Dropout Prevention Edstar evaluated the dropout prevention grants for the state of North Carolina. We were hired one year into the grant cycles. Most of the grants that were awarded were given to agencies that had designed programs through the Equity Lens of believing that demographically at-risk students (low-income or minority) need innovative services to raise their quality of life or to motivate them, and then maybe this would lead to on-time graduation. Fewer than 20% of the grantees had any measurable objectives, and most were not keeping records. They expected that success of their programs would be measured by overall graduation rates improving (or not) in the districts they served. The NC Dropout Committee explained to us that they wanted to know the effectiveness of the services provided by these programs at raising the on-time graduation rates and reducing the dropout rates. Our view of cause and effect on these projects meant that we needed to document why the students served were likely to not graduate on time (e.g., they failed core courses, were repeatedly suspended and got way behind, had high numbers of absences that caused them to fail courses, etc.) and then link the services they got to reducing those factors. Most of the agencies with the grants had the cause and effect perspective that they would raise the average quality of life for at-risk students, and the overall graduation rate would increase as a result. They saw no need to keep records of who was served because they thought improving the quality of life for some at-risk students would increase the average graduation rates for the group as a whole. There would be no direct link between the two, but rather, an indirect link. For the second round of grants, Edstar worked with the NC Dropout Committee and applicants were required to have objectives that related to fulfilling graduation requirements. They were to target students for service based on relevant data (e.g., course completion, attendance, suspensions, GPAs, etc.). The services needed to be related to helping students fulfill graduation requirements. Some of the programs had wanted to serve students who were too far behind to graduate on time, or who had already dropped out but because of the metric for success (on-time graduation), they were not going to. We talked to the Dropout Committee and they changed the success metric so that the schools that wanted to could include these students. We set up consistent record keeping across all the programs. Not all grantees were able to effectively make this switch to serving students based on data that indicated they were not likely to graduate at all or on time without help. Some thought they were making the switch and then found out the demographic data they believed was equivalent to academic data, wasn't. For example, one program planned to serve students who had not passed algebra or were not likely to. Rather than serve students who had taken and failed algebra, or whose math scores predicted they were likely to fail, they served Black males. They reported to us that they were serving students who couldn't pass algebra, but they had not looked at any data. They had asked for referrals of "at-risk" black males. They believed "at-risk" Black males were likely to fail algebra. They didn't know about EVAAS predictions, or how to get a list of students who had taken and failed algebra. They had students complete online remedial arithmetic and pre-algebra modules. They rewarded the students with iPads and basketballs for successfully completing them. At the end of the program, when they had to report how many of the students they served enrolled in and passed algebra after the services, they discovered that the majority of them had passed algebra before being in the program and that some of the participants successfully completed Calculus while in their program. Working with these programs to help them write S.M.A.R.T. objectives, we learned that many of them did not know the graduation requirements, or what data would be relevant for identifying a target group to serve. Even when they could identify a target group, such as students who are not likely to pass algebra, they used demographic data and referrals to identify students to serve. When programs wrote S.M.A.R.T. objectives in terms of relevant data, but used referrals to identify students to serve, it was not uncommon for up to 85% of the students in the programs to have already exceeded the program objectives prior to being in the program. Which beliefs were influencing some of the Equity Lenses of people who were developing these programs? Click to check your answer. B.1 Cause and Effect B.2 Expert vs. Evidence B.3 What At-Risk Means B.4 Desired Outcomes and Goals B.5 What is STEM and Why We Need to Fill STEM Pipeline Which skills were influencing the Equity Lenses? Click to check your answer. S.1 Knowing What Can Be Known S.2 How to Identify Kids to Align Services S.3 How to Classify Things S.4 Working With Data S.5 Understanding Data Details S.6 Understanding Federal Data-Handling Laws BeliefsB.1 Cause and Effect Overall, some programs continued to operate under the Equity Lens that supported providing innovative activities for at-risk students to raise their quality of life experiences and hopefully cause them to graduate on time. Other programs operated under the Equity Lens that supported providing algebra tutoring for students who had trouble passing algebra, and more straight forwardly related cause and effect. B.2 Expert vs. Evidence NA B3. What At-Risk Means There were at least two different Equity Lenses in different programs. Some programs operated under the lens that low-income and minority students are at-risk of dropping out regardless of their current status in school. Other programs operated under the lens that students who are not on track to meet the graduation requirements are at-risk of dropping out or not graduating on time. B.4 Desired Outcomes and Goals We were able to change the desired goals over the course of these grants, so that programs would be willing to help the students who had already dropped out or could not possibly graduate on time. B.5 What is STEM and Why We Need to Fill STEM Pipeline NA SkillsS.1 Knowing What Can Be Known The program in the example, and many more programs, did not know it was possible to use data to find the students who were not likely to pass algebra without help, or who had already taken and failed it. S.2 How to Identify Kids to Align Services Other programs used academic and behavior data, eventually. Some programs identified students for services by demographic characteristics. S.3 How to Classify Things NA S.4 Skill Set Required for Working With Data People did not know what data would be relevant. Many did not know what the graduation requirements were, which made it difficult for them to write goals about helping the students graduate. Both non-profits and school staff had difficulty working with data because they weren't used to it. They did not know how to write objectives in terms of relevant data. S.5 Understanding Data Details Most of the non-profit staffs, and many of the educators did not know how to interpret education data. They did not know how to interpret EVAAS projections, or EOG/EOC scale scores, didn't know attendance could be excused or unexcused, etc. S.6 Understanding Federal DATA-Handling Laws Many of the non-profit staff did not know what FERPA was. Many educators who were aware of FERPA were not sure what the requirements for data-handling were.