Edstar evaluated the dropout prevention grants for the state of North Carolina. We were hired one year into the grant cycles. Most of the grants that were awarded were given to agencies that had designed programs through the Equity Lens of believing that demographically at-risk students (low-income or minority) need innovative services to raise their quality of life or to motivate them, and then maybe this would lead to on-time graduation. Fewer than 20% of the grantees had any measurable objectives, and most were not keeping records. They expected that success of their programs would be measured by overall graduation rates improving (or not) in the districts they served.
The NC Dropout Committee explained to us that they wanted to know the effectiveness of the services provided by these programs at raising the on-time graduation rates and reducing the dropout rates. Our view of cause and effect on these projects meant that we needed to document why the students served were likely to not graduate on time (e.g., they failed core courses, were repeatedly suspended and got way behind, had high numbers of absences that caused them to fail courses, etc.) and then link the services they got to reducing those factors. Most of the agencies with the grants had the cause and effect perspective that they would raise the average quality of life for at-risk students, and the overall graduation rate would increase as a result. They saw no need to keep records of who was served because they thought improving the quality of life for some at-risk students would increase the average graduation rates for the group as a whole. There would be no direct link between the two, but rather, an indirect link.
For the second round of grants, Edstar worked with the NC Dropout Committee and applicants were required to have objectives that related to fulfilling graduation requirements. They were to target students for service based on relevant data (e.g., course completion, attendance, suspensions, GPAs, etc.). The services needed to be related to helping students fulfill graduation requirements. Some of the programs had wanted to serve students who were too far behind to graduate on time, or who had already dropped out but because of the metric for success (on-time graduation), they were not going to. We talked to the Dropout Committee and they changed the success metric so that the schools that wanted to could include these students. We set up consistent record keeping across all the programs.
Not all grantees were able to effectively make this switch to serving students based on data that indicated they were not likely to graduate at all or on time without help. Some thought they were making the switch and then found out the demographic data they believed was equivalent to academic data, wasn't. For example, one program planned to serve students who had not passed algebra or were not likely to. Rather than serve students who had taken and failed algebra, or whose math scores predicted they were likely to fail, they served Black males. They reported to us that they were serving students who couldn't pass algebra, but they had not looked at any data. They had asked for referrals of "at-risk" black males. They believed "at-risk" Black males were likely to fail algebra. They didn't know about EVAAS predictions, or how to get a list of students who had taken and failed algebra. They had students complete online remedial arithmetic and pre-algebra modules. They rewarded the students with iPads and basketballs for successfully completing them. At the end of the program, when they had to report how many of the students they served enrolled in and passed algebra after the services, they discovered that the majority of them had passed algebra before being in the program and that some of the participants successfully completed Calculus while in their program.
Working with these programs to help them write S.M.A.R.T. objectives, we learned that many of them did not know the graduation requirements, or what data would be relevant for identifying a target group to serve. Even when they could identify a target group, such as students who are not likely to pass algebra, they used demographic data and referrals to identify students to serve. When programs wrote S.M.A.R.T. objectives in terms of relevant data, but used referrals to identify students to serve, it was not uncommon for up to 85% of the students in the programs to have already exceeded the program objectives prior to being in the program.