College Ready In 2017, nearly 72% of all North Carolina juniors failed to score the ACT college-ready benchmark in math (NC DPI data). Of the students in 2013, who graduated from high school and immediately enrolled in a North Carolina community college, 52% were required to take up to 10 remedial modules at the community college (General Assembly of North Carolina, 2015). Students who are required to take remedial education courses in college are less likely to graduate or earn a certification for a job that could earn them a living wage (Tippett & Stanford, 2019). The North Carolina General Assembly has attempted to address the fact that many North Carolina high school graduates do not score the math ACT benchmark (22) or do not meet the high school GPA requirements for enrolling in gateway math courses at the community college. The math ACT benchmark score of 22 is for non-STEM (Science Technology Engineering and Mathematics) majors. Until June 2019, students enrolling in community colleges who did not meet the Multiple Measures would take North Carolina Diagnostic and Placement test (NC DAP) and the developmental community college math modules. Due to the low graduation rates, believed to be caused by the need to take remedial not-for-credit courses, the NC General Assembly passed a bill requiring the Career-and College-Ready Graduates (CCRG) program. This program requires high schools to offer the remedial modules to students so that they can complete them during their senior year of high school, rather than after the enter community college. Edstar evaluated a pilot CCRG mathematics program in 11 North Carolina school districts, in collaboration with their community colleges. This program was to serve high school seniors who did not meet the Multiple Measures for admission to community college without the need for remediation in math. In the pilot CCRG program, online EdReady mathematics modules replaced the DMA modules. High school seniors who didn't have the minimum GPA or ACT math scores were to enroll in the EdReady system via the partnering community college and complete modules prior to graduation. Ideally, doing so would mean they could enter the gateway mathematics required for their chosen path of study at a community college after high school. As a part of the evaluation, Edstar surveyed high school staff in the CCRG pilot schools to learn about math courses offered. Staff reported that they offered Honors, standard, and slower-paced two-course versions of Math I, II, and III and then had a variety of courses for the fourth math option. The longer, slower-paced versions of Math I, II, and III were described as being for struggling math students. The Honors courses were described as being for top-scoring students. Yet analysis of the data for these districts showed that many high achieving students were enrolled in longer two-course versions of Math I, II, and III. Comparing the Math I End-of-Course (EOC) scores of matched top-scoring mathematics students (Levels 4 or 5 on 8th Grade Math EOG) who took the standard course and those who took the slower-paced course showed that standard course students scored significantly higher on the Math I EOC (82% vs. 41% scored college-ready). Students had been placed in math courses based on the professional judgment of teachers, and the staff believed the placement to have been based on previous math scores. Low-income and minority students were disproportionately placed lower by professional judgement than they would have been if placed by their math scores. Schools that embedded CCRG into existing math classes reported using very specific participation criteria, and when asked to explain why the data showed many students served did not meet that criteria and students who met the criteria were not served, they explained that they had actually just served all the students enrolled in specific courses. They reported that they believed that most students who fit the criteria would probably take the courses in which they embedded the EdReady modules. Although schools reported using criteria to identify qualified students, these criteria differed from school to school. Educators did not have a clear understanding of which students should be targeted. One school reported that rather than use GPA and ACT scores, they had a teacher create a placement test to identify students to participate in CCRG. We also found that teachers over-rode the adaptive features of EdReady that required students to stay in a module until they mastered it. They did this so that classes would all be working on the same topics at the same time. As a result, the data showed that some students took the initial test for a module, passed it, but completed the corresponding EdReady modules anyway. Some students failed post-tests for modules but went onto the next module anyway. We found that 46% of the students who participated in the CCRG modules were not seniors; these were primarily high-scoring low-income and minority students who were taking a 4th math class their junior year. (Their 4th math was one assumed to be for struggling students.) Of the seniors who participated about a third of them had GPAs or ACT scores high enough that they were qualified for gateway math courses at their community college. The other two-thirds of the seniors who participated needed to. With all the juniors and high achieving seniors in the program, about 37% of the CCRG participants were seniors who needed the program. A review of all the data for the schools showed that nearly half of the seniors who needed the remediation did not get it because they weren't enrolled in the math courses where it was offered. Nearly all of the students who passed CCRG modules were students who did not need them. Which beliefs are influencing the Equity Lens behind the way the program was implemented? Click to check your answer. B.1 Cause and Effect B.2 Expert vs. Evidence B.3 What At-Risk Means B.4 Desired Outcomes and Goals B.5 What is STEM and Why We Need to Fill STEM Pipeline Which skills are influencing the Equity Lens? Click to check your answer. S.1 Knowing What Can Be Known S.2 How to Identify Kids to Align Services S.3 How to Classify Things S.4 Working With Data S.5 Understanding Data Details S.6 Understanding Federal Data-Handling Laws BeliefsB.1 Cause and Effect NA B.2 Expert vs. Evidence They used professional judgement, or experts' opinions about who should enroll in CCRG. They reported this to us as using the GPA and ACT data. They assumed they were equivalent. B3. What At-Risk Means The Equity Lenses of the schools probably included believing that low-income and minority students were at-risk in mathematics. This is reflected in the high percentage of top-level students enrolled in the two-course versions of Math I that they described as being for struggling students. The belief that low-income and minority students struggle and are at-risk of failing math may be why so many high scoring juniors were enrolled in a fourth math they described as being for students who struggle to pass math courses. B.4 Desired Outcomes and Goals Although not discussed in this reading exercise, many of the high school staff members were confused about the goal of CCRG. One high school teacher who taught the CCRG course for two years commented on a survey, “A lot of my students in the CCRG see no benefit in the work just to get college credit for the module if they pass it.” (They were not getting college credit. The goal was to qualify them to take for-credit courses if they enrolled in community college.) B.5 STEM Pipeline They were taking students out of the STEM pipeline by giving top-scoring juniors what they described as math for struggling students as their fourth math class. SkillsS.1 Knowing What Can Be Known Edstar provided a list of seniors who did not meet the GPA or the ACT math requirements to each school. However, by the time they had the program in place, other staff were involved and they didn't know the list existed or how to get such a list themselves. S.2 How to Identify Kids to Align Services They identified students by the courses they were enrolled in rather than by the relevant data. S.3 How to Classify Things NA S.4 Skill Set Required for Working With Data The program had many problems because of lack of data skills or understanding. Because they over-rode the adaptive feature, module completion was not recorded in the system. Some teachers said they tested students themselves on paper, but the results were never entered into the system so students did not get credit. S.5 Understanding Data Details Teachers who over-rode the adaptive feature of EdReady reported that they gave students paper tests when they felt they might be able to pass previous modules, since they could not go backwards in the online module. As a result, the data provided to us on module completion by the teacher did not match that provided by EdReady. Students did not get credit for these modules because the test data was not in the system. S.6 Understanding Federal DATA-Handling Laws There were four different agencies involved in this project: the schools, the community colleges, EdReady, and Edstar. It would have been a good idea for all parties to have a clear understand of how FERPA laws apply to this program and to make sure everyone involved understands them. Δ