A major public university has paused its use of risk scores following a Markup investigation that found several universities using race as a factor in predicting student success. Our investigation also found that the software, Navigate, created by EAB and used by more than 500 schools across the country, was disproportionately labeling Black and other minority students “high risk”—a practice experts said ends up pushing Black kids out of math and science into “easier” majors.
Following our report, Texas A&M University announced it will stop including such risk scores on adviser dashboards and asked EAB to create new models that do not include race as a variable.
Machine Learning
Major Universities Are Using Race as a “High Impact Predictor” of Student Success
Students, professors, and education experts worry that that’s pushing Black students in particular out of math and science
“We are committed to the success of all Texas A&M students,” Tim Scott, Texas A&M’s associate provost for academic affairs and student success, wrote in an email to The Markup. “Any decisions made about our students’ success will be done in a way that is fair and equitable to all students.”
The response from other schools has been mixed.
Maryclare Griffin, a statistics professor at the University of Massachusetts Amherst, another school featured in the story, said her institution appears to have taken down the option to view student risk scores for some Navigate users. One other professor at the school told The Markup that they were still able to view student risk scores.
UMass Amherst spokesperson Mary Dettloff would not confirm whether the school had made changes to its Navigate system and declined to answer other questions for this story.
The University of Houston, one of the four schools from which The Markup obtained data showing racial disparities in the risk scores, has not made any changes to its use of EAB’s algorithms, Shawn Lindsey, a spokesperson for the university, said.
The other schools mentioned in the original story—the University of Wisconsin–Milwaukee, South Dakota State University, Texas Tech University, and Kansas State University—did not respond to questions for this story.
The Markup obtained data from public universities showing that the algorithms embedded in educational research company EAB’s Navigate software assigned Black students high risk scores at double to quadruple the rate of their White peers. The risk scores purport to predict how likely a student is to drop out of school if that student remains within his or her selected major.
At nearly all the schools The Markup examined, the EAB algorithms used by the schools explicitly factored students’ race into their predictive models. And in several cases, the schools used race as a “high impact predictor” of success, meaning it was one of the variables with the most influence over students’ risk scores.
“EAB is deeply committed to equity and student success. Our partner schools hold differing views on the value of including demographic data in their risk models. That is why we are engaging our partner institutions to proactively review the use of demographic data,” EAB spokesperson John Michaels wrote in an email to The Markup. “Our goal has always been to give schools a clear understanding of the data that informs their customized models. We want to ensure that each institution can use the predictive analytics and broader platform as it is intended—to provide the best support for their students.”
EAB has marketed its advising software as a tool for cash-strapped universities to better direct their resources to the students who need help the most and, in the process, boost retention and avoid the additional cost of recruiting students to take the place of those who drop out.
But at the schools The Markup examined, we found that faculty and advisers who had access to EAB’s student risk scores were rarely, if ever, told how the scores were calculated or trained on how to interpret and use them. And in several cases, including at Texas A&M University, administrators were unaware that race was being used as a variable.
Instead, the software provided advisers a first impression of whether a student was at high-, moderate-, or low-risk of dropping out within his or her selected major, and then, through a function called Major Explorer, they were shown how that student’s risk might decrease if the student were to switch into a different, “less risky” field of study.
Experts said that design feature, coupled with the racial disparities in risk scores, was likely to perpetuate historic racism in higher education and result in students of color, particularly Black students, being encouraged to leave science, math, and engineering programs.
Iris Palmer, a senior adviser for higher education and workforce policy at New America, has studied the predictive analytics systems universities use to boost retention and has written a guide for schools to follow when considering whether to implement such systems.
“I don’t think taking race explicitly out of the algorithm solves the problem or makes the situation better necessarily,” she said. “Algorithms can predict race based on all sorts of other things that go into the algorithm,” such as combinations of data like zip code, high school name, and family income.
There is potential value in using predictive analytics to identify the students most in need of support, Palmer said, if schools actually train staff members on how the algorithms work and if the software explains, in a concise and understandable manner, which factors lead to each student being assigned a particular risk score. “And that’s a big if.”
Schools “need to do due diligence around disparate impact and why you’re seeing disparate impact on your campus,” she said. Had schools been doing that before signing multiyear contracts with EAB, “they wouldn’t have been caught unawares.”