While the Department of Education (ED) has allowed schools to test policy improvements through its Experimental Sites Initiative program for almost 30 years, it has failed to evaluate their success, according to a new report by New America.
The report, Putting the Experimental Back in the Experimental Sites Initiative, urges Congress, ED, and education advocates to support new language in the House Republicans' bill to reauthorize the Higher Education Act of 1965—the Promoting Real Opportunity, Success and Prosperity Through Education Reform (PROSPER) Act—that mandates the evaluation of such experiments.
ED's intention in creating the Experimental Sites Initiative was to allow institutions to test and evaluate policy changes on a small scale to advise future legislation surrounding higher education. Authors Clare McCann, Amy Laitinen, and Andrew Feldman argue that ED has instead used this program as an excuse to waive schools' obligations to adhere to federal rules, and has failed to collect sufficient data and evaluate the success of these trials.
"The Experimental Sites Initiative has been a missed opportunity to learn what works and for whom," the authors wrote. "Had there been a consistent commitment by the Department to rigorous evaluation—or a requirement by Congress to conduct credible evaluations—the Department, Congress, the higher education community, and taxpayers would today have evidence from those experiments to inform broader policy."
The Experimental Sites Initiative grew out of the Quality Assurance (QA) program, which piloted in 1985 and allowed schools to replace ED's verification requirements with their own systems. Over the next decade, however, schools began to request and have various experiments approved by ED based on their complaints about burdensome regulations, such as mandatory exit-counseling for high-risk borrowers. Additionally, none of these experiments were designed to produce significant results to inform policy.
While Congress instituted a stricter set of rules for what experiments could test and what data must be collected in their bill to reauthorize of the Higher Education Act in 1998, few new experiments were created due to these added reporting burdens, and Congress relaxed the rules a decade later.
ED has since created a new round of experiments with little plans for evaluation, according to the authors. For example, in 2015, ED announced two new experiments involving Pell Grants without a strategy to collect data on them. One experiment allowed high school students in dual-enrollment programs to access Pell Grants, and the other granted prisoners Pell Grants to enroll in education programs while incarcerated.
"The press releases for these experiments noted that thousands of individuals would benefit. However, without a credible evaluation, the Department cannot make claims about the extent to which federal aid generated positive outcomes for those students," the authors wrote.
At the beginning of 2017 and under a new administration, ED announced that it plans to evaluate any ongoing experiments, though the authors argue that it will be difficult to procure data when these initiatives began years ago and that ED currently lacks the funding to conduct rigorous evaluations.
The authors urge senators to support new language in the PROSPER Act that requires that these experiments are rigorously evaluated, and recommend several steps that ED, Congress and institutions can take to ensure that the Experimental Sites Initiative program is successful going forward.
The authors suggest that Congress designate funding to ensure that ED can afford to evaluate programs and that ED produce reports to Congress on experimental sites every other year. They also recommend that ED gathers input from stakeholders such as college administrators, higher education researchers, and policy organizations before designing experiments.
"The Trump administration—with support from Congress and encouragement from the higher education community—has a valuable opportunity to make a new start with the Experimental Sites Initiative by reviving its original mission as a strategy to innovate and rigorously learn what works, and, just as importantly, what does not," the authors wrote.
Publication Date: 1/23/2018