Additional Testimony on Opportunities to Improve Student Success in the Higher Education Act

Submitted to the U.S. Senate Committee on Health, Education, Labor, and Pensions

| Lashawn Richburg-Hayes

MDRC is pleased to have this opportunity to provide additional information for the consideration of Chairman Alexander, Ranking Member Murray, and members of the Committee on ways research can be used to improve the academic success of low-income college students.

MDRC — a nonprofit, nonpartisan research organization based in New York City and Oakland, California — was founded more than 40 years ago to build reliable evidence on the effectiveness of programs for the disadvantaged and to help policymakers and practitioners use that evidence to improve policies and programs. MDRC is known for conducting large-scale evaluations and demonstration projects to test the impacts and cost-effectiveness of education and social programs. Many MDRC studies use a random assignment research design, the most rigorous method for evaluating such programs, which is able to determine the value an intervention adds to the status quo. This method, analogous to the one used in medical clinical trials, produces the most reliable evidence that a program works. As a result, it is the primary method to be accepted without reservations by the U.S. Department of Education’s What Works Clearinghouse (WWC).

The goal of this additional testimony is to reiterate the second recommendation in the original testimony of Dr. Lashawn Richburg-Hayes to “encourage innovation paired with research, especially rigorous evaluation.” The testimony of Senators and witnesses at the August 5, 2015 hearing provided a number of promising ideas that are operationally feasible and whose outcomes seem encouraging, but that have not yet been subjected to rigorous evaluation based on WWC standards. In addition, other evaluated programs that were mentioned produced small effects at best on outcomes identified by the Institute of Education Sciences (IES) as key markers of collegiate progress. In a time of limited resources, building reliable evidence before making major investments is essential to the long-term goal of increasing college completion.

This additional testimony is intended to emphasize the importance of building into the reauthorization of the Higher Education Act incentives for innovation, coupled with requirements for rigorous evaluation through randomized experiments whenever feasible. This testimony identifies opportunities in two areas — financial aid and student support services — and offers caution in the area of performance funding for higher education.

A summary of our recommendations follows:

  1. Test variations of enrollment intensity tied to Pell Grant receipt before altering the policy. The Department of Education can test Pell Grant funding to cover the summer term of the academic year. Offering Pell Grant aid to students during the summer would offer an opportunity to test whether aid during short summer or winter terms (that is, those less than 12 weeks in duration) helps students make stronger progress toward degree completion. Such tests could evaluate tying the reintroduction of summer Pell awards to some of the other strategies (for example, incremental aid disbursements) that could help control program costs and make the program more sustainable. Tests (some taking as little time as a semester to gain pertinent information) could be designed to evaluate the effectiveness of using the Pell Grant as an incentive to enroll for 15 credits per semester, offering the Pell Grant for reaching particular milestones, and the relative effectiveness of changes to the Pell Grant versus alternative funding for students on the verge of dropping out due to tuition and fees.
  2. Include student support services as an IES grant priority and advertise this priority clearly through the Federal Register and other means before competitions open. Congress can capitalize on the IES annual grant opportunity to provide more information in areas of interest to policymakers, the Department of Education, and other stakeholders. One way to implement this is to identify specific competitions to focus proposal submissions in a desired area. Announcing the priority broadly would allow colleges and researchers to prepare models and find partners.
  3. Continue to use First in the World (FITW) to encourage innovation and research in student support services. The federal government has made notable strides in valuing evidence through the Investing in Innovation (i3) program and FITW. Both grant opportunities support programs while requiring strong evaluation that will benefit policymakers and practitioners. In addition, both use a tiered strategy of financial incentives, providing the most funding to expand proven strategies to a large scale and lesser amounts to support innovation and a learning agenda. This year’s FITW competition included “Improving Student Support Services” as an absolute priority in the validation grant category, further directing attention to areas where the research base is thin. Future competitions could foster additional innovation and research on support services.
  4. Consider the unintended consequences of implementing performance funding in higher education in the absence of standardized metrics of institutional effectiveness. While performance funding could provide incentives for an increase in institutional effectiveness, it could also lead institutions to select students who are more likely to graduate, to lower their institutional standards to achieve desired outcomes, or both. While there are strong critiques to be made of all funding schemes, it is likely that the development of standard metrics should precede the implementation of a performance-funding alternative.