Could College Exit Exams Restore Confidence in Higher Ed?

Although there is no shortage of college graduates, a degree alone, unfortunately, does not guarantee students learned anything of substance while in college. The grade point averages listed at the top of many graduates’ resumes aren’t always reflective of students’ actual academic capabilities. University classes, particularly in the humanities, have become increasingly watered-down, making students’ “A-plus” grades virtually meaningless. 

It’s no wonder employers have expressed concern about whether colleges adequately prepare students for the workforce. 

According to a 2021 report by the Association of American Colleges and Universities, 60 percent of surveyed employers said that “critical thinking skills” are “very important,” but only 39 percent reported that recent graduates are well-prepared in this area. Similarly, 56 percent of employers consider “application of knowledge/skills in real-world settings” to be very important, but only 39 percent thought recent graduates were able to perform this task well. And 17 percent of employers below age 40 report having “very little confidence” in higher education. 

Of course, employers aren’t the only ones who miss out when colleges fail to produce knowledgeable graduates. Many students and their families make significant financial sacrifices in order to finance a college education. The average federal loan borrower, for example, owes $36,510. 

Unless held accountable, colleges will likely continue to charge high prices for a substandard education. What can be done? 

Some have proposed the use of a college exit exam, a standardized test that students must take before graduating. The Tennessee Board of Regents, for example, requires students who attend its 40 community and technical colleges to take an exit exam, the ETS Proficiency Profile, as a condition of graduation. This exam tests “college-level skills in reading, writing, critical thinking and mathematics” and is “designed to measure the academic skills developed through general education courses, rather than the subject knowledge specifically taught in those courses.” 

Although there are variations of exit exams that have been used, the main idea is to measure how well students learned while in college and make those results public. Doing so would make institutions more accountable for providing a quality education and might help employers identify promising candidates. 

Ohio University economics professor Richard Vedder proposes the implementation of what he calls the National College Exit Examination (NCEE). In his book, Restoring the Promise: Higher Education in America, Vedder writes that the NCEE is “envisioned to be perhaps a three and one-half hour test administered over a morning or afternoon.” He elaborates:

It consists of a largely essay-based ninety-minute examination of critical-reasoning and writing skills, perhaps using the existing [Collegiate] Learning Assessment exam, and a two-hour test of one hundred multiple choice questions. The largest portion of that multiple choice test, seventy-five questions, would examine the student’s basic knowledge in a variety of disciplines—things every college graduate should know…

The test may include questions in categories such as mathematics, biological sciences, statistics and data analysis, literature, history, and government. Vedder suggests that the last twenty-five questions be reserved for “a subject of the student’s choice—presumably the area the student has studied most (“the major”).”

Vedder argues that special exams already exist to demonstrate mastery in given skill-sets: 

We have special exams to indicate mastery of preparation for graduate-level work (the Graduate Record Examination), to show mastery of high school level educational attainment (the GED examination), to demonstrate competency in critical-thinking and writing skills (the [Collegiate] Learning Assessment), and even to show mastery of high-level specialized skills (for example, the CPA and state bar examinations).

Vedder continues, arguing “There is no reason we cannot do the same in higher education, perhaps developing [an NCEE] that tests for critical-reasoning skills as well as knowledge that college-educated persons should possess.” 

Such an exam, Vedder argues, has the potential to help restore employer confidence in the value of a degree and make colleges more accountable. “[T]he scores on the NCEE… would provide a uniform metric of student knowledge currently unavailable,” he writes. Vedder even proposes that the NCEE could be used to provide the college equivalent of a GED:

Indeed, the NCEE (or its equivalent) could become like the GED is to high school diplomas. One approach would be to say that “anyone who scores above a 90 on the NCEE will be granted a diploma.” We are vastly less interested in the amount of inputs a student uses in the process of learning (the number of credit hours earned) than we are in the outcomes— the amount of learning itself.

He also argues that exit exam scores could provide much more useful information to future employers about graduates’ abilities. “Who needs degrees?” asks Vedder, “a diploma is a crude binary information device— you have one, or you do not. A diploma with an NCEE score attached to it conveys far more information.”

The creation of a new exam such as the NCEE may not be necessary, however. Proponents of exit exams could argue that a satisfactory measure of student learning already exists: the Collegiate Learning Assessment (CLA). The CLA, which the Martin Center’s Jenna Robinson wrote about here, “assesses students’ abilities to think critically, reason analytically, solve problems and communicate clearly and cogently.” Robinson writes:

CLA is made up of four sections: a performance task, an analytical writing task, a make-an-argument section, and a critique-an-argument section. Scores are aggregated at the institutional level to inform the institution about how their students as a whole are performing. After controlling for college entrance scores (SAT or ACT), freshmen scores are compared with graduating senior scores to obtain the institution’s contribution to students’ results. Students’ entrance scores help CLA to determine whether a university is at, above, or below expected performance.

In their 2011 book Academically Adrift,  Richard Arum and Josipa Roksa evaluated student learning by analyzing the CLA scores of over three thousand undergraduates from the fall of 2005 to the spring of 2009. For 45 percent of the students, the authors saw very little growth in critical thinking, complex reasoning, and written communication. 

And in a follow-up report, the authors found that students’ performance on the CLA reflected their post-graduation outcomes. The authors found that students in “the bottom quintile of CLA performance as seniors” were “more than three times as likely to be unemployed two years after college than graduates whose CLA scores were in the top quintile.” “They were also twice as likely to be living back at home with their parents,” the authors noted. Given these findings, some may argue that requiring college students to take a test like the CLA could provide helpful insights into how effectively schools are teaching students. 

If they can reliably measure student learning, college exit exams offer several potential benefits. For one, the prospect of an objective way to measure how well students have learned in college would likely appeal to employers who have lost faith in the value of a college education.

Exit exams may also provide useful information on the overall quality of a given university. If an institution has poor learning outcomes, then prospective students may be inclined to opt for a different college. This sort of accountability could give colleges the incentive they need to prioritize high-quality instruction or else risk losing their competitive edge— and perhaps their funding. 

Editor’s note: Next week, the Martin Center will feature an article arguing against the use of college exit exams. 

 

Shannon Watkins is associate editor at the James G. Martin Center for Academic Renewal