The Failure Factory That Wasn’t

Picking a college is not easy. Seventeen-year-olds and their parents receive conflicting advice on where to go, what to look for, and how to get in. Reliable information can be hard to come by, and some of the information that exists—such as the U.S. News rankings—is only marginally helpful.

In a recent report published by the American Enterprise Institute “Filling in the Blanks,” Andrew Kelly and Mark Schneider contend that students and parents should pay more attention to college graduation rates.

Many colleges and universities do have very low graduation rates, with only a third or fewer of the students completing their degrees within six years (for four-year institutions), while seemingly comparable schools have substantially higher rates. Kelly and Schneider want the government to require schools that take federal student aid funds (almost all of them) to report their graduation and retention rates when communicating with potential students. They contend that it would put pressure on low-performing schools to improve.

Perhaps, but that raises two questions.

Is a school’s graduation rate necessarily a good indicator of its educational quality?

If the government requires prominent reporting of this one statistic, will that help parents and students make better decisions?

We think that the answer to both is “no.”

We looked into a number of schools with consistently low graduation rates to find out why such a high percentage of their students do not complete their degrees “on schedule.”

One reason is that many of the students they enroll have their own schedules, often complicated by family matters, financial difficulties, work obligations and other distractions.

Among the schools identified as having a very low graduation rate in “Diplomas and Dropouts” is the University of Houston-Downtown, with a rate of 16 percent. Does that indicate that the university is not doing a good job? We spoke with Michael Dressman, interim vice president for academic affairs, who stated that few of the students there are like those who attend prestigious schools. Most have neither good academic preparation nor strong family support.  Some UH-D students succeed in progressing right through to their degrees, but for many others, “life intervenes,” Dressman said.

UH-D is one of the many colleges Washington Post columnist Robert Samuelson was writing about when he said that America is fortunate to have a “learning system” in which people get second or third chances to succeed. Of course, schools that give those chances to a lot of students will have much lower graduation rates than ones that enroll only highly gifted and academically focused students.

AEI’s 2009 paper on the supposed problem of graduation rates, “Diplomas and Dropouts,” helps to make our point by showing that school graduation rates correlate very strongly with the caliber of students enrolled. Among non-competitive schools, the six-year graduation rate averages 34.7 percent. Move up to “competitive” schools and the rate increases to 48.6; move to the “most competitive” category and it rises to 87.8 percent.

Put it this way: Harvard doesn’t have a high graduation rate because Harvard is so exceptionally good, but because it attracts exceptionally good students. If those students went to UH-D instead, UH-D would have a high graduation rate. And vice versa.

Another institution with a low graduation rate is University of North Carolina-Pembroke.  The school, located in rural southeastern North Carolina, has a 34 percent six-year graduation rate according to IPEDS data. Should students and parents be leery of an institution where it seems there appears to be only a one in three chance of graduating?

Dr. Elizabeth Normandy, associate vice chancellor for academic affairs at the university, thought that would be a mistaken conclusion and cites perfectly understandable reasons why. Most UNC-P students are from the surrounding area, which is far from affluent, and a higher percentage of them face financial stress than students from more prosperous regions. “In many cases, they are working long hours to pay for college. This necessarily impedes academic progress and affects the graduation rate, yet the measure used by IPEDS does not allow for the impact of this variable,” she said.

Normandy further notes that quite a few UNC-P students don’t graduate from that school because they find it expedient to transfer to another school that offers the professional program they ultimately want, such as pharmacy.

So UNC-Pembroke’s rather low graduation rate doesn’t mean that the school is a poor choice for a student considering it; nor does it follow that the same student would be better off by enrolling in a school with a higher graduation rate. Whether a student graduates and benefits from the educational experience is up to that individual, not the institution.

The words in italics above leads to a related point—graduating from college doesn’t necessarily mean learning.

Many students who graduate have little or nothing other than the diploma to show for their time and effort. That is the striking conclusion reached by Professors Richard Arum and Josipa Roksa in their recent book Academically Adrift.

Academically Adrift relied on data from the Collegiate Learning Assessment (CLA), an essay-based test developed by the Council for Aid to Education used to determine broad skills of college students such as critical thinking and analysis. The CLA gives us additional reason to be skeptical about the eminence of graduation rates. According to CLA program manager Heather Kugelmass, “the correlation between 6-year graduation rates and CLA value-added scores is 0.01 (i.e., non existent).”

Therefore, if you think College A must be better than College B because the former has a higher graduation rate, you might be making a mistake. College A may have a weaker curriculum, lower standards, and more inflated grades, and College B may enroll a high percentage of students with weak preparation and a lot of distractions.

This brings us to the second question we raised above—should the government require schools to prominently show their graduation rates on communications with applicants? Probably not.

As we have seen, higher graduation rates don’t necessarily mean that students are learning much, and lower graduation rates don’t necessarily mean that students aren’t doing the best they can, given their circumstances. If the government were to mandate publication of graduation rates, treating them as a vitally important piece of information, that might scare students and parents away from schools like UH-D and UNC-P, but it is by no means certain that another school with a higher graduation rate would be a better choice.

Furthermore, if the government were to place so much emphasis on graduation rates (“privileging” that information, academics like to say), college administrators might well reach into their “bag of tricks” (as Professor Robert Weissberg puts it in this essay dealing with graduation rates for athletes) and raise graduation rates by lowering their standards. We would find more students graduating, but with less educational benefit.  That would not be a good trade.

Lastly, keep in mind that colleges don’t graduate their students. Students graduate themselves by doing what is required. It isn’t a mark against a college if a high percentage of the students it admits can’t or don’t earn the sheepskin.