Will This New College Ranking System Upend the Status Quo?

Higher education reformers have long argued that colleges’ admissions selectivity reveals little about their educational quality and how well they prepare students for the workforce and civic life. The U.S. News & World Report rankings, which have dominated the field for more than 30 years, are based heavily on schools’ acceptance rates and students’ standardized test scores; in this rankings paradigm, it’s entirely possible for universities with robust curricula and/or high completion rates to remain unranked or otherwise receive limited mainstream recognition.

Fortunately, the conventional thinking about college rankings appears to be changing. In late September, the Wall Street Journal (WSJ) and Times Higher Education (THE) published a new ranking system which, while flawed in some ways, represents an improvement over the U.S. News framework. It shifts focus to students’ post-graduation success, something which could help to provide more useful information to prospective students and their families.

The new rankings weigh factors in four broad categories: student outcomes, school resources, student engagement, and campus environment. Student outcomes—which in this system include graduation rates, “value added” to graduates’ salaries, and students’ ability to repay tuition loan debt—account for 40 percent of a school’s total score.

In an era in which only 56 percent of full-time students complete four-year bachelor’s degrees within six years and student loan default rates remain a national concern, it’s vital that students and their parents have an accurate view of the likelihood of success at a particular college. These new rankings will certainly help along those lines.

Missing from the student outcomes category, unfortunately, are student learning outcomes. Currently, there is no widely accepted learning assessment in place, which perhaps isn’t surprising given that the assessments and surveys that have been experimented with have revealed that many students are graduating unprepared for the white-collar workforce, ignorant about basic science, and woefully uninformed about the elementary mechanics of American government. Universities may not be very eager to share such results with the public.

The remaining 60 percent of a school’s WSJ/THE score is divided into three categories: resources, engagement, and environment. The resources category includes factors such as faculty to student ratio, dollars spent per student, and faculty research productivity; the engagement score is based on a survey intended to gauge students’ perception of faculty involvement, how well the school improves critical thinking skills, and whether students would recommend the school to their peers; and the environment category measures the “diversity” of the student body.

Some of these metrics—student to faculty ratio and instructional spending, for instance—are more welcome additions than others, such as student diversity and faculty research productivity. Those metrics are, at best, tangentially related to student success and educational quality. Another shortfall has been identified by Paul McNulty, president of Grove City College. He criticized the new system’s use of College Scorecard, an online tool that allows prospective students to compare federally-funded schools, in determining the student outcomes and environment scores. According to McNulty, “the WSJ/THE college rankings should not be based on inaccurate and incomplete data sources like the College Scorecard” because they exclude schools such as Grove City, which don’t receive federal funding.

Others have faulted the new system’s weighing of instructional spending. A spokesperson from Northeastern University said that this metric is “promoting inefficiency at a time when students and their families are looking for higher education to provide maximum value for the money.” That seems to be a fair criticism, especially since the new rankings don’t account for new online education delivery methods, which sometimes require much less funding per student.

Those caveats aside, let’s delve into the new rankings.

In the WSJ/THE system, private universities tend to rank much higher than public institutions. That’s partly because student outcomes and availability of resources make up a combined 70 percent of a university’s total score; public universities, especially those with low graduation rates and high student loan default rates, are at a disadvantage. And although private universities with large endowments score high overall, many smaller universities rank high in the engagement and environment categories. Also, research universities tend to score poorly in the engagement category, due to low faculty-student interaction (in this category, Christian-affiliated schools occupy all five top positions).

Among North Carolina universities, Duke University ranks seventh (for Duke and other traditionally “Top 20” universities, there isn’t much variance between the WSJ/THE rankings and those of U.S. News; there are, however significant changes in lower-ranked schools), UNC-Chapel Hill ranks 30th, and Wake Forest 54th. Only one other school, Davidson College, cracked the top 100. Nine other universities landed in the top 500. The table below details the overall score (out of 100) and the score each university received in the four categories.

university rankings

Some NC schools ranked low overall received higher marks for both engagement and environment. North Carolina A&T ranked highest for student engagement, although it received an overall rank between 501 and 600. In the environment category, Appalachian State and High Point received the highest marks although both were ultimately ranked between 601 and 800.

Several notable universities rank significantly lower in the WSJ/THE rankings than in those of U.S. News. UNC-Charlotte, for example, fell 285 places in the new ranking, and East Carolina University landed 226 places behind its U.S. News rank. Wake Forest University, UNC-Greensboro, and North Carolina State University also placed significantly lower in the WSJ/THE system.

Whether these new rankings will impact students’ and parents’ college decisions, however, remains to be seen. The Wall Street Journal and Times Higher Education are not the first media organizations that have attempted to break into the rankings field. Forbes, for instance, began publishing its annual rankings in 2008. At the time, the Pope Center’s then-president, Jane S. Shaw, called the Forbes list a “breath of fresh air,” saying that its formula, which also prioritizes student outcomes, was moving in the right direction. Still, after eight years, the Forbes rankings haven’t gained enough momentum to overtake those made by U.S. News.

The U.S. News rankings’ popularity cannot be denied. A recent study found that for every ten positions a university falls, applications decline ten percent. The pressure to do well is so great that several universities have falsified data. And many recruiters from the nation’s top corporations use the U.S. News rankings as a factor in their decision to visit a campus.

But there are reasons to suspect that students, parents, and employers, the “consumers” of college rankings, may finally be open to a new alternative, such as the one now provided by the Wall Street Journal and Times Higher Education.

Higher education has changed dramatically, with student loan debt, low graduation rates, and underemployment issues frequently grabbing headlines. In this environment, ranking systems that rely on formulas that essentially equate a university’s rejection rate with its educational quality may be viewed with greater skepticism, or at least valued less, by students and parents. And employers, who routinely complain about recent graduates’ workforce unpreparedness and lack of “soft skills,” also may be in the market for an alternative ranking system.

While there is room for improvement in the Wall Street Journal/Times Higher Education rankings, the renewed focus on student outcomes is an important step toward holding universities accountable for things that truly matter. Ultimately, though, these rankings’ staying power, or lack thereof, will be determined by students, parents, and employers.