Rethinking College Rankings

This past summer brought good news about college rankings—the concept of a single hierarchy is breaking down. The U.S. News & World Report’s annual rankings have long dominated the genre. But its listings are splintering into multiple mini-lists, and it is even getting a competitor.

When U.S News. issued its 2008 rankings last month, the “buzz” was about the number of college presidents who are no longer willing to assess other schools’ academic reputations. Only 51 per cent of schools surveyed returned the questionnaire this year, according to InsideHigherEd.com. A few years ago, the participation rate was 67 percent. This drop suggests that presidents are uncomfortable with a single list that is heavily based on schools’ reputation among their “peers.”

Furthermore, the magazine itself took two steps this year that undermine its official rankings as “the” guidance of quality. First, it created a new, supplementary list of “up-and-coming” schools. Second, it issued a separate list of schools ranked by high school counselors.

By adding an up-and-coming list, U.S. News implicitly acknowledges that the big list is getting stale. The editors admitted (in the print edition) that the rankings “tend not to vary much from one year to the next.” So it asked administrators to identify colleges that have made “striking improvements or innovations.”

Another reason for adding an up-and-coming list might be the fact that top-ranked schools are increasingly out of reach for all but the most accomplished applicants. For instance, Harvard and Yale accept 10 per cent or fewer of their applicants. Thus the excitement of finding out who is at the top of the list may be fading. An up-and-coming list gives glamour to those schools in the middle of the pack.

The second change at U.S. News is perhaps more fundamental. The magazine surveyed guidance counselors at academically-oriented public high schools around the country for their views of the best colleges and universities. The survey appears to be a response to the slide in college presidents’ evaluations. While college administrators are less inclined to rank schools on reputation, many high school guidance counselors are apparently willing to have their say. Judging by the new rankings, their opinions don’t seem dramatically different from the college presidents.

To be fair, for some years now, the magazine has been supplementing its big list with others, such as the “A+ Schools for B Students.” It also splits its list into fine-grained categories, based on mission and geography, so that many universities (even those of uncertain quality) can claim that they have landed on one list or other. For instance, Appalachian State is one of the top 10 master’s degree universities in the South. Biola University, an evangelical Christian school, is proud simply to be on the magazine’s list of “national universities,” a list of 262 universities.)

One other important thing happened that could reduce the impact of U.S. News’s listings: Forbes magazine issued its own college ranking system. Designed by Ohio University economist Richard Vedder, the list is supposedly based on outcome-based measurements rather than mere reputation or inputs. Vedder’s measurements are debatable (particularly the use of Who’s Who as his sample of successful graduates), but in my view he’s moving in the right direction. (A recent article by the Pope Center’s George Leef about the Forbes rankings belittles the idea of rankings entirely.)

Although I’m content to see the U.S. News balloon burst, please note that I’m not negative about the magazine’s goals. It is trying to inform parents and prospective students about their options—and people do love lists. (Perhaps that is the real value of such rankings: entertainment.)

The problem with the U. S. News ranking is twofold. For prospective students, ranking colleges is not really what’s important—what matters are specific characteristics. That’s why I’ve come to prefer the Princeton Review, which ranks schools on many dimensions, from their level of partying to their level of academic seriousness, based on student surveys.

The other problem is that U.S. News is using the wrong measures. Twenty-five percent of the ranking comes from what other college and university administrators think about a school—reputation. Another 15 percent is based on the quality of the students they admit, using SAT scores, class standing in high school, and percentage of applicants rejected. This means that as long as students keep coming to Harvard and Yale (based on reputation), those schools are likely to stay at the top, because the students themselves are part of the ranking. Other factors in the U.S. News ranking are mostly inputs, such as amount of money spent on faculty (factors that tend to favor expensive schools).

I’m not so sure about U.S. News’s “up-and coming” list, either. Those schools are listed on the basis of reputation, too. And which ones hit the top this year? The top national university was George Mason University, which zoomed to public fame because of its surprise success in the NCAA basketball tournament in 2006. The highest-ranked liberal arts college was Davidson College, which echoed George Mason’s basketball success in 2008 by reaching the Elite Eight. Is that why they are up-and-coming? I wonder.

The changes mentioned here are among many that make this an exciting time to be writing about higher education. Old assumptions—the idea that there is a single ranking for 4,000 colleges, for example—are breaking down. I’m hopeful that newer assumptions will get us closer to the goal of helping each student find the right school.