Higher Education and the Law of Diminishing Returns, Part II

Graduate and professional programs, too, should consider time-to-degree reductions.

Recently, I argued in a Martin Center article that the fourth year of study for the bachelor’s degree is probably relatively unproductive and that enormous resources could be saved by introducing three-year degree programs like those found in Europe. What works at Oxford should also work at, say, Appalachian State University.

Yet I told only half the story: Many students go on to get graduate or professional degrees. Does the Law of Diminishing Returns apply to these programs, as well? Yes. Universities offer such programs at least in part to collect tuition revenues and to allow faculty to teach the classes they like, depriving program participants of the opportunity to enter the Real World and make a good living several years earlier.

Let’s start with law schools.

I was once asked by the American Bar Association to speak at a conference on legal education, where a central question was: Why does law school take three years, in addition to four years of undergraduate training?

Why does law school take three years, in addition to four years of undergraduate training?In some countries, notably Britain, students can go to law school right after high school, dramatically lowering their educational expenses. In some locations, students can study with a local lawyer and, upon passage of the bar exam, begin the practice of law. Why do we need seven years of university study in the U.S.?

Lawyers tell me that the fundamentally important courses are taught in the first year, that some possibly valuable ones are taught in the second year, and that the third year involves taking a bunch of electives—material that’s not essential in passing the bar exam and that rarely touches on anything the future lawyer will come across in his career. Why not allow students to choose a two-year law school, or simply let anyone practice who passes the bar exam, no matter where or how he learned the material? 

Moreover, it is quite a conflict of interest to let the members of the American Bar Association serve as law-school accreditors, determining entry to the profession and using expensive three-year law schools as a means of reducing the supply of lawyers, thus enhancing their own incomes. The extended period of legal study does nothing to make for more competent lawyers. (An aside: The recent attempts by the ABA’s accrediting arm to eliminate the mandatory LSAT for law-school admission is beyond reprehensible.)

Similarly, it’s questionable that one really needs four years of undergraduate training before entering medical school. Aside from the number of years required to formally earn an M.D. or D.O. degree, how much residency or interning experience is needed to develop a level of competency consistent with good health practices? I don’t know, but I suspect that some shortening of the current residency practices could happen with little impact on public health. That is the conclusion of Robert Orr and Anuska Jain in this paper written for the Niskanen Center. They write, “Apart from Canada, the United States is the only wealthy country requiring prospective doctors to earn a separate four-year bachelor’s degree prior to entering medical school. Establishing six-year, single-degree medical education programs … is therefore about as close to a free lunch for the U.S. health care system as you can get.”

Once again, we see needless years spent in formal education, taking us well past the point of diminishing returns. But the area with far more diminishing returns is graduate education, particularly the Ph.D. degree.

In the humanities, seven- or eight-year spells earning a Ph.D. are commonplace.I have a Ph.D. and have been reasonably successful in my field, if research accomplishments, teaching awards, or consulting opportunities are indicators. Yet I received my Ph.D. at 24, after less than three years of graduate training. I studied hard at coursework for two years (including summer school), learning the rudiments of both economic theory and statistics and taking courses in my field of specialized interest, economic history. I took rigorous oral and written exams prior to writing my doctoral dissertation, including exams demonstrating reading competency in French and German. I wasn’t that atypical in the 1960s—many of my friends got their degrees long before turning 30.

Today, however, students hardly ever take less than four years to get their Ph.D. In the humanities, seven- or eight-year spells earning the degree are commonplace. Since many Ph.D.-holders are marginally employed in humanities disciplines, staying in grad school for more years is not that injurious financially. It beats working at McDonald’s.

Tenured faculty members like to use these graduate students to teach annoying freshman survey courses so they can teach their first loves, typically esoteric topics of extremely limited interest. Professors make doctoral students waste months if not years researching issues of inconsequential significance. The diminishing returns problem here is enormous: Time and money are wasted in prodigious quantities.

To be sure, there are wide differences by field. Graduate training in, say, English is a great deal different than training in physics or nuclear engineering. However, the average training span for all doctoral students has risen over time. Are we getting a good “bang for the buck”? There is no reason to believe so.

The problems of doctoral programs are probably not found to the same extent in master’s-level training. Those programs tend to be one or two years in length and typically involve additional in-depth instruction in material students first encountered in undergraduate school. That said, I have never seen a detailed cost-benefit analysis of master’s-degree programs in standard academic disciplines, nor for such historically popular degrees as the M.B.A. A full analysis would no doubt show that most master’s programs also entail more time than is needed for competence.

If a person can pass a professional examination, why do we care how or where he acquired the knowledge?I believe that more use should be made of comprehensive, rigorous examinations to measure competency—tests like the bar exam, the examinations required to become a Certified Public Accountant (CPA), and physician licensing exams. If a person can pass examinations like these, why do we care how or where he or she acquired the knowledge necessary? This reform would put the individual in charge of his or her training, rather than educational institutions that want to maximize revenues or professional groups that want to minimize competition.

Similarly, at the undergraduate level, I have often wondered why we don’t have a National College Equivalence Examination (NCEE) that measures the presence of the general (and some specialized) knowledge expected from a “college-educated person.” We should be interested in outcomes (knowledge acquired) rather than the number of hours of seat time it has been determined that a degree requires.

We already give a GED, a high-school diploma equivalent. Why not do the same thing at the college level? A student scoring very high on the NCEE could brag, “I scored above the average Harvard graduate on the test!” Maybe our method of credentialing would change in ways that reduce the diminishing returns that are a major part of the college landscape today.

Richard K. Vedder is a distinguished professor of economics emeritus at Ohio University, a senior fellow at the Independent Institute, and a board member of the National Association of Scholars.