One way to improve life for the new student majority is to raise the quality of the education without raising the price.
Interest in using the internet to slash the price of higher education is being driven in part by hope for new methods of teaching, but also by frustration with the existing system. The biggest threat those of us working in colleges and universities face isn’t video lectures or online tests. It’s the fact that we live in institutions perfectly adapted to an environment that no longer exists.
In the first half of the 20th century, higher education was a luxury and a rarity in the U.S. Only 5% or so of adults, overwhelmingly drawn from well-off families, had attended college. That changed with the end of WWII. Waves of discharged soldiers subsidized by the GI Bill, joined by the children of the expanding middle class, wanted or needed a college degree. From 1945 to 1975, the number of undergraduates increased five-fold, and graduate students nine-fold. PhDs graduating one year got jobs teaching the ever-larger cohort of freshman arriving the next.
This growth was enthusiastically subsidized. Between 1960 and 1975, states more than doubled their rate of appropriations for higher education, from four dollars per thousand in state revenue to ten. Post-secondary education extended its previous mission—liberal arts education for elites—to include both more basic research from faculty and more job-specific training for students. Federal research grants quadrupled; at the same time, a Bachelor’s degree became an entry-level certificate for an increasing number of jobs.
This expansion created tensions among the goals of open-ended exploration, training for the workplace, and research, but these tensions were masked by new income. Decades of rising revenue meant we could simultaneously become the research arm of government and industry, the training ground for a rapidly professionalizing workforce, and the preservers of the liberal arts tradition. Even better, we could do all of this while increasing faculty ranks and reducing the time senior professors spent in the classroom. This was the Golden Age of American academia.
As long as the income was incoming, we were happy to trade funding our institutions with our money (tuition and endowment) for funding it with other people’s money (loans and grants.) And so long as college remained a source of cheap and effective job credentials, our new sources of support—students with loans, governments with research agendas—were happy to let us regard ourselves as priests instead of service workers.
Then the 1970s happened. The Vietnam war ended, removing “not getting shot at” as a reason to enroll. The draft ended too, reducing the ranks of future GIs, while the GI bill was altered to shift new costs onto former soldiers. During the oil shock and subsequent recession, demand for education shrank for the first time since 1945, and states began persistently reducing the proportion of tax dollars going to higher education, eventually cutting the previous increase in half. Rising costs and falling subsidies have driven average tuition up over 1000% since the 1970s.
Golden Age economics ended. Golden Age assumptions did not. For 30 wonderful years, we had been unusually flush, and we got used to it, re-designing our institutions to assume unending increases in subsidized demand. This did not happen. The year it started not happening was 1975. Every year since, we tweaked our finances, hiking tuition a bit, taking in a few more students, making large lectures a little larger, hiring a few more adjuncts.
Each of these changes looked small and reversible at the time. Over the decades, though, we’ve behaved like an embezzler who starts by taking only what he means to replace, but ends up extracting so much that embezzlement becomes the system. There is no longer enough income to support a full-time faculty and provide students a reasonably priced education of acceptable quality at most colleges or universities in this country.
Our current difficulties are not the result of current problems. They are the bill coming due for 40 years of trying to preserve a set of practices that have outlived the economics that made them possible.
The students enrolled in places like CCSF (or Houston Community College, or Miami Dade) are sometimes called non-traditional, but this label is itself a holdover from another era, when residential colleges for teenage learners were still the norm. After the massive expansion of higher education into job training, the promising 18-year-old who goes straight to a residential college is now the odd one out.
Of the twenty million or so students in the US, only about one in ten lives on a campus. The remaining eighteen million—the ones who don’t have the grades for Swarthmore, or tens of thousands of dollars in free cash flow, or four years free of adult responsibility—are relying on education after high school not as a voyage of self-discovery but as a way to acquire training and a certificate of hireability.
Though the landscape of higher education in the U.S., spread across forty-six hundred institutions, hosts considerable variation, a few commonalities emerge: the bulk of students today are in their mid-20s or older, enrolled at a community or commuter school, and working towards a degree they will take too long to complete. One in three won’t complete, ever. Of the rest, two in three will leave in debt. The median member of this new student majority is just keeping her head above water financially. The bottom quintile is drowning.
One obvious way to improve life for the new student majority is to raise the quality of the education without raising the price. This is clearly the ideal, whose principal obstacle is not conceptual but practical: no one knows how. The value of our core product—the Bachelor’s degree—has fallen in every year since 2000, while tuition continues to increase faster than inflation.
The other way to help these students would be to dramatically reduce the price or time required to get an education of acceptable quality (and for acceptable read “enabling the student to get a better job”, their commonest goal.) This is a worse option in every respect except one, which is that it may be possible.
All that was minor, though, compared to our willingness to rely on contingent hires, including our own graduate students, ideal cheap labor. The proportion of part-time and non-tenure track teachers went from less than half of total faculty, before 1975, to over two-thirds now. In the same period, the proportion of jobs that might someday lead to tenure collapsed, from one in five to one in ten. The result is the bifurcation we have today: People who have tenure can’t lose it. People who don’t mostly can’t get it. The faculty has stopped being a guild, divided into junior and senior members, and become a caste system, divided into haves and have-nots.
Caste systems are notoriously hard to change. Though tenured professors often imagine we could somehow pay our non-tenured colleagues more, charge our students less, and keep our own salaries and benefits the same, the economics of our institutions remain as they have always been: our major expense is compensation (much of it for healthcare and pensions) distributed unequally between tenured and contingent faculty, and our major income is tuition.
I recently saw this pattern in my home institution. Last fall, NYU’s chapter of the American Association of University Professors proposed reducing senior administrative salaries by 25%, alongside a ‘steady conversion’ of non-tenure-track jobs to tenure-track ones ‘at every NYU location’. The former move would save us about $5 million a year. The latter would cost us $250 million.
Now NYU is relatively well off, but we do not have a spare quarter of a billion dollars per annum, not even for a good cause, not even if we sold the mineral rights under Greenwich Village. As at most institutions, even savage cuts in administrative compensation would not allow for hiring contingent faculty full time while also preserving tenured faculty’s benefits. (After these two proposals, the AAUP also advocated reducing ‘the student debt burden by expanding needs‐based financial aid’. No new sources of revenue were suggested.)
The number of high-school graduates underserved or unserved by higher education today dwarfs the number of people for whom that system works well. The reason to bet on the spread of large-scale low-cost education isn’t the increased supply of new technologies. It’s the massive demand for education, which our existing institutions are increasingly unable to handle. That demand will go somewhere.
Those of us in the traditional academy could have a hand in shaping that future, but doing so will require us to relax our obsessive focus on elite students, institutions, and faculty. It will require us to stop regarding ourselves as irreplaceable occupiers of sacred roles, and start regarding ourselves as people who do several jobs society needs done, only one of which is creating new knowledge.
It will also require us to abandon any hope of restoring the Golden Age. It was a nice time, but it wasn’t stable, and it didn’t last, and it’s not coming back. It’s been gone ten years more than it lasted, in fact, and in the time since it ended, we’ve done more damage to our institutions, and our students, and our junior colleagues, by trying to preserve it than we would have by trying to adapt. Arguing that we need to keep the current system going just long enough to get the subsidy the world owes us is really just a way of preserving an arrangement that works well for elites—tenured professors, rich students, endowed institutions—but increasingly badly for everyone else.
Photo credit: The Economic Times
Via Clay Shirky