Back in 2013 student groups from the design school at Stanford University started to take on an interesting question:  what will an undergraduate education at Stanford look like at the turn of the 22nd century?  

They came up with four ideas, all of which look very different than the typical undergraduate experience today. Then they wrote and made videos framed as dispatches from the future, explaining what had happened in the past 86 years to make college look like this in the year 2100.

This is a thought experiment, and just the idea that college will look different at all a century from now is provocative. The basic format of college — earning a degree in four years — is centuries old, and classroom lectures dates back to the 12 century. Still, the thought experiment is interesting, in part because it reflects some trends that are already going on.


1) Replace four years of college with six years of education you can use whenever you want

Instead of going through college, students who Stanford recruited would get six years of education to use during their lifetimes. It might not be all at once: they could easily leave campus to take jobs or internships and then return. Students who had finished four years of education could come back later in their careers to teach a class on campus, or to take classes if they want to change careers. It’s sort of an all-you-can-eat buffet — eat as much as you want, then go back if you feel like you need more. (It’s not clear if graduation, or degrees at all, still exist in this futuristic universe, which focuses on how a college education can be more useful throughout your lifetime.)

Where this comes from: Stanford is a rare university where the idea of degrees becoming less important seems even slightly plausible. A 2012 New Yorker article — titled The End of Stanford? — argued that the university had essentially become a Silicon Valley incubator with a football team and that education itself wasn’t important to many students. In that context, the idea that students can drop out to go work at a startup, then come back a few years later when changing careers, starts to make a little more sense.


2) Replace a pre-major and major phase with a six-year “calibration, activation, elevation” journey

This idea is rife with technological jargon, but the basic concept is that instead of a four-year college experience, students would start out with short, introductory courses (less than a month long) in various fields of interest, a phase that could last up to 18 months. Then they would choose one area to focus on in depth, including original research. After that, they’d take internships or conduct research before leaving the university. In all, the three phases could take up to six years.

Where this comes from: An emphasis on “experiential learning,” or learning outside the classroom. There’s a wide range of research emphasizing the importance of internships, independent research, and other opportunities.


3) Focus on skills, rather than on knowledge

This idea is probably closest to becoming reality. It includes a reorganization of academic departments based on broad “competencies” — scientific analysis, quantitative reasoning, communication effectiveness, and so on. Instead of transcripts with grades in specific classes, students would earn a “skill-print” — a sort of heat map showing what things they could do extraordinarily well and with depth (such as advanced macroeconomic modeling) and what skills they were still developing, all based on the work students did in class.

Where this comes from: The idea of a gorgeous (but hard-to-read) skill print is a pure design school invention, but there’s a growing consensus that college should focus more on measuring “competencies” — not just what students know, but what they’re able to do. Colleges that aim to help working adults earn a degree are particularly interested, because they can get credit for things they’ve learned to do elsewhere. But liberal arts colleges are too, because measuring what students can do is a way to make an education in something abstract translate into something that sounds valuable in the workplace.


4) Replace majors with ‘missions’

This is the easiest change to understand — instead of picking a broad area of study to major in, students would pick a problem they wanted to solve, and how they wanted to continue working on that issue as they started their careers. One mission, for example, could be to study agriculture and biology to end world hunger. Then they’d organize their college classes to support achieving that goal.

Where this comes from: This is another way to focus on the real-world impact of what students are studying, while still keeping to the general format of a four-year college education. As a result, it’s probably the most likely to happen.


Image credit:  Francisco Osorio | Flickr
Via Vox