MOOCs After the Pandemic: What the Technology Turned Out to Be Good At
In the autumn of 2011, Sebastian Thrun and Peter Norvig opened their Stanford course on artificial intelligence to anyone with an internet connection. Around 160,000 people signed up. Thrun, a roboticist who had led Google’s self-driving car project, later told Wired that teaching a room of thirty students felt impossible to go back to after teaching tens of thousands. Within a year he had cofounded Udacity. Coursera launched weeks later out of the same Stanford computer science department. edX followed from MIT and Harvard. The New York Times declared 2012 the Year of the MOOC.
The promise was enormous and genuinely stirring. A child in Lagos or Lima could sit in on a Harvard lecture. The cost of knowledge, that most expensive form of scarcity, would fall to near zero. The word revolution appeared in headlines without apology.
Then the numbers came in.
By 2014 and 2015, researchers looking at completion data were finding rates in the low single digits. Five percent was considered respectable. For open-enrollment courses with no barrier to entry, two or three percent was common. Studies from the Harvard-MIT team running HarvardX and MITx found that most enrollees never made it past the first week, and that the learners who did finish were not the underserved global population the rhetoric had promised. They were disproportionately people who already held degrees, already had professional jobs, and already lived in wealthy countries. The utopian story about closing the world’s education gap started to look more like a story about credentialed adults taking free electives in their spare time.
The credentialing question was even harder. A certificate of completion from an online course, even one stamped with a prestigious university logo, did not reliably open doors in hiring. Employers in 2015 did not know what to do with them. Admissions offices did not know what to do with them either. The format had outrun its institutional scaffolding.
So the hype receded, and for a few years the dominant tone around MOOCs was disappointment. Thrun himself gave an infamous interview to Fast Company in 2013 describing Udacity’s early product as a “lousy” one for the students who most needed help. The platforms pivoted, quietly, toward corporate training and professional upskilling. The dream of the world classroom curdled into something more modest: a new distribution channel for continuing education.
And then, in March 2020, the world’s classrooms closed.
The pandemic did not resurrect the original MOOC dream. It did something more interesting. It forced hundreds of millions of people to accept online learning as a normal mode of life, and it gave the platforms a second chance to figure out what they were actually for. Coursera reported enrollment surges that dwarfed any year in its history. edX, which was acquired by 2U in 2021 for roughly $800 million, saw similar jumps. Class Central, the independent meta-tracker run by Dhawal Shah, documented the scale of the shift in its annual reports: more learners, more courses, and a decisive move away from the standalone free lecture.
What replaced it was a stack of paid credentials with increasingly real institutional weight. Coursera’s Specializations bundled four or five related courses into a multi-month path, typically for a monthly subscription. edX’s MicroMasters programs offered graduate-level sequences that could be counted toward a real master’s degree at participating universities. Google, IBM, Meta, and others launched Professional Certificates aimed at specific job categories, priced low enough for individual learners but designed with hiring pipelines in mind. Coursera and Illinois launched a fully online MBA. Georgia Tech’s online master’s in computer science, delivered through edX-adjacent infrastructure, enrolled thousands for a fraction of the on-campus cost and, by most accounts, actually worked.
The second-generation MOOC is not a free lecture. It is a structured product with a price, a deadline, and increasingly a paper trail that institutions recognize.
Which raises the practical question: what are these things good for now?
They are good, I think, at three concrete jobs. They can give you a structured path into a new technical field when you genuinely do not know what you do not know. A good Specialization in data analysis or product management will walk you through a curriculum that a competent mentor would have designed, which most self-directed learners cannot design for themselves. They can also serve as real credentialing for a promotion or lateral move within a career you already have, especially when the certificate carries a university name that your manager recognizes. And they can offer a genuinely affordable path to a graduate credential from a known institution, for the narrow slice of learners who have the discipline to finish one. If you are weighing a MOOC against a coding bootcamp or other alternative credential, the tradeoffs are specific and worth thinking through carefully.
They remain bad, equally concretely, at other things. They do not replace a real degree, not because the content is worse but because a degree is also a social credential, a four-year accountability structure, and a network of people who will know you for the rest of your life. They do not create community. The forum threads are mostly ghost towns. They do not provide the accountability that makes most adults finish hard things. The completion rates tell you what you need to know about that. If you need someone to notice when you stop showing up, a MOOC will not notice.
The accountability gap is where a lot of recent experimentation has gone. Cohort-based courses, study groups on Discord, and paid community layers have all tried to patch it. Some learners are now using an AI tutor as a study partner inside or alongside their MOOC courses, mostly to get unstuck on exercises and to turn passive video-watching into something closer to active problem-solving. This works unevenly but seems to help the people who were already going to finish.
Looking back across the fifteen years since that Stanford AI class, what strikes me is that the technology was actually good at something different from what its founders promised. It did not democratize elite education in any meaningful sense. It did, however, quietly create a new layer of professional learning infrastructure, sitting somewhere between a corporate training department and a graduate school extension program, priced and structured for adults who already have jobs and want to move within them.
That is a smaller claim than the one made in 2012. It is also, unlike the 2012 claim, mostly true.
Photo via Unsplash.