A criticism of massively online courses sometimes noted is the user drop-off rates — many are called to the courses but few are chosen to finish. This is a natural response to some of the more bombastic numbers of sign-ups the course providers are instinctively inclined to tout. Six-figure numbers are more headline-grabbing than five-figure numbers, even if the latter is an impressive completion number. Plenty of people should still be encouraged to put a toe in the water if there is no barrier to entry, even if not everyone is going to swim the English Channel.
There is some data on starts versus completions officially in early MOOCs; the following is from the New York Times last year:
Besides the Artificial Intelligence course [160,000 registered, 23,000 completed], Stanford offered two other MOOCs last semester — Machine Learning (104,000 registered, and 13,000 completed the course), and Introduction to Databases (92,000 registered, 7,000 completed).
But if these numbers are a problem from start to finish, where in the race is it? That is a more interesting number, and the capacity to find such a number is at the heart of future MOOC success.
We already have a publicly referencable proxy: the YouTubed classes that were the progenitors of the MOOC revolution. Watching a video doesn’t require the same effort as completing a problem set, but probably closely aligns. Among the early online courses is the famed MIT Introduction to Computer Science course, first “televised” in 2008.
This graph is presented in logarithmic scale, and demonstrates that after the first fifth of a course, the retention rate is rather high. Of course, we don’t know the frequency a particular student watches each video — presumably more difficult material is viewed repeatedly, but easier material viewed less frequently. That would compress the “drop-off rate” further.
The same course was taught and videotaped in 2011 — same professors, mostly the same exercises. Video recitations were also added to YouTube (not included in this graph to keep it apples-to-apples.) The overlap of material in the later courses with the recitations probably helps account for a slightly steeper drop than the 2008 course. There was probably also some selection of videos from the pre-existing 2008 course.
No grand revelation, but another demonstration, in this case at the half way point, that people sufficiently committed to a course then finish — and that commitment didn’t drop off in the second half of the course. This drop-off rate is a touch steeper — for one thing, the latent demand for such a course from determined students was probably partially filled in the 2008 classes.
Similar MIT courses also show surprisingly low attrition. MIT 6.042J Mathematics for Computer Science, Fall 2010 was also recorded and now has been live for a sufficiently long time, which helps makes some sense of drop-off data.
These are hard courses, and high completion rates that approximate the known results of MOOCs to date.
Humanities courses have a lower drop-off rate after the introductory videos, which themselves get a lion’s share of the views. Here it’s not as clear if a “drop-off rate” is a metric that can be generated — you may want to watch subject matter on a particular topic instead of the full buffet, and understanding the material is less dependent on the previous videos. Yale’s Robert Schiller’s Intro to Financial Markets course comes out guns blazing with 183,000 views and lecture 2 has 109,000. The remaining topics — which can be viewed and understood individually — range from lecture 7 on behavioral finance (81,000 views) to the surprisingly seldom viewed part II of former Treasury Secretary Lawrence Summers’ Learning From the Financial Crisis (only 9,000.) Most others are in cohorts of mid-teen thousands or mid-forties thousands.
Online businesses of all kinds are dependent on the “customer funnel” both in the acquisition and retention of customers. The ability of an online course over the course of several iterations to measure in a precise way the retention from class to class especially in (scientifically) sequential material will be invaluable. Once the pain points are scientifically identified (what is it, really, about organic chemistry that impedes further progress), then the A/B testing of what lessons are necessary to better enhance the understanding will begin. STEM courses are never going to be a piece of cake, but knowing where they’re cod liver oil is the first step to wider adoption that MOOCs will be achieving shortly.