Assessing Coursera, the LMS

Coursera announced last week that it will be partnering with ten state university systems to “explore MOOC-based learning and collaboration on campus.” The news revealed what many of us who have been working in this field for some time have known since about when non-Canadians started talking about MOOCs: grand proclamations about the inevitable revolution coming to higher education via Silicon Valley are being propelled by a significant amount of hot air. Eduprenuers interested in changing higher education by developing new technological tools and curricula are going to have to do the same thing that a lot of us have been doing already for years: experimenting, failing better, iterating, scaling, dreaming, scratching and clawing within/against existing institutional frameworks, and persisting.

The most troubling aspect of the MOOC hype has been how quickly this approach to teaching and learning with technology has been seen by a variety of constituencies as a tool/excuse for slashing public funding for higher education, for “doing more with less,” and (in the spirit of capital accumulation) for proclaiming that only the elite universities can lead us through contemporary communicative changes. The MOOC hype has somehow squeezed another tier of employment into the system — “course guiders” — who will sit below adjuncts and slightly above peer mentors on the hierarchy of academic labor (and will displace both in some spaces). Proponents and detractors of xMOOCs agree on one thing: the primary goal of all this is “disruption.” The proponents are certain that whatever replaces the status quo in higher education will be better, while the detractors see such thinking as reckless academic planning at best, and mendacious privatization at worst.

The second most troubling aspect of the hype is how poorly informed by the scholarship of teaching and learning so much of what’s happening in the xMOOCeshpere has been. In announcing their partnerships, Coursera noted that “studies have shown many benefits to blended learning.” They tried to link the word “benefits” to this study but screwed up the html in the post so that the link is dead (it’s still dead nearly a week after first posted). The study they failed to link to is focused primarily on K-12 instruction, and its citation evidences a sloppiness and a carelessness about this stuff. It suggests either they don’t know what they’re doing or they don’t care what anyone else thinks about what they’re doing. It’s actually kind of edupunk, if you convince yourself to think that way about it.

Much of the same sloppiness is embedded in the design of the Coursera learning platform. As Alex Usher and others have noted, Coursera just announced to the world their service is essentially an LMS. If you’ve not taking a Coursera course, I encourage you to do so — with a pseudonym, if you like — because it becomes clear that absent the hype and praise from the New York Times op-ed page, Coursera is Blackboard with a hipper stylesheet and a slightly enhanced feature set.

Which is to say, Coursera is pretty meh as a space for teaching and learning. Online courses take place in spaces, and just as the physical environment in which we teach impacts the ways we communicate with our students, virtual environments can be structured to make certain things possible and other things difficult. The design of Coursera as an LMS reinforces traditional notions of the “class” and the classroom, and makes it difficult, if not impossible, to think about and experiment with new structures. Just like Blackboard.

It also makes clear the idea that effective teaching online requires experienced, thoughtful, and engaged teachers. There’s a lot of bad models out there, which, when combined with bluster and grand proclamations, has led to some deliciously loud failures this year. If you look at the primary modes of interaction that Coursera affords, you see a platform that, just like other LMSs, places significant barriers before the instructor who wants to do something new, something open, or something connected.

What follows is based on a review of 8-10 Coursera courses that ran in Spring 2013.

The 15 Minute Video Lecture
Students, we’re told, need to have lecture content broken down into digestible chunks, and these chunks must be less than 15 minutes. Universities that partner with Coursera have to bear their own costs for producing the courses, and much of the investment (which tends to be around $50k a course) goes into high definition video content. Coursera will help you integrate a quiz into your video lecture, which in theory isn’t a bad way to make a lecture more interactive. Courses provide a transcription of the lectures (which are often sloppy), but do not offer audio files, which makes listening to them on the go quite difficult. Coursera tells you when your account has viewed a lecture, but it’s up to university partners to name the lectures, sometimes leading to titles like Rousseau 1, Rousseau 2, Marx 1, and Marx 2. University partners can also annotate lectures, but rarely do (it happens more in the math classes than others). In many cases you wonder where that $50k is going.

I don’t really have a problem with the idea of the chunked lecture, as long as it’s by a seasoned lecturer. If I were ever to teach a course on the American Civil War, I’d draw heavily upon David Blight’s work, and would extract specific segments to combine with other texts. I do though have a problem with the fact that within Coursera these bits of content are usually locked into specific courses run during specific time periods on a specific service that requires a specific log in. It’s clear that most Coursera courses view video content as the sine qua non of instructional modes. So did the Sunrise Semester.

A recommendation to colleges producing content for Coursera: make sure your video content lands in your institutional repository, and consult the librarians (always consult the librarians!) on ways to make this content discoverable and reusable. And, make it truly open.

Forums
Most student activity is located within the Forum area of a Coursera course, which, again, is planned, organized, and supported by each university partner. The faculty and/or course assistants must establish and name subforums in a way that fits the structure of the class. In any course this is a pedagogical process, where teachers first imagine what conversations they would like to nurture and then adapt the structure of the space to fit the dialogues that are actually emerging. In a fully online course, the stakes of instructional design are heightened. There’s a lot of room for error within this process, and it requires experienced, adaptive teachers.

The forums for each course are sortable by creation date, activity date, level of activity, and threads to which you’re subscribed to; you can also see “top forum posters,” who are awarded points and ranked “based on the sum of the square root of all the votes received for each post.” (I don’t know, either). There are many paths to interact with the content, but if you dig down there’s very little sustained dialogue actually happening within the forums. Good ideas are raised and responded to once or twice, and then things tend to peter out. There’s little to no remix and iteration — which are central modes in innovative digital pedagogy — in part because the forums make it difficult to do this type of work. Technical questions are mixed with task-oriented ones; content-based questions often go without response; students get anxious.

Fine. This stuff happens in all classes. But a good teacher anticipates concerns and confusion and corrals it towards productivity. In the Coursera courses I participated in, very little to none of this redirection happened. Posts in forums can be tagged, but rarely are. Useful tagging requires instruction, and instruction on tagging requires a sense of how to structure taxonomies. (Consult the librarians!).

Until recently, Coursera didn’t support permalinks in the forums, which made it very difficult to find your way around. Permalink functionality is now present, which allows email subscriptions to include anchors to specific comments in a thread, and for each member of a course to see a compiled stream of their forum posts from within that course. This is a massive improvement in the platform over what it was just a few months ago. At this moment, however, the activity stream is not extracted to the platform level; it’s only available within a course, and that course must be currently active or archived in order for you to access your posts, which you can only do by clicking into the course. Each course is atomized, just like in Blackboard, and thus there is no space currently on the platform for thinking about or working at the level of curricula.

Forums behind logins that are not permanently available are not open.

Assessments
There’s basically two modes of assessment within a Coursera course: quizzes and short peer-evaluated essays. There’s some space here for thoughtful pedagogical work that reinforces certain ideas from course content (quizzes can be useful). The prompts for the essays however are widely varied: in one class I “took” the same prompt was used for every reading in the course:

Please write an essay that aims to enrich the reading of a fellow student who is both intelligent and attentive to the readings and to the course. Each essay should be between 270 and 320 words.

The essay should focus on this unit’s reading and the subject may be any literary matter that you studied in that reading: plot, style, theme, structure, imagery, allusion, narrator reliability, and so on. Such matters are discussed in the video clips.

This is a terrible prompt for many reasons: the target essay size is incredibly small and distracting, not even enough space for a blessay. And the focus — “any literary matter that you studied” — is far from what my comp/rhet friends would call an “enabling constraint.” Other prompts I’ve seen are stronger; one asks students to hone in on a text’s arguments, exposition, and use of evidence, and then provides a detailed rubric that defines what it means to do that well.

The peer assessments are double-blind, and there’s no quality control or opportunity for continued exchange after the review phase is over. Students often create a forum thread to solicit more dialogue about what they’ve written. This is a microcosm of one of the overall structural problems with Coursera: the inflexibility of the platform locks communication into specific spaces, which poses significant challenges to iteration. What about making rubrics available outside of the assignment, or even outside of the course? What about allowing students to know whose work they’re reading, and who’s reading their work, to force more honest dialogue and accountability?

Openness on the web requires the flexibility to loosely join small pieces, and to directly engage whomever is engaging you. Coursera fails on this count. Not open.

Data Lock
Coursera students have no way to extract their content (other than to copy and paste) or to delete their account, and the only way to delete previously published content is to navigate to it individually and delete comments one by one. Users retain “ownership” over their content, but grant Coursera the right to do whatever it wants to do with it. What kind of ownership is that?

This data lock-in, more than any of the other structures, makes clear the level of concern Coursera has for students who use its platform. Disallowing a user from deleting their account and extracting their data? Not. Open.

MOCs

We’ve established that these are most definitely massive online courses. There’s a specific set of pedagogical benefits that truly open education offers students and the world: it foregrounds connectivity and puts the student at the center of his or her own learning; it prioritizes the generative iteration that is central to the evolution of ideas; it is skeptical of expertise; and it posits that learning is not limited to specific spaces but instead flows across them.

The design of Coursera as an LMS makes those learning goals very difficult to integrate. A good teacher can teach well using any set of tools, and it’s certainly possible to have “good” courses inside of Coursera. But the structures and design of these platforms matter, their settings and capabilities are ideological, and the notion that an institution can simply choose to scale up without experimentation, trial, and error is foolhardy. Spending $50k to do so is an outrageous waste of resources. Maybe Coursera is realizing all this and has determined that there’s more potential profit in changing their mission and competing against Blackboard under some perverted notion of “openness.” They’ll probably have a better chance of making a buck there than if they try to go toe-to-toe with the Canadians.

5 thoughts on “Assessing Coursera, the LMS”

  1. Thanks for the research, Luke! I hope you treated yourself to something nice after putting yourself through the pain.

  2. Fantastic post, Luke, and not just because of the librarian props. Open, indeed! Thanks for the expose, and to echo Trip, you should definitely buy yourself a present after going through so many Coursera courses.

  3. Thanks for sharing this excellent analysis of Coursera as a learning platform. I agree that it’s pretty much “Blackboard with a hipper stylesheet and a slightly enhanced feature set.” The effectiveness of Coursera (or Blackboard) as a platform depends almost entirely on how it is used by the instructor. I’ve seen rich and productive discussions in Coursera courses, thanks to guidance and structure by instructors. And I’ve seen discussion forums that went nowhere, largely because the instructor was absent from those forums. The forum system certainly imposes some constraints (you can’t “follow” another Coursera student, you can’t view discussion threads across courses), but there’s a lot an enterprising instructor can do within those constraints.

    What mystifies me is that Coursera hasn’t developed more platform tools that leverage the massiveness of their courses. Sure, “upvoting” discussion forum posts relies on the wisdom of the crowd, but that’s just about the only crowd-based tool the platform provides. The just-completed course TechniCity from Ohio State had to go outside the Coursera platform to a tool called MindMixer (http://technicity.osu.edu/) for a more robust crowdsourcing platform.

    Again, thanks for this cogent analysis of the platform. However, I’ll have to dispute a couple of statements in your introduction. You wrote, “Proponents and detractors of xMOOCs agree on one thing: the primary goal of all this is ‘disruption.'” That’s a pretty sweeping statement, and I think you’ll find those on both sides who don’t see “disruption” as the primary goal. At Vanderbilt, for instance, we’re viewing our MOOCs as educational outreach, not unlike the work we do locally with high school students or senior adults, not course replacements. Thus, our MOOCs aren’t geared toward “disruption,” but rather creating new opportunities for learners that probably aren’t going to college right now anyway.

    Also, the Department of Education study of blended learning you and Coursera mentioned is not, in fact, “focused primarily on K-12 instruction.” Its audience is the K-12 world, but most of the studies it includes in its meta-analysis took place in higher and adult education settings. In fact, the authors lament the fact that there weren’t more K-12 studies that fit the criteria of their meta-analysis. So the study is a reasonable one for Coursera to cite in this context.

    1. Thanks, Derek. I’m a fan of your work.

      Your criticisms are valid, as you point to a couple less-than-careful statements I made. Probably should qualified the “disruption” point with a “most.” I have some thoughts about MOOCs, VC partnerships, and “outreach” that require more time than I have to lay them out here, but thanks for sharing your example.

      And you’re also correct to point out the issue with the DoE study, though the executive summary notes: “The goal of the study as a whole is to provide policy-makers, administrators and educators with research-based guidance about how to implement online learning for K–12 education and teacher preparation.” My point anyway wasn’t really to say the study is irrelevant, but to note a parallel in how the study was kind of sloppily thrown into the release and what I see as the pedagogically limiting structure of the software itself. What I see in Coursera, for the most part, is a system that eschews many of the more exciting features of contemporary educational technology and instead foregrounds traditional structures. I certainly could have snarked less and made that point clearer.

      Thanks again-

Comments are closed.