Taking a look at the massive open online course (MOOC): My experience with edX

On several occasions over the past few years, I’ve put my prior experience as an English teacher and tutor to work, developing basic online course content for clients using learning management tools such as Moodle and LearnDash. This teaching and online course development experience, paired with two years of taking online courses through a community college, has helped shape my opinions concerning online learning. But what of these massive open online courses (MOOCs) that have been gaining in popularity? How do they compare to taking a formal online course through a university of community college? What platforms do they use, and how is the material presented?

I decided to give edX a try for my first MOOC experience.

edX is an initiative spearheaded by Harvard University and Massachusetts Institute of Technology that offers “high-quality courses from the world’s best universities and institutions to learners everywhere.” Those courses are offered under the MOOC paradigm, one that aims to provide near-unlimited access to interactive and open educational content to anyone with internet access. edX touts itself as being “the only leading MOOC provider that is both nonprofit and open source,” running its courses on its Open edX open-source platform. Users can opt to take courses for free or pay an additional fee to receive an associated certificate of completion for an individual class or a series of courses.

Setting up a profile on edX was a simple process of creating an account and adding a few additional details. I had to take a few extra steps in account creation because I opted to purchase certificates for the two courses I took. This required verifying my identity using my webcam to photograph myself and a valid photo ID. I used a state-issued driver’s license, though I’m guessing many other photo IDs would be sufficient. (They don’t seem to place special requirements on the ID.) It took less than a day (less than an hour, if I remember correctly) for the verification process to be complete, and it’s valid for one year. Then I chose my courses.

I decided to go with the first two courses in a data science series taught by Rafael Irizarry, Professor of Biostatistics and Computational Biology at the Dana-Farber Cancer Institute and Professor of Biostatistics at Harvard Chan School of Public Health. I chose those two courses—discussing the basics of the R programming language and data visualization—since R and data visualization is highly applicable to many laboratorians and researchers. And should I decide that I want to gain the full professional certificate for data science, I will already have two of the nine courses completed.

Next came the hard part: understanding and passing the courses. While I had taken an Introduction to Statistical Reasoning course online previously, my experience with R and data analysis was otherwise practically zero. I’ve worked casually with JavaScript and PHP in the past, so understanding the concepts of syntax wasn’t a major roadblock. Rather, the process of linking the tools available in R to data visualization and analysis concepts that I was largely learning from scratch proved the most difficult.

Several components made up these two edX courses: video lectures, with transcripts; sub-section assessments, using DataCamp; discussion forums; supporting course documents, including syllabus and installation guides; and a notes tool. Lecture material was almost exclusively presented as recorded videos. Fortunately, all but one or two of the lecture videos were accompanied by transcripts that are linked to the time of appearance in the video:

 

Users can copy-and-paste text from the time-linked transcript or download them as a .txt file. I noted, however, that the transcripts were at times rather subpar. After completing the courses I mused on how they actually go about the transcription process and if they would be willing to pay a fair sum for transcription or, at a minimum, editing work. I found many instances where the transcript needed some editing attention when compared to the video. Regardless, the transcripts were still useful and not so error-ridden that it detracted too much from the learning process.

After a series of videos in a sub-section, an assessment of that lecture material was presented. For these courses, edX has partnered with DataCamp, a provider of interactive R and Python courses and online learning tools. edX is using DataCamp’s browser-based R programming console to present graded activities covering previous material. Users get multiple tries (errors) at providing the expected answer, and they can even choose to take a hint for a minor deduction in the point value of the question. The hints provided with errors and the hint tool seemed to at times be based on what you’ve inputted into the R console, which was much better than receiving a generic hint. This context-based hinting was valuable at times, particularly when the course developer worded the questions poorly.

These assessments also had an associated discussion forum, which at times proved not only invaluable but also necessary to better decipher the at times cryptic assessment questions. (More on that later.) That individual forums associated with each assessment were also grouped together into a single course-wide discussion portal, but this was one of my least-favorite aspects to the platform.

 

Perhaps I’d have gotten used to it in time, but I think I’m much fonder of a more traditional forum format (such as those provided with phpBB and PunBB) that lets you dig down and see content incrementally. I suppose I can see the benefit of having search tools and search results appear all in one place, but the cramped, narrow column on the left is off-putting. Additionally, I found that I had to be super careful when browsing the discussion forum associated with a specific assessment:

 

While I appreciated this view much better than the course-wide discussion view, I found that pressing the back button on my mouse while browsing the forum would take me back to the beginning of the subsection, which was at times supremely frustrating.

The courses also had a syllabus and course advice, which was expected and useful. They also apparently had a tool—which I didn’t learn about until writing this post—for making notes on and marking passages in lessons. I’m under the impression this doesn’t work for video transcripts, though you can bookmark videos within the platform. Perhaps for many users having the equivalent of a built-in course notepad is extremely useful, but I found myself exclusively taking my own notes, outside the browser, in Word or Notepad++ (which is script-friendly).

Then of course we have the instructor and teaching assistants, which are the major variable in a MOOC platform. The course platform can be standardized to support a number of pedagogies, but in the end the overall success of a MOOC is heavily based upon the instructor, their teaching material, how it was presented, and how much attention was put into crafting it. That careful attention to course development is vital with online learning, particularly with video-driven MOOCs. At least with static course text placed in an HTML or text document, it’s relatively painless to correct/modify the content and upload the updated version. This correction/modification process is several degrees more complicated with video content, requiring more time, resources, and patience.

For example, there were at least four or five points in Professor Irizarry’s lecture videos where he either glossed over or didn’t even discuss a concept, yet the concept appeared in an assessment. In one case in particular, the professor briefly mentioned an R function, expected the student to put it to use in the subsequent assessment, and only after that assessment discussed the concept in full. While I can appreciate a teaching method that doesn’t hand-hold students through every little aspect of the course, one that encourages you to search for answers on your own during the assessment, it still sits uneasily with me. However, the degree to which a professor introduces practically new concepts without any prior explanation in a lecture should be limited, in my view. In retrospect, I realize what Irizarry was trying to accomplish: push students towards self-discovery of previously introduced and new concepts. Yet the handful of more severe issues caused by briefly glossing over a topic, including it in an assessment, and only later elaborating could have been mitigated by simply stating, for example, “this concept is explained in a later lesson and isn’t important to understand right now.”

My point: careful script writing and course content development is required. The script for a video lecture must be scrutinized, taking into account all the freshly introduced concepts and identifying points where students may have difficulties in understanding or ask significant questions. Additionally, the script and course content must be gauged for consistency. Several times the professor used shorthand R code such as “col” (for “column”) in the lecture, probably not aware that the DataCamp-supported assessments don’t accept shorthand. This inconsistency certainly caused more than a few students to scratch their head. These and other types of misunderstandings—and frustrations—can be somewhat mitigated with careful video and course material planning. If not well-considered, going back and updating video content will be required; in some cases this may not get done at all because the gains from the effort aren’t viewed as significant enough.

Finally, I’d like to add that from this experience (and past experience with online courses), presenting content and providing even rigorous assessments of it is no clear indicator of how well the student has learned or remembered the content. I was able to complete the assessments with a 90% or better without any “studying” or significant practice, just careful note taking. Even a week later can I explain to you in detail how to code in R a faceted series of scatterplots with log scales and special filters? Only with serious review, and even then, have I really learned the material? Like any knowledge acquisition, remembering and utilizing it effectively requires practice, practice, practice. A MOOC completion certificate means little without further application of what was presented and assessed.

To close, I’d say my experience taking the classes I did on edX was certainly worth it. The platform seems like a prime example of where open online learning can go, following on the advances of the likes of the Khan Academy. And despite the occasional disconnect between assessment questions and lecture material, I found myself appreciating even the add-on DataCamp integration and its hint system for helping learners of R get to the source of their error. Professor Irizarry teaches a good set of courses on data science, and I would probably take more of his courses, despite my gripes about the minor inconsistencies. But the experience has reminded me that an effective and enjoyable MOOC can’t simply be thrown together semi-thoughtlessly. It requires careful planning, consideration of how students will respond, a logical and evolving learning platform, and a pedagogy that is effective in the online environment. And, of course, a MOOC is only as good as its instructor or organizer.

In the future, I hope to give a few more free MOOC platforms a try; Wikipedia has a decent list of them.