WISCAPE Blog

Wiscape Banner
shadow

CONTACTING US

Main Office

WISCAPE
School of Education
UW-Madison
353 Education Building
1000 Bascom Mall
MadisonWI  53706-1326

Tel: 608/265-6342

Email: wiscape-info@education.wisc.edu
or by contact form

 

Facebook logo Twitter logo

 

This Just Published: Unraveling Massive Open Online Courses (MOOCs)

by Brett Ranon Nachman | May 29, 2018

This Just Published logo, 480 pixelsBrett Ranon Nachman is a ​doctoral student in the Department of Educational Leadership and Policy Analysis at ​UW-Madison and a ​project assistant for WISCAPE. His research focuses primarily on the experiences and depictions of college students on the autism spectrum, as well as STEM community college students’ experiences and transfer pathways.

Brett NachmanWelcome back to 'This Just Published.' Massive ​Open Online Courses (MOOCs) emerged earlier this decade as a potential game-changer, allowing individuals to partake in a plethora of online courses anywhere and anytime. While the MOOC fever has considerably lessened and ​faced criticism for low completion rates, these courses continue to draw students. In this edition, I evaluate three recently-published journal articles that address issues facing MOOCs across different subjects and spaces.

Rieber, L. P. (2017). Participation patterns in a massive open online course (MOOC) about statistics. British Journal of Educational Technology48(6), 1295-1304.

View the journal article.

Within ​his opening paragraph, Lloyd P. Reiber establishes the pitfalls of MOOCs, referencing studies that speak to student disengagement upon starting these courses. In ​his study, which analyzed a statistics MOOC serving both educational professionals and first-year doctoral students, Reiber aimed to determine what factors compelled students to participate in and stick with the MOOC, as well as understand how students found value in the MOOC’s offerings.

In constructing the course, offered through the Canvas learning management system, Rieber offered a variety of tools, including video-based content, with a task-centered design approach. This compelled students to apply knowledge in a hands-on way, often filling out Excel spreadsheets to complete statistical problems. Over the course of six sections of this MOOC, more than 5,000 students enrolled in the course. ​Survey data collected from students at both the start and conclusion of the class generated valuable insights into what supported their completion, or lack thereof, as well as how they participated in different facets of the MOOC.

Although only 11.4 percent of students who enrolled finished the MOOC, the authors point out that this exceeds completion rates ​in other studies. Interestingly, more than three-fourths of enrollees did not partake in any of the assessments, but among the rest, just over half completed the course with all eight assessments​. ​This suggests that ​for those who actually start the course, participation is much stronger than meets the eye. ​It also illustrates the complexity of MOOC participation, in that students may indicate early interest and then become disengaged, yet a healthy number who initially participate will reach the finish line. Self-reported intentions regarding course participation are also telling, though only 38​ percent of enrolled students ​completed the survey; of these, 54​ percent said they intended on fully participating in and completing the course.

Other fascinating findings are sprinkled throughout the article. Notably, Rieber ​notes that students who both indicated ​they want​ed to participate​ and finished course milestones​ had higher rates of course completion. Time overwhelmingly represented the biggest obstacle among students who failed to finish the MOOC in one of the sections. Students also noted that the Excel spreadsheet examples, hands-on activities based on video tutorials, and course organization and design, respectively, embodied the three most important factors to successful course completion.

Rieber warns readers to take certain findings with a grain of salt, and justly so, due to survey participants only representing students who participated in the course to some degree. Thus, little is still known about the vast majority of students who enrolled, yet failed to participate. Using the metaphor of shopping for a birthday gift, Rieber points out that MOOC users are akin to shoppers who can browse a store, but ultimately not purchase an item. That said, ​he acknowledges, this does not offer context to the high number of students who only participate in some elements of a MOOC. He contends “MOOC instructors should design ‘low-stakes’ activities and challenges to be introduced in the beginning phases of the course so as to coax participation with a high probability of success.” That is wise, and instructors who design MOOCs should consider ​experimenting with different sections of an individual MOOC ​by offering different times, types of instruction, and approaches to assignments to determine what techniques best account for student engagement and course completion. Rieber also acknowledges how, at times, students’ expectations do not reflect their behaviors, further raising the importance of matching data from students across both surveys and course participation.

Several factors enhance this study’s quality, such as the journal’s common inclusion of a list summarizing the study’s most salient aspects, the presentation of straightforward findings, and the author’s acceptance of the study’s inherent flaws, ​including the lack of representativeness among MOOC participants. However, the study lacks a qualitative component, thus boiling down students’ interpretations of the course to standardized statements. A number of lingering questions persist. What other commitments inhibited students’ progress? What parts of the assessments did students find most useful? How did students translate knowledge from the course into practice? And, most importantly, what factors drive a student away from course participation after initial enrollment? Unraveling the mysteries behind MOOC engagement will continue to drive important scholarly work related to this topic.

Shapiro, H. B., Lee, C. H., Roth, N. E. W., Li, K., Çetinkaya-Rundel, M., & Canelas, D. A. (2017). Understanding the massive open online course (MOOC) student experience: An examination of attitudes, motivations, and barriers. Computers & Education110, 35-50.

View the journal article.​


Similarly examining course engagement among students in MOOCs, Shapiro and colleagues incorporated text analysis of interviews with learners enrolled in two MOOCs. Through presenting common criticisms of MOOCs in the opening paragraphs, the authors immediately address the elephant in the room: MOOCs cannot be compared to, nor replace, face-to-face classes. MOOCs, however, can complement traditional classes and broaden access to learners who may not otherwise gain experience. The purpose of their study was to add to the emergence of qualitative work on MOOCs​, discover the motivations of students ​who take these courses, and ​better understand their experiences.

Dorian A. Canelas and Mine Cetinkaya-Rundel, two of the authors, taught the Introduction to Chemistry and Data Analysis and Statistical Inference MOOCs featured on Coursera, respectively; these were the sites of the study. Each class, utilizing a variety of common mechanisms featured in MOOCs, such as videos and quizzes, aimed to serve as foundational courses for students with minimal experience in the subject areas. Following the e-learning theoretical framework developed by Aparicio, Bacao, and Oliveira, 2016, the authors recognize the many contextual factors, including motivations, barriers, and demographics, that shape students’ course experiences.

Employing pre-course surveys, the authors captured interest from a few thousand students across both courses. ​Utilizing stratified sampling to obtain interviewees representing various backgrounds and ages​ (to name a few factors), the authors interviewed 20 chemistry students and 16 statistics students via Skype to determine their past educational experiences and intentions ​on how the Coursera course would support their objectives. The researchers later coded interviews via sentiment analysis, determining to what extent each sentence featured positive and negative words​ and assigning a score based on the sentiments​ expressed. In their qualitative analysis methods section, the authors demonstrate intentionality in explicitly listing out the rigorous coding process. Among the more significant limitations, sentiment analysis becomes complicated when examining sentences distinctly; therefore, coders reviewed interviews in their entirety to gain context. The authors also mention how interviewees were proficient in English and may not represent the wider population of students who participated in the courses. I find these limitations, as well as a few others offered, to be most appropriate and demonstrate self-awareness.

The sheer scale of the sentiment analysis is astounding, with 10,954 statements examined and around 80 percent of comments labeled as neutral, suggesting a balance in positive and negative phrases. Consequently, the authors only analyzed positive and negative statements to have a clearer picture of where the courses hit their strides or fell of the mark. A majority of the remaining comments were positive, and the authors recognize that this may also reflect the willingness of students who participated in interviews. Most interestingly, they reviewed sentiments based on students’ demographics, discovering that “education, gender, and continent were found to be significantly associated with whether a statement was positive or not positive.” Case in point, students with higher postsecondary degrees held more favorable opinions. The interpretation and rationale for this divide is not indicated, representing a missed opportunity.

While analyzing statements related to students’ motivations, ​the authors discovered that ​students' intrinsic motivations to learn was a common factor, along with work, convenience, and interest in the subject. Remaining in the course was often due to seeking knowledge improvement, as articulated by 92​ percent of interviewees. Tellingly, while participants sought to learn more about chemistry and statistics, there is no content shared on the level of preexisting knowledge they may have possessed regarding the subjects, save for a detail that one-third of interviewees felt they had “inadequate background” in the topic. Therefore, while these were introductory-level courses, students’ entry points remain rather unknown and could have influenced students’ interpretations.

Other enlightening findings emerged. Some students vocalized wanting to take the course for ​career advancement in STEM fields, which, as the authors indicate, may not apply across the board. Obtaining a course certificate was an inconsistent motivation, and the authors report that motivations on this front warrant further investigation. Many appreciated the flexibility of taking a MOOC on their own timeline and with little cost. Despite course convenience, many students noted course time demands and having bad previous experiences with the subject as major barriers.

Reviewing these findings leads me to believe that instructors possess an imperative need and responsibility to frame their courses in a more flexible manner to meet students’ time constraints, complemented by perhaps being more upfront about expectations regarding the time necessary to complete particular units. At the end of the day, however, the student is responsible for determining to what extent the barrier at hand may inhibit successful persistence or course completion. On multiple fronts, the study unpacks the complexities of offering MOOCs to such a large and varied student population. Interview transcripts demonstrate the potential of MOOCs to reach students, no matter their location or knowledge level, and internal drive can greatly heighten engagement. Instructors must continue to find ways to honor and support students with these motivations, as well as consider conducting similar assessments of their students -– much in the manner of these scholars –- to determine areas of improvement in course design. 

Ostashewski, N., Howell, J., & Dron, J. (2017). MOOCifying courses: Delivery of a MOOC to enhance university course activities. Journal of Learning for Development-JL4D4(2).

View the journal article.

While MOOCS have long represented an alternative to how students engage in their learning, the notion of “MOOCifying” an existing course is a relatively novel option. Nathaniel Ostashewski, Jennifer Howell, and Jon Dron cover new ground in their case study ​focused on translating an undergraduate pre-service education course into one that has MOOC elements. During this project, the authors reveal how students view “MOOCified” courses and ​whether they enhance learning experiences for students.

In reviewing the literature, the authors acknowledge the flaws associated with MOOCs, particularly the difficulty in showing learning outcomes and the lack of accreditation. As a result, the notion of credentialing MOOCs remains an ongoing question ​-- and opportunity. Drawing on Butin’s (2012) four “V’s” of the Internet, the authors show how accounting for the volume of web content, variability of programs, velocity, or availability, and variety of delivery can help us understand how content is presented. In considering these factors, they show the multidimensional aspects of learning, particularly in the Internet age. This framework proves useful in disassembling the notion of what a course can encompass.

The study centers around the Participating in the Digital Age ​(PDA MOOC) course, which followed a connectivist-style cMOOC space. To illustrate what a cMOOC encompasses, they reference other scholars’ interpretations, which often harken back to the idea of personalization and open exploration. This approach recognizes the utility of social media, video conferencing platforms, and other materials to help the learner engage in content based on their preferences. As the first of its kind, the six-week-long PDA MOOC primarily served university students​ and added on the online segment simultaneously.

Following a case study design of the aforementioned course, the researchers delivered a survey to 345 students, consisting of 149 via face-to-face and 196 fully online. In the end, only 48 participated, signifying a 14 percent response rate, which is not unexpected, yet still somewhat disappointing considering the number of enrolled students. The authors organize the findings around two central topics.

First, they show student engagement in the course. Via a simple but straightforward bar graph, they reveal which elements of the course students most utilized and appreciated, finding videos, discussion board threads, and the group blogs were most useful. Most students indicated they had no opinion on their fellow course collaborators. Open-ended questions prompting students to ​suggest course improvements revealed some found course structure to be unaligned and experienced difficulty using the online platform. Second, the authors ​asked how students’ ​experiences in the MOOC compared with prior online learning, finding that a majority of participants found Blackboard (a course management system) easier to use. The authors acutely note how the course elements that students most valued in the MOOC, such as videos, are also common on Blackboard. This prompts me to wonder if the accessibility of content on the MOOC may have served as a barrier. Perhaps the most notable finding is that, despite certain criticisms toward the MOOC’s construction, nearly half of participants said they would prefer to take a course ​​that includes a MOOC component, as opposed to one without it. Here we see some paradoxical points, but it also unveils the complexity of MOOCs and how students engage in them.

While some of these results are interesting, the study comes across as more surface level and not as thorough as it could have been. This stems in part from both the simplicity of survey design and lack of interpretation. Furthermore, the small sample size inhibits us from having a more precise understanding of what students generally thought of the class. Although a few student comments are scattered across the findings, the authors fail to meticulously break down participants’ potential reasons for having particular opinions. Because the study does not offer much context on the course construction and the student population, it is difficult to translate our understandings from this study to others that may utilize MOOCs in conjunction with an in-person course. Thankfully, in their conclusion, the authors state some suggestions on how these courses may operate to meet students’ needs, most explicitly ensuring that MOOCs are the same length as the in-person course. The course was indeed novel, and this study represents an introduction to comprehending a handful of challenges and opportunities behind cMOOCs, yet we should not make firm conclusions about the merits of these types of courses more generally. 

What are your thoughts? Please comment using the form below or email us.
Subscribe to receive notifications of WISCAPE blog posts.

Share this post with friends and colleagues using:

Leave a comment

© 2018 Board of Regents of the University of Wisconsin System • Please contact the School of Education External Relations Office with questions, issues or comments about this site.