1 To appear in The American Journal of Distance Education. 28(3). Understanding MOOCs as an Emerging Online Learning Tool: Perspectives From the Students Min Liu, Jina Kang, Mengwen Cao, Mihyun Lim, Yujung Ko, and Ryan Myers The University of Texas at Austin Amy Schmitz Weiss San Diego State University Abstract: This study examined participants’ learning experiences in the context of a six-week massive open online course (MOOC) in journalism with 5,000 students from 137 countries. Three research questions were asked: (1) Who are the students and why are they enrolled in this MOOC? (2) How much time have the students spent in taking this MOOC and have they completed all the assignments? and (3) What have they learned and what aspects of this MOOC do the students find most helpful? Four hundred nine students responded to a survey and 44 responded to interview questions. The main findings showed 84% of the participants were working professionals and only 28.9% were from a journalism background. Of those who did not complete the course, lack of time was the top reason. Most participants reported a positive learning experience, but lack of feedback and/or poor quality were reported as negative experiences. The discussion forum was the least liked aspect of the course. 2 The emergent, fragmented, confusing at times, and self-defined nature of massive open online courses (MOOCs) (McAuley, Stewart, Siemens, and Cormier 2010) presents challenges to learners, instructors, and educational institutions. The current rush to offer MOOCs has raised questions “about the future of teaching, the value of a degree, and the effect technology will have on how colleges operate” (The Chronicle of Higher Education 2013). Evidence-based research is beginning to surface, helping instructors and institutions understand MOOC advantages and constraints as a teaching and learning tool (Liyanagunawardena, Adams, and Williams 2013; Siemens, Irvine, and Code 2013). A key factor to ensure MOOC effectiveness is to understand the perspectives of students enrolled in these courses (Milligan, Littlejohn, and Margaryan 2013). RESEARCH QUESTIONS This study aims to examine students’ experience in taking a MOOC in journalism and aspects of this MOOC that students find beneficial to their learning. We ask these research questions: 1. Who are the students and why are they enrolled in this MOOC? 2. How much time have the participating students spent in taking this MOOC and have they completed all the assignments? If not, why? 3. What have the participating students learned from taking this MOOC and what aspects of this MOOC do the students find most helpful to their learning? METHOD 3 Participants and Structure of the MOOC The participants were approximately 5,000 students from 137 countries. The students had registered for the MOOC Introduction to Infographics and Data Visualization1, which was delivered by The Knight Center for Journalism in the Americas in the College of Communications at The University of Texas at Austin. This course was designed for practicing journalists, journalism students, media practitioners, and anyone else interested in learning about how data can be used in a journalistic context and various ways of working with graphics to communicate and analyze data. The MOOC was offered during the weeks of Jan. 12 to Feb. 23, 2013 for a total of six weeks through Moodle, a free coursemanagement system (CMS). The course content consisted of reading materials, video lectures, and tutorials for learning technical tools. During the course, two quizzes, three infographic exercises, and five mandatory discussions were used to evaluate participants’ performance. Each of the five mandatory discussions focused on a different topic. For each discussion topic, ten groups were formed, one group for each forum. A total of 165 forums were provided. In addition, there were weekly Q&A forums, student lounge forums, and weekly help forums. A certificate of completion was provided if a student met all of the following requirements: completion of two quizzes with an 80% score, completion of three exercises, and 1 The researchers were not the participants nor instructors of the MOOC. One of the co-authors manages the Moodle LMS for the Knight Center. This research was approved by the university’s institutional review board (IRB). 4 submission of posts to all five discussions. Given the current understanding of MOOC types, this course is an xMOOC. Data Sources, Procedure, and Analysis To answer the research questions, three data sources (both quantitative and qualitative) were employed for the purpose of triangulation (Creswell 2014). Survey. A 24-question survey was created. The survey included questions on demographic information and the experience and perceptions of students while they were participating in the MOOC. Construction of the survey was based upon the literature on MOOCs and reviewed by the journalism faculty for face and content validity. Both Likert scale questions and open-ended questions were used. Sample questions include: Why are you taking this MOOC? How do you compare learning in this MOOC with other face-to-face or online courses? Do you feel you have learned a lot from this MOOC? Of all the materials provided in the MOOC, which one is the most helpful? What do you like the most/least about the course? Interviews. All participants were asked if they were willing to be interviewed. Given the nature of a MOOC with its participants from all over the world, the interview was 5 conducted via e-mail (Creswell 2007, 130). Those who indicated their willingness were sent interview questions. Sample interview questions include: Please describe one SPECIFIC good/bad example of learning via this MOOC. Please describe what motivated you to spend time in doing all the different activities for this MOOC. Please specify any challenges you have encountered in taking this MOOC. Course activity data. Data on the usage of course activities were also collected, including the number of logins, participation rate in weekly discussions, both viewings and postings, and quiz completion rates. Procedure. The survey was distributed during the last week of the MOOC. The participants were informed that their participation was entirely voluntary and anonymous. A total of 409 participants responded, representing a response rate of 8%. Forty-four participants also responded to seven guiding interview questions about their learning experience. Analysis. Responses to the Likert scale questions in the survey were analyzed descriptively. Open-ended responses and interviews were analyzed using a systematic and iterative examination of the data, line by line. As we analyzed the qualitative data, we focused on the main idea in each response from the participants through "focused coding" (Charmaz 2006). The themes that emerged from the data were compared with each other 6 and with our research questions until an “emergence of regularities” (Lincoln and Guba 1985) was reached and we felt we had uncovered the primary themes from the responses. In reporting the findings below, sample participants’ comments were reported as unedited. Six researchers were involved in the process of analyzing the open-ended responses and interview data as well as checking and verifying the descriptive data using the Likert scale questions until 100% interrater reliability was reached. Course activity data were also tallied and reported. The research team met weekly to discuss data and resolve any disagreement in qualitative data interpretation. FINDINGS Participants’ Demographic Information and Reasons for Enrolling in This MOOC Of the 409 participants who responded to the survey, 58.4% (n = 239) were female and 41.6% (n = 170) were male. Figure 1 provides information about the countries the participants were from, their occupations, the fields they were in, and the number of MOOCs they have taken previously. The participants were asked the reasons they took this MOOC, and the top three reasons were (1) learning more about the topic for their current job, (2) personal interest, and (3) career development (see Table 1). <insert figure 1 about here.> 7 <insert table 1 about here.> More than 80% of the participants indicated they were excited to take this MOOC, while only 2.2% were reluctant. The three main reasons for their excitement were related to the course topic (68%), the MOOC format (22%), and personal/professional development (10%). Specific reasons relating to enthusiasm for the course topic included interest in the topic, relevance to job, positive reputation of the instructor or university, and the course’s function as an alternative to an equivalent course offering. Specific reasons relating to enthusiasm for the MOOC format were free of charge, flexibility of schedule, and interaction with others worldwide. Specific reasons related to personal/professional development were the opportunity to learn new concepts and career preparation. The participants were also asked how well prepared they were to take this MOOC. Some 36.7% (n = 150) of the participants indicated they were well prepared, followed by 56.9% (n = 233) who stated they were somewhat or slightly prepared. Approximately 50% of the participants felt competent enough to complete the course, while around 10% did not feel competent. Time Spent and Completion of Assignments The students were assigned various activities, such as exercises and discussions, during the course in addition to watching video lectures and video tutorials. Approximately 42% of the participants indicated they spent more than four hours each week participating in different activities, 36% spent between two to four hours, 14% spent one or two hours, and 8 6.8% spent less than one hour. Additionally, the participants were asked how many exercises and assignments they had done at the point when the survey was given. Slightly more than 62% of the participants had completed all or most of the activities, 28.2% had completed a few, and only 8.6% of the participants had not completed any of the activities. Regarding noncompletion of course activities, the participants cited two main reasons: lack of time and too much work (see Table 2). Other reasons included language barrier, technical problems, and noninterest in the topic. Other reasons for not completing assignments were that participants did not like the discussion forums, they simply wanted information, or they were ill. The interview questions asked the students to discuss what motivated them to complete all the assignments and challenges in taking the MOOC. The responses revealed that students completed exercises/assignments because they were “well-chosen and aligned with learning about the course topic,” “engaging,” and “typical of real-life professional projects,” and that the students were “learning by doing.” The primary challenge for not completing exercises/assignments was, as one student succinctly summarized in the response, “Time!” <insert table 2 about here.> Figure 2 provides the weekly activity data for discussion forums with a total of 16,810 posts and 36,769 views. At the end of the course, 33% of the participants (n = 1,630) had completed the first quiz, 26% (n = 1,277) had completed the second quiz, and 25% (n = 1,272) had completed both quizzes. In total, 44% of the participants (n = 2,214) interacted in the forums of the 9 course. At the end of the MOOC, 5.6% (n = 281) had met all of the requirements and paid $30 to receive the certificate. <insert figure 2 about here.> Students’ Learning and Aspects of MOOC that Students Found Most Helpful The participants were asked if they felt they had learned from this MOOC. Eighty-six percent (n = 353) indicated they did, while only 13.7% (n = 56) indicated they did not. The reasons for limited learning included lack of time, lack of participation/poor feedback, not new topic, lack of organization, and too much work. The participants were asked to name three things they had learned. Learning how to visualize data and critique infographics (46%, n = 303)2, learning visualization concepts (32%, n = 211), and learning tool use (9%, n = 59) were the top responses. Other responses included new way of learning (2.4%, n = 16) and helpful peer feedback (2%, n = 11). The top two reasons for not learning were that the students did not have enough time (39%, n = 19) and they did not feel they received new knowledge from the course (24%, n = 12). Other reasons (37%, n = 18) included lack of feedback, disorganized content, and too much work. The participants were also asked specifically about their learning experience through feedback or critiques from peers. Some 84% of the participants (n = 342) indicated 2 % is out of a total of 660 coding units as each response can have more than 1 coding units. 10 that peers’ feedback was useful, while 16% (n = 66) felt no value in peer feedback. Of the negative responses, the most cited reason was that feedback from peers was not useful/thoughtful/meaningful or the peers lacked expertise. One participant commented, “I haven't had any useful feedback,” and another said, “Many of the comments repeat the same things. Few are especially insightful, and those get buried. Most don't address particular questions/comments made by others.” Other reasons included “too many students” and “the forum was not set up for it—impossible to keep track of who is responding to what easily and there are too many messages.” Language can be a barrier as one participant stated: “Most class participants do not post meaningful content in the forums, but rather write a few sentences just to say they posted something. Also, since the students come from all over the world, there is a definite language barrier in interpreting what some students are trying to say through broken English.” The participants were asked the question “How do you compare learning in this MOOC with face-to-face courses or other online courses?” Most participants (75.31%, n = 186) indicated their MOOC experience was better than face-to-face instruction, about 6.48% (n = 16) indicated no difference, and 18.21% (n = 45) felt face-to-face instruction was better than their MOOC experience. The positive features the students indicated included self-pace and flexibility, diversity of worldwide participants, peer learning/monitoring, usefulness and quality of course materials (readings, videos, assignments), expertise of the instructor, and other aspects of the MOOC—free and convenient, hands-on, and engaging. The negative aspects included lack of feedback or feedback that was not useful, lack of peer instruction, unorganized course structure, and too many people. 11 The following are a few sample quotes from the students: Can learn by oneself any time, anywhere as long as internet is available, very flexible which conventional way of face-to-face can't provide. (Positive: Flexibility) The fact that we have 5000 colleagues it is a huge brainstorming and a great way of sharing ideas and connections. (Positive: Peer learning) This MOOC's assignments keep students moving forward, continuously, towards a defined goal promotes a more effective learning experience (Positive: Engaging experience) Lack of one-to-one interactions with professor/other students is lacking compared to face to face, but comparable to other MOOCs. (Negative: Lack of interaction) The structure online is kind of confusing with so many forums and everything. (Negative: Course structure) In this MOOC, various types of learning materials were provided including readings, video lectures, tutorials, and external resources. The participants considered readings as the most helpful (49.1%, n = 201) among all of the materials, followed by videos (39.4%, n = 161), discussion forums (5.9%, n = 24), others (3.9%, n = 16), and quizzes (1.5%, n = 6). The analysis of the qualitative data from the open-ended responses for explaining the 12 reasons corroborated the findings from the descriptive statistics in that the participants commented on the comprehensiveness and variety of the reading materials provided, the engaging video lectures delivered by the instructor energetically, and the usefulness of the hands-on activities to illustrate the content. The participants, however, found the discussion forum less helpful. As one participant stated, “Because it is impossible to read all forums, the information on the site are not very good structured.” The participants’ responses indicated eight aspects they liked most: (1) content related (various types, different materials, high quality, good selection, challenging, information/helpful), 29.2% (n = 151) 3 ; (2) instructor related (expertise, engaging, enthusiastic, responsive), 16.6% (n = 86); (3) topic (interesting, relevant, focused), 11.4% (n = 59); (4) activities (challenging, practical), 11.0% (n = 57); (5) organization (wellstructured, coherent, practical approach, combination of different activities, 7.9% (n = 41); (6) peer learning (different views, interaction with others), 7.9% (n = 41); (7) flexibility, 6.6% (n = 34); (8) free, 3.5% (n = 18) and other (e.g. getting a certificate). Their responses also indicated ten aspects they liked the least: (1) discussion forum (too many/fewer forums for overwhelming number of participants, unstructured), 22.6% (n = 71); (2) course platform (not easy to navigate, usability issues), 10.2% (n = 32); (3) instructor related (lack of feedback and guidance), 8.9% (n = 28); (4) high course load, 7.3% (n = 23); (5) peer feedback (not helpful, misjudgment, low quality), 8.0% (n = 25); (6) materials (too basic, too much, repetitive), 6.1% (n = 19); (7) interaction (impossible interaction among peers, lack of interaction), 4.5% (n = 14); (8) course duration (too short), 4.1% (n = 13); 3 % out of 314 coding units as each response can have more than 1 coding units. 13 (9) quizzes, 3.5% (n = 11); (10) technical issues, 2.9% (n = 9) and other (e.g., language barrier, too many e-mails). Interview data provided more evidence to support what the students liked and did not like about the course as discussed above. A few sample responses: I think the most effective part of the course was the overall structure of readings, video lectures and hands-on projects. The lectures and readings gave me the confidence to attempt the projects, and the projects then in turn reinforced the learning from the lectures and readings. (Positive, course organization) [Instructor] used practical, easy-to-understand examples in his lectures. (Positive, instructor, practical examples) I very much enjoyed the interactions with the other students in the class. There are some really talented people enrolled. And their work inspired me to work harder. (Positive, peer learning) It's obviously inherent in the format, but having to do everything online is really hampered by a bad Internet connection. (Negative, technical issue) The reliance of students (often uninformed) to provide this feedback was a real flaw in the model. While the act of reviewing the work of others has benefits, these 14 benefits were thwarted by bad advice. Too often praise was given when condemnation was warranted. (Negative, feedback quality) The large number of people enrolled on the MOOC meant that assignments which I spent a long time working on often only received one comment, which is not that helpful in terms of future development. (Negative, MOOC format) DISCUSSION Participants’ Reasons and Readiness in Taking this MOOC and Completion Rate More than 80% of the participants were excited to take this xMOOC because they were interested in the topic for their jobs or professional/personal development, and/or because they were interested in the MOOC format. Only a few showed reluctance (2.2%). This finding is aligned with the current MOOC phenomenon in that there is much enthusiasm by institutions, instructors, and students who are eager to explore this format as a new way of learning (Davidson 2012; Ruth 2012). While it was shown in a recent report on the first year of course offerings from edX (Ho et al. 2014) that the most typical course registrant was a male student, 58% of the registrants for this MOOC were females and 84% were working professionals, whose primary purpose for enrolling was job related, personal/lifelong learning, or interest in the MOOC format. A characteristic of a MOOC is its open access to anyone around the world. Although this MOOC was designed primarily for people in journalism, only 28.9% were journalism 15 professionals, while 71.1% were from other fields including education, social science, science and technology, business, and health. The finding that people from a variety of fields enrolled because of their interest in the topic on data visualization and in accessing free content supports the important advantages of MOOCs as a learning tool to offer free resources to anyone interested in learning about a topic regardless of time, geographic location, and formal prerequisite constraints (McAuley et al. 2010; Stewart 2013). In addition, the fact that the topic of this course, data visualization, is new and applicable to many professions explains the interest by a large number of people from other fields. The students, overall, had adequate computer and Internet experiences to participate online. The main technical problem they experienced were slow Internet connections at times. It is generally agreed that, compared to attending face-to-face courses, students need more discipline to succeed in an online course (Allen and Seaman 2014). Research looking into learner participation in MOOCs has indicated that for MOOC students, it is critical to have self-directed learning, time management, and critical-analysis skills to benefit from MOOC instruction (Kop and Fournier 2010; Waite et al. 2013), especially because many MOOCs at this current stage are still evolving, fragmented, and confusing at times (McAuley et al. 2010). As shown in this study with multiple data sources, lack of time is not only the main challenge, but also the top reason for students withdrawing despite their interest in the topic. As shown in recent reports, while hundreds and thousands might register for a MOOC, the completion rate is often low (Parr 2013). In this six-week MOOC, only 5.6% of the students completed all the requirements and paid to receive their certificate, although during the course’s final week more than 50% of those surveyed stated 16 they were competent to be able to complete the course. This finding raised the question: Did those who completed all the requirements proceed to get certificates? As illustrated in Table 1, earning a certificate was not the primary goal for participants of this MOOC, which is consistent with the finding by the University of Edinburgh (MOOCs@Edinburgh 2013) Rather, the participants were interested in learning about the topic out of personal interest and for career development. One student stated, “[MOOC] an educational format that works well with my time and circumstances. I'm scheduled to start another MOOC at the end of March. I do think that this is an exciting opportunity for people who want to learn, but don't need the college degree.” Apart from getting a completion certificate as one of many indicators of success, Koller et al. (2013) suggested that understanding learners’ intent was important when examining if they received a certificate or not. The open access, free, wide dissemination characteristics allow MOOCs to provide valuable resources, not readily available until now, for lifelong learning. This finding is consistent with the result examining the first year of course offering through edX: “Large numbers of non-certified registrants access substantial amounts of course content” (Ho et al. 2014). Kizilcec, Piech, and Schneider (2013) pointed out that learners in MOOCs do not adhere to traditional expectations that centered around regular assessments. Ho and his colleagues indicated that “Course certification rates are misleading and counterproductive indicators of the impact and potential of open online courses.” For many participants, receiving a certificate of completion does not appear to be as important as getting the knowledge and developing the skills that they seek, as the finding of this study suggests. Students’ Learning and Course Activities/Assignments 17 Probably the most important factor to consider is if the students have learned from taking a MOOC as well as what they have learned, and how to design a MOOC that will offer a good learning experience. The majority of the students (86.3%) responded that they learned a lot from this MOOC, similar to the finding by the University of Edinburgh (MOOCs@Edinburgh 2013). This study also found that of all the different learning materials offered, the participants considered the reading materials and videos to be the most helpful resources. The students particularly appreciated the well-chosen reading materials and the practical hands-on aspects of the course as shown in assignments/exercises. Qualitative data also revealed that students found the video lectures by the instructor engaging and through these lectures they could connect to him. One student commented, “[Instructor] makes this look very easy. He's very passionate and influences positively to the students.” Since a MOOC can involve hundreds and thousands of students, students are expected to be self-directed and self-disciplined in their studies. Yet not all students are. Providing engaging materials becomes more important in MOOCs than in traditional classroom instruction. Interesting, useful, and multimedia-based materials can keep students engaged and motivated to stay enrolled in a MOOC (Kop and Fournier 2010). Similar to other research on MOOCs (Zutshi, O’Hare, and Rodafinos 2013), these findings also revealed mixed results. For example, some students found this MOOC to be well organized while others did not. Some found peer feedback useful while others did not, and some found materials helpful while some did not. While the overall results were positive, both quantitative and qualitative results showed that using discussion forums was 18 the least liked and useful aspect of the course because of lack of feedback and interaction. The number of postings in the discussion forums dropped from 4,153 posts in the first week to 1,918 posts in the final week. Although the students in this MOOC found peer assessment helpful, they also indicated that the large number of students was overwhelming and interaction was not optimal. Students related difficulty keeping up with the conversation. One student commented, This is the first MOOC I've taken and I enjoyed and appreciated the experience. However, speaking as a teacher, I find that the strengths of the MOOC format are its openness—to large numbers of students in multiple time zones. Its main weakness is the lack of connection and interactivity between all the class members, students and instructor. My biggest complaint was the forums. I feel that they are clumsy to navigate and use. This MOOC relied on two main forms of communication: discussion forums and email communication with the instructor. Students’ responses also indicated difficulty in navigating the Moodle system and clumsiness in interacting with peers using Moodle. Delivery of a MOOC typically relies on a technology system, such as a CMS or a learningmanagement system (LMS). A CMS/LMS not only serves as a delivery platform, but also supports scaffolding specific learning experiences. The features provided by these systems and how they are used can affect the ease and effectiveness of the learning experience (Daniel 2012). Some CMS/LMS setups, such as Moodle, existed prior to the emergence of MOOCs, and having these systems updated and adapted for this new format of online 19 instruction to facilitate interaction becomes critical. Institutions and instructors choosing those systems for MOOC delivery need to be aware of their constraints. CMS/LMS providers need to adjust and adapt their systems to keep up with the evolving needs of modern technologies in order to meet learners’ expectations. Limitations This study used self-reported data from volunteered participants who represented a percentage of all course registrants to provide a glimpse of participants’ learning experiences. It is possible these participants may have strong positive or negative views about their learning experiences. Due to the way the discussion forums were structured in this MOOC via Moodle, it was not possible to get accurate information in terms of those who completed all activities and yet did not proceed to apply for the certificate. Implications and Conclusion This study showed that the majority of this MOOC’s participants had a positive experience and learned new knowledge and skills about a topic they were interested in. While the findings have provided empirical evidence to support the suggested advantages of MOOCs, a number of challenges for course designers, instructors, and providers are highlighted. Emerging research on the topic begins to show learning in a MOOC environment requires learners to be more self-directed, self-disciplined, and intrinsically motivated than in a typical face-to-face course. However, not all participants have those skills. Since anyone can 20 participate in a MOOC for any learning purpose, including professional development, lifelong learning, or for course credit, course designers and instructors are challenged to meet learners’ diverse needs. Making the goals and expectations of a MOOC clear and explicit can help potential students decide if they are suited for taking a MOOC. Large student enrollments and the expected learner autonomy warrant a more innovative use of instructional materials and strategies that need to go beyond the current common practice of using videos, readings, and discussion forums. MOOCs typically rely on peer interaction and peer assessment as their critical elements. As this study has shown, designing effective peer assessment and encouraging peer interaction are important issues. The large-scale nature of MOOCs is pushing the envelope of using discussion forums, emails, and social networking tools as means for communicating differently and innovatively. ACKNOWLEDGEMENT We thank Professor Rosental Calmon Alves, director of Knight Center for Journalism in the Americas at The University of Texas at Austin for his encouragement and support for this research. REFERENCES Allen, I. E., and J. Seaman. 2014. Grade change: Tracking online education in the United States. Wellesley, MA: Babson College/Sloan Foundation. 21 Charmaz, K. 2006. Constructing grounded theory: A practical guide through qualitative analysis. Thousand Oaks, CA: Sage. Creswell, J. W. 2007. Qualitative inquiry and research design: Choosing among five approaches, 2nd ed. Thousand Oaks, CA: Sage. ———. 2014. Research design: Qualitative, quantitative, and mixed methods approaches, 4th ed. Thousand Oaks, CA: Sage. Daniel, J. 2012. Making sense of MOOCs: Musings in a maze of myth, paradox and possibility. Seoul: Korean National Open University. Available online at http://www.tonybates.ca/wp-content/uploads/Making-Sense-of-MOOCs.pdf Davidson, C. 2012. What can MOOCs teach us about learning? October 1. Available online at http://www.hastac.org/blogs/cathy-davidson/2012/10/01/what-can-moocsteach-us-about-learning Ho, A. D., J. Reich, S. Nesterko, D. T. Seaton, T. Mullaney, J. Waldo, and I. Chuang. 2014. HarvardX and MITx: The first year of open online courses (HarvardX and MITx Working Paper No. 1). Available online at http://ssrn.com/abstract=2381263 or http://dx.doi.org/10.2139/ssrn.2381263 Kizilcec, R. F., C. Piech, and E. Schneider. 2013. Deconstructing disengagement: Analyzing learner subpopulations in massive open online courses. In Proceedings of the Third International Conference on Learning Analytics and Knowledge, 170–179, April. New York. Koller, D., A. Ng, C. Do, and Z. Chen. 2013. Retention and intention in massive open online courses: In depth. EDUCAUSE Review Online, June 3. Available online at 22 http://www.educause.edu/ero/article/retention-and-intention-massive-openonline-courses-depth-0 Kop, R., and H. Fournier. 2010. New dimensions to self-directed learning in an open networked learning environment. International Journal of Self-Directed Learning 7 (2): 1–19. Available online at http://selfdirectedlearning.com/documents/Kop&Fournier2010.pdf Lincoln, Y. S., and E. G. Guba, E. G. 1985. Naturalistic inquiry. Beverly Hills, CA: Sage. Liyanagunawardena, T. R., A. A. Adams, and S. A. Williams. 2013. MOOCs: A systematic study of the published literature 2008–2012. The International Review of Research in Open and Distance Learning 14 (3): 202–227. Available online at http://www.irrodl.org/index.php/irrodl/article/view/1455/2531 McAuley, A., B. Stewart, G. Siemens, and D. Cormier. 2010. The MOOC model for digital practice. Available online at http://www.elearnspace.org/Articles/MOOC_Final.pdf Milligan, C., A. Littlejohn, and A. Margaryan. 2013. Patterns of engagement in connectivist MOOCs. MERLOT Journal of Online Learning and Teaching 9 (2): 149–159. Available online at http://jolt.merlot.org/vol9no2/milligan_0613.pdf MOOCs@Edinburgh Group. 2013. MOOCs @ Edinburgh 2013: Report #1. May 10. Available online at http://hdl.handle.net/1842/6683 Parr, C. 2013. Not staying the course. Inside Higher Ed, May 10. Available online at http://www.insidehighered.com/news/2013/05/10/new-study-low-mooccompletion-rates Ruth, S. 2012. Can MOOCs and existing e-learning paradigms help reduce college costs? International Journal of Technology in Teaching and Learning 8 (1): 21–32. 23 Siemens, G., V. Irvine, and J. Code. 2013. An academic perspective on an emerging technological and social trend. MERLOT Journal of Online Learning and Teaching 9 (2): 3–10. Available online at http://jolt.merlot.org/vol9no2/siemens_editorial_0613.pdf Stewart, B. 2013. Massiveness + openness = New literacies of participation? MERLOT Journal of Online Learning and Teaching 9 (2): 228–238. Available online at http://jolt.merlot.org/vol9no2/stewart_bonnie_0613.pdf The Chronicle of Higher Education. 2013. What you need to know about MOOCs. The Chronicle of Higher Education, May 6. Available online at http://chronicle.com/article/What-You-Need-to-Know-About/133475/ Waite, M., J. Mackness, G. Roberts, and E. Lovegrove. 2013. Liminal participants and skilled orienteers: Learner participation in a MOOC for new lecturers. MERLOT Journal of Online Learning and Teaching 9 (2): 200–215. Zutshi, S., S. O’Hare, and A. Rodafinos. 2013. Experiences in MOOCs: The perspective of students. The American Journal of Distance Education 27 (4): 218–227. 24 TABLE 1 Reasons for Taking This MOOC Purpose for taking this MOOC To learn more about the topic for my current job %* 70.9 (n = 290) To learn more about the topic because of personal interest 66.7 (n = 273) To learn more about possible future career 39.4 (n = 161) Curious to find out what MOOC is like 24.2 (n = 99) To get the credit for certification or degree 14.2 (n = 58) Curious to find out what this topic is 14.2 (n = 58) Other 3.7 (n = 15) *Note: This is a multiple answer question. Participants were asked to select all that applied. % = n/total number of participants. 25 TABLE 2 Reasons for Not Completing All Exercises and Assignments Reasons for not completing all exercises and assignments %* Lack of time 54 (n = 221) Too much work 24 (n = 98) Language barrier 4.6 (n = 19) Technical problems related to this MOOC 3.4 (n = 14) Not interesting topic 2 (n = 8) Other 9 (n = 37) *Note: This is a multiple answer question. Participants were asked to select all that applied. % = n/total number of participants. 26 Countries of participants Fields of participants Occupations of participants Number of MOOCs taken prior to this MOOC FIGURE 1 Participants’ Demographic Information 27 FIGURE 2 Number of Posts vs. Views in the Discussion Forums (Week 3 did not have formal discussions)