Online Learning: From Research to Application Dr. Curtis J. Bonk Associate Professor, Indiana University President, CourseShare.com http://php.indiana.edu/~cjbonk, cjbonk@indiana.edu Are you ready??? Brains Before and After Elearning Before After And when use synchronous and asynchronous tools Tons of Recent Research Not much of it ...is any good... Basic Distance Learning Finding? • Research since 1928 shows that DL students perform as well as their counterparts in a traditional classroom setting. Per: Russell, 1999, The No Significant Difference Phenomenon (5th Edition), NCSU, based on 355 research reports. http://cuda.teleeducation.nb.ca/nosignificantdifference/ Online Learning Research Problems (National Center for Education Statistics, 1999; Phipps & Merisotos, 1999; Wisher et al., 1999). Anecdotal evidence; minimal theory. Questionable validity of tests. Lack of control group. Hard to compare given different assessment tools and domains. Fails to explain why the drop-out rates of distance learners are higher. Does not relate learning styles to different technologies or focus on interaction of multiple technologies. Online Learning Research Problems (Bonk & Wisher, 2001) • For different purposes or domains: in our study, 13% concern training, 87% education • Flaws in research designs - Only 36% have objective learning measures - Only 45% have comparison groups • When effective, it is difficult to know why - Course design? - Instructional methods? - Technology? Evaluating Web-Based Instruction: Methods and Findings (41 studies) (Olson & Wisher, October, 2002; International Review of Research in Open and Distance Learning) Number of Studies Year of Publication 12 10 8 6 4 2 0 1996 1997 1998 1999 2000 2001 Year http://www.irrodl.org/content/v3.2/olsen.html Wisher’s Wish List Effect size of .5 or higher in comparison to traditional classroom instruction. Web Based Instruction Average Effect Size Number of Studies CBI Kulik [8] CBI Liao [18] 31 . 32 . 11 97 46 . 41 Evaluating Web-Based Instruction: Methods and Findings (Olson & Wisher, in review) “…there is little consensus as to what variables should be examined and what measures of of learning are most appropriate, making comparisons between studies difficult and inconclusive.” e.g., demographics (age, gender), previous experience, course design, instructor effectiveness, technical issues, levels of participation and collaboration, recommendation of course, desire to take add’l online courses. Evaluating Web-Based Instruction: Methods and Findings (Olson & Wisher, 2002) Variables Studied: 1. 2. 3. 4. 5. Type of Course: Graduate (18%) vs. undergraduate courses (81%) Level of Web Use: All-online (64%) vs. blended/mixed courses (34%) Content area (e.g., math/engineering (27%), science/medicine (24%), distance ed (15%), social science/educ (12%), business (10%), etc.) Attrition data (34%) Comparison Group (59%) Some of the Research Gaps (Bonk & Wisher, 2000) 1) 2) 3) 4) 5) 6) Variations in Instructor Moderation Online Debating Student Perceptions of e-Learning Envir. Devel of Online Learning Communities Time Allocation: Instructor and Student Critical Thinking and Problem Solving Applications in Sync/Asynchronous Envir 7) Peer Tutoring and Online Mentoring: 8) Student Retention: E-learning and Attrition 9) Graphical Representation of Ideas 10) Online Collaboration Many forms of Online Instruction The Web Integration Continuum (Bonk et al., 2001) Level 1: Course Marketing/Syllabi via the Web Level 2: Web Resource for Student Exploration Level 3: Publish Student-Gen Web Resources Level 4: Course Resources on the Web Level 5: Repurpose Web Resources for Others ================================ Level 6: Web Component is Substantive & Graded Level 7: Graded Activities Extend Beyond Class Level 8: Entire Web Course for Resident Students Level 9: Entire Web Course for Offsite Students Level 10: Course within Programmatic Initiative Learning Improved… (Maki et al., 2000) Intro to Psych: Lecture vs. Online Online performed better on midterms. Web-based course students scored higher since had weekly activities due Lecture students could put off reading until night before exam. Learning Improved… (review by Chang, 2003) Online outperformed peers in histology (anatomy—plant and animal tissues under microscope) course (Shoenfeld-Tacher et al., 2001) Web enhancements raised exam performance, grades, & attitudes toward economics Agarwal and Day (1998) Online business communications students performed better on final exams than on campus (Tucker, 2000) Integrating Wireless Content Syllabus Magazine, May 13, 2003 Study by Mobile Learning Corp: group of college institutions Digital content helped first-year college accounting students learn Online interactive exercises useful to student learning Encouraged independent student learning, and instructors to adopt coaching role. Learning Worse (Wang & Newlin, 2000) Stat Methods: Lecture vs. Online No diffs at midterm Lecture 87 on final, Web a 72 Course relatively unstructured Web students encouraged to collab Lecture students could not collab All exams but final were open book Learning Improved or Not… Organizational Behavior, IUSE (Keefe, Educause Quarterly, 1, 2003) Keefe studied 4 semesters of courses, 6 sections, 118 students Face-to-face more satisfied with course and instructor Those in online course associated with lower grades Online Findings: Other Concerns Requires instructor be responsive any time Ottenhoff & Lawrence (1999). A study of 436 educational Web sites--instructors use simple and limited communication tools Mioduser, Nachmias, Lahav, & Oren (1998) Few syllabi posted to World Lecture Hall utilized Web for interaction and collaboration None utilized practitioners as mentors Cummings, Bonk, & Jacobs (2002) Learning Improved or Not… (Sankaran et al., 2000) Students with a positive attitude toward Web format learned more in Web course than in lecture course. Students with positive attitude toward lecture format learned more in lecture format. Contrasting Findings are the Norm Some courses impersonal, isolating, and frustrating (Hara & Kling, 2001) Sense of community and lower attrition rates when support interactivity, reflection, and sharing (Harnishfeger, March, 2003) Different Goals… Making connections Appreciating different perspectives Students as teachers Greater depth of discussion Fostering critical thinking online Interactivity online Student Basic Quantitative Grades, Achievement Test Scores, etc. Number of Posts Overall Participation Computer Log Activity—peak usage, messages/day, time of task or in system Attitude Surveys Student High-End Success Message complexity, depth, interactivity, questioning Collaboration skills Problem finding/solving and critical thinking Challenging and debating others Case-based reasoning, critical thinking measures Portfolios, performances, PBL activities Other Measures of Student Success (Focus groups, interviews, observations, surveys, exams, records) Positive Feedback, Recommendations Increased Comprehension, Achievement High Retention in Program Completion Rates or Course Attrition Jobs Obtained, Internships Enrollment Trends for Next Semester Electronic Conferencing: Quantitative Analyses Usage patterns, # of messages, cases, responses Length of case, thread, response Average number of responses Timing of cases, commenting, responses, etc. Types of interactions (1:1; 1: many) Data mining (logins, peak usage, location, session length, paths taken, messages/day/week), Time-Series Analyses (trends) Electronic Conferencing: Qualitative Analyses General: Observation Logs, Reflective Specific: Semantic Trace Analyses, Emergent: Forms of Learning Assistance, interviews, Retrospective Analyses, Focus Groups Talk/Dialogue Categories (Content talk, questioning, peer feedback, social acknowledgments, off task) Levels of Questioning, Degree of Perspective Taking, Case Quality, Participant Categories Overall frequency of interactions across chat categories (6,601 chats). On-Task Social Mechanics Mechanics 15% 70% 60% 50% 40% 30% On-Task 55% Social 30% 20% 10% 0% Month 1,2 Month 3,4 Month 5,6 Network Conferencing Interactivity (Rafaeli & Sudweeks, 1997) 1. > 50 percent of messages were reactive. 2. Only around 10 percent were truly interactive. 3. Most messages factual stmts or opinions 4. Many also contained questions or requests. 5. Frequent participators more reactive than low. 6. Interactive messages more opinions & humor. 7. More self-disclosure, involvement, & belonging. 8. Attracted to fun, open, frank, helpful, supportive environments. Starter Centered Interaction (Hara, Bonk, & Angeli, 2000): Scattered Interaction (no starter): Week 4 Nonnative speakers did not assume roles, Americans used role names, Ching-Fen Chang (May 2003) Ching-Fen Chang (May 2003) …it appeared that the Web-based forum discussions especially enabled the nonnative speakers of English to contribute to the class discussions by providing more opportunities to contribute than face-to-face discussions. Schallert & Reed, AERA, April 2003 Nonnative students do not participate equally in written discussions Enthusiastic and frequent contributors do not necessarily make intellectually significant contributions. Some who seem deeply engaged may be less rigorously engaged in many conversations Collaborative Behaviors (Curtis & Lawson, 1997) Most common were: (1) Planning, (2) Contributing, and (3) Seeking Input. Other common events were: (4) Initiating activities, (5) Providing feedback, (6) Sharing knowledge Few students challenge others or attempt to explain or elaborate Recommend: using debates and modeling appropriate ways to challenge others Online Collaboration Behaviors by Categories (US and Finland) Behavior Categories Planning Conferences (%) Finland U.S. Average 0.0 0.0 0.0 Contributing 80.8 76.6 78.7 Seeking Input 12.7 21.0 16.8 Reflection/ Monitoring 6.1 2.2 4.2 Social Interaction 0.4 0.2 0.3 100.0 100.0 100.0 Total Dimensions of Learning Process (Henri, 1992) 1. Participation (rate, timing, duration of messages) 2. Interactivity (explicit interaction, implicit interaction, & independent comment) 3. Social Events (stmts unrelated to content) 4. Cognitive Events (e.g., clarifications, inferencing, judgment, and strategies) 5. Metacognitive Events (e.g., both metacognitive knowledge—person, and task, and strategy and well as metacognitive skill—evaluation, planning, regulation, and self-awareness) Some Findings Cognitive Skills Displayed in Online (see Hara,Conferencing Bonk, & Angeli, 2000) Social (in 26.7% of units coded) 40 More inferences & judgments than elem clarifications and in-depth clarifications of St ra ts Ju dg me nt Inf er en cin g Ap pli c Cognitive Skills More reflections on exper & self-awareness Some planning, eval, & regulation & self q’ing InDe pt h Cl ar if Metacognitive (in 56% of units) Cl ar if social cues decreased as semester progressed messages gradually became less formal became more embedded within statement Cognitive (in 81.7% of units) Ele m Percent of Coded Units 35 30 25 20 15 10 5 0 Surface vs. Deep Posts (Henri, 1992) Surface Processing making judgments without justification, stating that one shares ideas or opinions already stated, repeating what has been said asking irrelevant questions i.e., fragmented, narrow, and somewhat trite. In-depth Processing linked facts and ideas, offered new elements of information, discussed advantages and disadvantages of a situation, made judgments that were supported by examples and/or justification. i.e., more integrated, weighty, and refreshing. Level of Cognitive Processing: All Posts Both 12% Surface 33% Surface Deep Deep 55% Both Critical Thinking (Newman, Johnson, Webb & Cochrane, 1997) Used Garrison’s five-stage critical thinking model Critical thinking in both CMC and FTF envir. Depth of critical thinking higher in CMC envir. More likely to bring in outside information Link ideas and offer interpretations, Generate important ideas and solutions. FTF settings were better for generating new ideas and creatively exploring problems. Unjustified Statements (US) 24. Author: Katherine Date: Apr. 27 3:12 AM 1998 I agree with you that technology is definitely taking a large part in the classroom and will more so in the future… 25. Author: Jason Date: Apr. 28 1:47 PM 1998 I feel technology will never over take the role of the teacher...I feel however, this is just help us teachers... 26. Author: Daniel Date: Apr. 30 0:11 AM 1998 I believe that the role of the teacher is being changed by computers, but the computer will never totally replace the teacher... I believe that the computers will eventually make teaching easier for us and that most of the children's work will be done on computers. But I believe that there… Indicators for the Quality of Students’ Dialogue (Angeli, Valanides, & Bonk, in review) ID Examples Indicators 1 Social acknowledgement/ Sharing/Feedback HHello, good to hear from you…I agree, 2 Unsupported statements (advice) II think you should try this….This is what I would do… 3 Questioning for clarification and extend dialogue · 4 Critical thinking, Reasoned thinkingjudgment good point, great idea Could you give us more info? · …explain what you mean by…? \\ II disagree with X, because in class we discussed….I see the following disadvantages to this approach…. Social Construction of Knowledge (Gunawardena, Lowe, & Anderson, 1997) Five Stage Model 1. Share ideas, 2. Discovery of Idea Inconsistencies, 3. Negotiate Meaning/Areas Agree, 4. Test and Modify, 5. Phrase Agreements In global debate, very task driven. Dialogue remained at Phase I: sharing info Problem-Based Learning Distance Ed, 23(1), 2002 Practical learning issues generated more interactions and higher levels of interaction than theoretical issues Communities of learners need to negotiate identity and knowledge and need milestones (chat session agreements, producing reports, sharing stories, and new work patterns) Group development: (1) negotiate problem and timetable, (2) divide work in subgroups, and (3) produce drafts of products Social Constructivism and Learning Communities Online (SCALCO) Scale. (Bonk & Wisher, 2000) ___ 1. The topics discussed online had real world relevance. ___ 2. The online environment encouraged me to question ideas and perspectives. ___ 3. I received useful feedback and mentoring from others. ___ 4. There was a sense of membership in the learning here. ___ 5. Instructors provided useful advice and feedback online. ___ 6. I had some personal control over course activities and discussion. Problems and Solutions (Bonk, Wisher, & Lee, in press) 1. 2. 3. 4. 5. 6. 7. Tasks Overwhelm Confused on Web Too Nice Due to Limited Share History Lack Justification Hard not to preach Too much data Communities not easy to form Train and be clear Structure time/dates due Develop roles and controversies Train to back up claims Students take lead role Use Email Pals Embed Informal/Social Benefits and Implications (Bonk, Wisher, & Lee, in press) 1. 2. 3. 4. 5. 6. 7. Shy open up online Minimal off task Delayed collab more rich than real time Students can generate lots of info Minimal disruptions Extensive E-Advice Excited to Publish Use async conferencing Create social tasks Use Async for debates; Sync for help, office hours Structure generation and force reflection/comment Foster debates/critique Find Experts or Prac. Ask Permission More Implications Include Variety: tasks, topics, participants, accomplishments, etc. Make interaction extend beyond class Have learners be teachers Find multiple ways to succeed Add personalization and choice Provide clarity and easy navigation Ten Ways Online Ed Matches or Surpasses FTF, Mark Kassop, Technology Source, Michigan Virtual Univ, May/June 2003 1. 2. 3. 4. 5. 6. 7. 8. 9. 10. Student-centered learning Writing intensity Highly interactive discussions Geared for lifelong learning Enriched course materials Online demand interaction and support Immediate feedback Flexibility An intimate community of learners Faculty development and rejuvenation My Evaluation Plan… Considerations in Evaluation Plan 8. University or Organization 7. Program 6. Course 5. Tech Tool 1. Student 2. Instructor 3. Training 4. Task Other Evaluation Plans 1. Quality on the Line: Benchmarks for Success in Internet-Based Distance Ed (e.g., the teaching/learning process) (Blackboard & NEA, 2000) 2. http://www.ihep.com/Pubs/PDF/Quality.pdf The Pedagogical Rating of Online Courses Syllabus Magazine, Jan, 2002, Nishikant Sonwalkar Best Practices: Who are some of the key scholars and promoters…??? Three Most Vital Skills The Online Teacher, TAFE, Guy Kemshal-Bell (April, 2001) Ability to engage the learner (30) Ability to motivate online learners (23) Ability to build relationships (19) Technical ability (18) Having a positive attitude (14) Adapt to individual needs (12) Innovation or creativity (11) Let’s brainstorm comments (words or short phrases) that reflect your overall attitudes and feelings towards online teaching… Feelings Toward Online Teaching The Online Teacher, TAFE, Guy Kemshal-Bell (April, 2001) (Note: 94 practitioners surveyed.) Exciting (30) Challenging (24) Time consuming (22) Demanding (18) Technical issue (16); Flexibility (16) Potential (15) Better options (14); Frustrating (14) Collab (11); Communication (11); Fun (11) Changing Role of the Teacher The Online Teacher, TAFE, Guy Kemshal-Bell (April, 2001) From oracle to guide and resource provider From providers of answers to expert questioners From solitary teacher to member of team From total control of teaching environment to sharing as a fellow student From provider of content to designer of learning experiences. Dennen’s Research on Nine Online Courses (sociology, history, communications, writing, library science, technology, counseling) Poor Instructors Little or no feedback given Always authoritative Kept narrow focus of what was relevant Created tangential discussions Only used “ultimate” deadlines Good Instructors Provided regular qual/quant feedback Participated as peer Allowed perspective sharing Tied discussion to grades, other assessments. Used incremental deadlines Common Instructor Complaints a) b) c) d) e) f) Students don’t participate Students all participate at the last minute Students post messages but don’t converse Facilitation takes too much time If they must be absent, the discussion dies off Students are confused Reasons why... Students don’t participate Because it isn’t required Because they don’t know what is expected Students all participate at last minute Because that is what was required Because they don’t want to be the first Instructor posts at the last minute Research on Instructors Online If teacher-centered, less explore, engage, interact (Peck, and Laycock, 1992) Informal, exploratory conversation fosters risktaking & knowledge sharing (Weedman, 1999) Job Varies--Plan, Interaction, Admin, Tchg (McIsaac, Blocher, Mahes, & Vrasidas, 1999) Study of Four Classes (Bonk, Kirkley, Hara, & Dennen, 2001) Technical—Train, early tasks, be flexible, orientation task Managerial—Initial meeting, FAQs, detailed syllabus, calendar, post administrivia, assign e-mail pals, gradebooks, email updates Pedagogical—Peer feedback, debates, PBL, cases, structured controversy, field reflections, portfolios, teams, inquiry, portfolios Social—Café, humor, interactivity, profiles, foreign guests, digital pics, conversations, guests But there is a Problem… How Bad Is It? “Some frustrated Blackboard users who say the company is too slow in responding to technical problems with its coursemanagement software have formed an independent users’ group to help one another and to press the company to improve.” (Jeffrey Young, Nov. 2, 2001, Chronicle of Higher Ed) Must Online Learning be Boring? What Motivates Adult Learners to Participate? Motivational Terms? See Johnmarshall Reeve (1996). Motivating Others: Nurturing inner motivational resources. Boston: Allyn & Bacon. (UW-Milwaukee) 1. 2. 3. 4. 5. 6. 7. 8. 9. 10. Tone/Climate: Psych Safety, Comfort, Belonging Feedback: Responsive, Supports, Encouragement Engagement: Effort, Involvement, Excitement Meaningfulness: Interesting, Relevant, Authentic Choice: Flexibility, Opportunities, Autonomy Variety: Novelty, Intrigue, Unknowns Curiosity: Fun, Fantasy, Control Tension: Challenge, Dissonance, Controversy Interactive: Collaborative, Team-Based, Community Goal Driven: Product-Based, Success, Ownership 1. Tone/Climate: Ice Breakers A. Eight Nouns Activity: 1. Introduce self using 8 nouns 2. Explain why choose each noun 3. Comment on 1-2 peer postings B. Coffee House Expectations 1. Have everyone post 2-3 course expectations 2. Instructor summarizes and comments on how they might be met (or make public commitments of how they will fit into busy schedules!) 2. Feedback Requiring Peer Feedback Alternatives: A. Require minimum # of peer comments and give guidance (e.g., they should do…) B. Peer Feedback Through Templates—give templates to complete peer evaluations. C. Have e-papers contest(s) 3. Engagement: Electronic Voting and Polling 1. Ask students to vote on issue before class (anonymously or send directly to the instructor) 2. Instructor pulls our minority pt of view 3. Discuss with majority pt of view 4. Repoll students after class (Option B: Delphi or Timed Disclosure Technique: anomymous input till a due date and then post results and reconsider until consensus Rick Kulp, IBM, 1999) 4. Meaningfulness: A. Professional/E-mail Interviews 1. Field Definition Activity: Have student interview (via email, if necessary) someone working in the field of study and share their results As a class, pool interview results and develop a group description of what it means to be a professional in the field 4. Meaningfulness: B. Field Observation Reflections 1. 2. 3. 4. 5. Instructor provides reflection or prompt for job related or field observations Reflect on job setting or observe in field Record notes on Web and reflect on concepts from chapter Respond to peers Instructor summarizes posts 5. Choice: A. Discussion: Starter-Wrapper 1. 2. Starter reads ahead and starts discussion and others participate and wrapper summarizes what was discussed. Start-wrapper with roles--same as #1 but include roles for debate (optimist, pessimist, devil's advocate). Alternative: Facilitator-Starter-Wrapper Instead of starting discussion, student acts as moderator or questioner to push student thinking and give feedback 5. Choice: B. Discussion: Multiple Topics Generate multiple discussion prompts and ask students to participate in 2 out of 3 Provide different discussion “tracks” (much like conference tracks) for students with different interests to choose among List possible topics and have students vote (students sign up for lead diff weeks) Have students list and vote. 6. Variety: Just-In-Time-Teaching Gregor Novak, IUPUI Physics Professor (teaches teamwork, collaboration, and effective communication): 1. 2. Lectures are built around student answers to short quizzes that have an electronic due date just hours before class. Instructor reads and summarizes responses before class and weaves them into discussion and changes the lecture as appropriate. 7. Curiosity: A. Electronic Seance Students read books from famous dead people Convene when dark (sync or asynchronous). Present present day problem for them to solve Participate from within those characters (e.g., read direct quotes from books or articles) Invite expert guests from other campuses Keep chat open for set time period Debrief 7. Curiosity: B. Electronic Guests & Mentoring Find article or topic that is controversial 2. Invite person associated with that article (perhaps based on student suggestions) 3. Hold real time chat 4. Pose questions 5. Discuss and debrief (i.e., did anyone change their minds?) (Alternatives: Email Interviews with experts Assignments with expert reviews) 1. 8. Tension: Role Play A. Role Play Personalities List possible roles or personalities (e.g., coach, optimist, devil’s advocate, etc.) Sign up for different role every week (or 5-6 key roles) Perform within roles—refer to different personalities B. Assume Persona of Scholar Enroll famous people in your course Students assume voice of that person for one or more sessions Enter debate topic, respond to debate topic, or respond to rdg reflections 9. Interactive: A. Critical/Constructive Friends, Email Pals, Web Buddies 1. 2. 3. 4. Assign a critical friend (perhaps based on commonalities). Post weekly updates of projects, send reminders of due dates, help where needed. Provide criticism to peer (i.e., what is strong and weak, what’s missing, what hits the mark) as well as suggestions for strengthening. Reflect on experience. 9. Interactive: B. Symposia, Press Conference, or Panel of Experts 1. 2. 3. 4. 5. 6. Find topic during semester that peaks interest Find students who tend to be more controversial Invite to a panel discussion on a topic or theme Have them prepare statements Invite questions from audience (rest of class) Assign panelists to start (Alternative: Have a series of press conferences at the end of small group projects; one for each group) 10. Goal Driven: Gallery Tours Assign Topic or Project (e.g., Team or Class White Paper, Bus Plan, Study Guide, Glossary, Journal, Model Exam Answers) Students Post to Web Experts Review and Rate Try to Combine Projects Motivational Top Ten 1. Tone/Climate/Ice Breakers: 8 nouns, expectations 2. Feedback: require fdbk, templates, e-papers contests 3. Engagement: polling, voting, timed disclosure 4. Meaningfulness: e-mail interviews, field observations 5. Choice: starter-wrapper, multiple tracks/topics 6. Variety: just-in-time-teaching 7. Curiosity: seances, electronic guests/mentors 8. Tension: role play, assume persona of a scholar 9. Interactive: e-pals, symposia, expert panels 10. Goal Driven: gallery tours Pick one you can use…??? (circle one) Some Final Advice… Or Maybe Some Questions???