Research-Supported Best Practices for Developing Online Learning

advertisement

The Journal of Interactive Online Learning www.ncolr.org

Volume 2, Number 1, Summer 2003

ISSN: 1541-4914

Research-Supported Best Practices for Developing Online Learning

Dennis W. Sunal and Cynthia S. Sunal

University of Alabama

Michael R. Odell

University of Idaho

Cheryl A. Sundberg

Louisiana Tech University

Abstract

An analysis was conducted of the body of research studies on best practice in asynchronous or synchronous online instruction in higher education. The analysis used specific research design criteria and categorized studies by the type of theory used, such as creation of typologies. Many studies had flaws in research design and generally were pre-experimental case studies. Those studies most closely meeting the research criteria indicate online learning is viable and identify potential best practices in four categories: student behaviors, faculty-student interactions, technology support, and learning environment.

New technology and skills have created an opportunity where students can download even the most obscure information, previously difficult or tedious to obtain, instantly. These developments have combined to fuel an information explosion that continues to change higher education in ways learning can be supported. To exploit the possibilities for instruction engendered by the Internet, universities are searching for ways to provide meaningful instruction using the Internet as a delivery tool. University faculty are often being pressured to teach online courses.

Online instruction is a relatively new phenomenon for most faculty. Few consider themselves expert at online instruction. Questions asked include, what are the differences between face-to-face and online teaching? What new concerns are involved in online student learning? The few faculty who are experienced in teaching online courses generally may not have shared expertise nor read the research literature. A few articles provide “interview” data from experts in online teaching (Smith, Ferguson, & Caris, 2001). However, the statements are anecdotal and are not evidence based. Concerns currently being explored by researchers are student achievement and attitudes, course design and delivery, course evaluation, and instructor behaviors and attitudes. Evaluation of these factors utilizing well-developed research plans are relatively scarce at this time, however. The purpose of this research project was to provide an overview of a sample of research that has been conducted on online learning, as well as to describe the extent and type of research currently being carried out. The goals of the online research project involve (a) research design criteria to be applied to evaluate the quality of

1

research on online learning environments, (b) sampling research examples to determine if they meet the criteria for research, (c) determining the results of the review of research on online learning, and (d) determining whether valid guidelines can be deve loped for best practice in designing and learning online.

Procedure

The research began with selection of published articles during the time which online learning became a common part of the use and development of the Internet. The purpose of the selection was to show what is being accepted, categorized, and published in the field. The selection was followed by developing a research design and evaluation procedure with the specific purpose of developing valid guidelines for best practices in online learning. Four questions were used to guide the analysis of online learning research studies reviewed. They are:

1. What scientific research design criteria and types of theory can be applied to evaluate the quality of research on online learning environments?

2. How do the online learning research examples that have been sampled meet the criteria for scientific research?

3. What can we learn from the results of the review of research on online learning based on the type of research performed?

4. What are implications from the results of research on online learning environments to date and what can be done to develop valid guidelines for best practice in designing and learning online?

The professional literature sources searched were selected as representative of the core of literature databases in the area of online learning and classroom teaching and learning. The sources included entries and journals listed through the Educational Resources Information

Center (ERIC), Dissertation Abstracts, abstracts from the annual and regional conferences of the

American Educational Research Association (AERA) as well as other various professional association conferences such as the National Council for Social Studies (NCSS/CUFA) and

National Association for Research on Science Teaching (NARST), Internet online journals specializing in research such as Asynchronous Learning Network (ALN), and journals not accessed through ERIC publishing research on online teaching and learning, such as Syllabus .

Nominations of citations to consider were related to several selection criteria. The first required citations to be reporting results of studies published between 1997 and 2002. Few pre-

1997 studies of online learning warranted inclusion. The technology has been changing so quickly that recent studies use significantly evolved hardware and software rendering possibly different conclusions. The second criterion required that the study relate to courses or modules involving online learning providing education via the Internet. The third criterion required the citation to relate to evidence of learning outcomes of students involved. Particular priority was made to find studies that compared online and classroom learning outcomes. An effort was made to eliminate citations that focused on “how to” and “show and tell.” For example we did not use a citation by Miltenoff (2000) that discussed techniques for enhancing online instruction. The article cites no evidence for arguments made for using the techniques. Finally, citations relating only to courses and modules for higher or adult education were sampled.

Multiple searches were performed on each database. Searching on the queries related to

“online and classroom learning” and “online courses and classroom teaching” yielded more than

400 citations. Adding the term “research” reduced the yield to 155 citations. Full articles were

2

obtained for these 155 citations. They were read and reviewed. Final selection of the studies to be included in this review of research applied the four criteria cited above. The final list included

28 studies.

The final selected “online research” studies were reviewed and evaluated using the following measures and criteria, to be discussed below: type of study and comparison basis, number of subjects, data collection instruments, outcome variables measured, courses studied, results and conclusions based on evidence obtained from students outcomes, and threats to study validity. An overview of the selected studies based on the evaluation categories is presented in

Table 1. A summary of these studies is presented in the discussion below. It is also useful to consider types of online courses involved in the research studies based on levels of type of use of the Internet and the number of hours, or percentage of time, on the Internet as instruction. The levels of use and time were categorized as traditional face-to- face instruction, Web presence,

Web-enhanced, Web-centric, and Web course (Boettcher, 1999). A description of levels of use and time and the comparison basis is found on Tables 2 and 3.

To respond to the first research question design criteria and types of theory that can be appropriately applied to evaluate the quality of research on online learning environments were identified. In answer to the first question, the literature identifies specific design criteria for quantitative and qualitative research (Brause & Mayher, 1991). Among these criteria are random sampling, appropriate group size, controlled variables, reliability, internal and external validity, significance, and objectivity that are addressed in the research design (Yore, 2003; Campbell &

Stanley, 1963; see Table 4). For qualitative research, sample criteria are a clear statement of the research purpose and question, research design, sample population, observer researcher, and the data analysis procedures, dependability, credibility, believability, and confirmability (Yore,

2003; Stake, 1995; Denzin & Lincoln, 1994; Wolcott, 1990). Types of theory used to evaluate the research studies can be summarized with a hierarchy of categories leading from nonpredictive typologies and predictions using correlations to explanations that involve various levels of understanding of cause and effect relationships (Reynolds, 1971; Snow, 1973; see Table

5). The least useful type of theory involves creation of typologies that can best be identified as descriptive research. The next level of theory involves looking for correlation between events.

The third level involves cause and effect. This level involves an experimental design with control and experimental groups. It may involve application of inferential statistics. The fourth and fifth levels of theory, sense of understanding and control, were not found in the research reviewed.

Discussion of Results

The following sections provide a brief narrative review of the 28 selected studies grouped in four categories: online comparisons with traditional face-to- face learning, student and teacher attitudes, student and teacher perceptions, and reports of online course design.

Online Comparisons with Traditional Face-to-Face Learning

Few researchers have directly compared an online course with its traditional course counterpart. Davies and Mendenhall (1998) compared Web-based and classroom instructional delivery methods for the health education/physical education course, Fitness and Lifestyle

Management, at Brigham Young University. In this study, the researchers evaluated the effectiveness of online instruction in terms of student achievement, attitudes, and course

3

preference. Students in the traditional course were required to attend 18 lectures and take 6 exams. Students in the Web-based course were required to access course materials online. These materials included lecture notes, video clips, animations, sound clips, graphics, and Web links.

The Web site also included lesson tests called “Speedback” exercises that provided students with immediate feedback concerning their performance.

The researchers invited 129 students who were enrolled in the traditional course to participate in the study. Thirty- nine students volunteered to participate in the Web-based lessons.

These Web-based participants were randomly divided into three rotating groups. While two groups studied the content from a specific module using only the Web-based lesson material, one of the three groups remained in the classroom to receive the classroom version of instruction.

The groups rotated for each of the three exams to counter any inequity in the groups that might have been unintentionally caused by the process of their selection. All students were required to take three identical exams.

Mean scores and standard deviations were calculated for each group, and a measure of statistical significance was used to calculate the effect size. The results indicated that the students participating in the traditional class scored slightly higher than those studying the material online. The effect sizes for the exams, respectively, were .26, .01, and .17. The practical significance of these differences was low to moderate. From this information, the researchers concluded that either method for providing instruction could satisfactorily be used as an instructional delivery method for the course in question. However, the researchers concluded that neither the classroom nor the online instruction fully met the learning needs of all students. They advocated the continued existence of classroom courses with equivalent online courses.

In another study, Wegner, Holloway, and Crader (1997) also compared the online version of a course with its traditional course format by studying the effectiveness of an online, problembased, upper level curriculum course at Southwest Missouri State University. The course content dealt with administration, development, and evaluation of school curricula. Dur ing the development of the course, instructors generated research-based problems using expected student outcomes from the course syllabus. The instructors then developed structures, or embedded parameters, to give some direction to students without tarnishing the problem-based delivery model. Finally, the instructors designed methods for monitoring student progress and for providing Socratic questioning. These methods included the use of e- mail, fax, videoconferencing, and telephones. The experimental group differed in two ways: (a) pedagogy was a problem based learning strategy and (b) the medium of delivery was online distance learning for part of the course.

In order to evaluate the effectiveness of the course, the researchers compared final exam scores from students in the traditional and Web-based versions of the course (the number of participants in the study was given). The traditional students averaged 92.5, while the online students averaged 90.4, which the researchers concluded was an insignificant difference. The researchers indicated that the online, problem-based delivery method is both workable and productive, yet they stated that this type of delivery method might not be applicable to all types of classes.

In a later study, Wegner, Holloway, and Garton (1999) looked at the impact online courses have on student learning as measured by student performance and perceptions of student learning in a curriculum design and evaluation course. Achievement was measured using a content exam with objective, short answer, and essay questions at the end of the course.

Perceptions of student learning were measured using surveys and student evaluation instruments.

4

Students studied were master’s degree students in a certification program for principals. The students, mostly rural classroom teachers, were allowed to self- select either the traditional classroom ( n = 17) or the “experimental problem based learning” online classroom ( n = 14).

Little description was given for the traditional control group treatment. The experimental differed in two ways: (a) pedagogy was a problem based learning strategy and (b) the medium of delivery was online distance learning for part of the course. The control group received face-toface training in technology at the beginning of the course and utilized classroom discussion. Both courses were taught by the same instructor and based on a problem-based model.

There was no significant difference in the exam scores of the control group ( M = 92.6) and the experimental group ( M = 91.6), t = -0.7473, p = .2305. There was no statistically significant difference in student ratings on course evaluations. Qualitative data indicated, however, that the students in the online course had a more positive feeling about the course and both groups enjoyed the problem-based nature of the course.

Jewett (1998) analyzed the benefits and costs of restructuring two undergraduate courses to better utilize instructional technology. Professors from two classes, one in philosophy and one in microbiology participated in workshops designed to help them restructure their courses to an online format.

The restructured philosophy course required students to discuss and provide written responses to philosophical writings, movies, and short stories. The students responded through live computer- mediated communication, synchronous chat sessions, asynchronous discussion postings, homework assignments submitted over the Web, or essays written and submitted in hard copy to the instructor. Approximately 210 students participated in the Web-based version of the course. The restructured course still required students to meet as a group once per week for a regular lecture. The purpose of this lecture was to provide students with an overview of the study topic of the week. In the traditional class, students read the text, attended class frequently, took extensive notes, and wrote a few long and polished essays during the semester. Comparatively, in the restructured course, students also read the text, but engaged in more writing (online), and received frequent challenges and critiques of their thoughts and assertions (via chat and other online accessories). Students in the restructured course received significantly less information directly from the professor.

In order to evaluate the effectiveness of the revised course, the researchers designed a study to compare performance on an essay assignment written by students in both the traditional and restructured courses. In the 10th week of the semester, students in both sections of the course were assigned to write a persuasive essay on the same topic. Forty essays were randomly selected from each section. The essays were scored and graded by three graduate students from another university. The graders used a 10-point scale to rate each essay on 16 criteria deemed important in terms of philosophical discourse. A multivariate analysis of variance was performed on the essay scores. This analysis showed significant differences, so a univariate analysis of variance was conducted between all items. This analysis revealed no statistically significant difference between the two groups on seven criteria: (a) the essay contains a thesis statement of conclusion, (b) the thesis or conclusion is clearly stated, (c) the reasons or arguments are relevant to the thesis or conclusion, (d) no reasons are anachronistic, (e) the reasons or arguments are cogent, (f) the essay’s organization contributes to its readability, and (g) this essay contains no extraneous material. It also revealed that the Web-based section scored better on eight criteria:

(a) the essay went beyond stating the obvious, (b) the organization of the essay contributed to the force of the argument, (c) no reasons or premises were missing, (d) the author understood the

5

philosophical views described in the paper, (e) the essay contained reasons or arguments to support the thesis or conclusion, (f) the essay did not contain factual errors, (g) the essay exhibited sensitivity to counter-arguments of counter-examples, and (h) there were no errors in language or usage that obscured the meaning of the sentences. The traditional section scored better on one criterion, whether the essay was succinctly written.

In regard to learning outcomes in the Web-based philosophy section, Jewett (1998) concluded that the evidence concerning learning outcomes was ambiguous. While students in the

Web-based course scored higher than those in the traditional course on 8 of 16 criteria deemed important to philosophy, there were no significant differences between the 2 groups on 7 of the

16 criteria. However, results indicated that the increased writing and interaction required in the

Web-based course should improve student performance in these areas.

Jewett (1998) also examined a restructured microbiology course where an elaborate Web site was created that allowed students to access class announcements, frequently asked questions, course documents, color PowerPoint slides of lecture notes, printable black-and-white slides, practice exams, live communication links to the professor, direct e- mail to faculty, asynchronous discussions with other students, and links to Web sites related to microbiology. The 185 students enrolled in the redesigned course were still required to attend class for the same amount of time as students enrolled in the traditional course. In order to evaluate the success of the redesigned course, student course grade averages were compared to those from students in past years. Jewett found no significant differences, but the types of statistical analysis used were not identified. The researcher discussed several reasons why comparing grades is not a good basis for comparing a

Web-based course to a traditional course. Included in these reasons were that (a) no standard tests were administered to compensate for differences in teaching and testing methods; (b) although students may learn material presented online in less time, that does not necessarily mean that they will use any time saved to raise their grade in a course; and (c) test grades may not reflect changes in the types of learning that students may gain from the technological aspect of a course. Jewett concluded that the students found the Web-enhanced course was useful to their studies.

Ryan (2000) evaluated the effectiveness of a Web-based construction equipment and methods class. Several universities implemented this course. The Web-based class was administered from a custom-built Web site that was designed, implemented, and maintained specifically to replace traditional class methodology. The online method required students to check the site on a regular basis. The site was formatted in a manner very similar to that of a textbook, and predominantly consisted of online class notes. Class participants were required to use e- mail, chatting, and desktop videoconferencing for communication between each other and the instructor.

In order to evaluate the effectiveness of the online course, the instructors employed several methods. Students in the lecture (26 students from Oklahoma State University) and Web courses (14 students from Oklahoma State, Texas A & M, Texas Tech, Auburn, University of

Wisconsin–Stout, and California Polytechnic–San Luis Obispo) were compared to determine participants’ similarity in background, knowledge, and attitude. The Test of Lo gical Thinking and the Test of Construction Attitude were used to investigate these factors. A statistical analysis of these tests (no exact type of analysis was provided) showed no significant differences between the groups. Moreover, the final grades for the Web-based and lecture participants were not statistically different for either course offering. Researchers concluded that online courses are potentially useful delivery methods for instruction.

6

The effect of online teaching on faculty workload and student achievement was studied by Gaud (1999). He examined how much time and effort instructors spent developing and managing Web-based courses, and the effect on students’ achievement. To collect data, two

Web-based courses were taught in spring, 1999. Each course included units, reading assignments, and activities which students were required to complete and return to the instructor for grading. Students were also tested using multiple-choice exams.

Some of the factors influencing the achievement of students taking Web-based courses were examined by Shih, Ingebritsen, Pleasants, Flickinger and Brown (1998). The factors were learning strategies, patterns of learning, and learning styles. It was considered crucial to understand how learning by diverse students would be influenced when using a new technology as a means of delivering instruction. Identifying students’ learning styles could allow teachers to have a deeper appreciation of how learners perceive and process information in different ways.

Earlier studies suggested that knowledge of learning strategies could help instructors understand which techniques and skills students use in accomplishing a learning task. It may be difficult for students to change their learning styles, but their learning strategies might be controlled and changed through teaching. Thus, it is essential to examine important learning factors that affect student achievement in Web-based instruction. Following this perspective, Shih et al. attempted to determine how students’ demographic characteristics (e.g., whether or not university students, gender, previous experience in the subject areas, study and work hours per week) impacted their learning styles. They also examined how students’ learning strategies, patterns of learning, and achievement differed in relation to their learning styles and the relationship among these three factors and selected variables.

Shih et al. (1998) conducted their study on 99 students taking two Web-based non-major introductory courses offered in the fall of 1997. For instrument design, a 5-point Likert-type of learning strategy scale and patterns of learning scale were developed. In addition, demographic questions and a learning style test were administered. Students’ grades were taken into consideration for comparison study. A panel of three faculty members and three graduate students established internal validity for the questionnaire. For reliability, the Likert-type scales were pilot-tested with 38 students taking a different undergraduate Web-based course.

The results of the t tests indicated that no significant differences were observed between field- independent and field-dependent students’ overall achievement scores by learning styles.

Also, no significant differences were found between student achievement, demographic characteristics, and learning styles. However, a strong correlation was observed between student achievement and learning strategies ( r = .50). A hierarchical regression analysis conducted to identify the amount of variance in students’ standardized achievement scores indicated that learning strategies was the only significant variable that explained the difference in students’ achievement scores. Consequently, student learning styles, patterns of learning, and characteristics did not impact Web-based learning achievement. The researchers concluded that diverse students with distinct learning styles could benefit equally well from Web-based instructions. They recommended that educators provide students with learning opportunities facilitating the use of a variety of learning strategies to ensure a deep understanding and mastery of the subject matter. Also, they urged teachers to encourage learners to use more communication techniques (e-mail, discussion, and chat forums) for more interactive learning in

Web-based instruction. Finally, they suggested that further study be done to determine the relationship between learning strategies and patterns of learning, and the degree of students’ achievement in online teaching.

7

A case study of an experimental course on information design conducted via the Internet,

World Wide Web, e- mail, and listserv was developed by McLellan (1997). The course served 25 graduate students across the western United States and Canada. One goal in designing the course was to create a virtual community that supported learning. Thus, a model by Michael Schrange

(as cited in McLellan, 1997) was used to design the course. In this model, collaboration in the form of shared creation is vital, but problem solving, creating value, and achieving mastery of materials are also key themes. The author noted students enrolled in classes on the Internet could become virtual communities of learners. The goal, according to Schrange, is to create a “shared experience” instead of “an experie nce that is shared” (p. 11). A shared experience is participatory, in contrast with an experience that is shared such as a speech or lecture in which the learner maintains a passive role.

McLellan (1997) emphasized that setting deadlines is important to motivate students and maintain collective communication. However, deadlines should be flexible to accommodate for technical problems. In addition to online conversations, the instructor and other students can use the telephone, which allows the conveyance of more empathy and personality than e- mail. To aid in creating more openness in student communication, students should publish online brief biographies and photographs to allow other members to visualize their face in the class. An advantage to using online discussions in addition to face-to-face sessions is that some students will communicate more via online because of the anonymity of the medium and because some students are more expressive in written communiqués than oral discourse. McLellan concluded that online learning provides a powerful and flexible medium for education, but emphasized that it is important to design online courses so that students dynamically interact with the course content and each another to form a virtual community of learners.

An introductory education course on American education offered online was examined by Teeter (1997). The study compared the traditional course with an online course. The only differentiating factor was the use of the technology for delivery of the instruction and for classroom conversations. While no data was presented, the author observed the following benefits of the online course: increased student motivation, informal nature (one student said that he liked that he did not need to get dressed to come to class), exposure to extended resources, and improved quality in the online discussions and written assignments. On the other hand, there were drawbacks in teaching via the online course. While the students had access to the hardware and software needed, the y did not have technical support to help them troubleshoot problems.

Also, the online course was more time-consuming for the instructor and the students. In addition, the instructor noted student focus shifted from primarily centering on course content to emphasizing the use of the Internet. However, the author did not find any significant differences in the performance of the online group when compared to the traditional group.

Student and Instructor Attitudes

Much of the online research literature concentrates on student and instructor attitudes towards online learning. Mende (1998) correlated student attitudes and participation with student achievement in a six-week non-credit course on HTML authoring. The students’ assignment was to produce a personal Web page. The course text was a specially- designed Web site and the means of communication was a mailing list. Fourteen students registered for the course. During the course, 194 class postings were logged and later examined. While the levels of participation by each student varied greatly, only 7 of the 14 students produced a course product. Mende

8

found there was a correlation between a student’s level of participation in the class postings and the likelihood that a student would produce the final project.

In order to elicit feedback, Mende conducted an evaluation at the end of the six-week class. The survey contained open-ended questions to allow the students to freely express themselves. In one student’s responses, the word “flexibility” occurred numerous times. On the other hand, students who were non-participators indicated that they needed some form of structure. The author hypothesized that the results obtained from this limited study indicated that online education is not for everyone; online learning requires more teacher commitment than a traditional classroom; the same teacher characteristics that are effective in a traditional classroom will be effective in a virtual classroom; many learners require more structure than is easily supported in a virtual classroom; social interaction is an important part of a learning experience for many people; not all learners, even mature adults, are self-directed. (p. 8)

In addition to assessing student performance, Davies and Mendenhall (1998) also required students to fill out a survey designed to gauge student attitudes concerning the effectiveness of the online instruction method. In this survey, 52% of the online students indicated they believed that the online instruction did not prepare them very well for the exam.

Seventy-six percent of the online students reported minor to major problems with the online technology. When asked how they would like to complete the course, 57% of the students indicated they preferred the traditional classroom experience. The researchers noted that students’ reasons for choosing one method over the other had little to do with the quality of instruction. An individual’s personal situation, social issues, and the perceived benefits of a specific delivery method were often the determining factors in a student’s preference. The survey results also indicated that the online students felt the need for more direction and course structure and that several technological problems needed to be corrected.

In addition to examining student achievement, Wegner et al. (1997) evaluated student attitudes towards online learning. The online students were also given exit interviews and required to fill out surveys. These instruments were used to gather data on student perceptions concerning the effectiveness of the online, problem-based delivery method. Generally, the online course received high marks from students. Students stated that some of the advantages of the online method included the authentic nature of the course, the novelty of the online approach, convenient access to online research materials, the online collaboration required for the course, the focus on a product, and the development of practical research skills. Students also noted some negative aspects of the course, including a lack of prerequisite technology skills among some students, a lack of professorial direction, frequent technological difficulties, and a lack of structure provided by a more traditional delivery method.

Student attitudes as well as student achievement were also examined by Jewett (1998). In his study, the Web-based philosophy students completed a survey designed to gather information concerning their reaction to the technology, their opinions about its usefulness in the course, and their learning styles. The survey indicated that student behavior was affected to the extent that over half increased their use of the Internet and that over three quarters better appreciated discussion, peer review, and informal writing as a component of the learning process. The adva ntages most frequently mentioned were better access to course materials and convenience and flexibility in studying. The disadvantages most frequently mentioned were technical

9

problems and the impersonal nature of technology. Microbiology student survey responses indicated that the course Web site improved student access to course materials and that, in terms of both activities and attitude changes, students learned to use the computer and the Internet as tools to assist them in their studies.

Like Jewett, Ryan (2000), in addition to studying student achievement, surveyed students in both the online and traditional course to gather students’ perceptions of instructor performance, course content, and self-evaluation. After the completion of the course, students in both groups were administered the Non-Studio Course Evaluation. Instructors collected additional information from the online class through the administration of a participant survey.

Other information was collected from instructor observations and online participant communications. The researchers took a qualitative analytical approach to the post-assessment data, but did not describe their analytical procedures.

According to the researchers, the data showed that students felt the major advantages of the online course were the opportunity it provided students to work at their own pace and the ability to communicate anonymously with other students. However, the Web-based students indicated that the greatest weakness of the online course was fewer opportunities to interact with the instructor and other students than in the traditional lecture format. The Web-based students mentioned that interaction via e- mail, chat, and so forth, demands greater efficiency than open oral discussion and is more limited. The researchers concluded that there is great potential for online courses.

Ory, Bullock, and Burnaska (1997) used student surveys, course monitoring, and group interviews to study gender differences in the use of and attitudes about online learning. The student surveys were given to a total of 2,151 students in 6 different colleges (17 courses in the fall and 23 courses in the spring) that are part of the university course offerings across two semesters. Four courses were monitored to observe how often students and faculty participated in the conferencing activities using tallies of student, teaching assistant and professor postings recorded weekly. Twenty-eight end-of-course group interviews (of students and teaching assistants) were conducted but results are not included in this study. The participants in the studies were approximately split between males and females, primarily freshmen and sophomores and predominantly Caucasian.

They found that the frequency of online learning use is different for males and females.

Females used computers more often for conferencing than did male students but only statistically significantly more in the fall 1995 semester, t = 3.43, p = .006. Male students accessed the Web more than female students, with statistically significant differences also only in the fall, t = -2.49, p = .01. The number of postings by males and females was approximately equal.

They found that slightly more female than male students used conferencing for social and instructional interaction with other. In general, both males and females found the computers easy to use, but males reported statistically significant less difficulty than females in the fall semester, t = 2.20, p = 0.03. There was no significant difference between male and female responses with both genders reporting positive experiences, as well as desire to take another online course.

Females also reported more statistically significant responses than males when asked about increases in computer familiarity. Females reported that the use of online learning had made a positive impact on their computer familiarity both in fall 1995 ( t = -3.65, p = .003) and spring 1996 ( t = -3.915, p = .0001) surveys.

Shaw and Pieter (2000) studied student attitudes, experiences and performance in a Webassis ted online nutrition education course (nutritional medicine module) at the undergraduate

10

level. The course, based on the traditional lecture course, was supplemented by alternating virtual and teacher- led tutorials. Students were able to access the lecture synopses, past papers, and virtual tutorials only during computer room open hours and these were limited to users at local computer terminals. The virtual tutorials asked questions based on the covered lecture material. Video footage of surgical techniques and conference proceedings were also used to supplement traditional lectures and slides. The module also used e- mail and newsgroups to facilitate discussion between students and staff. This access was not limited to local computer terminals and was truly “anytime, anyplace.”

The students ( N = 51), 44 of whom had passed, 5 had failed, and 2 did not complete the module, were all full time, 20 to 45 years old, and in their second year of higher education. They answered a Likert-type questionnaire (IT Appreciation questionnaire) and due to outliers, only

46 students’ questionnaires were left for statistical analysis.

The majority of the students (68%) felt that the proportion of Computer and Instructional

Technology (C&IT) to traditional delivery techniques used in the module was appropriate. Only

24% expressed a preference for C&IT-supported learning over traditional methods. The majority of the students acknowledged that the use of online methods in this module made the material easier to understand (52%), the lecturer more accessible (66%), and enabled them to take a more active role in the learning process (55%).

Students who failed the module agreed statistically significantly more to the statement, “I preferred the use of online technology used in this module to more traditional means of delivery” than those who passed, F (1, 44) = 6.007, p = 0.018. Those who passed the module disagreed more with the statement “I think newsgroup contributions should be assessed” than those who failed, F (1, 44) = 6.007, p = 0.018.

Twenty-seven percent of the students failed to use the newsgroup at all and 90% of students used it less than five times. However, over 77% of the students found reading the contributions of others worthwhile. The most commonly cited reasons for not contributing were technical and personal, both dealing with the inaccessibility of the computers.

There was no interaction between age and newsgroup contribution, but there were age and newsgroup contribution main effects. Younger students had a lower grade. Those who contributed to the newsgroup had higher grades than those who did not.

Student and Teacher Perceptions

Student and teacher perceptions of an online undergraduate education psychology course conducted over two semesters were reported by De Simone, Schmid, and Lou (2000). The course began with an ethnically and linguistically diverse group of 60 students, but concluded with 30 students. The researchers did not identify reasons for why the number of students dropped. The pilot program, in its second year, used the metaphor of a ship to create an online learning environment familiar to students. The teacher served as the captain and the students served as the crew. The students were randomly assigned to groups (3 to 5 students) throughout the course to cultivate a learning community. To guide group discussions, a “Big Idea” was presented every third or fourth week for the students to discuss. The “Big Idea” was a key principle such as “we are active learners.” Based upon their discussions, the students wrote several “Synthesis and

Reflection” papers over the duration of the course. The teacher-made resources available to the students included text, videos, student guides, and FirstClass (computer- mediated

11

communication software). In addition, students took individual exams and were assigned a research project or research paper, which could be done by an individual or a group.

To collect data, researchers administered a Teacher and Course Evaluation Questionnaire

(TCEQ) to the students. They also exa mined student comments, student online interactions, and students’ overall performance. De Simone et al. (2000) determined that the use of the ship as a metaphor provided students with a meaningful coping strategy. For example, when students encountered ne w or difficult material, they would warn other students to watch out for “icebergs ahead.” Analysis of the online discussions revealed that they could be classified based upon emerging themes. Thus, student-student discussions were classified into one of several categories: personal, management of learning environment, administrative, and cognitive.

Student-teacher discussions included the following categories: personal, administrative, managing learning, and cognitive.

In response to the TCEQ, which was based upon a 5-point Likert-type scale, 71.3% of the respondents either agreed or strongly agreed that the course had developed their “ability to respect and care for others” (p. 3). Over 91% of the students agreed or strongly agreed that they were able to get help from their instructor online. All of the respondents replied that they either agreed or strongly agreed with the statement, “This course helped me to relate my own ideas and experiences with ideas and concepts in the course” (p. 3). Overall, the students and instructor had a successful learning experience in the course. The researchers concluded that the use of metaphors, structured activities, and group collaboration provided a framework to support meaningful online learning.

Graduate students enrolled in a distance learning graduate level course were studied by

Saunders, Malm, Malone, Nay, Oliver, and Thompson, (1997). The purpose of the study was to describe student attitudes towards interactions with participants online. Class participants included 13 in-studio students, 24 off-site students, 2 graduate students, and a professor. The course included two-way audio and one-way video signals for in-class instruction and interaction, and an online “Class Page” for out-of-class interaction. The Class Page provided the following: an e- mail “Post Office” which contained photographs of, and links to, all class members, instructors, and technical personnel; a discussion area for out-of-class discussion; an area called “Project Reports and Motivational Ideas” where students posted assignments for the mutual benefit of class members; a resource section which linked the Class Page to hundreds of relevant Web sites; a “Notice Board” for posting weekly announcements and class assignments; a section for class handouts; a “Class Questions” section; and a biweekly class participant survey. The students were required to participate in the Web site.

The qualitative case study used the following naturalistic research techniques: focus group session of students at midterm, telephone interviews, and surveys. In all, eight survey instruments were administered, which included the following: a pre-course survey of computer skills, a Likert scale survey regarding the Class Page, and overall course evaluation. The students indicated that the Class Page provided them with a sense of involvement in the course. Students noted concerns about the use and access of a computer, use of the Internet, and the online interpersonal communication. For instance, the students completed a Self-Assessment of

Computer Skills survey the first class meeting. Only 3 of the 37 students enrolled in the course reported that they were comfortable using the Internet. The instructor noted the most prominent factor impacting class performance was the lack of active, quality participation. Saunders et al.

(1997) recommended that a moderator serve to provide leadership in the interactive areas of the course, but concluded the online instruction was effective in developing a learning community.

12

The effectiveness of a Web-based program of instruction was investigated by Schlough and Bhuripanyo (1998). The authors examined Web-based course instruction. The study included 22 subjects (respondents = 58%) completing an 8-week Web-based course on task analysis. The participants were technical college instructors and students completing undergraduate degrees. Graduate students enrolled in the course analyzed the surveys.

Researchers collected data using Asymetrix ToolBook II Instructor, a software evaluation form on a 5-point Likert scale.

When respondents were asked if they would prefer to take the class again via the Internet or in the classroom, 77% chose the classroom. On the survey, the Web-based course received higher scores for content organization, relevance, accuracy, effective illustrations, attractive design, and effective graphics. Lower scores (none below midpoint of scale) were given for clarity of content, understandability, navigation, control of program, student expectations, and appropriate instructional format. Similarly, the graduate students enrolled in the course cited time and space convenience, individual learning, asynchronous work, access to classmates opinions, and content clarity as strengths of the course. The graduate students reported that weaknesses of the Web-based course included learner self-discipline, inappropriate delivery for all learner styles, problems with group contact, and impersonal delivery. The authors concluded that the course offered a viable alternative for delivering teache r education instruction to students at remote locations, yet acknowledged problems with Internet delivery as compared to classroom instruction.

A virtual classroom using factors associated with effective instruction was assessed by

Powers, Davis, and Torrence (1998). Participants were 20 education graduate students enrolled in three Web-based instructional technology courses. Thirteen participants returned surveys

(65%) after the Web-based course had been completed. The College and University Classroom

Environment Inventory (CUCEI), rating student cohesiveness, individualization, innovation, involvement, personalization, satisfaction, and task orientation, was used to collect data. The authors statistically analyzed the frequency of unanswered survey items and mean scores on the survey.

Problems reported in the survey dealt with unanswered questions relating to differences in interpretations of where the classroom was actually located (e.g., is it the computer lab, home,

Web site, or nonexistent for Web-based courses?) and the arrangement of physical objects as requested in survey items. Other items were problematic because of the asynchronous nature of the Web-based courses. High satisfaction items on the survey included course satisfaction, individualization, and student cohesiveness. Low satisfaction items included personalization and task orientation. Based upon the results, the authors concluded that the CUCEI has promise for assessing virtual classroom environments.

Electronic discourse was analyzed by Hegngi (1998) to understand online teaching and learning. Three undergraduate and eight graduate students enrolled in a technology and diversity course, along with two team teachers and two teaching assistants, participated in the study. Data sources included field notes, course design artifacts, Web chat and e- mail archives, interviews, and student Web pages. NUD*IST Qualitative Solutions and Research software was used to analyze student participation in electronic discourse. Results demonstrated that online teaching and learning (a) facilitates writing and discussion assignments, (b) reshapes the roles of teachers and some students, (c) involves course design and development that is very labor intensive, (d) allows new types of interactions to emerge in online classrooms, (e) results in synchronous interactions (instructor initiated) encouraging greater participation than does asynchronous

13

communication, and (f) causes more open discussions and a greater number of perspectives when asynchronous interactions are used. The author concluded that the design, development, and implementation of Web-based instruction involving fast changing technology requires vigorous and persistent research.

The development of flexible university learning environments and the design of educationally sound learning environments were examined by McNaught, Kenny, Kennedy, and

Lord (1999). Participants in the study consisted of 40 staff and 717 university students during a semester covering many subject areas. Initial feedback from the staff and students was collected and coded to identify comment categories. The researchers collected data via weekly questionnaires, help-desk data, e- mail, student surveys, and reports from focus groups. The comments from this data were assigned to catego ries and marked as either positive or negative in nature. Data as indicated was tallied and ranked in table format.

Researchers identified 39 comment categories under headings that included access, toolset competence, support, student issues, educational outcomes, and communication. Issues receiving the highest number of negative comments included network reliability, faculty support, registration, staff workload, and course information. Issues receiving the highest number of positive comments included educational strategies considered, staff enthusiasm, help-desk, multi- functionality, and effects on flexibility. The authors of this study concluded that designing an online environment requires new educational design skills. They also found that students must have fast and easy access to online environments and that, after students have access and training, they appreciate the flexibility of online environments. Local technology support for staff and students was found to be important.

Interactions in an online course graduate course, Telecommunications for Instruction, involving eight students and one instructor were studied by Vrasidas and McIssac (1999). During the last three weeks of the course, the researchers interviewed participants regarding their perceptions of interaction during the course. The researchers also analyzed participants’ online chats. Several factors influencing interactions in the online course were identified. First, structural elements influenced interaction. For instance, required aspects of the course led to more interactions. Second, small class size appeared to reduce interactions, particularly asynchronous online discussions. Third, students indicated that more feedback from the instructor would have increased interaction, particularly in reference to the online discussions.

Finally, students who had not experienced computer- mediated communication were uncomfortable participating in the online synchronous discussions. However, these students reported that they were more comfortable with the asynchronous discussions because they were able to reflect upon their ideas and responses.

The researchers concluded that students needed training on the use of Internet customs, such “emoticons” as :) for example, and netiquette. They recommended that, at the beginning of an online course, a survey of the students should be conducted to ascertain those students who will require more technical training and support.

Reports of Online Course Design

A number of reports have been published describing the purpose, content, and structure of online courses. To a great extent these reports include little discussion of evaluation instruments or findings of the course. Primarily descriptive in nature, they explain the creation of

14

one or more courses, problems encountered, means of addressing problems, and plans for continuing revision of courses and development of new courses.

The use of animation and video in developing and delivering a lecture course,

Developmental Biology, online was reported by Stith (2000). The developmental stages and details of running the online course, Developmental Biology, were described. Stith considered how often, and for how long, students were involved in online course components such as a chat room, the number of articles on the course Web site bulletin board read by students, and correlations with final student grades. The number of hits did not correlate with the final grade achieved while the number of articles read did correlate. Furthermore, Stith found that students needed help in order to understand how to use the Website which contained several components including: e- mail, a bulletin board, chat groups, announcements, lecture notes, streaming videos, an ability to monitor one’s own grades, and timed quizzes. Several practical suggestions were offered for those who wish to implement online coursework.

The launching of online graduate coursework by two institutions was described by

Kroder, Suess, and Sachs (1998). They briefly discussed three methods of collecting comments and suggestions as their evaluation scheme: a traditional institutional course evaluation, a focus group, and a detailed post-class survey administered to both faculty and students. They reported that 80% of students would take another online course, although 80% also reported the course took more time than a classroom course. Both students and faculty found the online course to be more time- intensive. Results from the study culminated in three major suggestions: implement streaming audio to reduce reading text online, provide more complete instructions for use of the

WebBoard, and require more interaction among students. Among several issues cited by students were the vital role of support from management, a well-constructed development plan, the need for technology and support staff to be in place, faculty willingness to innovate and to insure learning occurs, and student understanding of both benefits and challenges faced.

Summary of Results from the Selected Research Literature

The research results were summarized to answer the second research question : How do the online learning research examples that have been sampled meet the criteria for scientific research?

In reviewing the research literature selected for online learning, no study was found to meet the scientific research design criteria developed for question 2. The research designs were flawed in some way, making the conclusions derived from the studies open to debate. Of the 28 studies reviewed, 18 had serious research design flaws. Using these studies, it is not possible to determine whether learning online results in more positive attitudes and higher achievement, than does learning in the traditional classroom context. These studies do not meet a level-three type of theory. They were, at best, pre-experimental case studies of Web design courses. They used the development of typologies or determining correlations among events as the main form of analysis.

The remaining 10 studies compared traditional with Web-based courses. Eight of these stud ies were quantitative studies and two were qualitative. In each case, the study had serious threats to validity for generalization or dependability of the research outcomes. Some of these threats in the quantitative studies included lack of use of random sampling, small sample size, lack of control of alternative variables, treatment periods that were too short, and lack of assessment of instruction or instructor quality. (See Table 4 for a summary of threats to

15

research.) The two qualitative studies were designed for hypothesis generation rather than hypothesis testing. Because of a lack of triangulation, small number of subjects, and the few specific situations studied, results cannot be considered dependable beyond these specific studies’ settings.

In both types of studies there was a failure to describe adequately pedagogical methods used. There was little description of the prior knowledge students brought into the learning situation such as knowledge of the content taught or their ability and experience with the technology used. Therefore, it is not possible to determine whether improvement in achievement was the result of online instruction or other uncontrolled factors. No valid generalizations or dependability of best practices can be made which can inform other designers, teachers, and students involved in online learning.

Conclusions

Summarizing the results of the analysis of the research reported we now address research question 3, What can we learn from the results of the review of research on online learning environments based on the type of research performed?

In the studies selected, it was concluded that online instruction was as good as, or better, than traditional instruction or that online learning is a viable strategy. These conclusions were not warranted because of the serious flaws in research design or threats to research validity.

Using the selected studies, one can say online learning is neither better nor worse than face-toface classroom instruction. No conclusion can be drawn that warrants action one way or the other.

However, online learning research in the selected studies can inform us in regard to variables and best practices that may form the basis of future research. The research base to date should be seen as attempts at hypothesis-generating research and a basis to begin hypothesistesting research. These previous studies inform us about suspected and potential variables that should be considered in creating research instruments and research designs, and in completing the research process to determine cause and effect for higher levels of theory and more useful knowledge about online learning.

In examining the history of research on educational media over the past 75 years, one can find studies relating focusing on correspondence courses, film, radio, pictures in textbooks, programmed instruction, television, and computer-based instruction. There have been thousands of studies and hundreds of reviews of research on these “new” and technology-based media. In all of this research literature, the instructional media studied has not been found to supplant effective pedagogical practices used in the classroom (Joy & Garcia, 2000). As a result, it has been argued that the research supports the idea that media, per se, do not influence learning

(Gagne, Briggs, & Wager, 1992). Instead, learning is caused by pedagogical methods embedded in the media presentation. Clark (1994, p. 23) cites Gavriel Salomon as defining pedagogical methods as a way to shape information that activates, supplants, or compensates the cognitive processes necessary for achievement or motivation. As a counter-argument, Robert Kozma

(1994) states that media and methods are part of the instructional design. Media must be structured to give us new methods that take appropriate advantage of the media’s capabilities.

The online learning research base has taken a similar direction, as has previous instructional media research.

16

At the present time, the lack of adequately designed research does not allow us to rate online instruction as better, or even the same, as traditional forms of classroom instruction.

However, the results of the studies reviewed provide useful information and variables to be explored using the media with effective, research-supported, classroom pedagogical practices.

Some of these pedagogical practices include adequate and timely feedback, student-student and student-teacher interaction, and a safe and supportive climate for learning. Thus, the results of the online studies demonstrate that potentially viable online courses need strong consideration of structuring, timing, and the use of specific pedagogical practices. Studies of online courses demonstrate possible approaches to successfully implement these pedagogical practices.

Implications and Future Research

Studies of online courses demonstrate possible approaches to successfully implement these pedagogical practices and thus, lead to the fourth research question : What are implications from the results of research on online learning environments to date and what can be done to develop valid best practice guidelines in designing and learning online?

The studies reviewed were hypothesis generating. They provide tentative support to begin exploration, hypothesis testing, of particular practices in an online learning format. These practices are described in Table 6 and are grouped into four categories: (a) student behaviors, (b) faculty-student interactions, (c) technology support, and (d) learning environment. These best practices are associated with predicted positive learning attitudes and higher achievement. Where these best practices can be related to, or derive from, research-supported pedagogical practice, they may become effective practices in online course design. These best practices need to be addressed in an action research process to determine effectiveness in the specific course and the subject of studies that meet scientific research design criteria.

An example of category 1, student behavior, is the encouragement of student use of a variety of communication techniques, such as discussion boards, journals, and e-mail, as a means to enhance interactive learning. Also, students should personalize themselves by publishing online biographies and photographs to enable other class members to visualize them (McLellan,

1997). An example of category 2, faculty-student interaction, is personalizing communications with students and providing continuous, frequent supportive feedback throughout the online course (Jewitt, 1998). Category 3, technology support, includes, for example, insuring a low level of technical difficulty in accessing the Web site or communicating with others. Technology support has been associated with reducing student course dropout rates and poor attitudes toward learning in online courses (Wegner et al., 1997). An example of category 4, learning environment, is social interaction through group collaboration. This has a strongly positive association with student attitude and achievement (De Simone et al., 2000).

The next level of development is the construction of a checklist for best practices in developing and using online materials. This checklist, although at present not validated, could form the basis for evaluation of courses and modules used in online learning environments. An example of one checklist, the authors’ Checklist for Online Interactive Learning (COIL), is shown in Table 7. The validation of COIL is being presently conducted using three validation techniques. The first involves establishing expert validity using the existing research literature on classroom and online learning research. Due to the paucity of good online research studies only classroom research results have been effective in establishing validity. Some examples are provided in the third column in Table 6. The second technique involves determining empirical

17

validity for single or related groups of best practices in experimental studies designed to determine the effect of single or related groups of best practices on student learning outcomes.

Several studies are in progress at the University of Alabama and the University of Idaho, partners in the National Center for Online Learning Research (NCOLR), to determine the effect of specific best practices in undergraduate and graduate courses in their education and engineering colleges. The third technique involves determining empirical validity using the COIL evaluation instrument to determine differences in ratings of several online learning modules or courses with similar learning outcome objectives. Comparing the COIL ratings with student learning outcomes can provide validity for the instrument as an evaluation tool.

Summary

Many studies on online learning have focused on Internet and Web features, designing

Web-based instruction, and student attitudes and participation in Web-based courses. A number of reports describe the purpose, content, and structure of online courses, as well as the pedagogical aspects of online course delivery. However, despite the large volume of articles published at this time, most sampled include little discussion of the research design, data collection instruments or findings based on data gathered from course outcomes. Primarily descriptive in nature, the literature explains the creation of one or more courses, problems encountered, means of addressing problems, and plans for continuing revision of courses and development of new courses. There is very little information on how such efforts are evaluated: what instrumentation is used, how valid and reliable evaluation efforts are found to be, and whether the instructional theory underlying the coursework has been articulated and used to focus course development and evaluation. The descriptive literature is practice-based and focused with the purpose primarily of helping others in the “how-to-do-it” stage of online coursework development. While improvements and further revision of coursework are often suggested, the evidence for these suggestions is not clearly delineated. The descriptive literature needs to be supported with evidence from research based in a theory of instruction, and designed to yield data that can be considered by others. The inclusion of a well-structured research plan as a component of online course development appears to be much needed.

Acknowledgements

This research was developed through a grant from the National Center for Online

Learning Research (NCOLR) at the University of Alabama, Tuscaloosa, AL, funded through a sub- grant from the University of Idaho, Moscow, ID, and the U.S. Department of Education.

18

References

Blum, K. D. (1999, May). Gender differences in asynchronous learning in higher education:

Learning styles, participation barriers and communication patterns. Journal of

Asynchronous Learning Networks, 3 (1), p. 14, Article 5. Retrieved July 12, 2003, from http://www.aln.org/publications/jaln/v3n1/pdf/v3n1_blum.pdf

Boettcher, J. (1999). Another look at the tower of WWWebble. Syllabus, 13 (3), 7.

Bonk, C., & Cummings, J. (1998). A dozen recommendations for placing the student at the center of Web-based learning. Educational Media International, 35 (2), 82-89.

Brause, R., & Mayher, J. (1991). Search and research: What the inquiring teacher needs to know . New York: Falmer Press.

Brophy, J. E. & Good, T. L. (1986). Teacher-child dyadic interaction: A manual for coding classroom behavior.

Austin, TX.: University of Texas.

Campbell, D., & Stanley, J. (1963). Experimental and quasi-experimental designs for research .

Chicago: Rand McNally.

Clark, R. E. (1994). Medial will never influence learning. Educational Technology Research and

Development, 42 (2), 21-29.

Cornell, R. (1999). Paradigms for the new millennium: How professors will certainly change!

Educational Media International, 3 (2), 89-96.

Davies, R., & Mendenhall, R. (1998). Evaluation comparison of online and classroom instruction for HEPE 129—Fitness and lifestyle management course. Salt Lake City, UT:

Brigham Young University. (ERIC No. ED427752)

De Simone, C., Schmid, R., & Lou, Y. (2000, April). A distance education course: A voyage using computer-mediated communication to support meaningful learning . Paper presented at the annual meeting of the American Educational Research Association, New

Orleans, LA.

Denzin, N., & Lincoln, Y. (Eds.). (1994). Handbook of qualitative research . Thousand Oaks,

CA: Sage.

Gagne, R. M., Briggs, L. J., & Wager, W. W. (1992). Principles of instructional design. Orlando,

FL.: Harcourt Brace Jovanovich.

Gaud, W. S. (1999). Assessing the impact of Web courses. Syllabus, 13 (4), 49-50.

Gillani, B. (1998). The web as a delivery medium to enhance instruction. Educational Media

International, 35 (3), 197-202.

Gray, S. (1998). Maintaining academic integrity in web-based instruction. Educational Media

International, 35 (3), 186-188.

Hegngi, Y. N. (1998, April). Changing roles, changing technologies: The design, development, implementation, and evaluation of a media technology and diversity on-line course .

Paper presented at the annual meeting of the American Ed ucational Research

Association, San Diego, CA.

Jewett, F. (1998). Course restructuring and the instructional development initiative at Virginia

Polytechnic Institute and State University: A benefit cost study.

Blacksburg: Virginia

Polytechnic Institute and State University. (ERIC Document Reproduction Service No.

ED423802)

Jiang, M. & Ting, E. (1999, October). A study of students’ perceived learning in a web-based online environment. Paper presented at the Webnet World Conference, Honolulu,

Hawaii.

19

Johnson, D. W. & Johnson, R. T. (1978). Cooperative, competitive, and individualistic learning.

Journal of Research and Devleopment in Education, 12 (1), 3-15.

Joy, E. H., II, & Garcia, F. E. (2000, June). Measuring learning effectiveness: A new look at nosignificant-difference findings. Journal of Asynchronous Learning Networks, 4 (1),

Article 4. Retrieved July 13, 2003, from http://www.aln.org/publications/jaln/v4n1/pdf/v4n1_joygarcia.pdf

Kavale, K. A. & Glass, G. V. (1982, December). The efficacy of special education interventions and practices: A compendium of meta-analysis findings. Focus on Exceptional Children,

15 (4), 1-14.

Kozma, R. B. (1994). A reply: Media and methods. Educational Technology Research and

Development, 42 (3), 1-14.

Kroder, S. L., Suess, J., & Sachs, D. (1998). Lessons in launching Web-based graduate courses .

T.H.E. Journal, 25 (10), 66-69.

Levin, B. (1999). Analysis of the content and purpose of four different kinds of electronic communications among preservice teachers. Journal of Research on Computing in

Education, 32 (1), 139-156.

Loomis, K. D. (2000, June). Learning styles and asynchronous learning: Comparing the LASSI model to class performance. Journal of Asynchronous Learning Networks, 4 (1), Article 3.

Retrieved July 12, 2003, from http://www.aln.org/publications/jaln/v4n1/pdf/v4n1_loomis.pdf

McLellan, H. (1997). Information design via the Internet. McLellan Wyatt Digital, Saratoga

Springs, NY. (ERIC No. ED408942)

McNaught, C., Kenny, J., Kennedy, P., & Lord, R. (1999, October). Developing and evaluating a university-wide online distributed learning system: The experience at RMIT University.

Educational Technology & Society, 2 (4). Retrieved July 12, 2003, from http://ifets.ieee.org/periodical/vol_4_99/mcnaught.pdf

Mende, R. (May, 1998). Hypotheses for the virtual classroom: A case study . Paper presented at the IT97 Conference, St. Catharines, Ontario.

Miller, S. M., & Miller, K. L. (1999, July). Using instructional theory to facilitate communication in Web-based courses. Educational Technology & Society, 2 (3).

Retrieved July 12, 2003, from http://ifets.ieee.org/periodical/vol_3_99/miller.pdf

Miltenoff, P. (2000). Integrating streaming media into web-based learning: A modular approach.

Syllabus , 14 (1), 2000.

Newman, D. R., Johnson, C., Cochrane, C., & Webb, B. (1996). An experiment in group learning technology: Evaluating critical thinking in face-to- face and computer-supported seminars. Interpersonal Computing and Technology , 4 (1), 57-74.

Ory, J. C, Bullock, C., Burnaska, K. (1997, March). Gender similarity in the use of and attitudes about ALN in a university setting. Journal of Asynchronous Learning Networks, 1 (1),

Article 6. Retrieved July 12, 2003, from http://www.aln.org/publications/jaln/v1n1/pdf/v1n1_ory.pdf

Pincas, A. (1998, October). Successful online course design: Virtual frameworks for discourse construction . Educational Technology & Society, 1 (1). Retrieved July 12, 2003, from http://ifets.ieee.org/periodical/vol_1_98/pincas.html

Powers, S. M., Davis, M., & Torrence, E. (1998, October). Assessing the classroom environment of the virtual classroom . Paper presented at the annual meeting of the Mid-Western

Educational Research Association, Chicago.

20

Prawat, R. S. (1989). Teaching for understanding: Three key attributes. Teaching and Teacher

Education, 5 (4), 315-328.

Reynolds, P. (1971). A primer in theory construction . Indianapolis: Bobbs-Merrill Educational.

Rosenshine, B. & Meister, C. (1986). Teaching students to generate questions: A review of the intervention studies. Review of Educational Research, 66 (2), 181-222.

Rossman, M. H. (1999, November). Successful online teaching using an asynchronous learner discussion forum. Journal of Asynchronous Learning Networks, 3 (2), p. 18, Article 6.

Retrieved July 12, 2003, from http://www.aln.org/publications/jaln/v3n2/pdf/v3n2_rossman.pdf

Ryan, R. (2000). Lecture and online construction equipment and methods classes. T.H.E.

Journal, 27 (3), 78-83.

Saunders, N., Malm, L., Malone, B., Nay, F., Oliver, B. & Thompson, Jr., J. (1997). Student perspectives: Responses to Internet opportunities in a distance opportunities environment. Ball State University, Muncie, IN. (ERIC No. ED413816)

Schlough, S., & Bhuripanyo, S. (1998, March). The development and evaluation of the Internet delivery of the course “Task Analysis.” Paper presented at the annual conference of SITE

98: Society for Information Technology & Teacher Education, Washington, DC.

Shaw, G. P., & Pieter, W. (2000, June). The use of asynchronous learning networks in nutrition education: Student attitude, experiences, and performance. Journal of Asynchronous

Learning Networks, 4 (1), Article 5. Retrieved July 12, 2003, from http://www.aln.org/publications/jaln/v4n1/pdf/v4n1_shawpieter.pdf

Shih, C. C., Ingebritsen, T., Pleasants, J., Flickinger, K., & Brown, G. (1998). Learning strategies and other factors influencing achievement via Web courses. Iowa State

University, Ames, IO.

(ERIC No. ED422876)

Slavin, R. E. (1995). A model of effective instruction. Educational Forum, 59 (2), 166-176.

Smith, G., Ferguson, D., & Caris, M. (2001). Teaching college courses online. T.H.E.

Journal,28 (9), 18-22, 24, 26.

Smith, S., & Benscoter, A. (1999). Implementing an Internet tutorial for web-based courses.

American Journal of Distance Education, 13 (2), 74-80.

Snow, R. (1973). Theory construction for research on teaching. In R. Travers (Ed.), Second handbook for research on teaching, 77-112. Chicago: Rand McNally.

Stake, R. (1995). The art of case study research . Thousand Oaks, CA: Sage.

Stevens, R. J. & Rosenshine, B. (1980, April). Student ratings of college instruction: Some low inference variables. Paper presented at the annual meeting of the American Educational

Research Association, Boston, MA, April 7-11.

Stith, B. (2000). Web-enhanced lecture course scores big with students and faculty. T.H.E.

Journal, 27 (28), 21-28.

Teeter, T. (1997). Teaching on the Internet. Meeting the challenges of electronic learning .

University of Arkansas at Little Rock, Little Rock, AR. (ERIC No. ED418957).

Vrasidas, C., & McIssac, M. (1999). Factors influencing interaction in an online course.

American Journal of Distance Education, 13 (3), 22-35.

Wegerif, R. (1998, March). The social dimension of asynchronous learning networks. Journal of

Asynchronous Learning Networks, 2 (1), p. 18, Article 3. Retrieved July 12, 2003, from http://www.aln.org/publications/jaln/v2n1/pdf/v2n1_wegerif.pdf

Wegner, S. B., Holloway, K. C., & Garton, E. M. (1999, November). The effects of Internetbased instruction on student learning. Journal of Asynchronous Learning Networks, 3 (2),

21

Article 7. Retrieved July 12, 2003, from http://www.aln.org/publications/jaln/v3n2/pdf/v3n2_wegner.pdf

Wegner, S., Holloway, K., & Kroder, A. (1997).

Utilizing a problem-based approach on the

World Wide Web. Southwest Missouri State University, Springfield, MO. (ERIC No.

ED414262).

Wolcott, H. (1990). Writing up qualitative research . Newbury Park, CA: Sage.

Yore, L. (2003). Quality science and mathematics research: Considerations of argument, evidence, and generalizability. School Science and Mathematics, 103 (1), 1-9.

22

Table 1

Review of Research on Online Learning

Source

Experimental

Ryan

(2000)

Type of Study

Comparison: web- enhanced vs. web course

Subjects

40 students

Wegner,

Holloway, &

Garton

(1999)

Comparison: traditional vs. web courses

31 students

Measurement

Instruments

Outcome

Variables

Course Content

Test of Logical

Thinking TOLT, Test of Constructive

Attitude TOCA, course evaluation, exams, instructor observations, communication artifacts

Exam

Likert scale satisfaction survey

Perceptions

Achievement

Attitude

Student performance

Student perception

Architectural construction equipment and methods

(undergraduate)

Curriculum design and evaluation

(graduate)

Blum

(1999)

Interpretive qualitative case study of traditional vs. web course

(asynchronyous)

2 classes Communication artifacts

Patterns of communication

Learning styles

Barriers to communication

Gender differences

Graduate Courses

Overall Results and Conclusions

Results: No significant differences.

Conclusions: Teaching techniques and student learning styles equally successful in student learning for both approaches. Greater effort and effectiveness in communication is required of both teacher and students in online courses. Assessment format problems were noted.

Results: No significant difference between test scores of control and experimental groups. Experimental students had a more positive feeling about the course.

Conclusion: Internet-based instructional model makes it easier to have a p roblem-based, learner-centered course.

It may not be for all students but is as effective as the traditional classroom.

Result: Females communicate more formally, striving to help, use age to justify, and use a more personal manner.

Conclusion: Gender differences between male and female students are similar and different from traditional class.

Dispositional, situational, and institutional barriers exist for female students in online courses.

23

Davies &

Mendenhall

(1998)

Comparison: traditional face-to-face lessons vs. web lessons

104 students

Exams

Attitude survey

Preference survey

Achievement

Attitude

Course preference

Wegerif

(1998)

Qualitative case study of problembased web course

1 class Communication artifacts

Jewett

(1998)

Jewett

(1998)

Comparison: traditional vs. web-centric courses

210 students

Essay test

Comparison: traditional vs. web-assisted courses

185 students

Exams

Wegner,

Holloway, &

Kroder

(1997)

Comparison: traditional vs. web courses

1 class

Teeter

(1997)

Exam

Comparison: traditional vs. web course.

2 classes Exams

Student survey

Social, linguistic & reflective variables

Writing argumentation

Health Education

(undergraduate)

Teaching &

Learning Online

(graduate)

Philosophy

(undergraduate)

Results: Achievement was not significantly different between groups. Though comparable, some students learning needs were better served by the online delivery method. Other factors effecting student attitude toward online instruction were technology problems and social issues.

Conclusion: Both traditional and online were satisfactory in meeting student needs.

Results: Social difficulties result from unequal access to communication, conflict or domination in discussion, moving from structured to open activities controlled by students, and scheduling time for reflective discussion.

Conclusion: Social factors influence course ownership; course design, role of moderators, interaction styles and features of technology used.

Results: Web-centric students performed better in: organization, depth, clearer views expressed, argumentation. Overall assessment of essays found to be not significantly different.

Conclusion: Online instruction a viable alternative.

Writing argumentation

Achievement

Achievement

Perceptions

Introductory

Microbiology

(undergraduate)

Evaluation of

School

Curriculum

(graduate)

American Education

(undergraduate)

Results: No significant differences. Online course improved student access to resumes.

Conclusion: Results were ambiguous. However, course web site improves student access to resources and improves writing skills and attitudes toward writing and discussion.

Results: No significant difference between problembased delivery method online and traditional delivery method.

Conclusion: Problem-based method is workable but not applicable to all classes

Results: No significant difference in performance between groups.

Conclusion: Benefits for online students included student motivation, informal nature, expanded resources, quality of discussion. Problems included lack of technical support and extreme amounts of time required for instructor. Not appropriate for all students.

(www.ualr.edu/~coedept)

24

Newman,

Johnson,

Cochrane, &

Webb

(1996)

Comparison: traditional face-to-face seminar vs. web seminar

36 students

Preexperimental single semester case study of a web course

1 class

Communication artifacts

Critical

Thinking types and depth

Seminar in Business

Practices

(graduate)

Result: Depth of critical thinking greater in computer conferencing than face-to-face seminars.

Conclusion: Critical thinking a viable outcome using communication technology. Technology helps the more structured, less creative parts of critical thinking process; however, it is weakest in problem exploration.

Pre-Experimental &

Pilot Studies

De Simone,

Schmid, &

Lou

(2000)

Loomis

(2000)

Shaw &

Pieter (2000)

Preexperimental single semester case study of a web-centric course

Preexperimental single semester case study of a web course

Preexperimental single semester case study of a web-centric course

30 students

23 students

46 students

Stith

(2000)

Teacher & Course

Evaluation (TCEQ)

LASSI Survey exams

Likert type IT appreciation questionnaire and open ended questions

Student attitudes/perceptions

Interviews

Surveys

Attitude

Perceptions

Perceptions

Attitudes

Achievement

Perceptions

Educational

Psychology

(undergraduate)

Result: Most students agreed that online course was a successful format.

Conclusion: Web-centric course should use structured activities and group collaboration.

Research Methods

(graduate)

Nutritional medicine module

Developmental

Biology

(undergraduate)

Result: Attitude toward values and relevance of college courses predicted students dropping online course.

Student inability to focus attention or poor study skills predicted low achievement.

Conclusion: Specific factors can be identified that effect achievement in asynchronous online course.

Results: The majority of the students had a positive view according to the Likert scale questions, but were negative in the open-ended questions. The majority of the problems were technology and personnel related. Student performance was significantly influenced by age and contribution to news group variables (older and those with more contributions performed better).

Conclusions: Students appreciated the advantages of online learning but technology and personnel problems need to be addressed. Both Likert-type and open-ended questions needed in evaluation.

Result: Number of hits does not correlate with final grade but number of articles read correlates with final grade.

Conclusion: Students need help in understanding use of a website.

25

Kroder,

Suess, &

Sachs

(1998)

Levin (1999) Preexperimental long term case study of electronic interaction

Vrasidas &

McIssac

(1999)

Preexperimental single semester case study of 3 web courses

Preexperimental single semester case study of a

Hegngi

(1998) web-centric course

Preexperimental single semester case study of a

Newman et al.

(1996)

Mende

(1998) web-centric course

Preexperimental single semester static group comparison of students in a web course

Preexperimental single semester case study of a web course

7 to 15 students

35 students

8 students

11 students

25 students

14 students

Course evaluation

Focus group

Interviews

Survey

Communication artifacts

Interviews

Interviews

Course design

Artifacts

Student survey and interview

Exams

Surveys

Attitudes

Perceptions

Telecommunications for

1. Managers

2. Public Policy

3. Applications

Communication content,

Purpose,

Frequency

Preservice elementary teachers

Perceptions Telecommunications for

Instruction

(graduate)

Result: Most students reported they would take online course, however, it takes more student time.

Conclusion: Online courses are effective for student learning.

Results: Seven categories of content and eight categories of purpose emerged from communication analysis.

Conclusion: Non-significant differences were found between kinds of electronic communication between students and: a) peers, b) key pals, c) instructor, and d) peer group.

Results: Several factors influenced interactions including structural elements and feedback.

Conclusion: Students required training in technology and communication skills.

Perceptions Technology &

Diversity

(graduate)

Results: Online learning facilitates writing and discussion. Synchronous interactions are more reflective in content areas than synchronous interactions.

Conclusion: Implementation of online courses involves continuous research.

Perceptions Information Design

(graduate)

Results: Important features include setting deadlines and personalizing online experiences. Students interacted dynamically.

Conclusion: Web course created a commu nity of learners and greater communication and expression.

Attitudes

Participation

Achievement

HTML Authoring

(graduate)

Results: Correlation between participation and achievement.

Conclusion: Online learning is not effective for everyone.

Greater teacher commitment and greater structure required for an online course.

26

Powers,

Davis, &

Torrence

(1998)

Saunders,

Malm,

Malone,

Nay, Oliver,

& Thompson

(1997)

Schlough &

Bhuripanyo

(1998)

Shih,

Ingebritsen,

Pleasants,

Flickenger &

Brown

(1998)

Preexperimental single semester case study of 3 web courses

Preexperimental single semester case study of a web-centric course

Preexperimental single semester case study of a web course

Preexperimental static single semester group comparison of students in a web course

13 students

37 students

22 students

99 students

Large

Surveys

Jiang & Ting

(1999)

Preexperimental single semester case study of webcentric courses

287 students in multiple classes

College and University

Classroom

Environment Inventory

(CUCEI)

Attitude Survey

Course evaluation

Self assessment of computer skills

Asymetrix ToolBook

Instructor evaluation

Perceptions

Attitude

Attitude

Preference

Technology skills

Attitudes

Perceptions

Learning Strategy Scale questionnaire

(modified from

MSQL test)

Patterns of Learning

Scale questionnaire

(modified MSQL Test)

GEFT Test

Exams

Learning strategies

Field dependence

Achievement

Course evaluation

Surveys

Perceptions of factors related to learning

1. History & Theory of Instructional

Technology

2. Information Tech.

& Media Literacy

3. Distance Learning

Technologies

(graduate)

Educational

Leadership

(graduate)

Results: Students were satisfied with the course.

Conclusion: CUCEI useful in assessing online learning.

Result: Most students were comfortable with the online course format.

Conclusion: Online enhanced instruction was effective in developing a learning commu nity and active, quality interaction was a critical variable.

Task Analysis

(graduate)

Zoology

(undergraduate)

Biology

(undergraduate)

Multiple undergraduate courses

Result: Students attitudes affected by online courses.

Structure, impersonal delivery and hardware and software problems. Most students preferred classroom to online learning.

Conclusion: Online course offered a viable alternative to deliver instruction for students in remote locations.

Results: No significant differences between students with different learning styles and field dependence. Significant differences with different student learning strategies.

Conclusion: Students with diverse learning styles can equally benefit from web course.

Result: Interaction with instructor online discussion that is rich and meaningful. Written assignments correlated with student’s perception of learning. Time was negatively correlated with perceived learning.

Conclusion: Students’ perceived learning related to interactive online environment and interactive online communication from instructor.

27

McNaught,

Kenny,

Kennedy, &

Lord

(1999)

Rossman

(1999)

Ory, Bullock

& Burnaska

(1997)

Preexperimental single semester case study of webcentric courses

Preexperimental single semester case study of webcentric courses

Preexperimental, single semester case study using large survey in computer conferencing with traditional and web- enhanced courses.

757 students in multiple classes

3000 students in multiple classes

2151 students

Questionnaires

Surveys

Interview

Course evaluation

Surveys

Surveys,

Course monitoring,

Group interviews

Attitudes

Perceptions

Multiple undergraduate courses

Result: Student technology skills needed with additional support.

Conclusion: Students report online courses are a viable alternative to traditional courses.

Perceptions Multiple undergraduate courses

Result: Specific faculty feedback, open online discussion, and course guidelines perceived by students as effective practice.

Conclusion: Students report online courses are a viable alternative to traditional courses.

Grade level, gender, ethnic background/

Web access,

Computer use,

Quality of experience

40 college courses in six colleges, four courses were monitored for gender, interviews of students in 17 courses

Results: Females used computers for conferencing (both social and instructional interaction) more than male.

Males had slightly less difficulty than females with computer technology. Females used computers more in residence hall or on-campus labs. No difference reported in quality of experience.

Conclusion: Few real differences reported between genders. Online learning may help bridge the gender gap in familiarity with computers. One gender does not profit more or less by integration of computers in instruction.

28

Table 2

Type of Course Based on Levels of Use of the Internet as Instruction

The level of use of the web site defines the type of instruction web integration.

Level 1: Traditional Course

No web presence for a regular lecture campus course with a separate laboratory.

Level 2: Web-presence

A course that has at least some information about the course on a web site. In addition, the course web site may have course syllabi, course bibliography, assignments, pictures of faculty, and comments from students. It is more marketing of a course than instruction. Regular classes are held on campus as in a traditional course.

Level 3: Web-enhanced

A regular campus course that makes use of web pages to distribute course materials, assists in student-student and student-teacher communication, offers some type of virtual class meetings, and has resources that help students to access the Internet. Regular classes are held on campus as in a traditional course or may be presented through distance education using videotapes of lectures.

Level 4: Web-centric:

The course looks like regular campus courses but uses the web site to facilitate access to class materials, and supports student-student communication, student-teacher communication, and student- materials-resources interaction. The main instructional interaction in the course has shifted from the physical classroom to the web site. The course has greater geographic potential for including students, more time flexible, and fewer classroom meetings. The course may include location based orientation activities, weekend classroom seminars, physical presence through video lectures, and other physical presence special events. Fewer classes are held on campus as compared to a traditional course.

Level 5: Web-course

A web course is a complete course that can be accessed through the Internet. Times may be open (asynchronous) or partially scheduled (synchronous). A web course facilitates access to class materials and supports student-student communication, student-teacher communication, and student- materials-resources interaction. The course can be used to reach remote students or students with constrained schedules. The course does not require attendance or participation at location-specific sites.

5A: Web-course, Traditional -- A web course centered on traditional approaches to instruction that may include a web site with lecture notes and synchronous discussion in a distance learning format.

5B: Web-course, Cognitive Approach -- A web course centered on activities involving students in working with materials. This may include a web site with materials and

29

problems for students to work through in asynchronous discussion. Learning takes place through cooperative interaction between the students and the course materials.

Meaning is constructed through engagement of the student while attempting to acquire knowledge through web presentations.

5C: Web-course, Constructivist Approach -- A web course centered on student-student activities. This may include a website with issues and problems for students to work through, supported by resource materials, in asynchronous and/or synchronous discussion. Learning takes place through cooperative interaction among students.

Meaning is constructed through engagement of the student in dialog and reflection facilitated through the web course site.

30

Table 3

Number of Hours to be Redesigned for Web Instruction by Level

(Using a Ratio of 3 hours of Out-of-Class Work Per Hour of Instruction)

Hours of Web

Instruction

Instruction hours designed for the Web

Physical classroom instruction hours

Hours designed for independent work

Total hours of instruction and learning

Level 1 (0%) 0 45 135 180

Level 2 (0%)

Level 3 (11%)

0

5

45

40

135

135

180

180

Level 4 (44%) 20

Level 5 (100%) 45

*Modified from Boettcher (1999).

25

0

135

135

180

180

31

Table 4

Threats to Valid Research

Factors jeopardizing internal and external validity in quantitative studies.

1) Internal Validity

Do the experimental treatments that make a difference in the study even if there is no effect by the experimental variable?

History: events occurring between first and second measurements in addition to experimental variable

Maturation: processes within subjects operate as a function of time

Testing: effects of testing

Instrumentation: threats due to construction, administration, reliability, or validity

Statistical regression: groups selected based on extreme scores

Selection: biases in selecting subjects

Random sampling

Sampling errors

Group size

Experimental mortality: subjects not completing all aspects of study

Selection-maturation interaction: effects mistaken for the experimental variable

2) External validity

To what populations, settings, treatment variables, and measurement variables can this effect be generalized?

Interaction effects of the testing: effects of testing on outcome

Interactio n effects of selection biases: effects related to selection of subjects

Interaction effects of experimental arrangements: effects of the study conditions

Multiple-treatment interference: several variable interacting at the same time

* Modified from Campbell & Stanley (1963).

32

Table 5

Theory Construction*

Types of Theory

1. Typologies: A method of organizing and categorizing things

Criteria used to determine the most useful typology include: a.) exhaustivity, b.) mutual exclusivity and c.) consistency. The problem becomes one of determining which typologies are the most useful.

2. Predictions: Predicting events that will occur in the future

Under certain conditions, a change in one variable is followed by a change in another variable. If a statement can be used to predict, the concepts contained in that statement can be used to organize and classify, providing a typology.

3) Explanations: Explaining events that have occurred in the past

Explanations involve a cause and effect relationship, an if-then symbolic logic. They are logically derived from empirical generalizations. Explanations become abstract and are independent of time. If a statement can be used to explain events, the concepts contained in that statement can be used to organize and classify, providing a typology.

4) A sense of understanding: Understanding what causes events

A sense of understanding is provided only when causal mechanisms that link changes have been fully described. There are no missing links or black boxes. Description of a causal mechanism provides a sense of understanding

5) Control of Events: The potential for control of events

Distinction must be made between understanding how variables affect one another and being able to change the variables, both are needed for control. This level requires the ability to influence the variables that will affect the events one wishes to control.

* Modified from Reynolds (1971).

33

Table 6

Best Practices for Online Learning Environments

Category 1: Student Behaviors

Demonstrate their prerequisite technology skills at beginning are adequate for hardware, software and website use

Seek opportunities to, and support for, interacting with instructor and other students

Actively participate in all online activities

Online Research

Citation

Classroom

Research

Citation

Stith (2000);

McNaught et al.

(1999); Vrasidas

& McIssac (1999);

Wegner (1997)

Ryan (2000);

Wegner et al.

(1997); Smith &

Benscoter (1999)

Saunders et al.

(1997); Mende

(1998)

Jewett (1998)

Johnson &

Johnson (1978)

Actively involved through writing and interaction in web-based courses (improves student writing performance)

Use a variety of communication techniques to enhance online learning

Personalize themselves by publishing online biographies and photographs to allow other members of the class to visualize them

Seek assistance in understanding and mastering different learning strategies

Demonstrate prerequisites and become more proficient in technology communication skills

Shih et al. (1998);

McLellan (1997)

McLellan (1997) Slavin (1995)

Shih et al. (1998)

Vrasidas &

McIssac (1999)

Category 2: Faculty-Student Interactions

Provide clear and adequate guidance Wegner (1999);

Kroder et al.

(1998)

Gillani (1998) Use action research regularly to evaluate the success/failure of the course and meet student concerns

Personalize communications by/with studentstudent and student-teacher

Use variety of communication techniques to provide for greater empathy and personal approach than 3- mail and website alone

Jewett (1998);

Powers et al.

(1998)

McLellan (1997) Stephens &

Rosenshine

(1986)

34

Plan for increased time for student interactions as compared to traditional courses

Clearly delineate institutional policy on cheating

Scaffold virtual discourse construction

Ryan (2000)

Gray (1998) and plagiarism at start of course

Maintain separate e- mail account for web courses Gray (1998)

Forward responses to frequently asked questions to all students to avoid duplication

Gray (1998)

Give faculty reduced load and increased support to develop course materials

Gaud (1999)

Provide students with continuous, frequent support, feedback

Vrasidas &

McIssac (1999);

Jiang & Ting

(1999)

Pincas (1998)

Slavin (1995)

Stephens &

Rosenshine

(1986)

Stephens &

Rosenshine(1986)

Emphasize importance of good study skills throughout course

Closely monitor each student’s progress

Create opportunities to coach and facilitate student construction of knowledge

Give negative comments to students privately, preferably by phone

Clearly delineate course requirements

Category 4: Learning Environment

Use structured activities to provide an effective framework for online learning

Loomis (2000)

Loomis (2000)

Miller & Miller

(1999)

Rossman (1999)

Brophy & Good

(1986)

Category 3: Technology Support

Insure a low level of technological difficulties in accessing website and communication

Provide adequate, friendly, easy, continuous technical support

Rossman (1999) Stephens &

Rosenshine

(1986)

Jewett (1998);

Cornell (1999);

Wegner et al.

(1997)

Teeter (1997);

McNaught et al.

(1999); Davies &

Mendenhall

(1998); Kroder et al. (1998); Gillani

(1998)

Powers et al.;

DeSimone (2000);

Mende (1998)

Brophy & Good

(1986)

35

Mandate smaller class sizes for online courses to give faculty appropriate time to deliver quality instruction board

Use flexible deadlines to motivate students, maintain communication, and allow for technical problems

Create social interaction through group collaboration to facilitate high achievement

Gaud (1999)

McLellan (1997)

Use streaming audio for reading online

Present course content in a manner that hierarchically structures the sequence of information

Organize website to enable student to interact with the content, other students, and instructor

Create welcoming, safe, nurturing online environment

Present problem-solving situations in a realistic context

Provide opportunities for students to question instructor to insure accuracy of understanding

Wegner et al.

(1997); Blum

(1999); Jiang &

Ting (1999);

Kroder et al.

(1998); De

Simone et al.

(2000); Mende

(1998)

Kroder et al

(1998)

Miller & Miller

(1999)

Gillani (1998)

Bonk &

Cummings (1998)

Miller & Miller

(1999)

Miller & Miller

(1999)

Miller & Miller

(1999)

Kavale & Glass

(1982)

Slavin (1995)

Brophy & Good

(1986)

Slavin (1995)

Prawat (1989)

Prawat (1989);

Brophy & Good

(1986)

Slavin (1995) Create opportunities for students to communicate with each other to share understanding of course content

Provide opportunities to collaboratively construct knowledge based on multiple perspectives, discussion and reflection

Provide opportunities for students to articulate and revise their thinking to insure accuracy of knowledge construction

Insure equitable environment exists for gender differences in learning styles, reduction of barriers to participation, and communication

Include cooperative and collaborative learning to distribute workload through group and support female students’ preferred method of connected learning

Miller & Miller

(1999)

Miller & Miller

(1999)

Blum (1999)

Blum (1999)

Johnson et al.

(1981)

Rosenshine &

Meister (1996)

36

Promote gender equality by encouraging females to post messages while asking males to subside if a patterns of male domination is noticed

Insure an equitable learning environment exists for all

Allow time for reflection at end of course

Include “warm- up” period with light- hearted exercises aimed to help student get to know one another

Start online course with all students together at the same time

Provide equal access to the shared conversation in the course

Provide opportunities for students to control online learning and structure it for themselves

Provide discussion forums encouraging open and honest dialog

Conduct a teleconference during and at the end of the course to discuss successes and problems

Use computer conferencing to develop overall critical thinking skills

Blum (1999)

Blum (1999)

Wegerif (1998)

Wegerif (1998)

Wegerif (1998)

Wegerif (1998)

Wegerif (1998)

Rossman (1999)

Rossman (1999)

Rossman (1999)

Slavin (1995)

37

Table 7

Checklist for Online Interactive Learning (COIL)

Category 1: Student Behaviors Meets Criterion

1.

Demonstrate their prerequisite technology skills at beginning are adequate for hardware, software and website use

2.

Seek opportunities to, and support for, interacting with instructor and other students

3.

Actively participate in all online activities

4.

Actively involved through writing and interaction in webbased courses (improves student writing performance)

5.

Use a variety of communication techniques to enhance online learning

6.

Personalize themselves by publishing online biographies and photographs to allow other members of the class to visualize them

7.

Seek assistance in understanding and mastering different learning strategies

8.

Demonstrate prerequisites and become more proficient in technology communication skills

Total Section Rating

1 2 3 4 5

1 2 3 4 5

1 2 3 4 5

1 2 3 4 5

1 2 3 4 5

1 2 3 4 5

1 2 3 4 5

1 2 3 4 5

Category 2: Faculty-Student Interactions

9.

Provide clear and adequate guidance

10.

Use action research regularly to evaluate the success/failure of the course and meet student concerns

11.

Personalize communications by/with student-student and student-teacher

12.

Use variety of communication techniques to provide for greater empathy and personal approach than 3- mail and website alone

13.

Plan for increased time for student interactions as compared to traditional courses

14.

Clearly delineate institutional policy on cheating and plagiarism at start of course

15.

Maintain separate e- mail account for web courses

16.

Forward responses to frequently asked questions to all students to avoid duplication

17.

Give faculty reduced load and increased support to develop course materials

18.

Provide students with continuous, frequent support, feedback

19.

Scaffold virtual discourse construction

1 2 3 4 5

1 2 3 4 5

1 2 3 4 5

1 2 3 4 5

1 2 3 4 5

1 2 3 4 5

1 2 3 4 5

1 2 3 4 5

1 2 3 4 5

1 2 3 4 5

1 2 3 4 5

38

20.

Emphasize importance of good study skills throughout course

21.

Closely monitor each student’s progress

22.

Create opportunities to coach and facilitate student construction of knowledge

23.

Give negative comments to students privately, preferably by phone

24.

Clearly delineate course requirements

Total Section Rating

1 2 3 4 5

1 2 3 4 5

1 2 3 4 5

1 2 3 4 5

1 2 3 4 5

Category 3: Technology Support

25.

Insure a low level of technological difficulties in accessing website and communication

26.

Provide adequate, friendly, easy, continuous technical support

Total Section Rating

1 2 3 4 5

1 2 3 4 5

Category 4: Learning Environment

27.

Use structured activities to provide an effective framework for online learning

28.

Mandate smaller class sizes for online courses to give faculty appropriate time to deliver quality instruction board

29.

Use flexible deadlines to motivate students, maintain communication, and allow for technical problems

30.

Create social interaction through group collaboration to facilitate high achievement

31.

Use streaming audio for reading online

32.

Present course content in a manner that hierarchically structures the sequence of information

33.

Organize website to enable student to interact with the content, other students, and instructor

34.

Create welcoming, safe, nurturing online environment

35.

Present problem-solving situations in a realistic context

36.

Provide opportunities for students to question instructor to insure accuracy of understanding

37.

Create opportunities for students to communicate with each other to share understanding of course content

38.

Provide opportunities to collaboratively construct knowledge based on multiple perspectives, discussion and reflection

39.

Provide opportunities for students to articulate and revise their thinking to insure accuracy of knowledge construction

40.

Insure equitable environment exists for gender differences in learning styles, reduction of barriers to participation, and communication

1 2 3 4 5

1 2 3 4 5

1 2 3 4 5

1 2 3 4 5

1 2 3 4 5

1 2 3 4 5

1 2 3 4 5

1 2 3 4 5

1 2 3 4 5

1 2 3 4 5

1 2 3 4 5

1 2 3 4 5

1 2 3 4 5

1 2 3 4 5

39

41.

Include cooperative and collaborative learning to distribute workload through group and support female students’ preferred method of connected learning

42.

Promote gender equality by encouraging females to post messages while asking males to subside if a patterns of male domination is noticed

43.

Insure an equitable learning environment exists for all

44.

Allow time for reflection at end of course

45.

Include “warm- up” period with light- hearted exercises aimed to help student get to know one another

46.

Start online course with all students together at the same time

47.

Provide equal access to the shared conversation in the course

48.

Provide opportunities for students to control online learning and structure it for themselves

49.

Provide discussion forums encouraging open and honest dialog

50.

Conduct a teleconference during and at the end of the course to discuss successes and problems

51.

Use computer conferencing to develop overall critical thinking skills

1 2 3 4 5

1 2 3 4 5

1 2 3 4 5

1 2 3 4 5

1 2 3 4 5

1 2 3 4 5

1 2 3 4 5

1 2 3 4 5

1 2 3 4 5

1 2 3 4 5

1 2 3 4 5

Total Section Rating

Total Score

40

Download