An Interactive Online Course: A Collaborative Design Model

An Interactive Online Course:
A Collaborative Design Model
El Mahnaz Moallem
The purpose of this paper is to describe the
evaluationresults of using an interactive
design modelfor the development of an online
course. Specfically, it examines: (a)how an
interactivedesign model was used to develop
collaborativeand cooperativelearning
activities; (b)how activities were structuredto
promote the level and quality of
communicationsamong students, as peers,
and between students and the instructors;and
(c)how students responded to such interactive
design model. The paper also provides
information about the delivery process and
describes what happenedwhen this interactive
model wasfully implemented and used.
O
As the number of Internet-based courses increases and distance learning programs grow in
popularity, educators raise important questions
about the quality of these courses and programs
(Muirhead, 2000, 2001). One of the concerms is
the level of interactivity (communication, participation, and feedback) between students and
between teachers and their students (LaRose &
Whitten, 1999; McNabb, 1994; Sherry, 1996). As
Foshay and Bergeron (2000) observed, there is a
big difference between being able to distribute
information with the Internet and being able to
teach with the Internet. While learning is ultimately an individual enterprise, the support of
a group with a common learning objective can
produce a synergistic facilitation of learning by
each member of that group.
Nonetheless, the social dimension of learning
in online courses or Internet-based instruction
has received little attention. Many educators advocating distance learning believe that interactivity is a vital element in the educational
process (e.g., Moore, 1991, 1992, 1993; Moore &
Kearsley, 1995; Muirhead, 1999; Parker, 1999;
Saba & Shearer, 1994; Spitzer, 2001; Zirkin &
Sumler, 1995). However, critics stress that interactivity is the missing element in distance education because online classes either do not
emphasize online interaction or face reluctance
from the students to participate in online discussion. A few researchers who studied online
courses (e.g., Boshier et al., 1997; Hiltz, 1997;
Kearsly, 1995; McNabb 1994; Sherry, 1996) observed that while communication options (e.g.,
e-mail, bulletin boards, conferencing systems,
whiteboards, chat rooms, and videoconferencing) are plentiful and increasing, Internet-based
-instruction (online courses) has been focused
mainly on student-content and self-study les-
ETR&D, Vol. 51, No. 4,2003, pp. 85-103 ISSN 1042-1629
85
86
ETR&D. Vol. 51, No. 4
sons and materials. They further argue that
simply making communication tools available
to online students does not mean that students
can and will use them (Berge, 1999). If the interaction is not an integrated, essential, and graded
part of an online learning environment, the
majority of students will never use it at all, and
those who start to use it will generally decide
that nothing is going on there, and will stop
using it.
The purpose of this paper is to describe the
evaluation results of applying an interactive
design model for the development of an online
course. The paper also discusses the delivery
process and explains what happened when this
interactive model was implemented and used.
INTERACTIVITY AND
INTERNET-BASED LEARNING
Two types of interactivity are identified in computer-mediated learning, (a) cognitive or individual interaction (interaction with content)
and (b) social or interpersonal interaction. While
both types of interactivity are important to
learning, the social constructivist view of knowing emphasizes the vital role of the human
dimension of interactivity in learning (Gilbert &
Moore, 1998; Knowles, 1990; Moore, 1992;
Mortera-Gutierrez & Murphy, 2000; Muirhead,
1999, 2000). According to social constructivists,
learning is a social construct that is mediated by
language and social discourse (Vygotsky, 1978).
The social view of knowing highlights the notion that it is through the construction of shared
outcomes or artifacts that learners engage in
developmental cycles that facilitate conceptual
change (Shaw, 1996). The social view of interactivity places emphasis on a collaborative and
cooperative learning environment and encourages active dialogue (Moore, 1991; Saba &
Shearer, 1994). In such an environment learners
are exposed to multiple perspectives that serve
to form cognitive scaffolds as the students exchange information with each other, the people
around them and experts in the field (Harasim,
1989). Furthermore, the social view of interactivity uses problem-based learning (Barrows &
Tamblyn, 1980; Blacklow & Engel, 1991; Boud,
1985; Boud & Feletti, 1991; Engel, 1997) as an instructional procedure in order to transfer control
over the learning process from the teacher to the
students (Knowles, 1975; Peterson, 1996) and to
structure and support a carefully planned series
of collaborative learning activities, which constitute the content and assignments of the online
instruction.
AN ONLINE DESIGN MODEL
WITH FOCUS ON
SOCIO-CULTURAL VIEW
The social constructivist notion of interactivity
described above was used as a theoretical
framework for building a design and development model that focused on online collaborative
learniing (see Figure 1). To build the model it was
assumed that knowledge, understanding, and
meaning gradually emerge through interaction
(social discourse) and become distributed
among those who are interacting (construction
of shared knowledge). Moreover, knowledge is
often distributed among participants and
situated in a specific activity context (Brown,
Collins, & Duguid, 1989; Greeno, 1997; Lave &
Wenger, 1991). In this situative approach, social
knowledge construction develops distributed
knowledge, skills, and understanding around
the target activity. However, as Salomon and
Perkins (1998) noted, even though knowledge
and learning are socially situated, the learner
still exists as an individual within the learning
situation. Thus, even when learning is fostered
through processes of social communication, individual activity and reflection still play a critical role (Perkins, 1993). As such, it was assumed
that both forms of interaction (individual and
social) are part of the same process of
knowledge construction and are essential to the
construction and assimilation of knowledge.
Emotions, feelings, motivation, and attitudes
are integral parts of an intellectual and social
development. A community of learners cannot
exist if its members do not care for each other
and do not understand each other's feelings.
Furthermore, in order to maintain positive
relationships with one another, members of a
AN INTERACTIVE ONUNE COURSE
87
Figure 1 3 Collaborative design model.
community must have feelings or empathy
(Martin & Reigeluth, 1999) for each other and
provide emotional support (onassen, 1999;
Reigeluth, 1999) when needed. This emotional
support could be in the form of providing feedback, sharing frustration, providing encouragement or offering help and hints. Thus, it was
assumed that in a collaborative and conversational learning environment, emotional support
would be provided along with social and cognitive support. In addition, it was assumed that a
problem-based learning environment provides
structure for generating a transaction between
social knowledge and personal knowledge. In
such an environment personal relevance is
stimulated by authentic problems without
lowering the degree of cognitive complexity.
Importance of the Nature or
Type of Learning Task
Research on small-group interaction indicates
that group discussion or conversation is highly
influenced by the nature of the problem-solving
task (e.g., Daft & Lengel, 1986; Hackman &Morris, 1975; McGrath, 1984; Straus & McGrath,
1994). In addition, communication media research shows that different tasks vary in how
much social context information (cues) their effective execution requires (McGrath,- 1990;
Straus & McGrath, 1994). A task that has a high
need for coordination may not be appropriate
for a text-based computer-mediated communication where social context cues are
primarily absent (Argyle, Lalljee, & Cook, 1968;
Kendon, 1967; Rutter & Stephenson, 1975). Some
problem-solving tasks may be more suitable for
the online collaborative learning environment
(Berge, 1995; Hiltz, 1994) than others.
McGrath and Hollingshead (1993) proposed
a model that predicts the effects of computermediated communication and task type on
group task performance. In their task classification model, McGrath &Hollingshead suggested
that most group tasks can be classified into
categories that reflect four basic processes. The
four categories require learners to (a) generate
(e.g., generate ideas or plans), choose (e.g., (b)
choose the correct answer or a preferred
88
ETR&D. Vol. 51. No. 4
answer), (c) negotiate (e.g., make a decision or
resolve conflicts of interest), and (d) execute
(e.g., perform intellectual and psychomotor
tasks), as each of these processes are related to
one another. On the basis of this theory, patterns
of difference occur between the information
richness requirements of the task and the information richness potential of the communication
medium. The generating and choosing that are
also called intellectual tasks are labeled as collaborative tasks because they are less dependent
on social context cues, while negotiating and
decision-making tasks are labeled coordination
tasks because they are more dependent on social
context cues. Therefore, given this framework,
there is a good task-media fit between generating tasks and computer-mediated communication (online discussion) and a good task-media
fit between negotiating tasks and face-to-face
communication (see Table 1).
In order to design and develop problem-solving tasks that have high potential for promoting
collaboration and fostering conversation, McGrath and Hollingshead's task classification
theory (1993) was used to identify the authentic
problem-solving tasks that are more appropriate
for an online collaborative learning environment. Given this theory, two types of problemsolving tasks were selected: generative tasks and
intellective (choosing) tasks. It was assumed that
as the model predicted, generative tasks are the
best type of task for promoting online discussion
and collaboration. It was also assumed that as
the model predicted, tasks requiring groups to
solve intellective problems (problems that have
correct answers) are also appropriate for online
discussion although they are not the best type of
tasks for fostering online conversations among
the group members.
The Importance of the Collaborative
Groups and Collaborative Context
In addition to the type of collaborative problemsolving tasks, the following were applied to create a better social context for collaborative online
learning:
• Establish individual accountability (ohnson,
Johnson, & Smnith, 1991; Slavin, 1995), where
both the individual and other members are
aware of the individual's performance
toward the group task.
c Encourage commitment to the group and its
goals, where group members help one
another, exchange needed resources, provide
appropriate feedback on performance, and
encourage efforts toward achieving the
group goals Johnson, et al. ,1991; Slavin).
c Facilitate smooth interaction among group
members at both an interpersonal and a
group level (Rubin, Rubin, & Jordan, 1997),
where group members demonstrate the
necessary social skills or communication
competencies.
o Provide stability of groups so that group
members can work with each other for longer
periods of time in order to reduce the time
and effort for establishing group norms,
group task performance, and interaction patterns (McGrath, 1992).
Table 1 El Task-media Fit on Information Richness (McGrath & Hollingshead, 1993).
Task Type
ComputerSystems
Generating ideas and plans (collaborative)
Good fit
Choosing correct answer: Intellective tasks
Marginal fit
Medium too constrained
Choosing preferred answer: Judgment tasks
Poor fit
Medium too constrained
Poor fit
Medium too constrained
Negotiating conflicts of interest
Face-to-FaceCommunication
Poor fit
Medium too rich
Poor fit
Medium too rich
Marginal fit
Medium too rich
Good Fit
89
AN INTERACTVE ONUNE COURSE
DESIGNING AND DEVELOPING AN
INTERACTIVE ONLINE COURSE
Course Description
The course that was designed to be delivered
over the Internet is entitled "Instructional Systems Design: Theories and Research." It is a required, three-unit core course for a graduate
degree in instructional technology. Participants
enrolled in this course are primarily graduate
students seeking a master's degree in instructional technology or education majors seeking
an elective course in the area of design and
development. The course expects students to
develop knowledge of theoretical foundations of
instructional design by exploring a full range of
theories, approaches, and methods of instruction. It also expects students to learn skills of applying the instructional design theories in the
design and development of an instructional
material, which is the major requirement of the
course.
The course was first designed for Web
delivery using a Web-based course management
tool (Eduprise Database), adopted by the
university, and in the following semester was
converted to the WebCT course management
system, with some revision of the process in the
following semester. There were some differences in the ways the above-mentioned course
management systems offered course content
tools, flexibility in data collection and data mining, and ability to customize. However, the
designer tried to use the established theoretical
framework or model for the design of the course
in order to keep the instructional design
specifications of the course the same across the
two different course management systems in
two consecutive semesters. Except for the two
different course management systems, this was
done. A total of 24 students (12 each semester)
enrolled in the course in two semesters. The
designer of the course was also the instructor of
record for the course delivery and its evaluation
in both semesters.
Course Design and Development
Specifications
As was indicated earlier, problem-based learning was used as the general instructional design
model to develop both a culminating project (a
real-world problem-solving task) and a series of
authentic but generative and intellective problem-solving tasks or collaborative activities to
organize the course content, as well as to structure students' social interactions. The general
goals of the course were to develop knowledge
of theoretical foundations of instructional
design and to apply instructional design
theories in the design and development of instructional material. The course-culminating
problem-solving project required students to
choose an instructional design theory or model
to design and develop instructional material for
a unit of instruction. The course general goals
and its culminating projectwere used to identify
the course content (knowledge and skills), its
units, and weekly lessons. After identifying the
content of the weekly lessons for each unit of instruction, a problem that simulated a situation
that instructional technologists encountered in
everyday professional practice was developed.
Problems were to be used as starting points for
learning the content of the lessons and for
achieving their objectives. The problems were
designed so that they were content specific, but
ill defined. Also, the problem statement did not
present all of the information that students
needed in order to solve the problem (onassen,
1999). They were open ended in the sense that
students had to fill the information gaps, to
make judgments about the problem, and to
defend their judgments by expressing personal
opinions or beliefs. The hope was that the
generative (multiple solutions) characteristic of
the problems would motivate students to initiate
and continue the discussion and that the domain
specific characteristic of the problem would help
students stay within the task knowledge domain.
(the content of the lesson).
In order to cognitively support students
during their problem-solving process or social
discussion (onassen, 1999; Salomon & Perkins,
1998), two strategies were used. (a) The first
strategy was to provide a set of related cases or
90
worked-out examples that could help students
explore how a similar problem has been solved
(onassen). Upon developing or locating such
examples, they were linked to the problem statement page. Related cases or worked-out examples were offered in an effort to provide
learners with an example of the desired performance, and simultaneously to demonstrate actions and decisions involved in the performance
(Jonassen). Furthermore, in order to facilitate
student access to expert opinions during the
problem-solving process, the problem statement
page was linked to an active online forum (IT
Forum Ihttp://it.coe.uga.edu/itforum/index
.html]) where experts in the field of instructional
design responded to student questions and discussed emerging theories and issues in the field
of instructional technology.
(b) The second strategy was to assist students
in their effort to construct individual understanding and to interact with their own prior experiences before presenting, defending, and
discussing them with their peers (Perkins, 1993;
Salomon & Perkins, 1998). To that end, an individual assignment was developed for each lesson, in which students were asked to read the
suggested materials and resources, to synthesize
their understandings of the readings in a summary format, and to post summaries in the individual assignment area. Students were
required to complete this individual assignment
before they could begin discussing and solving
the weekly problem or collaborative activity.
Again, to provide scaffolding strategies (scaffolding is a temporary support that is removed when
no longer necessary) for individual assignments,
a list of open-ended questions was developed
for each individual assignment, and students
were asked to use those probing questions to
synthesize their own understanding of the reading as well as to reflect. At the beginning of the
week, the instructor would read the individual
assignments and give students feedback. The
instructor's feedback often included summaries
of student discussion or thoughts, guidance
about alternative resources, and thoughtprovoking questions to stimulate more thinking
and to promote reflection. In addition, to assist
students in understanding the problem and in
developing personal knowledge, two strategies
ETR&D, Vol. 51, No. 4
were used. (a) First, several information pages
(slides, lecture notes, links to informative Websites) were developed and then linked to each
weekly lesson. Second, a list of reading materials
(textbook and reading package) was suggested
for each lesson, and students were advised to
read those materials before beginning to work
on the problem.
Another design issue was to identify the activity structures or rules that regulate actions
and interactions (Jonassen & Rohrer-Murphy,
1999) required to solve problems. This was the
most difficult part of the task because it could
either advance or discontinue student interactions or conversations. For me as a designer and
the instructor of the course, it was important to
decide how much scaffolding should be
provided in order to help learners perform the
task. To accomplish this goal, a list of focusing
questions, product specifications, and procedures for completing the problem-solving task
or collaborative activity were developed and
added to the problem situation page. By providing guidance, the hope was that the directions in
the task would allow spontaneity and experimentation during the problem-solving
process, while lessening the confusion and
regulating the actions and interactions.
Facilitating Student
Collaboration and Group Work
Research suggests that small groups of three to
four students are preferred (e.g., Imel &Tisdell,
1996; Johnson, Johnson, & Smith, 1998; Rau &
H-Ieyl, 1990; Slavin, 1995) because small group:
o Reduces the likelihood that members take a
free ride on the contributions of others (Shepperd, 1993)
o Makes it easier for the instructor to monitor
individual contributions and to scaffold each
team's progress;
* Provides more opportunities for quality interaction and improves commitment to the
group;
o Improves each student's social skills to interact smoothly with others at the group level;
* Helps team members by developing needed
91
AN INTERACTIVE ONUNE COURSE
behaviors and eliminating deferring behaviors to facilitate the productivity of the
group; and finally,
* Helps teams see the value of working
together.
Therefore, small groups (four members) were
formed for weekly problem-solving tasks. In
order to support and promote collaboration
within the team members, several researchbased strategies were applied. Research on
group theory in a computer-mediated environment (McGrath, 1991, 1992) suggests that
change in the group's membership affects the
group interaction process, member reactions,
and group task performance. Computermediated communication research also points
out that imposed change in a group's membership causes more perturbations in computermediated communication than it does in
face-to-face instruction. Given these research
results, the decision was to keep the groups the
same and not to change the team members for
the entire semester. The team members were
also asked to (a) introduce themselves to one
another in a "getting to know each other" assignment, (b) create an e-mail address list for
group members, and (c) use each other's names
all the time (the instructor also modeled these
behaviors). The teams were also formed very
carefully. The results of student learning styles
inventories (completed as part of introducing
themselves to one another) were used to form
students into the small teams. I tried to group
students with different learning styles (e.g., active and reflective; visual and verbal; sensing
and intuitive) in each team to allow enough difference of viewpoints to trigger interactions.
Research on group theory also shows that
there is a positive relationship between successful taskperformance, team members' perception
of effective group process, and the level of satisfaction with the task performance and communication (McGrath & Hollingshead, 1993). In
order to help the teams with the process of completing the task and feeling more satisfied,
several strategies were applied. First, the procedures for collaborative work were explicitly
described. Second, the responsibilities of the
team members (established social norms) were
spelled out in a separate Web page that was
linked to the lesson's collaborative activity.
Third, each team was advised to use a team assessment tool to evaluate its collaborative work,
and to use the results as a means to improve its
collaborative work. Fourth, each team was
asked to identify one member as a team leader
and one member as a team recorder for each
problem-solving task, and to rotate the responsibilities so that every member would have a
chance to serve both as a leader and as a recorder. Finally, the instructor used a conversational style (spontaneous and informal, with
comments directed to individual students or to
individual comments) in team discussions.
COURSE EVALUATION
The formative evaluation of the course design
model was focused on the following questions:
1. What happened when different components
of the model were implemented and used in
practice?
2. What did students think about the course
design specifications?
3. Which components of the course design
model were found to be most useful from the
student's perspective?
4. How did students use the cognitive support
strategies integrated in the course design?
5.
How did the nature and type of learning
tasks influence group discussion or conversation?
6. Which problem-solving tasks (intellective.vs.
generative) created the best environment for
conversation and sharing of knowledge?
7. In what ways did the course design model influence student learning and satisfaction?
The evaluation results presented here are
based on the quantitative and qualitative
analysis of the data gathered from 24 (6 male
and 18 female) graduate students (12 students in
each semester) enrolled in the Web-based course
in two semesters (Fall 2000 and Fall 2001). All
students enrolled in the course lived in counties
surrounding the university, although several
had to commute for about an hour to come to the
92
EIR&D, Vol. 51, No. 4
campus if needed. The majority of students also
worked full time during the day and were considered part-time graduate students. The course
design, delivery specifications, and evaluation
strategies were kept the same for both semesters.
Data Gathering Strategies
one group discussion was focused on how to
conduct needs assessment and needs analysis
for instructional design materials, and the
last three large-group discussions emphasized guiding students in applying an
instructional design model to develop instructional materials.
• Chat logs for small-group discussion.
O
Student perfonnance results (responses to individual assignments, responses to problemsolving tasks, and the written documentation
for the course design projects).
@
Student evaluation of the course, measured by
the instrument administered by the university at the end of each semester.
The plan was to gather data from multiple sources to test the consistency of the findings. The
following data-gathering strategies were used:
* Student questionnaire. Twice during the
semester (once in the middle of the semester
and once at the end of the semester), students
completed a questionnaire in which they
responded to a list of questions (both openended and closed-ended items) about the
course design specifications (e.g., Which
components of the course helped you understand the content? List three most useful and
three least useful features of the course, etc.),
and a list of questions that measured student
attitudes and satisfaction.
o Student biographicalinfrmnation and results of
learning styles surveys (Felder, 1993). At the
beginning of each semester students were
asked to complete the Felder-Silverman
Index of Learning Style (ILS) inventories (a
44-question, self-scoring instrument, Felder
& Silverman, 1988) and report its results in
their biographical information posted in the
first week's large-group discussion.
* The questions students posted in the "help"
thread. The first day of the course, a help
thread was created in the common forum and
students were instructed to post their general
questions and concerns about the course in
this thread. The purpose of creating a help
thread instead of using e-mail was to prevent
students' asking and answering similar questions. The content and nature of students'
posting (or help e-mail messages) were
analyzed.
a Student postings in the team discussion (eight
team activities) and weekly large-group discussion corresponding to each team activity.
The first large-group discussion was devoted
to getting to know each other. In addition,
Analysis Tools and Strategies
The qualitative analysis of student chat logs and
postings in small- and large-group discussions
was conducted using the NUD*IST qualitative
analysis software (e 2002 QSR International Pty.
Ltd.). Student discussions and chat logs were
imported to the NUD*IST in plain text files and
were used to create nodes (containers for
coding) and codes. Open coding strategy was
used for creating codes and nodes. Statistical
Package for the Social Sciences (SPSS) data
analysis software was also used to analysis the
quantitative data.
EVALUATION RESULTS
What Happened When Different
Components of the Model Were
Implemented and Used?
The course management and delivery system seemed
to influence student interaction during the implementationprocess. The initial first tool
(Eduprise Database) provided some support for
communication (forums: chat rooms, e-mail,
and electronic file sharing); the system was
limited in several ways (see Table 2). First, the
system was relatively slow for a course that had
conversation and discourse as its core pedagogy. The students and the instructor had to
spend many hours to manage a discussion that
93
AN INTERACTIVE ONUNE COURSE
could have been completed within 30 min in a
face-to-face situation. Second, students had to go
from one database (course site-lesson page) to
another database (forum) to communicate or
converse with their peers. They did not have access to an internal e-mail system and were only
able to participate in one chat room (this was not
recorded for the instructor, and the instructor
had to attend the chat session to copy and paste
the log after the conversation). Furthermore,
uploading and downloading files for the purpose of sharing ideas or adding to other
members' ideas was not only slow, but also difficult to manage, especially when students
began sharing images and graphics.
With the adoption of WebCT in the following
semester some of the above-mentioned problems were solved (see Table 2). It appeared that
WebCT was more suitable for a conversational
course in which students saved time by having
access to (a) both asynchronous small- and
large-group discussion and synchronous discussion on the same page/screen/site; (b) an
internal e-mail system, which facilitated conversation; and, (c) a total of six chat rooms, four customizable rooms that were logged (recorded for
the instructor), and two general purpose rooms
that were not logged. Students were able to participate in several chat rooms at the same time. It
was also possible to send URL links via the chat
rooms, so that conversation members could
share additional information. The availability of
the whiteboard (shared words space for visual
messages) during the conversation was also useful. Students were able to share images and
shapes along with text during conversations.
Furthermore, students seemed to have fewer
complaints about the system being down or
Table 2 0 Number of messages posted in
the team area for total of eight
team activities.
Number of Entriesin the Team Area
Fall2000
Fall2001
Team 1 (n = 4)
Team 2 (n = 4)
Team 3 (n = 4)
185
272
589
220
317
82
slow. The availability of an internal e-mail system and more chat rooms, and access to the
whiteboard combined with the ease of file sharing among students within WebCT also seemed
to better support student interactions.
Another challenge during implementation
was time, was not only for students, but for the
instructor aswell. Compared to face-to-face classes, the instructor had to spend many more
hours reading student postings, responding to
their ideas, participating in each team's discussion for the collaborative activities, providing
timely scaffolding in both the team area and
common forum, and giving timely feedback to
both individual assignments and group works.
If one adds the slow speed of the system to the
hours that the instructor had to spend online,
one can easily say that, for the instructor,
managing this online collaborative course was
equal to teaching two similar standard face-toface graduate courses.
The other challenge was, indeed, a pleasant
surprise. This challenge proved to be the same
across two semesters with two different groups
of students and two different course management systems. Students' desire to do the best collaborative work, and their willingness to spend
as many hours as required in order to produce
their best solution were above and beyond expectations. In addition to exploring issues, sharing resources, and coaching one another in
understanding underlying concepts and
theories, students tended to spend tremendous
amounts of time working on the product or the
response as they tried to include everyone's
ideas in the team product (an average of 130
messages per activity in Semester One and an
average of 77 messages per activity in Semester
Two with a minimum of one hour-long chat session in Semester Two). This result was very impressive. The team products or responses were
high quality work. After the first two collaborative activities, the instructor's coaching and scaffolding strategies decreased, and she became a
team member, for the most part, during team
discussions. However, while it was desirable to
see high student involvement in the development of the response or solution, it was
suspected that the time and effort that students
were putting into their work would eventually
94
frustrate them and might, in fact, have a negative effect on their overall performance and
satisfaction. Therefore, it was a challenge to help
teams understand that whereas the instructor
was very impressed with their creative and collaborative products or responses, the process of
social interactions and negotiations was more
valued than what they had developed as end
products. Student investment of time on their
products also made it more difficult to critique
their final responses and to provide constructive
feedback without making them unhappy. Some
student responses to the instructor's feedback
(posted in teams' chat logs or informal conversation with the instructor) for the first three team
activities indicated that students expected to
hear praise for their best effort and work rather
than constructive criticism (although they did
not deny the value of the constructive feedback).
The last challenge in managing the course
was related to effective collaborative team skills.
It was a challenge to develop a collaborative
work environment in order to help teams leam
how to work effectively and collaboratively. Independent and task-oriented students, as identified by the learning styles inventories, seemed
to become distressed with one another easily,
and appeared to be more concerned about completing the task than exploring alternative solutions and negotiating multiple perspectives. At
the end of the semester, students learned to use
verbal communication effectively and did not
have to spend much time rephrasing what they
wanted to say and how they wanted to say it,
whereas, at the beginning of the semester, it was
a challenge that most of them had to face and
learn.
The problem-solving tasks appearedto influence student discussionand conversation. The number of
student postings in the team area (see Table 2)
and in the weekly large-group discussion board
(see Table 3), the content analysis of student discussion logs for both large-group and smallgroup discussions, and student responses to the
questionnaire indicated that problem-solving
tasks stimulated high quality discussion and
conversation among students. Students actively
participated in the weekly asynchronous discussion topic (weekly topics focused on the issues
ETIR&D, Vol. 51, No. 4
Table 3 L1 Number of messages posted In
large group discussion
corresponding to eight team
activities.
Semester
Fall 2000
n=12
Number of
entries
Fall2001
n=12
Number of
entries
44
30
25
19
12
16
11
9
94
66
58
71
56
33
56
38
Weekly Discussion
Day 1
Day 2
Day 3
Day 4
Day 5
Day 6
Day 7
Day 8
Table 4 C] Average number of messages
posted in the team area for the
first and last collaborative
problem solving activities.
Average Number of
Entriesin the
Team Area
Fall 2000
Fall 2001
Activity I
(n =12)
Activity 8
(n = 12)
209
144
45
52
underlying the weekly collaborative problemsolving task) and chat session (30 min). The
average number of postings for each week's
large-group discussion topic was 20 for the first
semester and 52 for the second semester. Students also actively participated in their team
discussions for eight weekly problem-solving
tasks (average of 130 messages per activity in
Semester One and 77 messages per activity in
Semester Two). Although the number of postings for the collaborative problem-solving activities decreased from the first team activity
(see Table 4) to the last team activity, the qualitative analysis showed that this decrease was not
due to the lack of participation. In the later collaborative problem-solving tasks, students
tended to post more messages related to the
95
AN INTERACTIVE ONUNE COURSE
problem-solving tasks and fewer messages
about the collaborative group process and technical problems. In addition, one out of the three
teams in the first semester and two out of the
three teams in the second semester participated
in a minimum of one 60-min chat session to
solve each week's collaborative problem-solving
task. The two remaining teams in the first
semester and the one in the second decided to
have face-to-face meetings instead of chat sessions for each collaborative team activity (students reported that they spent a minimum of 60
min discussing the task).
Qualitative analysis of student postings in
both large-group and small-group (team) discussion suggested that students engaged in a
highly focused discussion. In their synchronous
and asynchronous team conversations and in attempts to solve the problems, students shared
understanding of the problem and its underlying concepts, helped each other understand the
new concepts, referenced the appropriate reading materials, shared new resources, compared
strategies, and developed a fully collaborative
product. The following are excerpts of different
students' postings during small-group discussion.
"Okay folks, what is a mini lesson and how do we incorporate the theories without making it a JUMBO lesson?" "In the DESIGN and DEVELOPMNENT phases of this
theory, the materials are formulated. We should look
at what the materials need to say and how they should
look (depends on audience) and then decide what is
most cost-effective. (Reverse the order)." "Maybe we
should split the history into four parts and each use or
make a collage . . ." "Okay, here is a very simple
sample, let me know what you think." "The software
'Inspiration' is very good for showing how things are
interrelated." "C, are you using the basic information
on Dr. M's site? I love to sneak a peak at it when you
get it together." "The first component is simply
answering any of the five questions that are pertinent
to your theory. I think it would be easier if each of us is
responsible for researching one of the theories and
answering up to five questions for our theory. The
second component is where we all need to bring our
info together and summarize, compare and contrast."
In the large-group discussion forum corresponding to the teams' domain knowledge,
students explored the concepts, provided real
life examples, discussed related issues and
topics and debated different perspectives. Table
5 presents sample excerpts of different students'
comments in a large-group discussion (the
group discussion was randomly selected for this
illustration).
What Did Students Think About the
Course Design Specifications?
The results of the student questionnaire also
showed that students rated the team and largegroup discussion forum (board) as very helpful
in understanding the course content and in contributing to the quality of the online learning environment (see Table 6). However, when asked
to rank order the importance to their learning of
each component of the weekly lessons (lesson
overview, lesson goals, required readings, individual assignments, collaborative team activities, resource materials, instructor's notes,
and lectures), students seemed to differ in their
perspectives (i.e., student ratings for the weekly
problem-solving tasks ranged from 2 to 8 (when
1 = most important and 10 = least important) with
more students rating team activities either 4
(36.4%, n = 8) or 7 (27.3%o, n = 6). Further analysis
of the results of student learning styles surveys
and their narrative comments on their reasons
for ranking items as most and least important indicated that there may be a relationship between
student learning styles and their ratings of the
components of each lesson. For example, students who reported being active leamers (12 out
of 16) tended to rate the collaborative problemsolving tasks higher (from 2 to 5) than did students who reported being reflective leamers
(e.g., "I need interaction/discussion to fully understand ideas" "My learning occurs here [discussion]" "It is very helpful to hear others'
perspective."). Likewise, students who reported
being reflective learners (4 out of 4) tended to
rate individual assignments higher (1 or 2) than
the collaborative problem-solving tasks (e.g., "I
like to think on my own" "I am not quick to use
it [discussion]" "They [individual assignments]
require me to do research.").
96
ETR&D, Vol. 51, No. 4
Table 5 0 Examples of different students' comments in a large group discussion.
Assigned Codes
Student Comments
Exploring
concepts
"Instructional Theories have been based on data obtained through several methods while
Curriculum Theory has been based on philosophy or values. This seems to relate to
deciding between your head and heart." "The more I read and the more we talk about the
more unsure I am of the difference between instructional theories and instructional design
theories. Can anyone clear that up for me?" "This a very difficult concept and a very minor
difference that I am not surprised that you are confused . . ." "I may be completely wrong,
but here is the difference I see between them: The Instructional Design Theory gives more
guidance in how the lesson should proceed (step-by-step, with different ways to do each
step).. 11
Providing
real life
examples
"When we were discussing the specifics of the new paradigm, it reminded me of the
'Quality Workshops' that... County started several years. It started with a few people from
each school who were required to attend. .." "When my daughter was in 1st grade she
was exposed to the 'Writing to Read program' by IBM. I remember vividly attending the
sessions for parents and asking questions .. . old vs. new instruction. I had my doubts
about this program. . ." "I also remember going through training programs on Working
on the Work and advisor/advisee. Not much is mentioned about those theories now. If you
talk to a veteran teacher they will tell you that every couple of years . . ."
Discussing
related issues
and topics
"I have learned that I am a knowledge user vs. a knowledge producer. I do see the need to
see the process that takes place in instruction.. ." "As a mathematician, I would like to be
able to say that I believed in something 100%lo, but I can't. There is an 'inner voice' that
always plays devil's advocate with my thoughts, ideas, and feelings. I can say I believe in
something emotionally, but reason always questions whether or not I am thinkdng
clearly. . ."
Debating
different
perspectives
"Instructional Theories have been based on data obtained through several methods while
Curriculum Theory has been based on philosophy or values. This seems to relate to
deciding between your head and heart. .. ." "What'swrong with that analogy? In the case
of choosing between you heart and your head you are the one that has to live with the
decision" "I agree with ... There is nothing wrong with that analogy. It sounds a lot better
than my analogy: "What ever works."
Table 6 El Student rating of the team and large group discussion forum.
Question
M
Did participating in the weekly forum discussion contribute to your understanding
of the course content?
Did participating in the team discussion forum contribute to your understanding
of the course content?
Did participating in the weekly forum discussion contribute to the quality of
course learning environment?
Did participating in the team discussion forum contribute to the quality of the
course learning environment?
Did individual assignment contribute to your understanding of course content?
2.73
.46
2.91
.29
2.82
.40
2.91
.29
2.91
.29
N=22
3-point scale: 3 = Very helpful, 2 =Some help, I = No help
SD
I
97
AN INTERACTIVE ONUNE COURSE
How Did Students Use the Cognitive
Support Strategies Integrated inthe
Course Design?
The results of student questionnaires, together
with the analysis of discussion logs from the
large-group discussion board (forum) showed
that weekly individual assignments played a
major role in the quality of student interactions.
In their responses, students indicated that individual assignments were very helpful (see
Table 6) and assisted them in exploring the issues individually and in forming some opinions
before working on the problem with their team
members. Students who had completed their individual assignments before participating in the
team and large-group discussion posted more
messages, asked more questions, and raised
more underlying issues related to each topic and
problem in hand than those who did not complete their assignments on time. Students also
thought the large-group discussion board or
forum encouraged them to explore the ideas that
they either did not think about or had a problem
understanding. The e-mail logs documented in
the second semester showed that several students also used the e-mail system to ask individual questions from the instructor.
However, none of the students indicated that
they used the IT Forum to explore expert
opinions during their team activities or individual assignments. Overall, it appeared that
individual assignments and the large-group discussion board provided cognitive support for
the teams' problem-solving tasks and influenced
the quality of student interaction and exchange
of ideas.
Analysis of each team's discussion logs and
the large-group discussion forum revealed that
after the first two problem-solving tasks, students tended to depend more on their peers for
information and discussion than on the
instructor's responses, notes, and comments.
While in the first two teams' discussion logs and
large-group discussion board, students either
waited for the instructor to reply or addressed
the instructor's comments, questions, or ideas;
in the later team and large-group discussions,
this was not the case. This result, combined with
evidence of positive interpersonal relationships
among team members, indicated that a community of leamers was formed in this online
course. Toward the middle of the semester, students were well acquainted with each other and
particularly with their team members. All of the
messages that were exchanged in the team area
were related to the problems at hand (except for
some emotional support). In their comments,
students also did admit that they spent less time
completing the last two problem-solving tasks
than the earlier ones.
How Did the Learning Task Influence
Group Discussion?
Which problem-solving tasks (inteilective vs.
generative) created a better environment for
conversation and sharing of knowledge? Out of
11 collaborative problem-solving tasks, 1 task
(Collaborative Task Two) was identified as
being an intellective task because it had a correct
solution, while the rest of the tasks were
designed to be generative problem-solving tasks
(McGrath and Hollingshead, 1993). Student conversations for the intellective task (Collaborative
Task Two) were compared with student discussions in a generative task (Collaborative Task
One). Because both tasks were discussed at the
beginning of the semester, there seemed to be
more simnilarities between them in terms of student familiarity with the online discussion,
course content, and materials than with later
tasks. The qualitative analysis of student postings indicated that the contents of student conversations were somewhat different for different
types of tasks. The analysis confirmed that, as
McGrath and Hollingshead observed, the
generative task created a better environment for
discussion and construction of knowledge than
did the intellective task. Furthermore, student
conversations for, generative problem-solving
tasks across both semesters and for all teams
seemed to be friendlier than for the intellective
task, in that students tended to praise and accept
each other's ideas and to add to them as their
conversations continued (i.e., "Yes, I like the
idea of visual, I am thinking of a collage type."
"After reading C's idea using visuals I now
think this approach would make an impressive
98
display." "How does everyone think about including text?"). However, discussions for the intellective task pointed to student attempts to
evaluate and validate each other's responses
against readings before accepting or rejecting
them (i.e., "I have looked them over and here is
my thinking . . ." "Okay, this is the third bullet
on the directions. I matched instructional theory
with theory 4. I think this theory builds on . . . "
"Which readings did you use M?"). Students
seemed to share more ideas, discuss alternative
ways of approaching the task and its solution,
and spend time trying to integrate different
ideas in one solution for generative tasks. On the
other hand, for the intellective task, it seemed
that each team member first tried to find an
answer to the problem, and then the team discussed which answer was the best one. The level
of student engagement in the discussion, however, did not seem to be different across different
tasks. After eliminating postings that were related to technical issues, it appeared that students
actively
participated
in
both
problem-solving tasks and posted a comparable
number of responses (177 postings for Activity 1
vs. 155 postings for Activity 2 in Fall 2000, and
80 postings for Activity 1 vs. 77 postings for Activity 2 in Fall 2001).
How Did Course Design Influence
Learning and Satisfaction?
Analysis of team performance products, the
quality of student interactions (in the team area),
and the nature of the questions and comments
that students posted in the discussion board or
forum demonstrated a deep understanding of
the course content and a high level of commitment to collaborative and cooperative work (see
Table 5 for examples of student thoughts and
Figure 2 for an example of student products). In
responses to the Incomplete Statements questionnaire, the majority of students (more than
80%, n = 22) in one way or another noted that
they liked the course because they were able to
work as teams and learn from each other. Student responses to the weekly individual assignments also demonstrated both student desire for
learning the content and their grasp of underly-
ETR&D, Vol. 51. No. 4
ing concepts and ideas. Student design projects
also showed that 75% of students (19 out of 24)
achieved the majority of the course objectives
(one student in Fall 2000 and two students in
Fall 2001 missed attending the course in the last
three or four weeks of the course because of unexpected personal or job related problems and
did not complete the course capstone project).
Those students who did not achieve the course
objectives tended to miss participating in the
large-group and team discussions and completing individual assignments. The results of the
second survey (conducted at the end of the
course) showed that in spite of the heavy
workload of the course, students developed a
very positive attitude toward it. All students
who responded to the second survey (21) indicated that they would suggest this course to
other students even though some mentioned
that they would remind future students of the
heavy work load and the demand on collaborative work.
The results of the Student Perception of
Teaching (SPOT)-a five-point scale instrument
(with 5 = strongly agree, and 1 = strongly disagree)
administered and analyzed by the universityconfirmed the student course evaluation results
(students rated the course highly and thought
they had a good learning experiences).
DISCUSSION AND IMPLICATIONS
The purpose of this paper was to describe
processes and results of using an interactive
design model for development of an online
course. The process of design and development
began with reviewing literature and establishing
an interactive design model, which was later
used to develop the course. The design specifications were evaluated as the course was
delivered. The design, development, and implementation process and the evaluation results
of this interactive online course pointed to
several important issues.
Course design model seems to be an influentialfactor
in creating an online interactive learning environment. During the design, development, and implementation process of this online interactive
AN INIERACTIVE ONUNE COURSE
99
Figure 2 0 Example of student product.
course, it became clear that developing an online
course that encourages student exploration and
reflection required much more thinking, time,
and effort than had been predicted. Later, the
course evaluation results further confirmed that
designing an interactive and collaborative
course for online delivery was more of a
pedagogical issue than a technological issue.
The design model used in this study confirmed a
reciprocal interactive relationship among the
design factors or specifications (Moore, 1991),
suggesting that without such conceptualization
during the design process, it might be difficult to
create an interactive online course. Since many
classroom instructors are being encouraged to
design online courses with neither a strong
background in the pedagogy nor a clear understanding of the strengths and limitations of the
technology, it seems necessary to design and
develop instructional design models that are appropriate for this learning environment.
Moreover, although it is important to wori with
the course management system and to develop
some expertise in multimedia, graphic arts, and
Web design, it is even more important that
designers of online courses learn to adapt and
design instructional activities and materials that
are functional within the courseware, while
being able to facilitate student learning, communication, and resource sharing.
Task structureand organizationinfluence the nature
and quality of student interaction. Evaluation
results of this online course confirm that a successful, interactive and collaborative online
course requires well-designed and welldeveloped' collaborative tasks or problems, or
activities that stimulate peer interaction and encourage peer, collaboration (e.g., Hannafin,
Land, & Oliver, 1999, Jonassen, 1999; Nelson,
1999). They also suggest that online collaborative tasks or activities that provide structure can
diminish student confusion. Because of the
flexibility of time and place and the immediacy
of problems posed by the absence of rich nonverbal communication in online collaborative
tasks, developing a focus, timeline, clear expectations, and well-defined roles for each participant, and a clear evaluation format for the
online tasks are very important in improving in-
100
teractivity and preventing confusion and
frustration. Furthermore, it is important that
domain knowledge be well integrated into the
problem. Well-integrated domain knowledge is
essential to online problem-solving tasks because it helps students understand the problem
and remain within the knowledge domain as
they are solving the problem.
Collaborative learning tasks should be carefully
designed and developed if they are to promote construction of knowledge through discussion and conversation. The results of formative evaluation in
this course echo previous research in collaborative and interactive learning pedagogy. Collaborative interactive learning is not just having
students talk to each other, either face-to-face or
in a computer-mediated conference while they
do their individual assignments. It is not having
students do the task individually, and then have
those who finish first help those who have not yet
finished, or talk about it. It is not having students
learn certain facts and concepts and then share
them with other peers. It is not having one or few
students do all the work while others append
their names to the report. The idea of collaborative, interactive learning is the development of
shared meaning among group members, a
perspective that emphasizes the social creation of
knowledge as the basis of learning. This shared
meaning needs to occur within a learning activity
that provides a means for both individual
development and collaborative construction of
knowledge. Such learning activities should be
carefully designed to be suitable for group work,
and should be designed in such a way as to encourage learners to explore and make use of new
knowledge and skills in order to solve the problem at hand. The collaborative activities should
also carefully structure positive interdependence
to ensure that students are commnitted to each
other as persons and to each other's success. The
structure of the collaborative activity should
promote individual accountability, while simultaneously requiring coordinated efforts to complete joint responses. The design model used in
this course appeared to be effective in creating an
environment in which students shared meaning
and ownership of their knowledge and committed to each other's success.
Ell&D, Vol. 51, No. 4
The nature and type of collaborative task influences
the content of students' interaction and the concept of
sharedknowledge construction. The
formative
evaluation data in this course indicated that
generative problem-solving tasks seemed to create an environment in which students constructed shared knowledge and products, and
formulated and negotiated understanding of the
content. In other words, as a result of these activities, students actively participated in generating the course content. However, similar results
were not observed for the intellective problemsolving task. Although, because of the limited
number of intellective tasks used in this course, it
is difficult to draw any of the conclusions, it
seems that, as McGrath and Hollingshead (1993)
had observed, the possible correct response for
intellective tasks prevents students from generating their own ideas and encourages them to find
the right answer from the available information.
Future research should focus more on the nature
of problem-solving tasks, and their effects on social construction of knowledge in an online learning environment
Augmenting groupactivities with individual assignments seem to improve the qualihy of interactionsand
to encouragestudent participation. As Salomon and
Perkins (1998) and Jonassen (1999) noted, in contexts of active social mediation, the learner still
remains an individual learner in significant
ways. The experience in this course indicates
that one could not expect learners to know
enough about the knowledge domain to constructively participate in the problem-solving
task. The learners must be prepared for an intellectual discussion. Supplementing group activities with individual activities was a good
design feature in this course. Asking students to
individually explore the underlying concepts
and issues in advance, and connecting them
with their own previous experiences helped students to better understand the collaborative task
and to be prepared to formulate ideas and participate in both team and large-group discussions. Furthermore, by providing individual
feedback to each student, the instructor was able
not only to encourage students to think more,
but also to examine alternative perspectives and
resources.
101
AN INTERAClIVE ONUNE COURSE
As a facilitator, the instructor of an interactive online course should prepare students for
discussions. Part of the role of a facilitator
during the team and large-group discussion is to
remind students of some norms (netiquette) for
an online discussion. This result, which evolved
during the implementation of this online course,
suggested that the facilitator may need to
remind students of issues, such as
* How to reply to each other's comments (e.g.,
not just saying "I agree" or "I disagree" but
expanding on the topic of agreement or disagreement).
* How to disagree, but be respectful.
* How to reflect and reformulate ideas, and so
on.
As the facilitator tries to face the challenge of
preparing students for a text-based online discussion, the best way of coaching students is to
model the behaviors and become a member of
the collaborative teams. Modeling what is expected of a student, as a team member, seems to
be more effective than telling students how to
manage online discussion.
Students in an interactive online course need time to
adjust to their new technology. The results of the
formative evaluation along with the delivery
process showed that the newness of the medium
and the learning environment required that students learn about the software, adjust to a textbased communication environment, and
become accustomed to less rich information
(compared to rich face-to-face communication).
Therefore, the instructors of online courses must
be patient, encouraging, and considerate of the
fact that the initial discussions are not indicative
of student performance.
The immediacy behavior (behaviors that enhance closeness to and nonverbal interaction
with others) affects student motivation in carrying on the discussion and discourse. Both the instructor and students in this course agreed that
online text-based discussion was not as stimulating and interesting as face-to-face discussion, in
which they were able to see faces, gazes, facial
impressions, and lip movements, and could hear
vocal expressions. However, as the discussion
progressed in the semester, both the instructor
and the learners learned to include some of the
interactive immediacy cues, such as praising
each other, addressing one another by name,
digressing to respond to the posted comments,
and providing prompt feedback in their discussions. These practices seemed to influence student perception and attitude, and increased
their willingness to participate in the discussion
and interaction. Incorporating videotape recording and video conferencing can improve online
discussion and produce nonverbal cues that
were absent in this online course.
In conclusion, Internet communication tools
do not enhance learning by themselves. Rather,
they provide avenues for learning when placed
in the capable hands of skillful teachers and
[1
designers of instruction.
Mahnaz Moallem Imoalleemm@uncmwedui is
Associate Professor of Instructional Technology in
the Watson School of Education at the University of
North Carolina at Wilmington.
The author would like to thank ETR&D reviewers
for their constructivist feedback on the earlier version
of this paper, and the graduate students of the
instructional technology program for their
wilingness to participate in this project.
REFERENCES
Argyle, M., LaDjee, M., & Cook, M. (1968). The effects
of visibility of interaction in a dyad. Huiman Rela-
tions, 21, 3-17.
Barrows, HS., &Tamblyn, RM. (1980). Problem-based
learning: An approach to medical education. NY:
Springer Pub. Co.
Berge, Z.L. (1995). Facilitating computer conferencing:
Recommendation from the field. Educational Technology, 35,22-30.
Berge, Z.L. (1999). Interaction in post-secondary Webbased learning. EduicationalTechnology, 39(1), 5-11.
Blacklow, R.S., &Engel, J.D., (1991). The University of
Delaware/Jefferson
Medical
Colege Medical
Scholars Program: An approach to educating
physicians for academic leadership and practice,
Del. Med. J., 63: 303-307.
Boshier, R., Mohapi, M., Moulton, G., Quayyum, A.,
Sadownik, L., & Wilson, M. (1997). Best and worst
dressed Web courses: Strutting into the 21st century
in comfort and style. Distance Education,18, 327-349.
Boud, D. (Ed.). (1985). Problem-based learningfor the
professions. Sydney: HERDSA.
Boud, D., & Feletti, G. (Eds). (1991). The Challenge of
Problem Based Learning. London: Kogan Page.
102
Brown, J.S., Collins, A., &Duguid, P. (1989). Situated
cognition and the culture of learning. EducationalResearchier,18, 32-42.
Daft, RL., &Lengel, RH. (1986). A proposed integration among organizational information requirements, media richness, and structural design.
Management Science, 32, 554-571.
Engel, C.E. (1997). Not just a method but a way of
learning. In D. Boud &G. Feletti (Eds.), The challenge
of problem based learning (2nd edition). London:
Kogan Page.
Felder, RP(1993). Reaching the second tier: Learning
and teaching styles in college of science education.
Journalof College Science Teaching, 23(5),286-290.
Felder, R., &Silverman, L. (1988). Learning and teaching styles in engineering education. Engineering
Education, 78(7), 674-681.
Foshay, R., & Bergeron, C. (2000). Web-based education: A reality check. TechTrends, 44,16-19.
Gilbert, L., &Moore, D.R. (1998). Building interactivity
into Web courses: Tools for social and instructional
interaction. EducationalTechnology, 38(3),29-35.
Greeno, J.G. (1997). Response: On claims that answer
the wrong question. EducationalResearcher, 20(1), 517.
Hackman, J.R., & Morris, C.G. (1975). Group task,
group interaction process and group performance
effectiveness: A review and proposed integration. In
L. Berkowitz (Ed.), Advances in experimental social
psychology (pp. 47-100). San Diego, CA: Academic
Press.
Hannafin, M.L., Land, S.M., & Oliver, K (1999).Open
learning environments: Foundations, methods, and
models. In C.M. Reigeluth (Ed.), Instructional design
theories and model: A new paradigm of instructional
theory (Vol. III) (pp. 115-141). Hillsdale, NJ:
Lawrence Erlbaum Associates.
Harasim, L. (1989). On-line education: A new domain.
In R. Mason, & A. Kaye (Eds.), Mindweave: Communication, Computers and DistanceEducation (pp.5062). Oxford: Pergamon Press.
Hiltz, S.R (1994). The virtual classroom. Learningwithout
limits via computer network Norwood, NJ: Ablex
Publishing Corporation.
Hiltz, S.R (1997). Impacts of college-level courses via
asynchronous learning networks: Some preliminary
results.
Available
on
line:
Ihttp://eies.njit.edu/hiltz/workingpapers/philly
/philly.htm].
Imel, S., &Tisdell, E.J. (1996). The relationship between
theories about groups and adult learning groups. In
S. Imel (Ed.), Learningin groups:Exploringfundamental principles, new uses, and emerging opportunities.
New Directions for Adult and Continuing Education no.71 (pp.15-24). San Francisco: Jossey-Bass.
Johnson, D.W., Johnson, R.T., & Smith, K.A. (1991).
Cooperativelearning:Increasingcollegefaculty instructional productivity (ASHE-ERIC Higher Education
Report No. 4). Washington, DC: The George
Washington University, School of Education and
ETR&D, Vol. 51, No. 4
Human Development. (ERIC Document Reproduction Service No. ED 343 465)
Johnson, D.W., Johnson, R.T., &Smith, K.A. (1998). Active learning: Cooperation in the college classroom.
Edina, MN: Interaction Book Company.
Jonassen, D. (1999). Designing constructivist learning
environments. In C.M. Reigeluth (Ed.), Instructional
design theories and model: A new paradigm of instructional theory (Vol. III) (pp.215-241). Hillsdale, NJ:
Lawrence Erlbaum Associates.
Jonassen, D., & Rohrer-Murphy, L. (1999). Activity
theory as a framework for designing constructivist
learning environments. Educational Technology Research and Development, 47(1), 61-79.
Kearsly, G. (1995). The natureand value of interactionin
distance learning. Retrieved 2003 from http://www
.gwu.edu/-etl/interact.html
Kendon, A. (1967). Some functions of gaze direction in
social interaction. Acta Psychologyica,26, 1-47.
Knowles, M.S. (1975). Self-directed learning:A guidefor
learnersand teachers.New York: Association Press.
Knowles, M.S. (1990). Fostering competence in selfdirected learning. In R.M. Smith (Ed.), Learning to
learn across the life span (pp. 123-136). San Francisco:
Jossey-Bass.
LaRose, R., &Whitten, P. (1999). Websection: Building
Web courses with instructional immediacy. Retrieved
December 1, 1999 from http://www.telecommunication.msn.edu/faculty/larose/websectionlit
e.htm
Lave, J., & Wenger, E. (1991). Situated learning and
legitimate peripheral participation. Cambridge, UK.
Cambridge University Press.
Martin, B.L., & Reigeluth, C. (1999). Affective education and the affective domain: Implications for instructional design theories and models. In C.M.
Reigeluth (Ed.), Instructional design theories and
model: A new paradigm of instructionaltheory (Vol. III)
(pp. 485-511). Hillsdale, NJ: Lawrence Erlbaum Associates.
McGrath, J.E. (1984). Groups: Interaction and performance. Englewood Cliffs, NJ: Prentice Hall.
McGrath, J.E. (1990). Time matters in groups. In
Galegher, RE. Kraut, & C. Egido (Eds), Intellectual
teamwork: Social and technological foundations of
cooperativework (pp. 23-61). IiDsdale, NJ: Lawrence
ErlbaumAssociates.
McGrath, J.E. (1991). Time, interaction, and performance (TIP): A theory of groups. Small Group Research, 22,147-174.
McGrath, J.E. (1992, November). Group, technologies,
tasks, and time. Paper presented at the annual meeting of Computer Supported Cooperative Work,
Toronto, Ontario.
McGrath, J.E., &Hollingshead, A.B. (1993). Putting the
"group" back in group support systems: Some
theoretical issues about dynamic processes in
groups with technological enhancements. In L.M.
Jessup &J.S. Valacich (Eds.), Group support systems:
Newperspectives (pp. 78-96). NY: Macmillan.
AN INTERACTlVE ONUNE COURSE
McNabb, J. (1994). Telecourse effectiveness: Findings
in the current literature. TechTrends, 39(4),39-40.
Moore, M.G. (1991). Editorial: Distance education
theory. The American Journal of Distance Education,
5(3),1-6.
Moore, M.G. (1992). Three types of interaction. The
American Journalof DistanceEducation, 3(2), 1-6.
Moore, M.G. (1993). Theory of transactional distance.
In D. Keegan (Ed.), Theoretical principles of distance
education (pp. 22-38). London & New York: Routledge.
Moore, M., &Kearsley, G. (1995). Distanceeducation:A
systems view. NY: Wadsworth Publishing Co.
Mortera-Gutierrez, F., &Murphy, K. (2000). Instructor
interactions in distance education environment. Paper
presented at the Annual Distance Education Conference (7th, Austin, TX, January 25-28,2000).
Muirhead, B. (1999). Attitudes toward interactivity in a
graduate distance education program: A qualitative
analysis, Parkland, FL: Dissertation.Com.
Muirhead, B. (2000). Interactivity in a graduate distance education school. Educational Technology &
Society, 3(1) 2000.
Muirhead, B. (2001). Enhancing social interaction in
computer-mediated distance education. Ed at a Distance Journal, 15(4), Apr 2001. Retrieved 2003 from
http://usdla.org/ED-magazine/illuminactive/A
PROl_lssue/article02.html
Nelson, L.M. (1999). Collaborative problem solving. In
C.M. Reigeluth (Ed.), Instructionaldesign theories and
model: A new paradigm of instructionaltheory (Vol. III)
(pp. 241-265). Hillsdale, NJ: Lawrence Erlbaum Associates.
Parker, A. (1999). Interaction in distance education
The critical conversation. Educational Technology
Review, 12, (Autumn-Winter), 13-17.
Perkins, D.N. (1993). Person plus: A distributed view
of thinking and learniing. In G. Salomon (Ed.), Distributed cognitions (pp. 88-110). NY: Cambridge
University Press.
Peterson, M. (1996). A team-based approach to problem-based learndng: An evaluation of structured
team problem solving. Journalon Excellence in College
Teaching, 7(3);129-153.
Rau, W., & Heyl, B.S. (1990). Humanizing the college
classroom: Collaborative learning and social or-
103
ganization among students. Teaching Sociology, 18,
141-155.
Reigeluth, C.M. (1999). Instructional design theories and
models:A newparadigmof instructionaltheory (Vol. 1).
NJ: Lawrence Erlbaum Associates
Rubin, RB., Rubin, A.M., &Jordan, F.F. (1997). Effects
of instruction on communication apprehension and
communication competence. Communication Education, 46,104-114.
Rutter, D.R., & Stephenson, G.M. (1975). The role of
visual communication in synchronizing conversation. EuropeanJournalof Social Psychology, 7,29-37.
Saba, F., &Shearer, RL. (1994). Verifying key theoretical concepts in a dynamic model of distance education. The American Journalof DistanceEducation, 8(1),
36-59.
Salomon, G., &Perkins, D.N. (1998). Individual and
social aspects of learning. Review of Research in
Education, 23. Retrieved 2003 from http:/ /construct
.haifa.ac.il/-gsalomon/indsoc.htm
Shaw, A. (1996). Social constructionism and the inner
city. In Y. Kafai &M. Resnick (Eds.), Constructionism
in practice:Designing,thinking, and learningin a digital
world. Mahwah, NJ: Lawrence Erlbaum Associates.
Shepperd, J.A. (1993). Productivity loss in performance groups: A motivation analysis. Psychological
Bulletin, 113,67-81.
Sherry, L. (1996). Issues in distance learning. International Journal of Educational Telecommunication, 1(4),
337-365.
Slavin, R.E. (1995). Cooperativelearning:Theory, research,
and practice (2nd Ed.). Boston: Allyn &Bacon.
Spitzer, D.R (2001). Don't forget the high-touch with
the high-tech in distance learning. EducationalTechnology, 41(2), 51-55.
Straus, S.G., &McGrath, J.E. (1994). Does the medium
matter? The interaction of task type and technology
on group performance and member reactions. Journal ofApplied Psychology, 70(1), 87-97.
Vygotsky, L.S. (1978). In M. Cole, V. John-Steiner,
Scribner, S., &Souberman, E. (Eds.), Mind in society.
Cambridge, MA: Harvard University-Press.
Zirkin, B., &Sumler, D. (1995). Interactive or non-interactive? That is the question! An annotated bibliography. Journalof DistanceEducation, 10(1), 95-112.
COPYRIGHT INFORMATION
TITLE: An Interactive Online Course: A Collaborative Design
Model
SOURCE: Educ Technol Res Dev 51 no4 2003
WN: 0300403447005
The magazine publisher is the copyright holder of this article and it
is reproduced with permission. Further reproduction of this article in
violation of the copyright is prohibited.
Copyright 1982-2004 The H.W. Wilson Company.
All rights reserved.