Interactive Learning Systems Evaluation

advertisement
Six Facets of
Instructional Product Evaluation
Review
Effectiveness
Evaluation Functions
Development Activities
Review
Product Conceptualization
Needs Assessment
Design
Formative Evaluation
Development
Effectiveness Evaluation
Implementation
Impact Evaluation
Institutionalization
Maintenance Evaluation
Project Re-conceptualization
EMCC Design Document
• Urban Science Course Environment
Dimensions of effective technology
enhanced learning environments:








Task-Oriented
Challenging
Collaborative
Constructionist
Conversational
Responsive
Reflective
Formative
Task-Oriented
Academic
The tasks faculty set
for students define the
essence of the
learning environment.
If appropriate, tasks
should be authentic
rather than academic.
Authentic
Task-Oriented Example
Students in
online
instructional
design courses
are tasked with
designing
interactive
modules for real
clients.
Challenging
Simple
The notion that
interactive learning is
easy should be
dispelled. Learning is
difficult and students
should not spoon fed
simplified versions of
their fields of study.
Complex
Challenging Example
In a Masters of
Public Health
program, students
confront problems
as complex and
difficult as the ones
they’ll face in the
real world.
Collaborative
Unsupported
Web-based tools for
group work and
collaboration can
prepare students
for team work in
21st Century
work environments.
Integral
Collaborative Example
Art, dance, and
music students are
collaborating to
produce online
shows with digital
versions of their
works and
performances for
critique by
international experts.
Constructionist
Replication
Faculty should
engage students in
creating original
knowledge
representations
that can be shared,
critiqued, and
revised.
Origination
Constructionist Example
Students in fields
ranging from aeroengineering to zoo
management are
producing digital
portfolios as integral
components of
their academic
programs.
Conversational
One-way
Students must have
ample time and
secure spaces for
in-depth
discussions,
debates, arguments,
and other forms of
conversation.
Multi-faceted
Conversational Example
New knowledge and insight are being
constructed in conversation spaces
such as the e-learning forums found in
BlackBoard, WebCT, Desire2Learn,
and other
online
learning
authoring
platforms.
Responsive
Superficial
In learning
communities, both
faculty and students
have a mutual
responsibility to
respond quickly,
accurately, and
with respect.
Genuine
Responsive Example
This is an area
where R&D are
needed. Some
universities are
seeking to establish
supportive online
networks that will
continue throughout
a career, indeed
throughout a life.
Reflective
Shallow
Both faculty and
learners must
engage in deep
reflection and
metacognition. These
are not instinctive
activities, but they
can be learned.
Deep
Reflective Example
Teacher preparation
students are keeping
electronic journals to
reflect upon the
children they teach,
and their roles as
advocates for
children.
Formative
Fixed Assessment
Learning
environments can be
designed to allow
students to develop
prototype solutions
over time rather than
to find one right
answer that someone
else has defined.
Developmental
Formative Example
Faculty should
engage their
students in ongoing
efforts to evaluate
and refine their
work related to
authentic tasks to
encourage lifelong
learning.
Traditional Course
Task-Oriented
Challenging
Collaborative
Constructionist
Conversational
Responsive
Reflective
Formative
Online Course
Heuristic Review
What is usability?
• The concern with
designing software
applications which
people find easy to
use and personally
empowering.
• Usable computer
programs are logical,
intuitive, and clear to
the people who use
them.
Web Site Usability
• “ The most common user
action on a Web site is to flee.”
Edward Tufte
• “at least 90% of all commercial
Web sites are overly difficult to
use…..the average outcome of
Web usability studies is that
test users fail when they try to
perform a test task on the
Web. Thus, when you try
something new on the Web,
the expected outcome is
failure.”
Jakob Neilsen
Typical Web Usability Problems
• bloated page design
• internally focused
design
• obscure site
structures
• lack of navigation
support
• writing style
optimized for print
Jakob Neilsen
http://www.useit.com/
Key Usability Principles
• Structure - organize
meaningfully
• Simplicity - make
common tasks easy
• Visibility - all data
needed for a task
• Feedback - keep users
informed
• Tolerance - allow cancel,
back
• Reuse - reduce the
users' need to remember
Nielsen’s Web Usability Rules
• Visibility of system status
• Match between system and
real world
• User control and freedom
• Consistency and standards
• Error prevention
• Recognition rather than recall
• Flexibility and efficiency of use
• Help users recognize,
diagnose, and recover from
errors
• Help and documentation
• Aesthetic and minimalist
design
Two Major Ways to Evaluate Usability
• Heuristic Review
– quick and relatively inexpensive
– based on expert analyses
– no user involvement
• Usability Testing
– finds more problems
– user involvement increases validity
– when designers see problems live, it
has a huge impact
Heuristic Review
 Several experts individually compare a
product to a set of usability heuristics
– Typical heuristic:
• Visibility of
system status
• The system should
always keep users
informed about
what is going on,
through appropriate
feedback within
reasonable time.
Heuristic Review
 Violations of the heuristics are
evaluated for their severity and extent
Severity Scale:
1
2
3
4
Cosmetic: fix if possible.
Minor: fixing this should be given low priority.
Medium: fixing this should be given medium priority.
Major: fixing this should be mandatory before the system is
launched. If the problem cannot be fixed before launch, ensure
that the documentation clearly shows the user a workaround.
5 Catastrophic: fixing this is mandatory; no workaround possible.
Extensiveness Scale:
1 Single case
2 Several places
3 Widespread
Heuristic Review
 At a group meeting, violation reports are
categorized and assigned
 Heuristics violated are
identified
 Average severity and
extensiveness ratings are
compiled
 Opportunities for
improvement are clarified
 Feasible solutions are
recommended
Heuristic Review
• Example of Opportunity For Improvement
Opportunity 1 (4 reports. Avg. Severity=2.25, Avg. Extent=2.34,
Heuristics Used: 1, 3)
Consider providing more user feedback about where they are and
what they should do next. Examples cited:
 No Page progress indicator
 No indication of how to start
Suggestions:
 Provide a page-progress indicator, such as “page 3 of 12”
 Put a “Click a section below to start:” on the first screen, as a TOC
header
Heuristic Review
• Advantages
– Quick: Do not need to find or schedule users
– Easy to review problem areas many times
– Inexpensive: No fancy equipment needed
• Disadvantages
– Validity: No users involved
– Finds fewer problems (50% less in some cases)
– Getting good experts can be challenging
– Building consensus with experts is sometimes
difficult
Another Weakness
• Some people
believe that
heuristic evaluation
is too subjective.
• Human judges are
prone to poor
judgment at times.
Usability Standards
http://www.astd.org/ASTD/marketplace/ecc/ecc_home
ASTD offers certification of e-learning courses,
including 8 usability standards:
-Navigation
-Orientation
-Feedback cues
-Link cues
-Links labeled
-Help
-Legibility
-Text quality
Heuristics for E-Learning Evaluation
1. Visibility of
system status:
The e-learning
program keeps the
learner informed
about what is
happening, through
appropriate
feedback within
reasonable time.
•red for a problem
•yellow for a warning
•green for OK
Heuristics for E-Learning Evaluation
2. Match between system
and the real world: The
e-learning program’s
interface employs words,
phrases and concepts
familiar to the learner or
appropriate to the content,
as opposed to systemoriented terms. Wherever
possible, the e-learning
program utilizes real-world
conventions that make
information appear in a
natural and logical order.
Heuristics for E-Learning Evaluation
3. Error Recovery and
Exiting: The elearning program
allows the learner to
recover from input
mistakes and
provides a clearly
marked “exit” to leave
the program without
having to go through
an extended dialogue.
Heuristics for E-Learning Evaluation
4. Consistency and
standards: When
appropriate to the
content and target
audience, the elearning program
adheres to general
software conventions
and is consistent in
its use of different
words, situations, or
actions.
Heuristics for E-Learning Evaluation
5. Error prevention: The e-learning program is
carefully designed to prevent common problems
from occurring in the first place.
Heuristics for E-Learning Evaluation
6. Navigation support:
The e-learning program
makes objects, actions,
and options visible so
that the user does not
have to remember
information when
navigating from one part
of the program to
another. Instructions for
use of the program are
always visible or easily
retrievable.
Heuristics for E-Learning Evaluation
7. Aesthetics:
Screen displays do
not contain
information that is
irrelevant, and
“bells and whistles”
are not gratuitously
added to the elearning program.
Heuristics for E-Learning Evaluation
8. Help and
documentation: The elearning program
provides help and
documentation that is
readily accessible to the
user when necessary.
The help provides
specific concrete steps
for the user to follow. All
documentation is written
clearly and succinctly.
Heuristics for E-Learning Evaluation
9. Interactivity: The e-learning program provides
content-related interactions and tasks that support
meaningful learning.
Heuristics for E-Learning Evaluation
10.Message
Design: The
e-learning
program
presents
information in
accord with
sound
principles of
informationprocessing
theory.
Heuristics for E-Learning Evaluation
11.Learning Design:
The interactions in
the e-learning
program have been
designed in accord
with sound principles
of learning theory.
Heuristics for E-Learning Evaluation
12.Media Integration: The inclusion of media in the
e-learning program serves clear pedagogical
and/or motivational purposes.
Heuristics for E-Learning Evaluation
13.Instructional
Assessment:
The e-learning
program provides
assessment
opportunities that
are aligned with
the program
objectives and
content.
Heuristics for E-Learning Evaluation
14.Resources:
The
e-learning
program
provides
access to all
the resources
necessary to
support
effective
learning.
Review
• The purpose of review is to
ensure that the development
team is well-informed about
previous work done in the area
during the early stages of
product conceptualization.
• Designers must avoid recreating
the wheel.
Review
• The two primary methods used
are reviewing the related
literature and reviewing
competing products.
• Regularly reviewing competing
products is a great professional
development practice.
I can do
better than
this!
Needs Assessment
• The purpose of needs
assessment is to identify the
critical needs that an instructional
product is supposed to meet.
• Needs assessment provides
essential information to guide the
design phase of the development
process.
Needs Assessment
• The primary methods are:
– task analysis,
– job analysis, and
– learner analysis.
• One of the most important results is
a list of specific goals and
objectives that learners will
accomplish with the new product.
Formative Evaluation
• The purpose is to collect
information that can be used for
making decisions about
improving interactive learning
products.
• Formative evaluation starts with
the earliest stages of planning
and continues through
implementation.
Formative Evaluation
• Provided the results are used,
formative evaluation usually
provides the biggest payoff for
evaluation activities.
• Clients may be reluctant to
accept the results of formative
evaluation, especially as a
program nears completion.
Effectiveness Evaluation
• The purpose is to estimate
short-term effectiveness in
meeting objectives.
• It is a necessary, but
insufficient, approach to
determining the outcomes
of interactive learning.
Effectiveness Evaluation
• Evaluating implementation is
as important as evaluating
outcomes.
• If you don’t understand how
instructional products were
actually implemented, you
can’t interpret results.
A connection with
the server could not
be established?
Impact Evaluation
• The purpose is to estimate the
long-term impact on
performance, both intended
and unintended.
• It is extremely difficult
to evaluate the impact
of interactive learning
products.
Impact Evaluation
• Evaluating impact is
increasingly critical because of
emphasis on the bottom line.
• More and more clients expect
impact evaluation to include
“return-on-investment” (ROI)
approaches.
Maintenance Evaluation
• The purpose of maintenance
evaluation is to ensure the
viability of an interactive
product over time.
• Maintenance is one of the
weakest links of web-based
learning environments.
Maintenance Evaluation
• Document analysis, interviews,
observations, and automated data
collection are among the methods
used in maintenance evaluation.
• Very few education and training
agencies engage in serious
maintenance evaluation.
Planning is the key to successful
instructional product evaluation.
• Evaluation requires good
planning, careful
implementation, and
systematic follow-up.
• A major challenge is
getting clients to identify
the decisions they face.
• Clear decisions drive the
rest of the planning.
Heuristics for E-Learning Evaluation
15.Feedback:
The
e-learning
program
provides
feedback that
is contextual
and relevant
to the
problem or
task in which
the learner is
engaged.
What the heck?
Download