Summative evaluation questions

advertisement
Microsoft Excel 2002
Meeting the Workbook
Evaluation
Instructions: Evaluate the website for Meeting the Workbook by checking “Yes” or “No” for the statements below.
http://komodo.utsystem.edu/med/cmclain/excel/overview.htm
Start time: ______
End time: ______
Learning Objectives and Value
The objectives for Meeting the Workbook, Lesson 1, and Lesson 2 were clearly stated.
The materials and activities were directly related to the objectives
The learner was actively involved throughout Meeting the Workbook.
The activities and practices promoted active learning.
Meeting the Workbook provided opportunity for self-assessments.
Comments:
Yes
No
Course Navigation and Instruction
Meeting the Workbook is easy to navigate, all links are active, and images download.
The design is flexible enough for me to move around at my own pace.
The language used is appropriate for Meeting the Workbook learners.
The directions were clear.
The amount of information on each page is appropriate.
The screens are aesthetically pleasing and effective.
Meeting the Workbook maintains a consistent use of headers, graphics, page layout, and text
(font type, style, size, and color).
The text is easy to read and information is broken into short paragraphs with adequate use of
white space, bulleted lists, and tables.
Comments:
Course Content and Activities
The content is clearly explained.
The introduction draws the learner into the lesson by relating to the learner’s interest and
describes a compelling question or problem.
The examples and illustrations help the learner understand the concepts and skills.
There are an ample number of activities and the placement makes sense.
The activities helped to reinforce understanding the content.
Feedback on practices and assessments is immediate.
Comments:
Test Results
Lesson 1 - ___ out of 8 terms correctly identified
Terms missed:
Lesson 2 - ___ out of 10 questions answered correctly
Questions missed:
Conclusion –
Acme information entered correctly
BBunny information entered correctly
Adapted and modified from Dick, W., Carey, L., & Carey, J. O. (2005). The Systematic Design of Instruction (6th ed.). Boston: Pearson. Pp. 289 and 302;
http://www.e-learningguru.com/tools/pilottest.doc; http://www.pinnaclelearning.com/IDToolkit/Evaluation/formative.htm; http://jan.ucc.nau.edu/~ksw8/evaluation.htm;
http://webquest.sdsu.edu/webquestrubric.html; http://help.ramct.colostate.edu/training/course_evalform.pdf; and CTSS
http://www.ed.uiuc.edu/courses/ci235/tutorial/composer/eva_form.html.
1. What is the purpose of the evaluation session?
The purpose of the summative evaluation is to determine program worth by recommending the instruction be
reviewed, retained, rejected, or revised. The summative evaluation collects data and information to verify and
measure the instructional outcomes reflected in the instructional goal and effectiveness of the target audience. The
summative evaluation is divided into two phases: expert judgement and field trial. The expert judgement phase
determines if the instruction met the instructional goal by evaluating congruency, completeness and accuracy,
instructional strategy, utility, and user satisfaction in the instruction. The field trial phase determines student
achievement against the instructional objectives using a target group of learners in the intended instructional setting.
The summative evaluation can be viewed as the instruction’s final grade.
2. Who is involved in the session? What is his/her profile?
3. Where is the session taking place?
The expert evaluator, Mrs. Emily Moore, was asked to participate in the evaluation because of her background in
copy editing and proofreading. Mrs. Moore is a peer in the Master of Education program at the University of Texas
at Brownsville. Her occupations have included copy and developmental editor, freelance writer/editor, and
published author. Mrs. Moore edited the book Excel for Starters. She reviewed the instruction in a personal setting
such as her home or office.
The field trial target audience included three learners.
 Tom McLain is a 42 year-old male. He has completed college courses and works as an electrician.
He is knowledgeable with computer hardware components and is learning Word, Excel, and
AutoCAD.
 Ethel Niemeyer is a 67 year-old female. She is a homemaker and novice computer user currently
learning how to use email.
 Robert Ross is a 22 year-old male. He is a junior philosophy major at the University of North Texas.
He is an experienced computer user.
The learners completed the evaluation in their own homes.
4. What are the results?
5. What is your reaction to the results?
6. What is your action to the results?
Expert Evaluator
The results of the expert evaluator included Mrs. Moore identifying areas of concern and suggesting ways to
improve the instruction. I reacted to the results by agreeing with Mrs. Moore’s findings. My action to the results is
to take the information and revise the course. Mrs. Moore made suggestions on ways to make the instruction more
congruent and complete by including such information as telling the learners why they need to know specific
information. She liked the way the course included a “hook” explaining why you should learn Excel and how the
course was short and concise. She found the instruction to be effective using screenshots, immediate feedback, and
Excel to give a quiz about Excel. Learner satisfaction will thrive after changing the instruction to include the copy
edit and improvement suggestions. Overall, Mrs. Moore indicated I did a “great job” “taking a big topic and
condensing it into small, manageable chunks.”
Field Trial
The results of the field trial included collecting test data and areas of concern. I reacted to the results by agreeing
with the areas of concern addressed by the target audience. My action to the results is to revise the course. Two of
the target group members experienced problems with the button links at the top of each webpage. The problem was
resolved when it was discovered Java needed to be enabled for the links to work. The learners had no problems
once Java was enabled. The instruction will be revised to include a note about how to fix the links if they are not
properly working. Student achievement results were based on the learning goal and performance objectives:
Learning Goal – The instruction will teach five adult learners how to create a Microsoft Excel 2002 workbook.
Performance Objectives – Given a computer with Microsoft Excel 2002
I.
II.
adult learners will identify spreadsheet terminology correctly
adult learners will create a workbook without referencing a study guide
The first performance objective resulted in all learners identifying 8 out of 8 terms correctly. The second
performance objective resulted in all learners creating a workbook without the use of a study guide. The instruction
is effective since the target group successfully met the performance objectives in the intended instructional setting the learner’s home.
Resources:
http://www.gse.pku.edu.cn/jxsj/materials2/Dick%20&%20Carey.htm
http://www.pittcc.edu/distance-learning/faculty-resources/ID_demo.swf
http://en.wikiversity.org/wiki/Before_You_Begin
http://www.tomdorgan.com/eportfolio/defEval.html
http://classweb.gmu.edu/ndabbagh/Resources/IDKB/eval_techniques_summ.htm
http://windev.cis.uncw.edu/mit/students/rose/eportfolio/evaluation.htm
From Dr. Pan’s information
“Effectiveness normally refers to student achievement. You conduct an evaluation by running your learning object
by a good sample of target audience. The assessment part of the learning object measures and assesses the intended
learners as of how much they have achieved. The assessment results are then analyzed and evaluated against the
pre-determined standards set by performance objectives in your design document. If the results and standards match
or learners out perform the set standards, effectiveness is proved.
Quality, on the other hand, usually deals with issues such as typos and errors. You may have a subject matter,
expert, or colleague read through your learning object for typos, errors, illegibility, or inconsistency. A copy editor
or a technical writer may be even better. The goal is to reduce such quality issues as much as possible.
Drawn from the Dick and Carey model, six questions are used to document the evaluation process.”
Download