CR-Eve of Construction

advertisement
Constructed-Response Items
EXAMPLES:
Spelling Test
Short Answer
Completion
List
Label a Diagram
Short Essay
Extended Essay
Performance
Sketch/Drawing
Speech
Portfolio
1
Topics: Constructed Response Item Family
 Nature and Challenges
 Short Answer and Completion Items
 Essay Items
 Performance Tasks and Group Work
 Formative Classroom Assessment
Techniques
 Portfolio Assessment
2
General Nature of
Constructed Response Items
 Students respond “from scratch” based on a prompt.
 Many variations exist, but items fall along a continuum from very simple
responses to very extended responses.
 General reception by many educators is more positive to this item
grouping than the selected response grouping. You will hear other
professionals in education use different terms, usually in the context of
saying “we need to replace our school’s multiple choice tests with
 essay exams.” (speaker is “old-school”, we now think of essay as
one type among many types of constructed response items)
 performance assessment.” (speaker usually means he/she
advocates something other than filling in bubbles on test sheet)
 authentic assessment.” (speaker wants the assessment task to
be closer to a task we would do in everyday experience)
 alternative assessment.” (speaker wants an alternative to multiple
choice exams, but this term is ambiguous as it also has other
meaning; such as making individual student accommodations)
3
Key Challenges for
Constructed Response Items



Scoring Reliability
 The essence of a constructed response item is that it allows for variation in
response; thus evaluation will require human judgment. Different scorers
may have different judgments; one scorer may not be consistent over time
or among students in class.
Adequacy of Content Coverage
 Constructed response items tend to focus on central aspects of the content.
If your testing target is a large body of knowledge or a large number of
learning objective the constructed response approach may come up short.
Constructed response items also take longer for students to answer so this
also cuts coverage.
Consequences for Student Misunderstanding
 The price a student pays when a constructed response item is misread is
often more severe than when a selected response item is misunderstood.
Consider a 50-item multiple-choice exam versus a 5-item essay exam. A
student who misunderstands one essay item has 20% of their final score
affected; misunderstanding a multiple choice is less harsh.
4
Constructed Response Items . . .
Examples ahead
 There are a wide variety of constructed response items. For the
purposes of this class we will restrict our discussion to creating
and scoring the following items. As you come across other
constructed response items, many of the suggestions and
principles discussed here will apply to them.
5
Short Answer
 The short-answer item uses the
constructed-response format. It requires
the student to supply rather than select the
correct answer.
 The typical task relates to simple facts or
skills.
 The item can either be a direct question or
an incomplete statement. Both versions
can be answered by a word, phrase,
number, or symbol.
 NOTE: Professionals use the term
“incomplete statement” rather than “fill-inthe-blank.” Why?
6
Short Answer Questions
. . . Presented via the classic chicken joke.
What do you get when a chicken lays an egg on top of a barn?
An eggroll
Is chicken soup good for your health?
Not if you're the chicken
Which day of the week do chickens hate most?
Fry-day
Why does a chicken coop have two doors?
With four doors it would be a sedan.
Why did the chicken cross the basketball court?
He heard the referee calling fowls
How do two chickens dance when they dance together?
Chick to chick
Which ballroom dance will a chicken NOT do?
The foxtrot
How do you stop a rooster from crowing on Sunday?
Eat him on Saturday
7
Short Answer Becomes Completion
Using principles of good item writing, change each of the following short
answer questions to completion items. Consider good item construction
techniques when answering these questions: Where should the “blank” be
located in the sentence? How long should it be? Should I use multiple
blanks for phrases?
Which day of the week do chickens hate most?
How do two chickens dance when they dance together?
Which ballroom dance will a chicken NOT do?
How do you stop a rooster from crowing on Sunday?
Scoring judgments . . . How close does the student’s response need to be
to your envisioned answer to receive credit? What about spelling?
8
Some Final Thoughts on . . .
Completion Items
Completion items are similar to multiple choice questions without the
distracters. Students need to recall the information being asked based on
context clues found in the stem. Reasons for using them include:





Completion items support learning objectives which focus on having students
being able to summon up specific information from memory; however, don’t take
stems verbatim from textbooks, lectures, overheads, etc. Why?
Completion items can be constructed more quickly than multiple-choice items,
since you don't need to create distracters.
Sometimes it is difficult to construct a multiple-choice item without making the
answer obvious. As there is no answer in a completion item, this type of item
avoids this kind of problem.
Completion items can help students gain proficiency in the use of the new
context clues found in the item stem since, if students seek any memory
associations to help their response, the help will be found in the stem.
One difficulty in creating a completion item is to formulate the stem with
sufficient contextual clues so that the wanted word is clearly indicated and
ambiguity is avoided.
9
Let’s Look at . . .
Using and Assessing Essays
10
Some overall thoughts on . . .
Using Essay Items
The structure of the essay item often means that successful essay responses may be
measuring writing skill as well as measuring of content knowledge.
Teach these skills before the test, not just on tests. Regular in-class essay writing should
make the essay test approach less threatening and the test results more meaningful.



EXAMPLE FOR STUDENTS:
These simple steps will guide you
through the essay writing process:





Decide on your topic.
Prepare an outline or diagram of
your ideas.
Write your thesis statement.
Write the body.






Write the main points.
Write the subpoints.
Elaborate on the subpoints.
Write the introduction.
Write the conclusion.
Add the finishing touches.
11
Some overall thoughts on . . .
Using Essay Items
 Essay items are best for measuring students' higher level cognitive abilities (e.g.,
Use freedom of response and originality are important - measures ability to
organize, integrate, relate, and evaluate ideas); if you are thinking of measuring
knowledge only, considering using something besides an essay.
 The essay prompt (called many names – question, stimulus) must be clearly
stated for the students so they can write to it and so you can evaluate it effectively
later. Consider this example:
 Poor item - "Why did we enter World War II?”
 Better "State three reasons cited by historians that you feel best
explain America's entry into World War II.“
 Provide a suggested length in terms of paragraphs or pages.
 Avoid optional questions (e.g., choose 3 of the following 5). While this is good for
student morale, it makes it problematic to score. All essays are not likely to be of
equal difficulty; if students know there will be a choice, they can focus study away
from your learning objectives.
12
Thoughts to consider as you . . .
Create Individual Essay Items
As you create a high quality, valid essay item experience for your students, ask
yourself these questions about every item and the scoring plan:
 1. Does the item target a specified learning objective?
 2. Is the level of reading skills required by this item below that of student ability?
 3. Can all students answer the item in less than the allotted time?
 4. Are higher level thinking verbs like "predict" or "compare and contrast" used
rather than recall verbs like "list" and "name" or ambiguous verbs like "discuss"
and "tell.“
 5. Will all or most all content experts agree that the scoring plan outlines the
correct response to the item?
 6. Will the scoring plan insure that your judgments on each essay are protected
from bias?
 7. Are all students aware of how the essay with be scored?
13
Thoughts to consider when you . . .
Create Your Scoring System
Essay Scoring Systems – Some Basic Choices
1. Point method - Have a written outline for yourself which expresses your
preconceived model of a high quality answer (i.e. key points to be included or
skills to be demonstrated). Simply sum these points.
2. Analytic method – use a two-way scoring rubric (e.g., rate on subscales from 1
to 4); raters break the essay task into important predetermined sub-tasks
associated with key points and skills.
3. Holistic method – use a one-way scoring rubric (e.g., rate on overall scale from
1 to 9); raters compare each essay taken as a whole to the model. There is a
variation to this method in which the raters sort all the essays into three
categories (for example: “below average”, “average”, “above average”) then fine
sort within categories. Some teachers use this method for A, B, C, D, F.
4. Primary Trait method – Used most often when the essay task is a practical one
(for example, “Write a letter to your French pen pal.” The score is determined
on whether it was complete or not; sometimes we say “met, or unmet”. The
students receive a predetermined score when the task is completed
satisfactorily.
14
Thoughts to consider as you . . .
Score Individual Essay Items
1.
2.
3.
4.
5.
6.
Have your scoring key or scoring rubric physically with you as you score.
Prior to the start of reading your students’ essays, decide how to handle writing
mechanics issues such as grammar, penmanship, spelling and punctuation.
Evaluate one question at a time, avoid the "halo" effect of the first good/ or bad
answer impacting future judgment.
Don't look at student's name – “I know she knows, but she just didn't express
herself.” OUCH or “How did he come up with this answer, he must have
cheated.” DOUBLE OUCH One solution is to have students place their names
on back of essay . . . of course, you may recognize their handwriting.
Watch for the tricks of bluffing - name dropping; addressing the significance of
problem but not its solution; making some great points but they are off the topic;
just writing and writing and saying nothing.
Use two or more raters if the decision based on this essay is critical.
15
Some overall thoughts on . . .
Using Performance Task Assessment
 Certainly by asking students to take written exams we are interested in




their performance, but we are thinking of performance a bit differently
here. In performance task assessment we are interesting in having
them do something other than paper and pencil testing.
Performance testing can be standardized and they can have norms just
like the paper an pencil tests. Most likely, however, you will create
performance tests for use in your own classroom much as you create
essay exams.
So, the students are active in producing something. In fact, it might
look like an instructional activity. It would distinguish itself from an
instructional activity in that it would have an assessment component.
As teacher, you might assess the process the student is using or you
might assess the product. Or both.
Assessing a performance task would have similar scoring issues as an
essay, so look back at those guidelines.
16
Some Final Questions to Consider on . . .
Performance Projects and Assessment
 Performance tasks, whether they be individual or group, have special
questions to consider as we evaluate the products and processes
associated with them, for example:
 How can I restructure the class period in order to give students
time to work on the products? This time needed will expand if the
projects involve group work. Is this taking away for important
content I should be teaching?
 How can I restructure my class time so I can fairly assess both the
process and the product. What will the rest of the class be doing
while I am assessing the performance task (since, by the nature of
these assessments, not everyone is “on stage” at once)?
 How can I be certain that tasks completed outside of my direct
supervision were really done by the student? Certainly there is
cheating on paper and pencil exams; but if work completed at home
is a large percent of ones final score, I may be asking for trouble.
17
Using Constructed Response Items for . . .
Formative Assessment
 The intent of this group of techniques is to collect data which will allow
immediately redirect learning, if necessary. Authors Angelo and Cross
(1993) used the unfortunate term “Classroom Assessment Techniques”
(why unfortunate?) and it has caught on in the literature.
 It functions quite simply. At key points decided by the teacher, the
students are asked for brief, written responses to open ended
questions (some teachers like oral responses). Students are told their
responses not be graded (as an alternative, the questions might be
blanket scored with low point values). When written, 3 by 5 cards or
even scrap paper might be used ; allow students 1-3 minutes to write.
 The teacher reviews the responses simply to see if the students “get it.”
No rubric is used. Teachers can read these quickly and determine
follow-up activities based on the cards.
 The next slide has examples of constructed response items that might
be used in formative assessment.
18
Examples of Brief Constructed Response Items for . . .
Formative Assessment
WRITTEN (delayed feedback but private)
Directed paraphrasing – Write the meaning of a key concept or term in
their own words.
Muddiest point – Identify the most confusing point discussed.
Pro and Con Grid – Provide thoughts both for and against an idea
discussed.
Test Item – Prepare a test item appropriate for the topic.
ORAL (immediate feedback but public)
Lecture Pause - Teacher stops lecture at 1 or 2 key points and asks
students to reflect on how they are feeling about what they are
learning. After allowing reflection time call on a few students to sample
the feelings.
Opinion Poll - Teacher poses questions, students respond in unison by
each holding up cards (Yes or No; A, B, or C). Notice this is really a
selected response variation. Some schools use electronic clickers.
19
Practical Advice . . .
To following when using construction response items.
1.
Become proficient in, and use a mix of, both selected response and
constructed response items.
2.
Devise your scoring system in advance.
3.
Make sure you are assessing your learning objectives and not
extraneous skills.
4.
If you are interested in assessing higher level cognitive skills, make
certain that the range of anticipated responses is truly open-ended.
If there is truly only one possible response, consider re-crafting the
item as a selected response item .
20
Download