Chapter 14: Assessing Learning Outcomes Objectives:

advertisement
Chapter 14: Assessing Learning Outcomes
Objectives:
1. Define traditional, alternative, authentic, and
performance assessments
2. Describe effective classroom assessment programs
3. Describe various methods of science assessment
4. Explain scoring assessments and assigning grades
I.
What is Assessment
A.
Plays a critical role in education
1.
2.
B.
Teachers plan to ensure students so well on assessments
Students are motivated to learn content to do well on assessments
Science Reform and Assessment
1.
2.
Tradition assessment = paper and pencil tests
Hein and Price (1994) identified reasons for assessment reform
a.
b.
c.
d.
C.
Tests reveal what students don’t know; can’t assess all outcomes
Science education is now less about content, more about literacy
Accountability of schools
We know more about how students learn: inquiry learning is better
Contemporary Science Assessment
1.
2.
3.
4.
Alternative formats: portfolios, journaling, concept mapping, etc…
Authentic: real world situations; doing what real scientists do
Performance: hands-on or creative tasks, rather than regurgitation
Choosing appropriate assessment is challenging task
D. Other important assessment concepts
1. Diagnostic assessment: used to find out what students know
before beginning a unit; may include interviews, journals, pre-tests,
etc…
2. Formative assessment: used during instruction to find out student
progress and provide feedback to students
3. Summative assessment: final evaluation of learning achieved;
usually comprehensive; often serves as the basis of a grade
4. Reliability: consistency or repeatability of an assessment tool;
confidence in the assigned grade depends on reliable assessment
5. Validity: does the assessment measure what it is supposed to;
content validity means asking the appropriate questions; form
validity means having students answer in an appropriate way
6. Evaluation: assessment = collection of data; evaluation = using the
data from assessment to make a decision about quality of work
II.
Assessment Methods
A.
Performance Tasks
1.
2.
3.
4.
A laboratory practical exam is a performance task
Using materials, equipment, models to demonstrate learning
Teacher assesses by observing the process
Logistics and management challenge the use of this assessment
a.
b.
5.
6.
B.
Use of stations throughout the room can help
Assessing during regular lab activities
Checklists allow timely use of performance assessments (p. 280)
Computer simulations can allow for storing files of student work
Open-Ended Problems
1.
2.
3.
4.
5.
Many and varied ways of arriving at a solution
Usually, solution is presented in writing
Often use real-world issues
Students may use multiple resources to find a solution
Examples:
a.
b.
Describe a method for removing nitrate contamination in well water
Experiments to determine what gave someone food poisoning
C. Inquiry-Oriented Investigations
1. Analyze problem, plan and conduct experiment, organize results,
and communicate findings (Doran, 1998)
2. Provide students with materials and directions, safety precautions
3. Some teachers require approval of a plan before students start
4. WOWBugs example p. 282 (behavior of wasps)
5. Individual accountability (Reynolds, 1996)
a. Group report, but individual reports allowed if student disagrees
b. Group does investigation, but all students submit own report
c. Individuals don’t discuss results; prepare separate reports
6. Large scale investigations
a.
b.
c.
d.
e.
Can last over weeks or even a semester
Assessment and instruction are intermixed
Students get an idea of what real scientists do
Students report various segments of the project over the time period
McPherson College Undergraduate Research program
D. Concept Maps
1. Graphical method to show learning or to infer misconceptions
2. Must provide instruction and practice before using as assessment
E. Observation
1. Science teachers continually do this anyway: write it down
2. Example: safety goggles? correct technique? correct procedure?
3. How to decide what to observe (Hein and Price, 1994)
a. Knowledge application: how students solve problems
b. Information assimilation: how students relate new information to class
c. Vocabulary: listen to student conversation and discussion
4. Challenges
a. Many students per day
b. Many learning objectives per student
c. Checklists can help (p. 284, p. 285)
F. Interviews
1. Verbal questions from teacher to student
2. Can be used before, during, or after instruction
3. Open-Ended
a. Teacher asks fewer, broader questions
b. What do you know about…?
c. Can you explain how that is used outside of school?
4. Partially structured interview
a.
b.
c.
d.
e.
f.
Written set of questions to probe specific knowledge
Probing questions can help clarify the student response
Can be particularly helpful with writing-challenged students
In your own words, what is the theory of…?
What evidence supports to conclusion that…?
What did scientists learn from the study of…?
5. Challenges
a.
b.
c.
d.
Not practical to get lengthy interviews of all students in large classes
Interview a few students at a time, dispersing questions
Interview before, after school, at lunch, during study hall, etc…
Tape student-student interviews
G. Journals
1.
2.
3.
4.
Assess attitudes, growth, and improve writing at the same time
Can include free writing on a subject or specific questions
Often overlook spelling/grammar to get at science
Challenges: don’t have time to read them all
a. Read randomly selected journals each week
b. Not summative assessments
H. Drawings
1. Nonthreatening, simple, useful for reading/writing challenged
2. May include written descriptions, summaries
3. Provide evidence of conceptual change
I. Portfolios
1.
2.
3.
4.
5.
6.
7.
8.
Organization, synthesis, and summarization of student learning
Formative, and Summative assessments included
May ask for reflection by student on past assessments
May include student written captions describing how assessment
was used to demonstrate learning
Involves student in the assessment process
Looks at totality of the experience, rather than isolated data
Student often allowed to choose a limited number of items
Judgement
a. Holistic: on portfolio as a whole
b. Analytic: rating of individual items
III. Developing Assessments
A.
Challenges to using Contemporary Assessments
1.
2.
3.
4.
5.
Often harder than writing exams
Often must construct them yourselves
Authenticity and complexity of tasks
Scoring rubrics is more subjective than scoring exams
Step-by-step process: Lewin and Shoemaker, 1998
a.
b.
c.
d.
e.
f.
g.
B.
Be clear about skills, knowledge, standards targeted
Be familiar with the traits of a strong performance
Use a meaningful context within which you assess
Write and rewrite the task clearly and concisely
Assign the task with step-by-step instructions
Provide examples of “good” work
Score the task and then make revisions for the next use
Rubrics
1.
2.
Written criteria by which student work is judged
Numeric scale is tied to specific performance
3.
4.
5.
6.
Student usually given the task and the rubric at the same time
Tasks can be general and scores specific or vice versa (p. 290)
Some teacher start with a generic rubric and adapt to a task (p. 291)
Value of Rubrics
a. Communicate what students know and can do
b. Provide understandable performance targets to students
c. How will I be graded? How am I doing?
C. Resources of Assessment Tasks
1. Textbooks often include suggestions of contemporary assessments
2. Internet
a. Performance Assessment Links in Science http://pals.sri.com/
b. Links http://www.col-ed.org/smcnws/assesstask.html
3. Journals and tradebooks
a.
b.
c.
d.
e.
National Science Teachers Association
Association for Supervision and Curriculum Development
The Science Teacher
Science Scope
Educational Leadership
IV. Grading and Reporting Grades
A.
Importance of Grades
1.
2.
3.
B.
Impact lives of students by evaluating success and failure
Compare students with each other
Determine scholarship or even acceptance to further education
Types of Grading
1.
Criterion-referenced: judged relative to established criteria
a.
b.
c.
2.
Allow as many A, B, C, D, F grades as students earn
Assumes all students can earn an A
Does not ensure a normal distribution of grades
Norm-referenced: judged relative to other members of the class
a.
b.
c.
d.
e.
Assumes class should match a normal distribution
Particularly difficult to defend with small class sizes
Applied assessment rarely produce this on their own
Curving grades = adjusting grades to meet normal distribution
Homogeneous classes (advanced placement) don’t work
C. Assigning Final Grades
1. Numerical average of all assessments is the traditional method
a. Some argue criterion-referenced scores shouldn’t do this
b. Replace grades with statements of student attainment
c. Norm-referenced scores must use numbers in any event
2. Point and Percentage Systems
a. Assign points for each assignment reflecting the importance to course
b. Assign grades based on total points earned or percentage of points
c. Flaw: a percentage doesn’t tell you specifically what was learned
3. Fairness, Consistency, and Communication are Key
4. Scaling: adjusting letter cut-offs 90/80/70/60  88/78/68/58 for
5. Curving: assigning grades based on a normal distribution
μ
s
s
s
s
s
1
Σx i N  number of measuremen ts
N
x i  an individual measuremen t

1
σ
Σ(x i  μ)2
N 1
F
D
C
B
A

1
2
 Standard Deviation
Download