Document 15134543

advertisement
The Portals (Coolfont) Workshop decided that:
• DLESE will be a two-level collection:
– The unreviewed collection: a broad collection of
content which is relevant to Earth System education
and meets minimum quality and technical standards
– The reviewed collection: a subset of high quality
teaching and learning materials which have been
rigorously evaluated
• The rationale for having a reviewed collection:
– For the user: guaranteed high-quality resources, even
for a teacher without expertise in the field or time to
“comparison shop”
– For the creator: inclusion in the reviewed section of
DLESE can become a recognized stamp of professional
approval
• The rationale for having an unreviewed collection:
– For the user: access to a wider range of teaching and
learning resources
– For the library builders: a pool from which to select the
reviewed collection
OK, so how do we decide what
goes into the reviewed collection?
From the Portals (Coolfont) workshop:
• Selection Criteria:
– Accuracy, as evaluated by
scientists
– Importance/significance
– Pedagogical Effectiveness
– Well-documented
– Ease of use for students and
faculty
– Motivational/inspirational for
students
– Robustness/Sustainability
Familiar review procedures:
• “Traditional peer review”
• “Traditional educational
evaluation”
Traditional “PeerReview”
• Reviewers are
selected for their
expertise by an
editor.
• Reviewers examine
the material, or a
description of the
material, in their
home or office.
• Typically two
reviews.
Traditional
“Peer-Review”
What’s wrong with
this picture?
Traditional
“Peer-Review”
There are no
students in this
picture!
“Traditional Educational Evaluation”
• Evaluator (reviewer)
is selected by the
developer.
• Evaluator observes
students in another
teacher’s classroom
and/or administers
evaluation
instruments
• Typically one
evaluator, several
classes of students.
“Traditional Educational Evaluation”
What’s wrong with this picture?
“Traditional Educational Evaluation”
Evaluation by independent professional
evaluators is labor-intensive and expensive!
Community Review Concept
Premises
• The materials in the “inner-circle” of reviewed, DLESE-stamp-ofapproval-bearing resources must be classroom-tested.
– However, testimony from the creator of a resource that learning
has occurred in his or her classroom is insufficient.
– It is not realistic to pay for professional evaluators to go into
classrooms to evaluate whether student learning has occurred for
every potential DLESE resource.
– Experienced educators can tell whether or not their own students
are learning effectively from an educational resource.
– It is easier to answer: “Did your students learn?” than “Do you
think students would learn?”
Community Review Concept
Premises (cont’d)
• In order to be useful, DLESE has to contain lots of
resources. Therefore it must grow fast.
• In the DLESE ecosystem, teachers, classrooms and
students will be abundant resources.
• The rate-limiting resources in DLESE’s growth will be
money, and the time of paid librarians/editors/gatekeepers
and computer wizards.
• This is a digital library; we can and should take advantage
of automated information gathering
Community Review Concept
Procedure
Coolfont Selection Criteria
How to implement
Well documented
Review by library staff
Importanc e/Signifi cance
More than N (thre shold nu mber to
be chosen) educa tors from DLESE
community tried this resource in
their classroom.
Pedagog ical Effectivene ss
Ease of use for students and
faculty
Insp irationa l or motivationa l for
students
On-line que stionna ire fil led out by
educ ators who used resource in
their classroom
Accuracy, as eva luated by
scientists
Invited review by a scientist ,
recruited by an editor
Robustness/sustainab ility
Testing by a qua lit y a ssurance
profe ssiona l
Community Review Concept
Procedures (cont’d)
• What happens to the questionnaire information?
– Creator receives all feedback from “YES” and “NO” respondents.
– Builders of Discovery System receive feedback from “NO”
respondents.
– Suggestions typed in the teaching tips field are added to the
Teachers’ section of the resource.
– “Editor” or “gate-keeper” is automatically notified and receives
full packet of reviews when number of complete reviews exceeds
N and the average, or weighted average, of the numerical scores
exceeds Y.
Community Review Concept
Procedure
Coolfont Selection Criteria
How to implement
Well documented
Review by library staff
Importanc e/Signifi cance
More than N (thre shold nu mber to
be chosen) educa tors from DLESE
community tried this resource in
their classroom.
Pedagog ical Effectivene ss
Ease of use for students and
faculty
Insp irationa l or motivationa l for
students
On-line que stionna ire fil led out by
educ ators who used resource in
their classroom
Accuracy, as eva luated by
scientists
Invited review by a scientist ,
recruited by an editor
Robustness/sustainab ility
Testing by a qua lit y a ssurance
profe ssiona l
Community Review Concept
Strengths
• Inclusive: The community builds the library.
• Scalable: Hundreds or thousands of resources can be
classroom-tested.
• Thorough: All seven Coolfont/Portals selection criteria
are applied.
• Economical: Scarce talents are applied at the end of the
process, to the smallest number of items.
Community Review Concept
Issues
• How do we get educators to send in their reviews?
• How do we ensure that reviews come from bona fide educators?
• Would creators “spin” the review process by soliciting reviews from
their friends?
• Would the merely-good early arrival tend to keep out the truly
excellent later arrival?
• Some topics are inherently less inspirational/motivational than others;
how do we avoid filtering out resources on such topics?
• What about off-the-wall, or erroneous, or malicious reviews?
Traditional “Peer-Review”
Traditional “Educational
Evaluation
Proposed “Community
Review”
Reviewers are selected by
their exp ertise by an editor.
Evaluator (reviewer) is
selected by the deve loper.
Reviewers step forward from
the communi ty.
Reviewers exa mine the
material, or a description o f
the material, in their ho me
or offic e.
Evaluator observe s studen ts Reviewers test the material
in anoth er teache r’s
in their own c lassroom with
classroom and /or administ ers their own studen ts.
eva luation instruments
Typically two revie ws.
Typically one eva luator,
several classes of studen ts.
A large nu mber of
reviewers/studen ts is
possible.
Moderate cost
High cost
Moderate cost
Editor is supposed to filt er
out un fair or ma licious
reviews.
Evaluator is a neutral party,
neither studen t’s teache r no r
resource creator.
Unfair or malicious reviews
can be swa mped out by
numerical weight of good
reviews.
How can I become part of DLESE?
• … as a resource creator/contributor
• … as a user
• … as a reviewer/tester
Continue the conversation at:
collections@dlese.org
or
http://www.ldeo.columbia.edu/dlese/collections
Download