Undergraduate Research - Center for Innovative Teaching and

advertisement
Indiana University
Scholarship of Teaching
and Learning
September 17, 2010
Establishing a Culture of
Experimentation and Evidence
Reframing Assessment
Teaching as an Iterative
Process of Inquiry
Robert J. Thompson Jr.
Cultural Transformation is
Occurring in the Academy
• Acceptance of the need to do better
regarding undergraduate education
• Willingness to reexamine familiar practices
and search for new methods
• Appreciation that experimentation to
improve learning can be rewarding
Drivers of the Cultural Transformation
• Internal
• Faculty
• Institution
• Academy
• External – Society
• Internal
External
Key Issues
• Focus on improving student learning
• Relationship between assessment and
accountability
• Teaching as a process of inquiry
• Becoming a learning organization
Approaches
Engage societal & academy organizations
in addressing key issues
Establish collaborative projects
• Iterative T & L experiments
• Share approaches to teaching and learning
• Share assessment methods and measures
Form and engage networks in collaborative
studies
Improving Student Learning
• Thousand wild flowers
• Increase the yield
• Accelerate the pace
• Engage more faculty
• Integrate disparate efforts/findings
• Scale
Engaging organizations
• AAC&U
• Reinvention Center
• Department of Education
AAC&U
•
•
•
•
Learning Objectives
Curriculum
Pedagogy
Assessment
Curriculum
• General education
• Coherent
Pedagogies of Engagement
Learning Communities
Research/ Capstone/Thesis
Writing intensive courses
Experiential Learning
Internships
Study Abroad
Service Learning
The Reinvention Center
Consortium of research universities (60+)
Mission: to strengthen undergraduate
education through networking and
promoting the exchange of information and
ideas and multi-campus collaborative
projects.
www7.miami.edu/ftp/ricenter/
Assessment and Accountability
Department of Education:
Spellings Commission Report: 2006
•
•
•
•
Access
Affordability
Quality
Accountability
Commission on Higher Education:
Recommendations
• Provide more evidence of student achievement and
institutional performance and make this evidence primary
when judging academic quality.
• Make information easily understandable and readily
accessible to the public.
• Develop various means to compare institutions regarding
their success in student achievement and institutional
performance.
• Establish threshold standards for collegiate learning.
Judith Eaton Change September/October 2007, pp. 18-19
Issue: Relationship between
assessment and accountability
… if assessment becomes synonymous with
standardized testing, what will happen to
assessment undertaken for the purpose of
guiding improvement in instruction,
curricula and student services?
Trudy Banta Peer Review Spring 2007, p. 12.
Premise:
“Assessment can and should be designed
to deepen and strengthen student
learning, not just to document it.”
Carol Geary Schneider Peer Review, Spring 2007, p.3
Marketplace Perspective on
Accountability
• Education is seen as a product that is
provided
• Focus is on student achievement and
institutional performance outcomes
• Approach is comparative and competitive
among institutions
Academic View of
Accountability
• Education is seen as a process of enabling
growth
• Focus is on instituting a culture of
experimentation and evidence with regard
to teaching and learning
• Approach is evaluative within institutions
and collaborative among institutions in
pursuit of best practices
Academic Model of Assessment
The primary functions of assessment are
to improve our efforts to promote the
intellectual growth and personal
development of our students
Accountability Reframed
Institutions of higher education hold
themselves accountable to their multiple
constituencies for establishing continuous,
systematic, iterative processes to improve
the quality of teaching and learning.
Teaching as a Process of Inquiry
Process of Inquiry
•
•
•
•
•
•
•
•
•
Make observations
Pose questions
Identify sources of information
Identify assumptions
Propose explanation/Formulate a hypothesis
Conduct an experiment or study
Analyze findings
Consider alternative explanations
Communicate findings
Boyer: 1990 Scholarship Reconsidered
• Scholarship of Discovery
• Scholarship of Integration
• Scholarship of Application
• Scholarship of Teaching
Becoming a Learning
Organization
Derek Bok: Our Underachieving
Colleges (2006)
• Argued the need for continuous, systematic,
experimentation and evaluation
• Process of enlightened trail and error
• Encouraged funding agencies to support
institutional efforts to establish systematic,
continuous processes to improve
undergraduate education rather than specific
particular innovations
Systematic Improvement in
Undergraduate Education in
Research Universities
The Teagle Foundation
The Spencer Foundation
Teagle and Spencer Project
• Goal: to foster a culture of experimentation
and evidence for undergraduate education at
research universities such that iterative
approaches to curricular and pedagogical
efforts to enhance student learning and
engagement become the standard of practice
for departments and programs responsible
for undergraduate education.
Objective
• To seed a number of iterative curricular and
pedagogical experiments aimed at
improving undergraduate teaching and
learning at research universities.
Objective
• To build a knowledge base about promising
practices regarding effective teaching and
learning processes in two core areas of
intellectual skills central to a liberal
education: writing and critical thinking.
Objective
• To promote a spread of effect of both best
practices and the adoption of iterative
approaches to undergraduate education
within and across fields and institutions.
Collaborative Project
•
•
•
•
•
•
•
Carnegie Mellon
Duke
Georgetown
Indiana
Penn State
UC-Berkeley
UC-Davis
• Illinois-UrbanaChampaign
• Kansas
• Michigan
• Nebraska-Lincoln
• UNC-Chapel Hill
• USC
University of Nebraska
• Developing a system for assessing student
learning regarding 10 general education
learning objectives
University of California-Davis
• Writing in large lecture courses
Carnegie Mellon University
• Using argument diagramming in freshmen
writing course to promote critical thinking
Indiana University
The History Learning Project
“Decoding the Discipline”
Leah Shopkow, Arlene Díaz,
Joan Middendorf, and David Pace,
Kansas University
Evaluating the Cognitive
Apprenticeship Model for
Improving Undergraduate Student’s
Critical Thinking and Writing Skills
Andrea Greenhoot
Department of Psychology
University of Kansas
Collaborators
Dan Bernstein
Director, KU Center for Teaching Excellence
Jennifer Church-Duran
Associate Dean of Library Instruction
Terese Thonus
Director, KU Writing Center
Catherine Weaver
Assistant Professor of Political Science
Kansas University Project
• The Goal
• Improve critical thinking, application and
writing skills in undergraduate students
• Model: Cognitive Apprenticeship
• Identification, modeling of expert-like
processes
• Staged, scaffolded learning tasks
• Students are supported by experts and peers
Inquiries and Measurement
• What is the impact of the program on targeted skills in the
course?
• Systematic application of rubrics each semester
• Course-specific pre- and post-tests of target skills
• What is the impact of the program on general skills not
specific to course goals?
• Application of a general intellectual skills rubric to
samples of student work
• General dimensions of learning, but assessed in embedded
student work rather than stand alone testing
• Do target skills generalize to new contexts?
• Pre- and post-test using Collegiate Learning Assessment
(CLA)
University of Michigan
The Impact of Meta-Cognitive Strategies
within Writing in the Disciplines
Naomi Silver, Matthew Kaplan, Deborah Meizlish, and
Danielle LaVaque-Manty
University of Michigan: Aim
The goal:
To discover what types of interventions and
pedagogical strategies will help students
better understand not only course content,
but also discipline-specific modes of
thinking and writing.
Questions:
To what extent does engagement in
meta-cognitive activities bring students’
conceptions of thinking and writing in the
disciplines closer to that of faculty?
To what extent does better understanding of
these modes of thinking improve the
students’ actual writing?
Assignments designed to:
• Expose student thinking
• Enable productive dialogue between
instructors and students
Reflective process with writing
assignment
• Planning: How the assignment will develop
disciplinary thinking/writing
• Monitoring: Marginal comments on
difficult/interesting moments in writing
• Evaluation: How did the assignment
promote disciplinary thinking/writing and
what they can apply to future assignments
Evaluation
• Rubrics for “think-like” psychologist and
political scientist
• Pre-post student appraisals
• Students’ written responses to planning and
monitoring prompts
• Pre-semester interviews and post-semester
focus groups with instructors
Duke University
• Goal: Increase undergraduate research and
honors theses
• Questions:
• How will faculty manage the increased
work load?
• How will we know if our approach is
successful?
The Thesis Assessment Protocol
Julie Reynolds
Department of Biology
Duke University
TAP Protocol
• Framework: Scientific peer review process
• Timeline for revisions
• Methods for soliciting and responding to
feedback
• Assessment guidelines & rubrics for
faculty
Methods for soliciting and
providing feedback
• Students write first draft
• Students solicit feedback
• Faculty provide feedback through rubric
worksheet
• Students respond to feedback via worksheet
• Students write final paper
Feedback & Assessment Rubric
Quality of Research
1. Does the thesis represent the student’s
significant research?
2. Is the literature review accurate and complete?
3. Are the methods appropriate, given the
student’s research question?
4. Is the data analysis appropriate, accurate and
unbiased?
Feedback & Assessment Rubric
Critical Thinking Skills
5. Does the thesis make a compelling argument
for the significance of the student’s research
within the context of the current literature?
6. Does the thesis skillfully interpret the results?
7. Is there a compelling discussion of the
implications of findings?
Writing Skills
8. Is the writing appropriate for the target
audience?
9. Does the thesis clearly articulate the student’s
research goals?
10. Is the thesis clearly organized?
11. Is the thesis free of writing errors?
12. Are the citations presented consistently &
professionally throughout the text & in the
list of works cited?
13. Are the tables and figures clear, effective, and
informative?
Student response to feedback
Summary of readers comment/Reader
Student response
Examples:
1. My Faculty Reader said she didn’t see
the relevance of the article by Smith and
Jones (2002) to my research.
I rewrote the introduction to the
paragraph in which I reviewed Smith and
Jones’ research, making it more explicit
that this research influenced the choice of
methods that are commonly used in this
field.
I discussed this with my Faculty Reader
who said that as an outside reader, she
appreciated the extended background
section. So, I decided to keep all the
details I presented in the background
section, but to revise it for conciseness.
2. My Research Supervisor said he didn’t
think I needed to provide so many
background details in the Introduction.
Location in
revised
thesis
Literature
review (in
Introduction)
Introduction
Teagle/Spencer Project Findings
• Approaches – insights regarding
what and how
• Findings – student learning
Critical Thinking
• Disciplinary specific
• Explicate the metacognitions
• Provide the scaffolding
Curriculum: From General
Education to Education in the
Disciplines:
“if we improve learning in the disciplines, we
will have improved general education”
Beyer, Gillmore, & Fisher, 2007, p. 363)
Writing
• Makes thinking visible
• Enables dialogue about how to think more
critically
• Engages metacognitive processes:
• Planning,
• Monitoring
• Evaluation
Learning to Write & Writing To Learn
• Assignments
• Feedback > revision
• regarding what – targets
• how – methods
• Grading
• criteria
• how - methods
Rubrics
Teagle/Spencer Project Conference
• What does evidence of metacognition look
like on your campus?
• How does it differ across disciplines?
• How does it differ across levels of student
development?
• What are the challenges to implementation
of strategies related to metacognition?
Teagle/Spencer Project Conference
• What is the relationship between assessment
and faculty development?
• Is either one a good and useful wedge into
the other?
• Is assessment a teachable moment for good
practice?
• Is department level assessment a route to
participation in teaching improvement?
Establishing a Culture of
Experimentation and Evidence
Next Steps
“ Just as weighing a pig will not make it
fatter, …” (p.10).
Banta, T. W. (2007). Can assessment for accountability
complement assessment for improvement? Peer Review 9
(2), p. 9-12.
National Center for Post
Secondary Improvement
• Extensive involvement in student
assessment but limited use and impact
• “..student assessment data has only a
marginal influence on academic decision
making.”
• Research universities make the least use of
student assessment data for academic
decisions
Correlates of Use
• The extent of student assessment being
conducted
• The extent of professional development
related to student assessment for faculty,
staff, and administrators
(Peterson & Augustine, 2000, Research in Higher
Education, 41, pp.21-52; 443-479)
National Institute for Learning
Outcomes Assessment
• Established in 2008, (NILOA) assists
institutions and others in discovering and
adopting promising practices in the
assessment of college student learning
outcomes.
www.learningoutcomeassessment.org
NILOA’s primary objective:
• to discover and disseminate ways that
academic programs and institutions can
productively use assessment data internally
to inform and strengthen undergraduate
education, and externally to communicate
with policy makers, families and other
stakeholders.
NILOA Report
More Than You Think, Less Than We Need:
Learning Outcomes Assessment in
American Higher education
(Kuh & Ikenberry, 2009)
Key Points
• Major drivers for assessment:
• Accreditation
• Commitment to improvement
• Must Provide evidence of how findings are
being used to improve student learning
• Program level findings are more likely to be
actionable
• Know what to do to improve student
learning
Occasional Papers
• Ewell, P. (2009, November). Assessment,
accountability, and improvement:
Revisiting the tension.
• Banta, T.W., Griffin, M., Flateby, T.L., &
Kahn, S. (2009, December).Three
promising alternatives for assessing college
students' knowledge and skills.
Occasional Papers
• Wellman, J.V. (2010, January). Connecting
the dots between learning and resources.
• Hutchings, P. (2010, April). Opening doors
to faculty involvement in assessment.
• Swing, R.L. & Coogan, C.S. (2010, May).
Valuing assessment: Cost-benefit
considerations.
Engage Networks
Reinvention Center
Assessment Working Groups
• General education
• E-portfolios and rubrics
• Writing to learn in the STEM disciplines
• NSF supported Workshop
• Multicampus collaborative proposal
Download