Student Reflections as Data - UBC Centre for Teaching, Learning

advertisement
Student Reflections as Data
Dr. Allyson Fiona Hadwin
University of Victoria
(EPLS & LTC)
hadwin@uvic.ca
UBC Institute for the Scholarship of Teaching and Learning
1
Examining and Interpreting
Data Collected over Time
Student reflection data
Student interview data
2
Student Reflection Data
Journals
Portfolios
Reflection assignments
3
Why do you want to look at
this data in the first place?
What is your research question?
4
There are different kinds of
research questions
Introducing the language so you know
where to look beyond this course
Covered in many introductory research
methods texts in education
Creswell, J. W. (2005). Educational Research: Planning,
conducting and evaluating quantitative and qualitative
research. Upper Saddle River, NJ: Pearson-Merrill Prentice
Hall.
5
Develop an understanding of what students gained
through the experience (THEORY GENERATION)
?
What are some possible models to explain student
learning in this program?
INDUCTIVE
6
Examine students' reflections according to
theoretically driven constructs
(THEORY DRIVEN)
?
What kinds of sensitivities and awareness
developed with respect to each of these
issues: X, Y, Z?
Theory verification
7
Understand the essence of student
experiences of X.
?
What is the meaning of this learning
experience for students?
Phenomenology
8
Compare students to one another
?
How were these students’ experiences
similar or different?
Cross case study
9
Compare students to themselves at
different points in time
?
How did students change over the course of
this instructional experience?
Within case study
10
What kind of question do you
have?
Looking for something
Top Down
Bottom Up
No idea what to expect
11
What is the question?
A characteristic of reflection data:
Usually interested in higher order, complex
thinking
Critical thinking
Community awareness
Self-regulated learning
Difficult to operationalize and measure
12
Caution
Data analysis and interpretation are
limited (and afforded) by:
– the kind of data you collected
– how you collected the data
– your own biases and roles in the process
13
Who is doing the analysis and
interpretation?
Researcher
The student
Collaboration between researcher and
student
14
Beginning steps in the
analysis process
1. What is the unit of analysis
2. Organize data according to unit of
analysis
3. Choose an analysis approach that is
consistent with your research
question
15
Individual
Responses to
questions or themes
Timing of events
Unit of Analysis
Key Events
Key Processes
16
Data Sets
1. Reflections on specific questions
collected by Dr. Dana Damian (University
of Victoria)
2. Reflections about studying and learning
over 3 study sessions using computer
based technologies for learning
(collected by Allyson Hadwin)
17
Data sets
What unit of analysis might you
choose for each data set?
Explain why?
18
Organize data
Make 2 copies of everything
Make sure identifier information is on
each document (and unit)
Group data into the desired unit of
analysis
Choose an appropriate analytic
approach
19
Semi-structured or
Loosely structured?
Semi-structured
– Responses to specific questions
– Reflections on particular themes
Loosely structured
– Open discussion
– Reflections about learning without specific
criteria
20
Which data set is:
Semi structured?
Loosely structured?
21
Analytic approaches
Content analysis
Case study
Inductive analysis
Narrative analysis
Phenomenological analysis
Grounded theory analysis
22
Combining analytic
approaches
Often these analytic approaches are
combined.
I describe each separately to give you a
sense of the main focus
23
Content Analysis
Code (from theory or data)
Pull apart
Re-construct according to categories and
themes
Summarized numerically & with quotes
Build theory and explanation
Not very sensitive to context of statements
and ideas
24
Narrative Analysis
Preserves the whole
represents the perspective of the “teller”
Themes in context
Researcher voice appears in explanation
and representation
Many of the participants words included
25
Case Study Analysis
Assemble all sources data for one case
(reflection, interviews, etc)
Construct a case record –classifying and
editing raw data into an accessible
package
Write a case narrative (organized
chronologically or thematically, or both)
Pattern matching (comparing theoretically
predicted patterns to data based patterns)
Explanation building (theory building)
Time series analysis –trends over time
26
Inductive analysis
Themes emerging from the data
Also top down sensitizing from the
researcher
Movement toward typologies
Synthesis and explanation of typologies
27
Grounded Theory analysis
Fine-grained analysis of small units (sentences or
phrases)
description
conceptual ordering
theorizing is interplay between data and theory
asking questions and making comparisons
throughout
layers of coding from micro-analysis to modelling
iterative inductive cycle with specific stages
Constant comparative approach
Saturation
28
Phenomenological analysis
•
•
•
•
1.
2.
3.
4.
-examining or uncovering the essence of a
experience or behaviour
-involves exploration of personal biases
-looking for what is "real” & teasing apart judgment
-bracketing or identifying data in its pure form
Epochè -look inside at personal bias
Phenomenological reduction -bracketing from the
world and all presuppositions
Horizontalizing -data spread out for interpretation
Structural synthesis -deeper meaning or essence of
phenomenon is revealed
29
Important elements of analysis
Systematic approach (open about and
describe the approach)
Coding data
Description
Conceptual Ordering
Higher order analysis or theorizing
Strauss and Corbin (Grounded theory)
30
Steps for your analysis
1. Focus your analysis
2. Organize your data
3. Decide upon and justify the
appropriate way to attack or analyze
the data?
4. (coding)
5. Conceptually organize data (try 2
different ways of presenting findings
to bring meaning to themes)
31
Coding
Marking segments of data with symbols,
descriptive words, labels or names
Keep a master list of codes, their meaning,
examples, & non-examples
Theory driven
Data driven
32
Description
Emerges from coding the data
What does the data say
What is important
What are the patterns?
What are basic stories
33
Description
34
Numerical description
35
Conceptual Ordering -logical
analysis
• Creating categories, finding means for identifying and
displaying themes
• Keeping an eye toward interpretation
• Must be careful - once you start organizing data
according to a frame it becomes difficult to let go and
try something different when it is not working
•
•
•
•
Matrixes
Concept maps
Summary tables
Diagrams
36
Conceptual ordering
37
Data may be organized:
•
•
•
•
•
•
•
•
•
Time ordered
Role ordered
Role by time ordered
Role by group
Conceptually clustered
Site dynamics
Predictor - outcome
Process -outcome
see Miles & Huberman (1984) for
details
38
Numerical analysis
39
Interpretation -Theory
• Focus is illumination, extrapolation, and
understanding
• Does not focus on cause and
consequence
• Going beyond the data to connect with
theory or to generate theory/explanation
• Dealing with rival explanations,
disconfirming cases, and data irregularities
40
Higher order theorizing
41
Group activity
What have students learned
about their learning through the
computer based learning activity
42
Analyzing data activity
Focus your analysis
Organize your data
Decide upon and justify the appropriate
way to attack or analyze the data?
(coding)
Conceptually organize data (try 2
different ways of presenting findings to
bring meaning to themes)
43
When a theoretical frame
informs your analysis
Coding scheme comes from the theory
Top Down
For example, you can use a rubric to
analyze (and assess) deeper level thinking
44
Working with theory driven
constructs
Defining your construct
– From the data (inductive)
– From the theory (lens for the data)
Rubrics
Content analysis
45
An Example from Critical
Thinking
VanGyn, G., Ford, C., & Associates
(2005). Teaching Critical Thinking.
Unpublished manuscript.
46
Critical Thinking Rubric
Interviewed faculty
Examined critical thinking assignments
looking for ways to define and assess
critical thinking
47
Key Dimensions of critical
thinking
Intellectual habits
Intellectual deliberations
Reflexive Disposition
48
Intellectual habits
Intellectual curiosity
Respect for truth and reason
Fair and open mindedness
Tolerance for ambiguity
Intellectual courage to take a position
Intellectual work ethic
Willingness to engage in collaborative
inquiry
49
Intellectual deliberations
Identify the challenge, situation or task
Gather, interpret and understand
background information & evidence
Apply thinking strategies relevant to the
type of inquiry relevant to the challenge
Generate or select alternatives
Make evaluative judgments among
alternatives based on criteria
Provide justification for the conclusion
50
Reflexive disposition
Plan ahead for critical thinking
Monitor its quality throughout
Reflect on the strengths and limitations of
the use of intellectual habits and
intellectual deliberations in making a
judgment
51
Quality, rigor, credibility
Qualitative analysis is judged by different criteria
largely governed by:
– Design chosen
– Philosophical position (often represented
in design
Identify/reference the framework for credibility you
are using *** Think of your writing as an opportunity
to educate the reader about rigor in qualitative
research
52
Internal validity
Triangulation (data, perspective, theory)
Member checks
Long-term observation
Peer examination
Participatory models
Exposing researcher biases
53
Fossey, et al. (2002)
Congruence
Responsiveness to social context
– Emergent research design
– Sampling, data collection,
analysis
Appropriateness
– Sampling
– Data collection
Adequacy
– Sampling
– Data gathering and analysis
54
Transparency
Data collection and analysis
Authenticity
Presentation of findings and interpretations
Coherence
Presentation of findings and interpretations
Reciprocity
Analysis, findings and interpretations
Typicality
Written report
Permeability
Findings and interpretations
Written report
55
References
Creswell, J. W. (2005). Educational Research: Planning, conducting and evaluating quantitative and
qualitative research. Upper Saddle River, NJ: Pearson-Merrill Prentice Hall.
Fossey, E., Harvey, C., McDermott, F., & Davidson, L. (2002). Understanding and evaluating
qualitative reesarch. Australian and New Zealand Journal of Psychiatry, 36, 717-732.
Hadwin, A. F., Boutara, L., Knoetze, T., & Thompson, S. (2004). Cross case study of self-regulation
as a series of events. Educational Research and Evaluation, 10, 365-418.
Hadwin, A. F., Wozney, L., & Pontin, O. (in press). Scaffolding the appropriation of self-regulatory
activity: A social constructivist analysis of changes in student-teacher discourse about a graduate
student portfolio. Special Issue of Instructional Science.
Hadwin, A. F., Wozney, L.., & Venkatesh, V. (2003, April). A narrative analysis of the dynamic
interplay between students’ emerging task understanding and instructional scaffolds. Paper
presented at the Annual Meeting of the American Educational Research Association, Chicago, IL,
USA.
Krathwohl, D. (1998). Methods of educational and social research: An integrated approach. New York:
Longman
Miles, M. B., & Huberman, A. M (1994). Qualitative Data Analysis. Thousand Oaks, CA: Sage, 1994
Patton, M. Q. (2002). Qualitative research and evaluation methods. Thousand Oaks, Calif.: Sage.
Strauss, A. L., & Corbin, J. M. (1990). Basics of qualitative research: grounded theory procedures and
techniques. Newbury Park, CA: Sage Publications
VanGyn, G., Ford, C., & Associates (2005). Teaching Critical Thinking. Unpublished manuscript.
56
Download