'The Leader' versus 'Leading' - EdD-Res-Method

advertisement
Richard W. Riley
College of Education and Leadership
Research Methodology
for the Project Study
EdD Residency Program
Presenters
•
Dr. Wade Smith: Wade.smith@waldenu.edu
•
•
Dr. Paul Englesberg, paul.englesberg@waldenu.edu
•
•
360-380-2238 PST
Dr. Wallace Southerland,
wallace.southerland@waldenu.edu
•
•
352-895-9900
919-815-5323
Dr. Martha Richardson;
martha.richardson@waldenu.edu
Collaborative Space for Notes
•
http://edd-res-method.wikispaces.com/
•
•
Ask to join
Upload notes for colleagues usage
Purpose of this Session
•
Select and apply the appropriate method to a
given problem statement and research
questions.
•
Align methodology with problem statement,
research questions, data collection, and
analysis.
•
Recognize methodology alignment in a peerreviewed article
•
Practice summarizing primary research.
Methodology for a Project Study (EdD)
•
STUDY TYPE
•
•
PROJECT STUDY – Scholarly paper and
project
STUDY METHODOLOGY
•
•
•
Quantitative
Qualitative
Mixed
Quantitative Research
•
•
•
•
•
Concise statement of purpose and question
•
NOT creative or emotive rhetorical writing
Specification of measurable constructs (variables)
Question poses a relationship or comparison
Results are numerically presented.
Narrative is objectively oriented, all elements
congruent, exactly consistent statements of key
elements
Research Questions and Purpose
The research question is the most important element
of the research endeavor. More time and care should
be invested in determining the right question in the
correct form than in any other part of the process.
Once a proper question is articulated, the rest of
process falls into place [paraphrase] (Creswell,
personal communication, 1995).
DVs and IVs for Quant. Problem Statement
•
•
Study must include a conjectured relationship
between at least 2 variables:
•
Independent Variable (IV): The variable that is
being manipulated or tested
•
Dependent Variable (DV): The variable that
varies based on manipulation of the IV
Example: SIGNIFICANT Differences in reading
comprehension scores (DV) by level of parental
involvement (IV)
Dependent and Independent Variables
•
According to Creswell (2003), independent
variables are generally defined as consisting of the
two or more treatment conditions to which
participants are exposed. These variables “cause,
influence, or affect outcomes” (Creswell, 2003, p.
94).
•
Dependent variables are observed for changes in
order to assess the effect of the treatment. They
“are the outcomes or results of the influence of the
independent variables” (Creswell, 2003, p. 94).
Dependent and Independent Variables
For example: An independent variable could be
parental involvement in reading activities of English
language learners while the dependent variable is
identified as reading comprehension performance as
measured by the reading portion of Criterion
Reference Competency Test (CRCT) for English
reading comprehension assessments.
I. Quantitative purpose and questions
•
Specific purpose, research questions, hypotheses,
and/or research objectives are concise and clearly
defined.
•
Must include measurable elements.
Quantitative Research Questions
•
There are three basic types of questions
•
Descriptive: When a study is designed primarily
to describe what is going on or what exists
•
Relational: When a study is designed to look at
the relationships between two or more variables
•
Comparative: When a study is designed to
compare differences between groups or
conditions in terms of measurable outcomes;
sometimes called causal comparative
Hypotheses
•
Statements of relationships between variables you
want to test—only when using statistical tests
•
Expectations based on what the literature indicates
about the relationship between the variables
•
Stated in terms of a null hypothesis (no
SIGNIFICANT difference or no SIGNIFICANT
change) and research hypothesis (statement of
conjectured SIGNIFICANT difference)
Hypotheses as Operational Definitions
•
Hypotheses are, in a sense, a statement of
operational definitions
•
An operational definition matches a concept, such
as intelligence, with a measurement tool, such as
the Wechsler Adult Intelligence Scale (WAIS)
•
Your hypotheses MUST operationalize your
concepts; this makes your hypotheses TESTABLE.
Hypothesis Example
•
•
You are interested in cognition and ADD
•
Research (H1): There is a [statistically significant]
difference in time on task [as measured by the
TaskTest], between ADD children who are given 5
minutes of physical exercise every half an hour and
ADD students who go through the normal daily routine.
Null (H01): There is no [statistically significant]
difference in time on task [as measured by the
TaskTest], between ADD children who are given 5
minutes of physical exercise every half an hour and
ADD students who go through the normal daily routine.
Hypotheses
•
Statistical hypotheses should be straightforward
statements of what is expected to happen in regard
to the independent and dependent variables.
•
NOTE: Descriptive questions do not suggest
hypotheses.
Hypotheses, cont’d.
•
The NULL is a statement of NO SIGNIFICANT
effect, NO SIGNIFICANT relationship, or NO
SIGNIFICANT difference, depending on the
research question and design. The statement
should be clear, understandable, and not
unnecessarily complex or obtuse.
Hypotheses, cont’d.
Null hypotheses are in the following form:
The independent variable has NO SIGNIFICANT effect on the
dependent variable.
Or
There is NO SIGNIFICANT relationship between the independent and
dependent variable.
Or
There is no SIGNIFICANT difference between the treatment group and
the placebo group in terms of the dependent variable.
Hypotheses, cont’d.
•
For complex designs, multiple hypotheses are
sometimes required. For example, when the design
incorporates multiple independent variables
(factorial designs), there might be two or more sets
of hypotheses.
Research Question Example
Example: Research questions:
1. Does gender of student affect math performance?
2. Does pedagogy X affect math performance?
3. Is there an interaction between gender and
pedagogy X in terms of math performance?
Hypotheses Examples
•
Hypotheses:
Null 1: Gender has no SIGNIFICANT effect on
math performance.
Null 2: Pedagogy X has no SIGNIFICANT effect on
math performance.
Null 3: There is no SIGNIFICANT interaction
between gender and pedagogy X in terms of math
performance.
Measured Variables
•
Note that quantitative analysis requires numeric
measurement of variables. So, focus groups,
interviews, and open-ended observations are NOT
typically part of quantitative study. Quantitative
inquiry focuses on variables that can be measured
by reliable and valid instruments, such as tests,
numerically represented attitude scales, or other
tools that yield numerical results.
I. Purpose of the Study
•
Simple paragraph that describes the intent of your
study. It should flow directly from the problem
statement. Two to three sentences are sufficient.
•
•
It should be logical and explicit.
Review that it directly relates to the problem
statement and research questions.
Purpose of the Study Example
•
The purpose of this correlational study is to
examine the relationship between the level of
parental involvement and reading comprehension
performance on the XYZ assessment among
elementary English language learners.
•
(Matches earlier DV and IV example)
Quantitative Research – Design and Methods
•
All elements of the quantitative study MUST be
exactly aligned.
•
The problem, purpose, questions, hypotheses,
design, and methods MUST be congruent and
directly aligned.
•
The same terms must be used every time an
element is discussed, described, or mentioned.
Quantitative Research – Design and Methods,
cont’d.
•
The design and methods narrative includes:
•
The larger population of interest, to which the
results will be generalizable
•
The location and context of the study (how does
the location and context relate to the research
purpose and questions?)
•
Instruments or means of measuring variables
(Only variables mentioned in the purpose and
question are measured.)
Quantitative Research – Design and Methods,
cont’d.
•
The subjects who will provide the measurement on
variables of interest
(Why are these the best subjects for the purpose
and questions?)
•
The sampling strategy (for quantitative inquiry, a
random/representative or equivalent sample is
required)
Quantitative Research – Design and Methods,
cont’d.
•
Data analysis—how will the data collected be
analyzed to answer the research question?
Main Research Designs
• Experimental
• Random assignment—Comparative design
• Quasi-Experimental—Pre-selected groups (not random–
convenient)
• Causal comparative design
• Within-group designs (Pretest/posttest or matched
pairs comparisons) Very weak design –
unconvincing.
• Between-group designs (comparisons b/w groups)
• Non-Experimental
• Descriptive
• Correlational/secondary data analysis
Selecting a Research Design
•
•
•
•
•
What are the underlying constructs (variables)?
What is the intended population?
What is the intended interpretation?
Sometimes you need more than one type of design
You can/should use an existing design (Creswell,
pp.168–171)
A Few Examples of Quantitative Studies
• Treatment/intervention outcomes (e.g., in
schools/classrooms)
• Activity analysis (# of behaviors/episode)
• Policy analysis (Breakdown by criteria – content
analysis)
• Program Evaluation
• Needs Assessment
• Surveying an organization to understand the impact of
management practices on employees
• Secondary data analysis
• Developing reliable and valid scales to assess attitudes
Methodology —Setting and Sample
•
Setting and Sample
•
•
•
Population from which sample will be drawn
Describe and defend sampling method
Describe and defend sample size
•
Use sample size generators for random
selection designs:
•
•
•
See notes
Eligibility criteria for participants
Characteristics of selected sample
Population and Sample Example
•
•
What is the population of interest (Ex: 4th grade boys)
Your sample is drawn from a population. Note: You
can only generalize to the populations from which you
have sampled. Describe how you will select your
participants.
•
•
•
Recruitment strategies
What is your role as a researcher?
Describe the demographics of your sample
•
•
•
Gender
Age
Independent variables
Types of Samples
•
WEAK: Convenience sample – the subset of a
population is available for study (may or may not
be representative of the population)
•
Random Sample – each member of a population
has an equal chance of being picked
•
•
Equivalent to random/representative
Website for further information:
http://www.socialresearchmethods.net/tutorial/
Mugo/tutorial.htm
Sample Size
•
Rule of thumb: In order to estimate the number
needed for your sample, use 15–20 participants
per variable. The larger the sample, the better!
•
Statistical Power Analysis basically answers the
question How large must my sample be to ensure
a reasonable likelihood of detecting a difference if it
really exists in the population?
•
You will need to report statistical power in your
study (http://statpages.org/).
Methodology —Instrumentation
•
Instrumentation and Materials—describe data
collection tools
•
•
•
•
Name of instrument
•
•
How participants will complete
Concepts measured by instrument
How scores are calculated; meaning
Processes of assessing the reliability and
validity of instrument (e.g. Cronbach’s alpha,
etc.) See notes
Detailed description of each variable
Instrumentation and Materials
•
Describe the survey or other tools you will use in
your investigation
•
A brief description of the methods with
references to previous studies that have used it
•
Identify the underlying constructs. Constructs
should derive from the theoretical foundation.
•
Include copies of your measure in an Appendix.
Test Validity
•
A score from a test or other measurement
instrument must represent what it is intended to
represent.
•
The researcher must provide some evidence and
support for validity.
Test Reliability
•
•
Tests/measurement scores must be reliable.
•
Reliability is a technical issue.
Reliable means that the score will be similar over
different administrations, or that the score is based
on an internally consistent set of items/tasks.
Methodology —Data Analysis
•
An explanation of all descriptive and/or inferential
analyses
•
Null hypotheses as they relate to research
questions
•
•
Specific explanations of variables
Best presented in a sequence that matches the
research questions
Statistical Analysis
•
Based on frequency in category, average of
measurement, variability of measurement
•
Answers questions like is the frequency of
occurrence what we expect? Is frequency the
same across groups?
•
Are the average results different between or
among groups?
•
Is there a relationship between or among
measured variables?
Characteristics of a Quantitative Method
•
•
•
Rooted in testable and confirmable theories
Looking for relationships
Statistical tests are used to analyze the data:
•
•
t tests
•
•
•
Chi square analysis
Analysis of variance (ANOVA), analysis of
covariance (ANCOVA)
Correlation
Linear/Logistic regression See notes
Types of Tests
• Independent-Samples t test (compare scores of two
independent groups)
• Compare achievement of two groups
• Compare employees from two companies on morale
• Paired-Samples t tests (compare two groups of scores
that are matched)
• Compare the pretest and posttest scores provided
by participants of an intervention (pre-post design)
• ANOVA (comparing two or more levels of an
independent variable)
• Can be between groups (independent groups) or
repeated-measures (matched scores)
Types of Tests
• Chi-square (examine statistical relationship between
categorical variables)
•
Association – relationship between two categorical
variables
• Goodness of fit- is the distribution in your sample the same
as the population or the same as another study
•
Correlation (relationship between 2 variables)
•
•
Pearson r (parametric) or Spearman (non-parametric)
Regression (examine the effect of multiple independent
variables on one variable See notes
•
How do various differentiation strategies predict
achievement?
Which test should I use?
•
Tutorial that helps you decide:
http://www.wadsworth.com/psychology_d/template
s/student_resources/workshops/stat_workshp/chos
e_stat/chose_stat_01.html
•
This site has four different interactive webpages
that help you decide the correct analytical
procedure:
http://statpages.org/#WhichAnalysis
Tool for Aligning Methods and Questions
Problem
Purpose
Research Question
Hypothesis
Analysis
GOAL = Overall Alignment in your Study
 Problem Statement
 Nature of the Study/Guiding Question
 Purpose of the Study
and
 Research Design
 Setting and Sample
 Data Collection
 Data Analysis
References
•
Cohen, J. (1988). Statistical power analysis for the behavioral sciences (2nd ed.). Mahwah, NJ:
Lawrence Erlbaum.
•
Creswell, J. W. (2002). Research design: Qualitative, quantitative, and mixed-methods approaches
(2nd ed.). Thousand Oaks, CA: Sage Publications.
•
Gravetter, F. J., & Wallnau, L. B. (2004). Statistics for the behavioral sciences (6th ed.). Belmont, CA:
Thompson-Wadsworth.
•
Hallahan, M., & Rosenthal, R. (1996). Statistical power: Concepts, procedures, and applications.
Behavior Research and Therapy, 34, 489–499.
•
Lipsey, M. W., & Wilson, D. B. (1993). The efficacy of psychological, educational, and behavioral
treatment: Confirmation from meta-analysis. American Psychologist, 49(12), 1181–1209.
•
Murphy, K. R., & Myors, B. (1998). Statistical power analysis: A simple and general model or traditional
and modern hypothesis tests. Hillsdale, NJ: Erlbaum.
•
Patten, M. L. (2007). Understanding research methods: An overview of the essentials (6th ed.). Los
Angeles: Pyrczak Publishing.
•
Rossi, J. (1990). Statistical power of psychological research: What have we gained in 20 years? Journal
of Consulting and Clinical Psychology, 58(5), 646–656.
•
Tabachnick, B. G., & Fidell, L. S. (2001). Using multivariate statistics (4th ed.). Needham Heights, MA:
Allyn & Bacon.
Qualitative Research/Naturalistic Inquiry: Project
Study
•
•
Natural setting & culture/context-bound
•
•
Use of researcher’s tacit/implicit knowledge
•
•
•
Purposive sampling (based on research question)
Researcher as instrument (interpersonal skills
essential)
Qualitative methods (narrative observation & indepth interview)
Inductive data analysis (emic vs. etic)
Grounded theory (emerges from data/represent
“reality”)
Qualitative Research/Naturalistic Inquiry (cont’d.)
•
•
•
•
•
•
Negotiated outcomes (interpret with participants)
In-depth contextualized reporting (thick description)
Idiographic (vs. nomothetic) representation
Tentative (conditional) application
Focus-determined but emerging boundaries
Criteria for trustworthiness (confidence in findings)
Qualitative Research: Approaches
•
•
Biography—life history of an individual (n=1)
•
Grounded theory—develop theory inductively (2030+)
•
•
•
•
Ethnography—describe/interpret culture
Phenomenology—meaning of experiences re:
particular phenomenon/construct (5-10)
Case study—in-depth study of one or more cases
Action research—theory-research-action
Participatory (action) research—stakeholders as
partners
Biography
•
Involves the study of an individual’s
experiences.
•
•
Can be told to the research
Can be biographies, autobiographies, life
histories, and oral histories
Researcher Role in Biographies
•
•
Select the type of biographical study
•
Gather stories or concrete contextual
biographical material
•
Organize stories around themes that account
pivotal events/epiphanies.
•
•
Explore the meaning of the stories
Begin with objective experiences noting the life
course stages and experiences
Look to explain the meaning
Phenomenology
•
Explores how people make meaning of their
experiences.
•
Based on the philosophical perspective of
Husserl
•
Focus is on the intentionality of consciousness
(Experiences are mediated by memory, image,
and meaning)
Researcher Role in Phenomenology
•
•
•
•
•
Reduction
Analysis of specific statements and themes
Search for all possible meaning
Setting aside all prejudgments
Bracketing researcher experiences
Researcher Role in Phenomenology (cont’d.)
•
Understand the philosophical perspectives of the
approach
•
Write research questions that explore what
experiences mean to the people and ask
individuals to describe everyday life experiences
•
Collect data from people who have experienced
the issue or phenomenon under investigation
•
•
Analyze data
Write reports that describe the essence of the
experiences and describe similarities in meaning
of these experiences
Case Study
•
•
Exploration of a case or multiple cases over time
•
The case can be a program, an event, an
activity, or individuals.
•
The case should be in the physical, social,
historical, or economic setting
Involves detailed, in-depth data collection from
multiple sources of information
Researcher Role in Case Study
•
Consider the type of case study will be most
useful
•
Select case studies that show and array of
perspectives on the problem, process, or event
•
Collect data using multiple sources of
information such as observations, interviews,
documents, and audiovisual materials
•
Analyze data through holistic or embedded
analysis using within case analysis or cross
case analysis
•
Report the learned lessons from the case.
Action Research
•
Seeks full collaborative inquiry from all
participants
•
Often aimed at sustained change in
organizational, institutional, or community
contexts
•
Distinction between participants and researchers
blur.
Researcher Role in Action Research
•
•
Begin with a critical review of past actions
•
Engage in research to improve practice and to
get understanding and change at the same time
•
Engage in a processing of regular, critical and
systematic reflection as you act or practice at
every stage
•
•
Maintain flexibility
Plan the next action using the information
gained from that review.
Report
Grounded Theory
•
Generates the discovery of a theory closely
related the phenomenon being studied
•
Focuses on how individuals interact, take
actions or engage in a process in response to a
phenomenon
Researcher Role in Grounded Theory
•
Conduct field interviews to the point of saturation
for categories
•
•
Theoretically choose participants
•
•
Analyze data as you go
•
Produce a substantive-level theory
Collect and analyze observations and
documents
Use systematic approach to data analysis that
involves open, axial, and selective coding and a
conditional matrix
Methods
• Participant (narrative) observation
• Key informant interviewing
• In-depth interviewing
• Focus group interviews
• Ethnographic (culture-specific) survey
• Freelist & pilesort (elicitation techniques)
• Fieldnotes (researcher)
• Journal/logs/diaries (participants)
• Social networks
• Spatial mapping
• Tangible products (“artifacts”)
Trustworthiness (Lincoln & Guba, 1985)
• The extent to which one can have confidence in the study’s
findings
• Parallel of reliability, validity, & objectivity in traditional
“quantitative” research
• Criteria
• Credibility—plausible, reflect reality of participants (cf.
internal validity)
• Transferability—applicable to other contexts/samples
(cf. external validity)
• Dependability—accounts for instability/change (cf.
reliability)
• Confirmability—authenticate process & document
researcher bias (cf. objectivity)
Insuring Trustworthiness (Lincoln & Guba, 1985)
•
•
•
•
•
•
•
•
•
•
Prolonged engagementa
Persistent observationa
Triangulationa
Peer debriefinga
Member checksa
Thick descriptionb
Audit trailc,d
Negative case analysisa
Reflexive journala,b,c,d
Referential adequacya
a Credibility
c Dependability
b Trustworthiness
d Confirmability
Principles of Data Transformation
(Description, Analysis, Interpretation)
•
•
Ongoing and recursive
•
•
•
•
Requires systematic documentation of procedures
•
•
Specific to purpose and audience
Integrates emic (meaningful to participants) and etic
(meaningful to observers) perspectives
Includes both descriptive and inferential language
Maximizes data triangulation
Involves search for confirming and disconfirming
evidence
Participatory process
Qualitative Research: Sampling
•
Sampling refers to selection of individuals, units,
settings
•
Typically purposeful or criterion-based, that is, driven
by characteristics relevant to research questions
•
Purposes/goals:
•
•
•
Identify representative or typical
•
For controlled comparison (to demonstrate
differences based on criteria; e.g., multiple case
studies)
Reflect heterogeneity or variation
Find cases critical to theory/question (e.g., extreme,
unique, ideal)
Sample Size: Rule of Thumb
• Biography/Case Study: 1
• Phenomenology: < 10
• Grounded theory/ethnography/action research: 20-30 to reach
saturation
• Methods (examples)
• Key informants: 5
• In-depth interviews: 30
• Freelist / pilesort: 30
• Focus group: based on “groupings” represented
(e.g., male & female, 3 age levels)
•
Ethnographic survey: typically larger & representative
(purposeful or random based on purpose)
References
•
Creswell, J. W. (1998). Qualitative inquiry & research
design: Choosing among five traditions. Thousand
Oaks: CA. Sage.
•
Maxwell, J. A. (1996). Qualitative research design: An
interactive approach. Thousand Oaks, CA: Sage.
•
Miles, M. B., & Huberman, A. M. (1994). Qualitative
data analysis (2nd ed.). Thousand Oaks, CA: Sage.
•
Schensul, J. J., & LeCompte, M. D. (Eds.). (1999).
Ethnographer’s toolkit: Volumes 1–7. Walnut Creek,
CA: AltaMira Press.
Mixed-Methods Design
•
•
Decisions
•
Regarding the type of data to be given priority
(qualitative or quantitative)
•
When data are collected and analyzed
Most common designs sequence the use of
qualitative and quantitative methods
Explanatory Design
•
•
Data are collected in two phases
•
Quantitative data is the emphasis with
qualitative data used to explain the quantitative
findings.
Quantitative data collected first; qualitative data
are collected later
Example of Explanatory Design
•
Schools are surveyed that participate in a National
Science Foundation program. Teachers from the
schools are select the most important outcomes for
their students. Schools with high numbers of
students who received awards at science contests
are selected. Teachers are interviewed to
understand how they are using the NSF program.
Exploratory Design
•
•
Data are collected in two phases
•
Advantage: Surveys are constructed from the
language used in the interviews
Qualitative data collected first (interviews);
quantitative data are collected later (surveys)
Example of Exploratory Design
•
Ten schools are selected and case studies are
conducted using observations, interviews, and
document review to describe procedures and
activities that these schools used to address school
improvement for No Child Left Behind. Teachers
and administrators were interviewed. A survey was
developed following analysis of the case studies.
One hundred schools with similar problems were
surveyed, which produced extensive data from this
larger sample.
Triangulation Design
•
•
Most complex design
•
Results from both types of data are compared
(triangulated) to see if similar findings are
produced.
Qualitative and quantitative data are collected
simultaneously
Example of Triangulation Design
•
A researcher attempted to paint a portrait of
effective teacher techniques and behaviors. He
used qualitative techniques (classroom
observations, daily logs, interviews with students
and teachers). He also used quantitative
instruments (performance checklists, rating scales,
discussion flowcharts). Triangulation was achieved
by comparing qualitative data with the results of
quantitative measures of classroom interaction and
achievement to develop detailed description of
each teacher’s teaching style, behaviors and
techniques and their effects on students. Teachers
were compared for similarities and differences.
Guide to Conducting a Mixed-Methods Study
•
•
Determine the feasibility
•
Do I have the time to use a Mixed-Methods
Design?
•
Do I have the knowledge to use a MixedMethods Design?
•
Do I have the resources to use a Mixed-Methods
Design?
Identify a clear rationale
•
What are the reasons for needing both
qualitative and quantitative data?
Evaluating Mixed-Methods Research
•
•
Complex
•
Consider whether the criteria for the
qualitative/quantitative approaches are appropriate
Evaluated to determine whether the design and
research procedures match the guide for
conducting the mixed-methods study
•
Was the survey pilot tested? What was the
response rate? (Quantitative)
•
Did the researcher spend an adequate amount
of time in the field? Did the researcher use
multiple sources? (Qualitative)
Questions for Evaluating a Mixed-Methods Study
•
Is there a clear rationale for using each type of
data?
•
Is the type of design clearly described or clearly
presented in a visual manner?
•
Are there research questions for each type of data,
and are these appropriate for the sequence and
priority of data collection and analysis?
Questions for Evaluating a Mixed-Methods Study
(cont’d.)
•
Are the data collection procedures clearly described?
Is evidence of the quality of the methods of data
collection presented, as appropriate, for both
quantitative (reliability and validity) and qualitative
(credibility and dependability) measures?
•
Are the procedures for data analysis consistent with
the type of design chosen and appropriate to the
research questions asked?
•
Does the written results have a structure that is
appropriate to the type of mixed-methods design being
used?
Creswell, 2010, p. 288
References
•
Creswell, J. W., & Plano, V. L. (2007). Designing and conducting
mixed methods research. Thousand Oaks, CA: Sage.
•
Morse, J. M. (1991, March/April). Approaches to qualitativequantitative methodological triangulation. Nursing Research, 40(1),
120–123.
•
Tashakkori, A., & Creswell, J. W. (2007). Exploring the nature of
research questions in mixed methods research. Journal of Mixed
Methods Research, 1(3), 207–211.
•
Tashakkori, A., & Teddlie, C. (1998). Mixed methodology:
Combining qualitative and quantitative approaches. Thousand
Oaks, CA: Sage.
•
Tashakkori, A., & Teddlie, C. (Eds.) (2003). Handbook of mixed
method research in the social and behavior sciences. Thousand
Oaks, CA: Sage
Group Activity: Your turn to draft… DIRECTIONS
• Divide students into small groups.
• Instruct students to examine the exemplar study and rubric.
• Using the exemplar study, students should locate the chosen method in
Section 2.
• Using the rubric and the exemplar, students will locate the standards and substandards in the rubric and match them to those elements in the study
exemplar.
• Instruct students to enter the page numbers in the rubric where the standards
are met in the study exemplar.
• Students will then complete an alignment grid to show how the chosen
methodology aligns with the research questions and data collection/analysis.
• Notify students that the Tool for Aligning Methods and Questions handout is
for quantitative studies, and quantitative portions of mixed methods designs,
only.
It’s Your Turn…
Examine your proposal, rubric and alignment grid.
Decide which methodology would be appropriate for
your study.
Complete your alignment grid using 1–2 research
questions.
Questions?
Don’t leave until I get to answer your question.
Evaluation of Session
•
•
•
Submit Your Seminar Evaluation
Provide Feedback
Include Improvements!
Download