Faculty Assessment of Doctoral Student Research: Conceptions of

advertisement
Exploring the Roles
of Faculty Supervision:
Improving Qualitative Doctoral
Dissertation Methodology
Dan Kaczynski, PhD
dkaczyns@uwf.edu
Research Problem
More
qualitative
dissertations
Increasing
emphasis
on quality
Shifting
supervisory
roles
Understand assessment practices
Strengthen qualitative research skills
Develop future researchers
Open Discussion
 Do
you use technology
in your research?
Open Discussion

How should we explore the tensions
within and between:


Assessing Quality
Adoption of QDAS
Qualitative Software
NVivo, MAXqda, Atlas ti, QDA Miner, Qualrus, Transana
Kaczynski (2004)
http://www.aare.edu.au/04pap/kac041065.pdf
What is Good Qualitative Research?
What does good work look like?
Identify indicators of
quality in a thesis:
Identify common errors
in a thesis:












What is Quality?
The researchers logic of
justification
“Flaws in the logic of justification can
potentially occur anywhere in the inquiry
process. The nature of such flaws and where
they occur can jeopardize the soundness of a
study in one or more ways.”
(Piantanida & Garman 1999, p. 147)
Types of Quality Criteria
Philosophical
Criteria
(Lincoln, 1995; Lincoln & Guba, 1985)
Procedural
(Creswell, 1998)
Criteria
Philosophical Criteria
(Lincoln & Guba, 1985)
 Credibility
 Is the work authentic?
 Transferability
 Will the work fit outside this situation?
 Dependability
 Is the researcher consistent?
 Confirmability
 Are interpretations defensible?
Procedural Criteria
 Quality
of methods
(open-ended interviews)
 Quality
of data
(verbatim long transcripts)
 Quality
of data analysis
(comprehensive data treatment)
(Silverman, 2004 [Sacks, 1984])
Standardized Procedural Criteria
(controversial checklists or guidelines)
Does the title reflect the study focus?
 Is the problem socially important?
 Is the literature review comprehensive?
 Has study conformed to ethics standards?
 Are issues of sampling discussed?
 Did findings answer the questions?
 Was study written convincingly?
 Are issues of trustworthiness addressed?

Quantitative and Qualitative Criteria
for Assessing Research Quality and Rigor
Quantitative
term
Qualitative
term
Strategy employed
Internal validity
Credibility
Prolonged
External validity
Transferability
Provide
Reliability
Dependability
Create
Objectivity
Confirmability
Triangulation
engagement in field
Use of peer debriefing
Triangulation
Member checks
Time sampling
thick description
Purposive sampling
an audit trail
Code-recode strategy
Triangulation
Peer examination
Practice
reflexivity
Anfara, V. A., Brown, K. M., & Mangione, T. L. (2002) p. 30
Transparent Assessment
 Explore
rich diversity of meanings
 Sensitized appreciation of worth
 Deeper assessment of analysis
 Multiple paths to look inside
 Transparency strengthens credibility
Data Collection
Stage 1
Survey
Research
Supervisory
Ability
Quality Factors
Technology
Resources
Professional
Development
Stage 2
Interviews/
Documents
Perceptions
Processes
Assessment
Practices
Stage 3
Document
Review
Findings: Knowledge,
Ability, and Confidence
Supervising
Serving on a
committee
Judging
quality
Satisfactory
M
SD
or Higher
73.9% 3.48 1.28
91.3% 4.17
.94
91.3% 4.09
.95
Findings: Technology
Tools Used in Assessment
14
52% (None)
30% (Not
applicable)
17% NVivo
4% InfoRapid
No. of Respondents
12
10
8
6
4
2
0
NO
NA
AT
HR
IR
MQ
NV
QM
Technology Tools
QU
SS
TR
OT
Findings: Resources
Consulted for Expertise
83% Others
70% Publications
30% Conference
workshops
22% Campus
workshops
13% Continuing
education
20
Number of Respondents
18
16
14
12
10
8
6
4
2
0
NA
CF
SP
CE
Resources Used
WA
WC
Findings: Conceptualizations
of Quality (cont.)
AI: Alternative interpretations
CE: Consideration of ethical
issues
AG: Ability to generalize findings
HC: Hierarchical code structure
MC: Member-checking
MS: Memos
AT: Methodological audit trail
PD:
Peer debriefing
PF: Prolonged field engagement
AS: Qualitative data analysis
software
RO: Researcher objectivity
SS: Sampling strategies
SD: Self-disclosure
SC: Social context
TO: Theoretical orientation
TN: Triangulation
VY: Validity
Stage 2 Findings: Critical Needs
 Building
knowledge and skills
– Moving beyond superficial assessment
– Significance of researcher transparency
– Teaching students to self-assess
 Building
a community of practice
– Strengthening qualitative research skills
– Sharing assessment strategies
– Engaging in professional development
Stage 3 Findings:
Role of Technology
90
80
70
60
50
40
30
20
10
0
Total Dissertations
(Not identified)
Mixed Methods
Analysis Software
Qualitative
Quantitative
Study Findings
 Highly
favorable attitudes toward
qualitative research
 Diverse conceptualizations of quality
 Need for alternative assessment
frameworks
 Need and desire to strengthen
knowledge and skills
 Need for professional development
Future Challenges
 Progressing
research methods
Mixed → Blended → Integrated
 Emerging
research innovations
mainstream adoption of QDAS
 Positioning
standards
quality research
Future Research Questions




What does it mean to disclose or conceal the role
of technology?
Does nondisclosure of analysis software imply the
presumption that the use of technology is
ubiquitous and commonly accepted?
Does nondisclosure of QDAS suggest a student’s
fear of the supervisor’s acceptance or
sanctioning?
In what ways and under what conditions does a
technological tool become a barrier to the
learning process for the teacher and the learner?
Download