PowerPoint

advertisement
Qualitative Data Analysis
Mary Cassatt: The Sisters, 1885
Quantitative and Qualitative
Some Definitions
• Quantitative data are observations coded in
numerical format.
• Qualitative data are observations coded in any
other format besides numbers (e.g., text,
photographs, film, voice recordings).
• Quantitative methods generally refer to survey
research methods.
• Qualitative methods typically refer to any method
besides survey research methods.
Quantitative and Qualitative
Some Definitions (Continued)
• Quantitative methods can be used to collect either
quantitative or qualitative data.
• Qualitative methods can be used to collect either
quantitative or qualitative data.
• Both quantitative and qualitative data can be used
to build theory (i.e., induction).
• Both quantitative and qualitative data can be used
to test theory (i.e., deduction).
Quantitative and Qualitative
Some Definitions (Continued)
• Epistemologically, quantitative and qualitative
methods do not differ in objectivity, validity, or
reliability.
• Epistemologically, quantitative and qualitative data
do not differ in objectivity, validity, or reliability.
• The researcher selects the most appropriate
method(s) to collect the most appropriate data to
answer the research questions.
Linking Theory and Analysis
1. Discovering Patterns
• We defined theory as “an empirically falsifiable
set of abstract statements about reality.”
• The term, “empirically falsifiable” reflects a
broader range of testing than statistics
generated from quantitative data.
• Theories can be empirically falsified using
qualitative data.
• Example: “From my observations of women
in the workplace over the past three months,
its seems plausible that they are more likely
to experience discrimination than are men.”
Linking Theory and Analysis
1. Discovering Patterns (Continued)
• The qualitative test of an hypothesis, although
subjective in the sense of relying upon
observations of the researcher rather than
quantitative statistics, can be equally as valid
as the statistical test.
• A qualitative analysis of data uses the same
processes of data organization and evaluation
as is found in quantitative analysis, albeit with
non-numerical observations.
Linking Theory and Analysis
1. Discovering Patterns (Continued)
• The process of data analysis:
• Frequencies: How often?
• Magnitudes: How much?
• Structures: To whom?
• Processes: In what order?
• Causes: Why?
• Consequences: With what outcomes?
Linking Theory and Analysis
1. Discovering Patterns (Continued)
• Variable-oriented analysis: The use of an
independent variable to predict an outcome of
the dependent variable.
• Case-oriented analysis: Thorough, ideographic
examination of a single case.
• Cross-case analysis: Examination of
similarities across cases (e.g., an inductive
approach to variable-oriented analysis).
Linking Theory and Analysis
2. Grounded Theory Method
• The use of cross-case analysis to inductively
create/adopt concepts and build theory.
• A key feature of GTM is the use of the constant
comparative method.
Linking Theory and Analysis
2. Grounded Theory Method (Continued)
• Constant comparative method.
1. Comparing incidents across cases.
2. Developing/adopting concepts.
3. Comparing concepts across cases.
4. Integrating concepts from different avenues
of inductive inquiry.
5. Delimiting the theory (creating/adopting a
theoretical approach; ruling out some
concepts that seem less important).
6. Writing theory: Explaining the approach and
theory to others.
Linking Theory and Analysis
3. Semiotics
• The “science of signs.”
• Learning the meaning of language, symbols,
and behavior within a social setting.
• Focus on content validity of concepts.
• “Signs” can be material artifacts or nonmaterial
instances (e.g., “body language,” gestures,
word usage).
• Semiotics also focuses on the dramaturgy of
everyday life: the presentation of self and
meanings to others.
Linking Theory and Analysis
4. Conversation Analysis
• Extremely close scrutiny of the way people
converse with one another.
• Especially important to studies of
ethnomethodology.
• Assumptions of conversation analysis:
1. Conversation is a socially structured
activity.
2. Conversation must be understood
contextually.
3. Seeks understanding of the structure and
meaning of conversation.
Qualitative Data Processing
1. Coding
• Coding involves classifying or categorizing
individual pieces of data.
• Coding of qualitative data can create either
qualitative or quantitative categories.
• Coding units: concepts are coded as the units
of analysis. Thus, a single sentence or several
pages of text might coded identically, as an
expression or example of a concept.
• Similarly, a unit of analysis might be coded
within several coding units simultaneously.
Qualitative Data Processing
1. Coding (Continued)
• Coding as a physical act: The researcher
decides how each event or artifact is to be
coded.
• This process can be done “by hand” or by the
uses of computer software programs.
• Creating codes: The researcher might decide
which concepts to investigate a priori. In most
cases of qualitative analysis, the researcher
builds a series of codes inductively through the
process of analyzing the data.
Qualitative Data Processing
1. Coding (Continued)
• Creating codes: Creating codes inductively
typically requires several iterations of trial and
error to decide which codes will be used.
• The researcher begins with “open coding,” the
process of creating many codes as one takes
an initial look at the data.
• Open coding is followed by “axial coding,” or
the process of selecting the key codes and
concepts of interest. Axial coding involves a
regrouping of the data into the main coding
scheme.
Qualitative Data Processing
1. Coding (Continued)
• “Selective coding” seeks to identify the central
code in the study, the one to which all other
codes are related.
• Once a coding scheme is finalized, to the
extent that any coding scheme is “final,” the
researcher will try to assign instances to the
existing coding scheme (i.e., one has to stop
building a coding scheme at some time to
complete the task of coding).
Qualitative Data Processing
2. Memoing
• Memoing is the process of writing memos to
yourself as you develop the coding scheme.
• These notes help the researcher recall
ideas for coding and developing concepts
as the data analysis progresses.
• Code notes identify the code labels and
their meanings to the researcher.
• Theoretical notes remind the researcher of
ideas for concept and theory development
during the coding process.
Qualitative Data Processing
2. Memoing (Continued)
• Operational notes deal primarily with
methodological issues. These are ideas and
reminders about the data gathering or coding
process itself.
• The elemental memo is a detailed account of
relatively specific points of interest to the
researcher. The final coding scheme and
conceptual development of the study rely upon
the compilation of these memos.
Qualitative Data Processing
2. Memoing (Continued)
• Sorting memos are notes regarding the
organization and compilation of the elemental
memos. These are ideas about how to move
to axial codes.
• Integrating memos are ideas about how to
organize the axial codes into a coherent
account of the data.
• The process of memoing is iterative. The
many memos and notes entail a trial and error
process of developing the overall conceptual
account of the data.
Qualitative Data Processing
3. Concept Mapping
• In qualitative data analysis, the researcher
spends a lot of time in committing thoughts to
paper, in organizing ideas into a coherent
conceptual approach to the data.
• This process is iterative and one of trial and
error.
• Placing concepts in a graphical format, called
concept mapping, can help the researcher
organize thoughts.
Content Analysis
1. Introduction
• The analysis of existing documents that can
reveal important information about human
behavior.
• Content analysis can be use for exploratory,
descriptive, or explanatory research.
• Content analysis can create qualitative or
quantitative data.
• Content analysis can be used for deductive or
inductive research.
Content Analysis
2. Operationalization
• Concepts of interest must be operationalized,
as with any method.
• Content analysis allows for revisions to
operationalization.
• After examining the first 10 of 50
documents, for example, the researcher
might decide to revise the operational
definition of one or more concepts and then
restart the analysis.
Content Analysis
3. Unit of Analysis
• Content analysis can be conducted on the
population of events (e.g., all letters to the
newspaper Editor within a three-month period)
or on a sample of the population.
• One or more units of analysis can be
investigated.
• For example, unit of analysis might be the
letters to the Editor, or paragraphs within
letters, or sentences within paragraphs, or
any of these three during the analysis.
Content Analysis
4. Sampling
• Content analysis can utilize any type of
sampling design.
• Cluster sampling is common in analysis of
text, wherein the researcher identifies
paragraphs, for example, and then draws a
sample of paragraphs.
Content Analysis
5. Coding
• Coding—the process of transforming raw data
into a standardized form using some
conceptual framework—can be qualitative or
quantitative.
• The researcher can code the data manually,
use computer software packages to help store
and organize codes, or use sophisticated
software that codes text.
Content Analysis
5. Coding (Continued)
• Qualitative coding involves assigning artifacts
to one or more categories.
• Consider, for example, the phrase, “The
state constitution should be amended to ban
same-sex marriage,” that might appear in a
letter to the newspaper Editor.
• This phrase might be classified as:
• Same-sex marriage: opposed.
• Constitutional amendment.
• Policy suggestions.
Content Analysis
5. Coding (Continued)
• Sophisticated computer software packages,
such as N-Vivo, facilitate qualitative coding of
artifacts.
• Text, for example, is scanned into the
program.
• Then, the researcher assigns one or more
markers to each unit of analysis.
• Later, the researcher can sort the text by
markers (e.g., all text related to “same-sex
marriage: opposed”) for further analysis.
Content Analysis
5. Coding (Continued)
• The researcher might chose to use quantitative
coding of artifacts.
• Artifacts can be coded as nominal-, ordinal-, or
interval-level data.
• The researcher can apply numerical codes or
use sophisticated computer software packages
to conduct the coding.
Content Analysis
5. Coding (Continued)
• Both manifest and latent content can be coded
during content analysis.
• For example, the latent content of the
phrase, “There should be a constitutional
ban against same-sex marriage” might be
coded as “respectful objection,” whereas the
latent content of the phrase, “Those damn
[enter bad word here] should not be allowed
to marry,” might be coded as “hateful
objection.”
Content Analysis
6. Inductive and Deductive Research
• As is the case in many instances of research—
both quantitative and qualitative—the
researcher might pursue both deductive and
inductive processing of the data.
• The researcher might begin by using the
data—in either qualitative or quantitative
form—to test hypotheses.
• Then, the researcher might try altering an
existing theory or positing a new theory
during content analysis.
Content Analysis
7. Reliability
• The reliability of coding schemes used in
content analysis typically is assessed by using
the method of inter-rater reliability.
• That is, multiple researchers will code the raw
data and compare notes with one another.
• Discussion among researchers results in a
consensus opinion about how to code certain
artifacts.
• This approach provides intersubjectivity to the
analysis but also might create group-level bias.
Computer Programs for Qualitative Data
1. N-Vivo (NUD*IST)
• Computer software programs, such as
NUD*IST by N-Vivo, are designed to help the
researcher organize thoughts regarding
qualitative data.
• The computer program does no active coding
itself, but only assists the researcher in
marking text, organizing codes, and retrieving
text in various grouping schemes as a means
of developing axial codes.
Computer Programs for Qualitative Data
1. N-Vivo (Continued)
• Textual data, which might be the raw data or
notes taken by the researcher about artifacts,
is entered into the word processing component
of the program.
• Each unit of analysis is marked with one or
codes.
• The researcher can retrieve text within different
groupings of codes, develop higher-order, axial
codes as well as write memos and notes as the
data analysis proceeds.
Computer Programs for Qualitative Data
2. Other Computer Programs
• The Ethnograph
• HyperQual
• HyperResearch
• HyperSoft
• Qualrus
• QUALOG
• Textbase Alpha
• SONAR
• Atlas.ti
Qualitative Analysis of Quantitative Data
Special Note
• It is important to note that qualitative analysis
and quantitative analysis are neither competing
nor incompatible.
• Unless the one can conduct both types of
analysis, in some cases simultaneously, then
one is limiting one’s potential as a social
researcher.
• Qualitative analysis of quantitative data, for
example, involves the visual presentation of
numerical information to aid understanding.
Download