Content Analysis and Survey Research

advertisement
Epistemology and Methods
Content Analysis and Survey Research
May 13 2008
Overview
• Content Analysis
• Survey Research
• Group Exercise
Content Analysis
• Mannheim and Rich 1995, chapter 10
• Content Analysis: the systematic counting, assessing, and
interpreting of the form and substance of communication
• Precondition: physical record of communication
• Type of communication: books, magazines, newspapers,
documentaries, films, recordings, photographs, meeting
minutes, government documents, diplomatic communiqués,
cartoons, political advertisement, speeches, letters and diaries,
blog,….
• Relatively low cost research method
Mannheim/Rich
• Steps in using this method:
– Define the population of communications we want to study (this is
determined by the research question!)
– Once population is defined, one possibility is sampling (still a large n,
sufficient level in generalizing from results)
• Focus on
–
–
–
–
–
–
–
Type of communication
Time period of communication
The size of communication (e.g. op-ed articles)
The frequency of communication (e.g. monthly memos)
The distribution of communication (e.g. home delivered newspapers)
The location of communication
The parties to the communication (sender, receiver)
Mannheim/Rich
•
•
•
•
Define the unit of analysis
Word (e.g. peace in speeches of presidents)
Pitfalls:
Nonstandardized measures can lead to biased
results (e.g. word references by 1000 words)
• Different meanings of the word (e.g. we seek
peace vs. we never allow peace): counting the
words in context
– We can use judges or coders
– Or we move to a second unit of analysis: Theme
Mannheim/Rich
• Theme: context+ / complexity+
• Example: Development (trade openness increases
economic development; the developmental concerns
of LDCs, …this contributes to sustainable
development)
• Spend time on coding rules…
Mannheim/Rich
• Item: the communication itself taken as a whole
• Example: e.g. how many times GB senior referred to
Saddam Hussein as Hitler-like prior to 1991 Persian
Gulf War
• Full-text search (e.g. NYT) for words “George Bush”,
“Saddam Hussein” and “Hitler”
• These words appear within 10 words in either
direction
Mannheim/Rich
• Create a “dictionary”: define each and every observation we
might make and allocate a particular coding
• What constitutes a salient reference: e.g. Cuban school books
(American, US, Imperialists, Outlaw Regime in Washington,
etc…)
• Evaluation of salient references (good or bad, pro or anti,
ranking, intensity)
• E.g. ranking statements to measure “intensity” of newspaper
endorsement of candidates (Figure 10.1)
• Assistance from a group of judges (about the meaning or
intensity of a term)
• The role of pre-testing!
Mannheim/Rich
• Various procedures:
• E.g. define distribution scale (1-x), tell judges to distribute a
certain quota for each category (1-x)
• The mean is calculated for each statement
• E.g. pair-comparison scaling
• Pitfalls: number and selection of judges
• Pair-comparison for 100 to 200 items!
Mannheim/Rich
• Special problems:
• Communications are issues for a purpose “All Chinese people
believe that the new agricultural policy is a major step….”
• If we want to assess the impact of communication, we need to
know whom it reaches…
• What is the degree of our own access to communication (free
choice over material we analyze?)
• Is the research sample representative
• Intercoder reliability (human judgments): degree of consensus
among coders
Laver/Garry
• Estimating Policy Positions from Political Texts
• The Manifesto Research Group (MRG) / Comparative
Manifestos Project (CMP): biggest show on the road (started
in the early 1980s)
• Party manifestos are strategic documents written by politically
sophisticated party elite with many different objectives in mind
• Usually issues during elections campaigns, reflect party
positions
• Provide evidence party position changes over time
Laver/Garry
• MRG: measuring relative emphasis of an issue (salience, not
party position!), position # emphasis (attention paid to)
• MRG categories unipolar (e.g. law and order)
• MRG categories bipolar (e.g. social service expansion)
• In order to retrieve information on position, additional coding
of MRG raw data (e.g. Harmel-Janda Party Change Project
(PCP))
• Definition of 19 issues of interests, coding -5 to +5, few
experts engage in coding)
• Caveat: Prior knowledge (subjective placement of parties)…
Laver/Garry
• Estimating policy positions / expert coding:
• Data reduction (coding scheme)
• in MRG sentence count for 54 coding categories, in PCP transforming text
into scores on 19 policy scales
• New approach for systematic coding: an example of hierarchical decisionmaking process (Table 1):
• 1 ECONOMY (Role of state in economy)
– 11 Economy/+State+ (Increase role)
• 111 Economy/+State+/Budget (Budget)
• 1111 Economy/+State+/Budget/Spending (Public Spending)
–
–
–
–
–
11111 Health
11112 Education and Training
11113 Housing
11114 Transport
….
• 1112 Economy/+State+/Budget/Taxes (Increase Taxes)
• 1113 Economy/+State+/Budget/Deficit (Increase Budget Deficit)
Laver/Garry
•
•
•
•
•
•
•
•
•
Estimating policy positions / computer coding
Quantitative content analysis
Dictionary – mechanical criteria
But need to focus on less ambiguous words or phrases (unipolar)
Example on “taxes”…it’s possible to lower taxes and to raise taxes, but in
manifestos in favor of lowering taxes (question is how often is one wrong
in order to correct?)
The validity costs are offset by gains in reliability (Computer coding is
reliable) and fewer costs
Computer codes without knowledge of context (no pre-information…),
computer could potentially more easily detect policy changes…
Laver/Garry on policy positions in UK and Ireland
Cross-validation of techniques (Computer, revised expert, MRG, expert
surveys), high correlation on economic policy, less so on social policy (e.g.
MRG technique of saliency…)
Content Analysis
• Questions?
Survey Research
• Mannheim and Rich 1995, chapter 7
• Survey research is a method of data collection in which
information is obtained directly from individual persons who
are selected so at to provide a basis for making inferences
about some larger population
• Methods: direct questioning (face-to-face or telephone
interviews) or questionnaires…
• Types of information: facts, perceptions (what respondents
know (or think they know)), opinions, attitudes (more stable
preferences), behavioral reports
Survey Research
• Stages of Survey Research Preparation
• Conceptualization (objective, theory, specific set of questions)
• Survey Design (explorative, descriptive, explanatory)
• linking objective with data-collecting method;
• cross-sectional or longitudinal design (trend and cohort
studies, panel studies)
– Panel studies – watch out for: costly, keep track of sample, moving
towards a biased sample? (respondent changes due to interviews, dropouts)
Survey Research
• Instrumentation (operationalization): define content, form,
format, wording and order of questions
• Content: Limit the number of questions and time, keep
number of hypotheses to be tested small (degree of data!)
• Form: Open-ended vs. closed-ended questions (choice of
options influences responses)
• Format: Techniques how questions are presented (e.g. visual
aids…)
Survey Research
• Wording and order of questions
• compare with prior research
• are respondents competent to answer?, use contingency
questions to find out (to assess the level of knowledge)
• statements are more useful than questions (e.g. to measure
intensity of opinion, respondents use the same frame of
reference)
• but tendency of a response set (tendency to agree with
statements), items should be mixed
Survey Research
• Generally 4 steps: explanation, warm-up questions, substantive
questions, demographic questions
• Explanation: should not reveal study information that would
bias responses (eliminating fear of study, importance to
warrant time and attention)
• Warm-up: establish a good relationship with the respondents
• Substantive questions: check question ordering, pre-test will
give some guidance (experiment with different orderings)
• Demographic items: at the end, personal information, prevent
respondents’ being ill at ease…
– Do not crowd items
Survey Research
•
•
•
•
•
•
Techniques:
Face-to-face surveys:…
Mail surveys:
Telephone surveys: ….
Discussion on pro and cons?
Sampling, Validity and Reliability
Brady: Contributions to Political Science
• No other method for understanding politics is used more
• Sample surveys analyzed with the help of statistical techniques
• New survey methods involve “experiments embedded in
surveys”: modify questions wordings to determine whether
counter-arguments, subtle cues, rhetorical, emotional or
cognitive factors can change opinion or behaviors (how stable
are opinions and how do respondents threat information)
• Examples: The Use of Competing Frames or Priming
• Experiments have made surveys even stronger methods for
testing theories, they potentially score high on conceptual
richness and policy relevance
• Surveys have the great virtues to ask the questions you want to
ask, when and where you want!...
Designing an experiment-survey on
• Measuring
• a) the attitude of citizen vis-à-vis free trade
• Questions / Statements
• Stability of attitudes (Framing, Priming)
• b) the attitude of citizen vis-à-vis international law (e.g.
WTO law)
• Questions / Statements
• Stability of attitudes (Framing, Priming)
Good luck for the moot and assignment 2!
Download