CMNS 260: Empirical Communication Research Methods 13-Review and Overview of the Course Professor: Jan Marontate Teaching Assistants: Nawal Musleh-Motut, Megan Robertson Lab Instructor: Chris Jeschelnik School of Communication. Simon Fraser University Fall 2011 Outline of Class Activities Today • Syllabus & Outline of Class Sessions – Objectives • Selected excerpts of lecture material to review for final examination • Study tips for final examination • Discussion of last assignment Course content • Introduce different forms of research • Analyze relationships between goals, assumptions, theories and methods • Study basic data collection and analysis techniques • Research process—focusing on empirical methods Why study methods? Practical aspects – learn to read other people’s research & critically evaluate it – learn ways to find your own “data” to answer your own research questions – acquire skills potential employers seek – self-defense (against misinformation) & responsible citizenship The Research Process Babbie (1995: 101) Why study methods? – “Knowledge is power” (to acquire skills for social action or change) • “Savoir pour pouvoir, Pouvoir pour prévoir” (Auguste Comte) • «To know to do (have power), to do (have power) in order to predict the future and plan for it » – « Knowledge is understanding » • “décrire, comprendre, expliquer ” (Gilles Gaston Granger) • “to describe, to understand and to explain” Research has the potential to inform and misinform • even well-done research is not always used accurately • some research is technically flawed • knowledge of methods an important tool for understanding logic and limits of claims about research Research Methodology (Scholarly Perspectives) • Process – methods – logic of inquiry (assumptions & hypotheses) • Produces – laws, principles and theories that can be tested • (Karl Popper & notion of falsifiability for politically engaged scholars interested in the fight against genocide in the early 20th century) Research has the potential to inform and misinform • even well-done research is not always used accurately • some research is technically flawed • knowledge of methods an important tool for understanding logic and limits of claims about research Other Ways of Knowing – authority (parents, teachers, religious leaders, media gurus) – tradition (past practices) – common sense – media (TV. etc.) – personal experience Talk show host Oprah Winfrey Cory Doctorow Electronic Frontier Assoc. & Boingboing.net Ordinary Inquiry vs. Scholarly Inquiry Risks of “Errors” associated with non-scholarly knowledge • selective observation--only notice some phenomena-miss others • overgeneralization-evidence applied to too wide a range of conditions • premature closure--jumping to conclusions • halo effect--idea of being influenced by prestige Communication as a Science? • Field more recent – affiliations with the sciences, social sciences & the humanities • Scholarly work (like old ideas of science) distinguished from mythology by methods AND goals • many different approaches Relations between theory and empirical observation • Theory and empirical research – Testing theories through empirical observation (deductive) – Using empirical observation to develop theories (Inductive) Empirical and Logical Foundations of Research (does not have to start with theory) Theories Empirical Generalizations The Scientific Process Predictions (Hypotheses) Observations Source: Singleton & Straits (1999: 27); Babbie (1995: 55) Scholarly Communities--Norms • universalism -- research judged on “scientific” merit • organized scepticism -- challenge and question research • disinterestedness-- openness to new ideas, non-partisan • communalism--sharing with others • honesty Research Questions • Questions researchers ask themselves, not the questions they ask their informants • Must be empirically testable • Not – too vague – too general – untestable (with implicit, untested assumed outcomes) Developing research topics “Dimensions” of Research Purpose of Study Intended Use of Study Treatment of Time in Study Exploratory Descriptive Explanatory Basic Applied -Action -Impact -Evaluation Cross-sectional Longitudinal -Panel -Time series -Cohort analysis -Case Study -Trend study Space Unit of Analysis (examples) -dependent -individual -independent -family -household -artifact (media, technology) Neuman (2000: 37) Exploratory Research • When not much is known about topic • Surprises (e.g. Serendipity effect) • Acquire familiarity with basic concerns and develop a picture • Explore feasibility of additional research • Develop questions Descriptive Research • Focuses on “who”, “what” and “how” • Background information, to stimulate new ways of thinking, to classify types, etc. Explanatory Research • To test theories, predictions, etc… • Idea of “advancing” knowledge Intended Use of Study • Basic • Applied – action research (We can make a difference) – social impact assessment (What will be the effects?) – evaluation research (Did it work?) – needs assessment (Who needs what?) – cost-benefit analysis (What is it worth?) Basic or Fundamental Research • Concerns of scholarly community • Inner logic and relation to theoretical issues in field Applied Research • commissioned/judged/used by people outside the field of communication • goal of practical applications – usefulness of results Types of Applied Research Action Research Social Impact Assessment Needs Assessment Evaluation Research • formative (built in) • summative (final outcomes) Cost-benefit analysis Treatment of Time Cross-sectional (one point in time) Longitudinal (more than one point in time) Main Types of Longitudinal Studies • Panel study – Exactly the same people, at least twice • Cohort Analysis – same category of people or things (but not exactly same individuals) who/which shared an experience at at least two times – Examples: Birth cohorts. Graduating Classes, Video games invented in the same year 2000 2010 41-50 51-60 61-70 71-80 41-50 51-60 61-70 71-80 • Time-series – same type of info., not exactly same people, multiple time periods, e.g. Same place 2006 Burnaby residents • 2011 Burnaby residents Case Studies may be longitudinal or cross-sectional Lexis Diagram (To study Cohort Survival) Importance of Choosing Appropriate Unit of Analysis • example: Ecological Fallacy (cheating) Ecological Fallacy Ecological Fallacy Ecological Fallacy & Reductionism ecological fallacy--wrong unit of analysis (too high) reductionism--wrong unit of analysis (too low) reductionism--wrong unit of analysis (too low) Relationship of Theory & Empirical Observation (Wheel of Science) Deductive & Inductive Methods (p. 71) Conceptualization & Operationalization of Research questions • Conceptualization: Development of abstract concepts • Operationalization: Finding concrete ways to do research Reliability & Validity Reliability dependability is the indicator consistent? same result every time? Validity measurement validity - how well the conceptual and operational definitions mesh with each other does measurement tool measure what we think ? Hypothesis Testing Possible outcomes in Testing Hypotheses (using empirical research) • support (confirm) hypothesis • reject (not support) hypothesis • partially confirm or fail to support • avoid use of PROVE Causal diagrams Direct relationship (positive correlation) X Y X Y Indirect relationship (negative correlation) Causal Diagrams X + + Y Y X _ X1 Z + Y _ X2 X1 + _ X + Z + Y Z X2 + Y _ Neuman (2000: 56) Types of Errors in Causal Explanation • ecological fallacy • reductionism • tautology • teleology • Spuriousness Double-Barrelled Hypothesis & Interaction Effect Means one of THREE things 1 2 OR Interaction effect Recall: Importance of Choosing Appropriate Unit of Analysis • Recall example: Ecological Fallacy (cheating) Ecological Fallacy (cheating) Ecological Fallacy (cheating Box) Ecological Fallacy & Reductionism ecological fallacy--wrong unit of analysis (too high) reductionism--wrong unit of analysis (too low) reductionism--wrong unit of analysis (too low) Teleology & Tautology tautology--circular reasoning (true by definition) teleology--too vague for testing Neuman (2000: 140) Spurious Relationship spuriousness--false relationship (unseen third variable or simply not connected) Neuman (2000: 140) Example: Storks & Babies – Observations: – Lots of storks seen around apartment buildings in a new neighbourhood with low cost housing – An increase in number of pregnancies – Did the storks bring the babies??? ? But... • The relationship is spurious. – The storks liked the heat coming from the smokestacks on the roof of the building, and so were more likely to be attracted to that building. – The tenants of the building were mostly young newlyweds starting families. – So…the storks didn’t bring the babies after all. Causal Diagram for Storks • Stork = S • Baby = B S • Newlywed = N • Chimneys on Building = C + N + B B C + S Another example of spurious relationships: number of firefighters & damage • The larger the number of firefighters, the greater the damage But... • A larger number of firefighters is necessary to fight a larger fire. A larger fire will cause more damage than a small one. • Debate about Hockey Riots in Vancouver. – Did the size of the crowd & amount of drinking cause the riots? – Did bad planning and inadequate policing cause the fire? Causal Diagram • Firefighter = F • Damage = D • Size of Fire = S + F + D F S + D Ethics & Legality Typology of Legal and Moral Actions in Research Ethical Illegal Only Immoral Only Legal Illegal Both Moral and Legal Both Immoral and Illegal Unethical Source: figure adapted from Neuman (2000:91) Privacy, Anonymity, Confidentiality • privacy: a legal right (note : public vs. private domain)--even if subject is dead • anonymity: subjects remain nameless & responses cannot be connected to them (problem in small samples) • confidentiality: subjects’ identity may be known but not disclosed by researcher, identity can’t be linked to responses 4-Measurement—Scales & Indices (Part 2 of 2 slideshows) Neuman & Robson Chapter 6 •systematic observation •can be replicated Creating Measures Measures must have response categories that are: mutually exclusive possible observations must only fit in one category exhaustive categories must cover all possibilities Composite Measures • Composite measures are instruments that use several questions to measure a given variable (construct). • A composite measure unidimensional (all items measure the same construct) – Indices (plural form of index) and scales Logic of Index Construction actions combined in single measure, often an ordinal level of measurement Logic of Scales actions ranked Logic Index--example Logic Scale-example Treatment of Missing Data • eliminate cases with missing data? • substitute average score ? • Guess ? • insert random value ? Rates & Standardization: • deciding what measure to use for reference populations example: employment rates Sampling: key ideas & terms Bad sampling frame = parameters do not accurately represent target population – e.g., a list of people in the phone directory does not reflect all the people in a town because not everyone has a phone or is listed in the directory. Types of Nonprobability Samples 4 Types of Probability Samples link to useful webpage: http://www.socialresearchmethods.net/kb/sampprob.php 16 Stratified Evaluating Sampling • Is the sample representative of the population under study? • Assessing Equal chance of being chosen • Examine Sampling distribution of parameters of population • Use Central Limit Theorem to calculate Confidence Intervals and estimate Margin of Error Asking Questions that can be answered Types of Surveys & Survey Instruments • Self-administered Surveys • Mail • Web • Surveys based on Interactive Interviews • Telephone • Online (interactive) • Face-to-face – Individuals – Focus groups • Survey Instruments: – Questionnaires • self-administered • Respondent reads questions & records answers – Interview Schedules • interviewer reads questions & records responses Main Types of Unobtrusive Measures • Physical traces – Erosion (ex. wear on floor in museum displays as measure of popularity of display) – Accretion (ex. garbage) • Simple observation • Media analysis such as content analysis, critical discourse analysis (ex. advertisements, news reports, films, music lyrics etc…) • Analysis of archives, existing statistics & running records (ex. shoppers’ records, library borrowers’ histories) • Simple observation Types of Equivalence for comparative research using existing statistics • lexicon equivalence (technique of back translation) • contextual equivalence (ex. role of religious leaders in different societies) • conceptual equivalence (ex. income) • measurement equivalence (ex. different measure for same concept) Discrete & Continuous Variables • Continuous – Variable can take infinite (or large) number of values within range • Ex. Age measured by exact date of birth • Discrete – Attributes of variable that are distinct but not necessarily continuous • Ex. Age measured by age groups (Note: techniques exist for making assumptions about discrete variables in order to use techniques developed for continuous variables) Cleaning Data • checking accuracy & removing errors – Possible Code Cleaning • check for impossible codes (errors) – Some software checks at data entry – Examine distributions to look for impossible codes – Contingency cleaning • inconsistencies between answers (impossible logical combinations, illogical responses to skip or contingency questions) Treatment of Missing Data (%) • Comparison with medium & low collapsed Table 5-1 Alienation of Workers Table 5-1 Alienation of Workers Level of Alienation High Medium & Low No Response Level of Alienation High Medium & Low F 30 120 60 % 14 58 29 (Total) (Total) 210 Non-respondents included F 30 120 % 20 80 150 100 100 Non-respondents eliminated Grouping Response Categories(%) • Comparison of with high & medium response categories collapsed Table 5-1 Alienation of Workers Table 5-1 Alienation of Workers Level of Alienation High & Medium Low No Response Level of Alienation High& medium Low (Total) 210 Freq % 62 10 29 100 (Total) 150 Freq % 87 13 Core Notions in Basic Univariate Statistics Ways of describing data about one variable (“uni”=one) –Measures of central tendency • Summarize information about one variable • three types of “averages”: arithmetic mean, median, mode –Measures of dispersion • Analyze Variations or “spread” • Range, standard deviation, percentiles, z-scores Normal & Skewed Distributions Details on the Calculation of Standard Deviation Neuman (2000: 321) The Bell Curve & standard deviation If Time: Begin Bivariate Statistics (Results with two variables) • Types of relationships between two variables: – Correlation (or covariation) • when two variables ‘vary together’ – a type of association – Not necessarily causal • Can be same direction (positive correlation or direct relationship) • Can be in different directions (negative correlation or indirect relationship) – Independence • No correlation, no relationship • Cases with values in one variable do not have any particular value on the other variable Recall (Lecture 2) *Types of variables* • independent variable (cause) • dependent variable (effect) • intervening variable – (occurs between the independent and the dependent variable temporally) • control variable – (temporal occurance varies, illustrations later today) Causal Relationships • proposed for testing (NOT like assumptions) • 5 characteristics of causal hypothesis (p.128) – at least 2 variables – cause-effect relationship (cause must come before effect) – can be expressed as prediction – logically linked to research question+ a theory – falsifiable Types of Correlations & Causal Relationships between Two Variables X=independent variable Y=dependent variable • Positive Correlation (Direct relationship) – when X increases Y increases or vice versa X + • Negative Correlation (Indirect or inverse relationship) X – when X increases Y decreases or vice versa • Independence – no relationship (null hypothesis) • Co-variation – vary together ( a type of association but not necessarily causal) Y Y Five Common Measures of Association between Two Variables General Idea of Statistical Significance • In general English ‘significance’ means important or meaningful but this is NOT how the term is used in statistics • Tests of statistical significance show you how likely a result is due to chance. Multi-variate Statistics: Elaboration Paradigm (Types of Patterns) • Replication: same relationship in both partials as in bivariate table • Specification: bivariate relationship only seen in one of the partial tables • Interpretation: bivariate relationship weakens greatly or disappears in partial tables (control variable is intervening—happens in between independent & dependent) • Explanation: Bivariate relationship weakens or diappears in partial table (control variable is before independent variable) • Suppressor: No bivariate relationship; relationshp only appears in partial tables. Elaboration Paradigm Summary Study Tips for Final Exam • Practice questions • Other ideas for preparation