• Not all information is equally valuable. • Not all interpretations are equally valid. • Evaluate the Sources Is the source up-to-date? o different kinds of information have different shelf lives Is the source dependable? Is the information relatively unbiased? How does the source measure up to others? • Evaluate the Evidence Is the evidence sufficient? Is the presentation of the evidence balanced and reasonable? Overstatement (or overgeneralization—exaggeration) Omission of vital facts Deceptive framing of the facts Can the evidence be verified? • Interpret Your Findings What level of certainty is warranted? o The ultimate truth—the conclusive answer o The probable answer—most likely true given current knowledge o The inconclusive answer—realization that the truth of the matter is currently unknown Are the underlying assumptions sound? o Assumptions are notions that are taken for granted. An argument can make sense given the assumptions but fail if the assumptions themselves are untrustworthy. To what extend has bias influenced the interpretation? Are other interpretations possible? • How Standards of Proof Vary for Different Audiences The scientist demands evidence that indicates at least 95% certainty The juror demands evidence that indicates only 51% certainty The corporate executive demands immediate (even if insufficient) evidence Specific cultures may have their own standards for authentic, reliable, persuasive evidence • Avoid Errors in Reasoning Faulty generalization o A jump from a limited observation to a sweeping conclusion Faulty causal reasoning o A link between a two factors as cause and effect that fails to recognize the real cause Ignores other causes Ignores other effects Invents a causal sequence Confuses correlation with causation Rationalizes • Avoid Statistical Fallacies Common statistical fallacies o The sanitized statistic o The meaningless statistic o The undefined average o The distorted percentage figure o The bogus ranking The limitations of number crunching o Confusion of Correlation with Causation o The biased meta-analysis o The fallible computer model Misleading terminology • Interpret the Reality Behind the Numbers • Acknowledge the Limits of Research Obstacles to validity and reliability o People often see themselves as more informed, responsible, or competent than they really are o Respondents might suppress information that reflects poorly on their behavior, attitudes o Respondents might exaggerate or invent facts or opinions that reveal a more admirable picture o Even when respondents don’t know, don’t remember, or have no opinion, they tend to guess in ways designed to win the researcher’s approval • Reliable research produces replicable results • Flaws in Study Design Epidemiological Studies (population studies) o Faulty sampling techniques o Observation or cognitive bias o Coincidence can be easily mistaken for correlation o Confounding factors (other explanations) often affect results Laboratory Studies The reaction of an isolated group of cells does not always predict the reaction of the entire organism The reactions of experimental animals to a treatment or toxin often are not generalizable to humans Faulty lab techniques may distort results Human Exposure Studies (Clinical Trials) The study group may be nonrepresentative of or too different from the general population Anecdotal reports are unreliable Lack of objectivity may distort results • Sources of Deception Underreported hazards The untouchable research topic A "good story" but bad science • Guidelines for Evaluating and Interpreting Information Check the source’s date of posting or publication Assess the reputation of each printed source Assess the quality of each electronic source Identify the study’s sponsor Look for corroborating sources • Evaluate the Evidence Decide whether the evidence is sufficient Look for a reasonable and balanced presentation of the evidence Do your best to verify the evidence • Interpret Your Findings Don’t expect "certainty" Examine the underlying assumptions Identify your personal biases Consider alternate interpretations • Check for Weak Spots Scrutinize all generalizations Treat causal claims skeptically Look for statistical fallacies Consider the limits of computer analysis Look for misleading terminology Interpret the reality behind the numbers Consider the study’s possible limitations Look for the whole story