chapter 7 Statistical Issues in Research Planning and Evaluation Chapter Outline • Probability • Meaningfulness • Power • Using information in the context of the study • Reporting statistical data Interpreting Statistical Findings • Probability – Alpha: level of chance occurrence (type I error) • • • • Typical: p < .05 or p < .01 Varying alpha Truth table Exact probability – Beta (type II error) • Meaningfulness (effect size) • Confidence intervals • Power: probability of rejecting the null hypothesis when it is false Truth Table for the Null Hypothesis Ho true Ho false Accept Correct decision Type II error (beta) Reject Type I error (alpha) Correct decision Sampling for Null Hypothesis From Experimental procedures for behavioral science, 3rd ed., by R.E. Kirk © 1995. Reprinted with permission of Brooks/Cole, an imprint of the Wadsworth Group, a division of Thomson Learning. Fax 800-730-2215. Estimating Effect Size • Effect size represents the standardized difference between two means. • Formula: ES = (M1 – M2)/s • ES allows comparison between studies using different dependent variables because it puts data in standard deviation units. • An effect size of 0 is no difference, 0.2 is small, 0.5 medium, and 0.8 large. Effect Size Examples of 0.5s and 1.0s Effect Size Curve to Estimate Sample Size When p = .05 Effect Size Curve to Estimate Sample Size When p = .01 Context of the Study How do findings from the study fit within the context of • Theory • Practice Planning Research Information needed in planning • Alpha • Effect size • Power • Sample size Using the Power Calculator When Reading a Research Study When reading research, often sample size, means, and standard deviations are supplied. You can calculate the effect size by the formula in chapter 7. Using this data and the Power Calculator at the Web site below, you can estimate the power to detect a difference or relationship. http://calculators.stat.ucla.edu/powercalc/ Using the Power Calculator to Plan Research If you are planning your own research, you can often estimate the effect size from other studies. By setting your alpha (say .05) and power (say .8), you can use the Power Calculator at the website below to estimate the sample size you need. http://calculators.stat.ucla.edu/powercalc/ Reporting Statistical Data (Summary From APA and APS) • How was power analysis done? • Always report complications (screen your data). • Select minimally statistical analyses. • Report p values of confidence intervals. • Report magnitudes of the effects. • Control multiple comparisons. • Report variability using standard deviations. • Report data to appropriate level.