Assumption Testing: t-test: Outliers: Use a Box and Whisker plot for

advertisement
Assumption Testing:
t-test:
Outliers: Use a Box and Whisker plot for each group and/or variable. Look for extreme outliers.
Assumption of Normality: Use Shapiro-Wilks or Kolmogorov-Smirnov.
Assumption of Equal Variance: Use Levene's Test of Equality of Error Variance
ANOVA:
Outliers: Use a Box and Whisker plot for each group and/or variable. Look for extreme outliers.
Assumption of Normality: Use Shapiro-Wilks or Kolmogorov-Smirnov.
Assumption of Equal Variance: Use Levene's Test of Equality of Error Variance
Two-way ANOVA:
Outliers: Use a Box and Whisker plot for each group and/or variable. Look for extreme outliers.
Assumption of Normality: Use Shapiro-Wilks or Kolmogorov-Smirnov.
Assumption of Equal Variance: Use Levene's Test of Equality of Error Variance
ANCOVA:
Outliers: Use a Box and Whisker plot for each group and/or variable. Look for extreme outliers.
Assumption of Normality: Use Shapiro-Wilks or Kolmogorov-Smirnov.
Assumption of Linearity: Use a series of scatter plots between the pre-test variable and post-test
variable for each group
Assumption of Bivariate Normal Distribution: Use a series of scatter plots between the pre-test
variable and post-test variable for each group. Look for the classic “cigar shape.”
Assumption of Homogeneity of Slopes: Look for interactions.
Assumption of Equal Variance: Use Levene's Test of Equality of Error Variance
MANOVA
Outliers: Use a Box and Whisker plot for each group and/or variable. Look for extreme outliers.
Assumption of Normality: Use Shapiro-Wilks or Kolmogorov-Smirnov.
Assumption of Multivariate Normal Distribution: Look for a linear relationship between each
pair of dependent variables. If the variables are not linearly related, the power of the test is
reduced. You can test for this assumption by plotting a scatterplot matrix for each group of the
independent variable. Look for the classic “cigar shape.”
Assumption of Homogeneity of Variance-Covariance matrices: You can test this assumption in
SPSS using Box's M test of equality of covariance. If your data fails this assumption (p < .05),
you may also need to use SPSS to carry out Levene's test of homogeneity of variance to
determine where the problem may lies.
Correlation and/or bivariate regression:
Assumption of Bivariate Outliers: Use a scatter plot between the predictor variables (x) and
criterion variable (y). Look for extreme bivariate outliers.
Assumption of Linearity: Use a scatter plot between the predictor variables (x) and criterion
variable (y).
Assumption of Bivariate Normal Distribution: Use a scatter plot between the predictor variables
(x) and criterion variable (y). Look for the classic “cigar shape.”
Multiple regression:
Assumption of Bivariate Outliers: Use scatter plots between all pairs of independent variables (x,
x) and also the predictor variables (x) and criterion variable (y). Look for extreme bivariate
outliers.
Assumption of Multivariate Normal Distribution: Look for a linear relationship between each
pair of variables. If the variables are not linearly related, the power of the test is reduced. You
can test for this assumption by plotting a scatter plot for each pair of predictor variables (x, x)
and between the predictor variables (x) and the criterion variable (y). Look for the classic “cigar
shape.”
Assumption of non-Multicollinearity among the Predictor Variables: If a predictor variable (x) is
highly correlated with another predictor variable (x), they essentially provide the same
information about the criterion variable. If the Variance Inflation Factor (VIF) is too high
(greater than 10), you have multicollinearity and have violated this assumption. Acceptable
values are between 1 and 5.
Download