R-square

advertisement
Effect Size
The importance or magnitude of an effect is not indicated by the probability
value because the probability value depends upon the sample size. The magnitude
of an effect can be estimated by R2. R2 (“R{Alt + 0178}” or “R-Squared”) measures
how much variation in the dependent variable can be accounted for by the effect. R2
ranges from 0 to 1. In general, the larger the value of R2, the better the effect
accounts for the variation in the dependent variable. R2 is the ratio of the sum of
squares for the effect divided by the sum of squares for the corrected total.
In SAS, the effect names are listed under “Source” in ANOVA tables. The RSquare is calculated by dividing the Anova SS for an effect by the Sum of Squares
for Corrected Total. If there is a Type III SS, use it for the sum of squares for an
effect; do not use the Type I SS.
R-Square
=
Type III SS for an effect

Corrected Total Sum of Squares
R-Square
=
Anova SS for an effect

Corrected Total Sum of Squares
or
When reporting results, it might be useful to explain the meaning of R2 in
context. An example for an ANOVA might go like this: “The size of the treatment
effect (R2) is .72, which means 72% of the variation in the dependent variable is
explained by the different levels of the independent variable.” An example for a
regression analysis might go like this: “The magnitude of the treatment effect (R2)
is .72, which means 72% of the variation in the criterion variable is attributed to
the predictor variable.”
The R-square statistic is also called the coefficient of determination or called
Eta-square (η²).
R² is the square of the Pearson product moment coefficient of correlation, r,
only if (a) there is a straight line (linear) relationship between two variables X and
Y, (b) there is only one X variable, and (c) there is only one Y variable.
R-square is a statistic that explains the variation within a sample. If you
want an estimate of the population variation, you should use adjusted R-square. It
is a modification of R-square that adjusts for the number of terms (Xs or predictor
variables) in the model. R-square always increases when a new term (X) is added to
a model, but adjusted R-square increases only if the new term improves the model
more than would be expected by chance. Adjusted R-squared is approximately equal
to Omega-squared (ω ²).
Download