Validity of web-based placement testing outcomes: Some recent findings Hoi Suen Professor of Educational Psychology Pennsylvania State University University Park, PA 16802-3108, U.S.A Office: 814-865-2235 Fax: 814-863-1002 Email: HoiSuen@psu.edu Website: http://suen.ed.psu.edu Distance learning and assessment • We deliver and facilitate instructional activities over the Web or other media. • However, we require students to take highstakes exams at designated centers where the tests are secured and where the identities of the examinees can be authenticated, and the examinees can be proctored and observed. Unproctored web-based Exam Security and Authentication • Without the physical presence of a human proctor, there is no known easy way of ensuring that the responses to exam questions in fact originated from the examinee; nor that the examinee did not obtain assistance from other people, from printed reference books, from his/her own computer hard drive or even from the web itself, while taking the exam. Assumption regarding unproctored webbased placement testing • Students should not and would not cheat when taking a placement test, because cheating in this situation would only hurt the student himself/herself. If a student did better due to cheating and were thus placed into a course more advanced than his/her capabilities, the student would be the one who would suffer. We theorized that: 1. When given convenient opportunities to improve test scores, students will tend to take advantage of these opportunities whether these opportunities are appropriate or not and whether the end result will hurt the student or not. This is because students today have been brought up in a high-stakes testing environment and that they generally believe that it is important to score as high as possible in any exam. Therefore, even for a placement test, when administered through an unproctored webbased setting, students will cheat. We theorized that: 2. As a result of cheating in these placement tests, a substantial portion of students will be placed into courses beyond their capabilities. 3. When confronted with a substantial portion of students being placed beyond their levels of abilities, instructors would change instructional method and/or contents to adjust to the students’ lower level of abilities; i.e., as an indirect result of unproctored web-based placement testing, instructors will tend to “water down” instruction. Study to test the theory designed and executed by: Dawn Zimmaro, Ph.D. Research Associate University Testing Center Penn State University University Park, PA 16802 Email: dmz115@psu.edu SAMPLE • Penn State freshmen placement testing program • Focused on math testing: The mathematics placement exam was a multiple-choice format. The test contained 72 five-response option items based on a set of 29 mathematical topics ranging from arithmetic of integers to inverse trigonometric functions. • Matched samples: Matched samples of web-based and paper testers based on SAT Math and high school grade point average. A total of 1,010 exactly identical matches on these two variables were identified. The final sample included 1,010 Web testers and 1,010 paper testers with an average high school grade point average of 3.79 (s.d. 0.25) and an average SAT Math score of 628 (s.d. 66). Major Hypotheses: Focus of this presentation I. After controlling for differences in mathematics ability students who take an unproctored Web mathematics placement test will score higher on the placement test than students who take an equivalent proctored paperand-pencil mathematics placement test. II. After controlling for differences in mathematics ability, there will be an interaction between type of placement test and the type of math course in which the student enrolled with lower first exam scores in calculus courses for students placed based on Web test results. (The testing of a 3rd “watered-down” hypothesis was inconclusive.) Results: Math placement test scores of matched samples TEST TYPE Mean Std. Deviation N Paper testers 48.89 11.89 1010 Web testers 51.34 12.16 1010 Total 50.11 12.08 2020 t (2018) = 4.591, p < .001. The correlation between math placement test score and placement test type was found to be r = 0.102. The amount of variation in total math placement test score that can be attributed to the type of test the student took was found to be 1% (r2 = .010). Results: First exam scores by course by placement test type COURSE TYPE TEST TYPE Mean Std. Deviation N Non-calculus Paper testers 73.43 16.37 Web testers 70.13 18.04 Total 71.98 17.18 234 182 416 Calculus courses Paper testers 67.77 17.47 Web testers 61.20 19.26 Total 64.27 18.72 202 231 433 Total Paper testers 70.81 17.10 Web testers 65.13 19.23 Total 68.05 18.38 436 413 849 Results: First exam scores by course by placement test type Estimated Marginal Means of First Math Exam Score 76 Estimated Marginal Means 74 72 70 68 66 TESTTYPE 64 paper test 62 60 web tes t Non-calculus cours e Calculus cours e COURSE TYPE Results: First exam scores by course by placement test type Tests of Between-Subjects Effects Dependent Variable: First Math Exam Score Source Model Intercept COURSE TYPE TEST TYPE COURSE TYPE * TEST TYPE Error Total SS 18408.392 3899194.845 11163.866 5118.687 df 3 1 1 1 Mean Square 6136.131 3899194.845 11163.866 5118.687 F Sig. 19.346 .000 12293.290 .000 35.197 .000 16.138 .000 562.125 268017.723 4217644.000 1 845 849 562.125 317.181 1.772 *R Squared = .064 (Adjusted R Squared = .061) .183