1.2.110 Types of Multiple Measures Used in California Community College Mathematics, English, and English as a Second Language Course Placement: Summary Report of Survey Results Prepared by REL West at WestEd December 21, 2011 The report has not been reviewed by IES and, thus, cannot be verified as meeting IES standards. Not for general distribution. Introduction The California Community Colleges Chancellor’s Office has convened a Multiple Measures Workgroup to develop a resource document on effective multiple measures and their application in the assessment and placement process. The term multiple measures refers to the use of measures of student readiness for coursework in addition to a single test score. The Chancellor’s Office is developing a Framework for evaluating the technical adequacy of multiple measures used by the state’s community colleges. The Framework will include a list of current measures or types of measures used in community colleges to place matriculating students in courses and is intended to help the Chancellor’s Office, the Multiple Measures Workgroup, and local colleges evaluate the technical adequacy of these measures. Measures include both standardized tests (e.g., ACCUPLACER) and additional measures such as high school grades and student selfreports of readiness or goals for attending college. In order to obtain accurate information on which multiple measures colleges are currently using, and what information colleges have on the validity of these measures, the Chancellor’s Office asked Regional Education Laboratory–West (REL West) to develop and then analyze the results of a survey of all California community colleges, administered by the Chancellor’s Office. The survey was conducted in November 2011. This report is a summary of the findings of the survey. Methodology In collaboration with the Chancellor’s Office, WestEd developed survey questions, including multiple-choice and open-ended items, intended to elicit information on which multiple measures colleges use, how student data are collected using the measures, and how the information collected is used, as well as the existence of validation studies on the measures used. The survey was administered by the Chancellor’s Office using SurveyMonkey; an explanatory email including a link to the survey was sent to matriculation officers at the 112 community colleges in California. Of the 112 colleges, 59 responded, for a response rate of 53 percent. Findings This section summarizes the overall data on the types of multiple measures used at the colleges; the way student data are collected and used for placement in mathematics, English, and English as a second language (ESL) courses; and the existence of validation studies of these measures. A narrative summary of responses is provided for each question on the survey; for some questions, a chart showing number and percentage of responses for each item is also provided. The report has not been reviewed by IES and, thus, cannot be verified as meeting IES standards. Not for general distribution. 1 Question 1: Select all content areas tested that apply to each test given at your college. % 90 English writing 35 % 90 ESL 13 % 33 Total number of responses 39 Mathematics 33 % 85 English reading 35 CELSA (Combined English Language Skills Assessment) 0 0 0 0 0 0 24 100 24 COMPASS 6 46 7 54 7 54 12 92 13 College Tests for English Placement (CTEP) 0 0 9 100 8 89 0 0 9 Mathematics Diagnostic Testing Project (MDTP) 21 100 0 0 0 0 0 0 21 Locally developed multiple choice test 3 38 2 25 1 13 4 50 8 Locally developed direct performance assessment test (e.g., writing sample) 1 11 2 22 4 44 8 89 9 Other 14 ACCUPLACER Of the 59 respondents to this question, 57 reported using some form of placement test. Of those who reported using exams, 68 percent (n = 39) reported using ACCUPLACER, with almost 90 percent of those respondents reporting using it for English reading and/or English writing (n = 35 for both), slightly fewer (85 percent; n = 33) reporting using it for mathematics, and substantially fewer (33 percent; n = 13) reporting using for ESL. The next most widely used test was the Combined English Language Skills Assessment (CELSA), with 42 percent (n = 24) of respondents reporting using the test for ESL. The Mathematics Diagnostic Testing Project (MDTP) was the third most widely used, with over one-third of respondents reporting using it for mathematics (36 percent; n = 21). Respondents also reported using locally developed assessments, including both multiple-choice assessments (14 percent; n = 8) and direct performance assessments (16 percent; n = 9), across all content areas listed. In comments related to the “Other” option, respondents reported using the Degrees of Reading Power (DRP) and Nelson Denny assessments for reading, as well as tests for chemistry. The report has not been reviewed by IES and, thus, cannot be verified as meeting IES standards. Not for general distribution. 2 Question 2: Please select all data collection methods that apply to each measure concerning Educational Background implemented at your college. Self‐ reported in person (e.g., interview with a counselor) % Independently verified (e.g., transcripts) % Total number of responses 21 72 6 21 29 26 27 87 7 23 31 11 29 29 76 8 21 38 10 48 14 67 8 38 21 Grade in last math class completed 11 26 32 76 11 26 42 Highest math course completed 11 26 36 84 12 28 43 Length of time since last math course 10 26 31 82 9 24 38 General proficiency in reading and writing 11 52 14 67 7 33 21 Grade in last English class completed 11 28 29 74 11 28 39 Highest English course completed 9 36 19 76 11 44 25 Number of years of high school English 8 27 24 80 7 23 30 Other 15 % Self‐reported in a written questionnaire 11 38 Highest level of educational attainment 8 High school GPA General proficiency in math Length of time out of school Of the 59 survey respondents, 48 (81 percent) reported using some form of educational background measure to support placement decisions. Of those 48, the vast majority (more than 70 percent) reported that they use written questionnaires to obtain information; of the measures listed, respondents reported using the following most commonly: “highest math course completed” (90 percent; n = 43), “grade in last math class completed” (88 percent; n = 42), and “grade in last English class completed” (81 percent; n = 39). In addition to the measures listed in the survey, a number of respondents (15 percent; n = 7) reported using questions aimed at gauging the extent to which potential students are familiar with and/or use the English language as part of their daily lives. The report has not been reviewed by IES and, thus, cannot be verified as meeting IES standards. Not for general distribution. 3 Question 3: Please select all data collection methods that apply to each measure of College Plans, Goals, and Experience implemented at your college. Self‐ reported in person (e.g., interview with a counselor) % Independently verified (e.g., transcripts) % Total number of responses 23 66 1 3 35 67 17 57 0 0 30 20 59 19 56 1 3 34 Highest math course student plans to take 16 76 5 24 1 5 21 Time of day attending classes 11 55 10 50 0 0 20 Student’s attitude towards studying 15 63 10 42 0 0 24 Number of hours student plans to devote to studying/homework 16 59 12 44 0 0 27 College GPA 8 36 4 18 15 68 22 College units completed 8 36 4 18 16 73 22 College degree earned (foreign students) 10 42 7 29 15 63 24 7 % Self‐reported in a written questionnaire 20 57 Student’s choice of major 20 Number of units student plans to enroll in Student’s educational goals Other Of the 59 colleges responding to the survey, 44 (75 percent) reported using measures reflecting college plans, goals and experience. With the exception of college GPA, college units completed, and college degree earned (which were mostly independently verified), these measures were largely collected through self-reporting, either in person or in a written questionnaire. Of the 44 who reported using these measures, 80 percent (n = 35) reported using the student’s educational goals as a measure. Also reported as widely used were “number of units student plans to enroll in,” with 77 percent (n = 34) of those responding reporting that they use this measure, and “student’s choice of major,” with 68 percent (n = 30) reporting using this measure. The report has not been reviewed by IES and, thus, cannot be verified as meeting IES standards. Not for general distribution. 4 Question 4: Please select all data collection methods that apply to each measure of Personal Characteristics and Situational Characteristics implemented at your college. Self‐ reported in person (e.g., interview with a counselor) 4 Veteran status % 78 Independently verified (e.g., transcripts) 2 % 9 Total number of responses 23 15 60 9 36 25 54 16 57 1 4 28 12 52 11 48 0 0 23 Number of hours employed 20 71 14 50 1 4 28 Amount of time spent on extracurricular activities 12 80 4 27 1 7 15 Amount of time devoted to family commitments 14 88 3 19 1 6 16 Student perseverance with academic challenges 15 75 5 25 3 15 20 Time spent reading in English 11 55 10 50 1 5 20 Ease/confidence in reading/writing in English 11 58 9 47 1 5 19 5 % 17 Self‐reported in a written questionnaire 18 10 40 Importance of college to student 15 Importance of college to those closest to student Age Other Of the 59 colleges responding to the survey, 71 percent (n = 42) reported that they include personal and/or situational characteristics as multiple measures, and the vast majority of these reported that they obtain this information via self-reports (either in person or in a written questionnaire). Of the 42 who reported using these measures, 67 percent (n = 28) reported using “importance of college to student” and “number of hours employed.” Other measures that respondents reported they use include “veteran status” (60 percent; n = 25), “age” (55 percent; n = 23), and “importance of college to those closest to student” (55 percent; n = 23). The report has not been reviewed by IES and, thus, cannot be verified as meeting IES standards. Not for general distribution. 5 Question 5: In the text box below, please specify how your college uses the information. Of the 59 colleges that responded to the survey, 48 (81 percent) reported on how they utilize the multiple measures. Of the 48 that responded to this question, 71 percent (n = 34) reported using multiple measures as a weighted score of placement test results, plus or minus points for additional factors such as educational background; college plans, goals, and experience; and personal and situational characteristics. Twenty-five percent (n = 12) of respondents to this question reported using qualitative data and analysis to drive their decisions, factoring in test scores but relying more heavily on additional factors to determine placement. Most of the respondents who reported utilizing qualitative means reported relying on college advisors or counselors to make the determination for placement. Though most colleges utilized a weighted score as the ultimate determination of placement, the manner in which the scores were weighted appears to vary widely, with some colleges relying on regression analysis to factor in multiple variables in order to predict future success, and others combining weighted scores with more qualitative assessments added as “weights.” Questions 6, 7, and 8: Are you aware of completed/in progress validation studies for the multiple measures implemented at your college? Of the 51 survey respondents who answered this question, 55 percent (n = 28) reported having awareness of validation studies for the multiple measures they utilize for placement purposes. Slightly more than half of survey respondents (51 percent; n = 26) were aware of completed validation studies, and slightly fewer than half (45 percent; n = 23) responded that they were aware of ongoing or planned validation studies. The report has not been reviewed by IES and, thus, cannot be verified as meeting IES standards. Not for general distribution. 6 Summary As reported in the Findings section, of the 112 California community colleges sampled, 59 (53 percent) responded to the survey about the use of multiple measures used in community colleges to place matriculating students in courses. Of those colleges responding, 97 percent (n = 57) reported using some form of multiple-choice or open-ended exam, typically as a baseline score onto which additional points are added to take into consideration qualitative background characteristics of the student. ACCUPLACER appears to be the most frequently used of these tests. Most respondents reported using information on students’ educational background characteristics as a measure, with the most widely collected information being the type and final grade of class in a given subject (most typically mathematics). Most respondents also reported using information on students’ college plans, goals, and experiences as a measure; the types of information clustered around students’ educational goals, numbers of units planned, and choices of major. Additionally, most respondents reported using information on students’ personal and situational characteristics as a measure, focusing largely on number of hours worked, importance of college to the student, and veteran status. Most of the colleges that reported on their multiple measures were aware of completed, in-progress, or planned validation studies for the multiple measures implemented, and matriculation officers at 33 of the colleges provided their contact information, allowing the Chancellor’s Office to follow up with these colleges to collect further information on multiple measures and validation studies of multiple measures at these campuses. The report has not been reviewed by IES and, thus, cannot be verified as meeting IES standards. Not for general distribution. 7