An Exploratory Research Study

advertisement

An Exploratory Research Study 1

An Exploratory Research Study

Examining College Placement Procedures

Across Ohio Higher Education Institutions

Focusing on Cognitive and Noncognitive Assessment Methods

Lana Evans

Northwest State Community College

February 2005

An Exploratory Research Study 2

Table of Contents

Abstract…………………………………………………………………………………………...3

Introduction………………………………………………………………………………….........4

Statement of Problem……………………………………………………………………………..4

Literature Review…………………………………………………………………………………5

Purpose of the Study…………………………………………………………………………........8

Limitations………………………………………………………………………………………...9

Significance of the Study………………………………………………………………………….9

Methodology………………………………………………………………………………………9

Subjects……………………………………………………………………………………………9

Apparatus………………………………………………………………………………………...10

Statistical Procedure……………………………………………………………………………...10

Results……………………………………………………………………………………………11

Discussion………………………………………………………………………………………..12

Conclusion and Recommendations………………………………………………………………14

References………………………………………………………………………………..………15

Appendices……………………………………………………………………………………….18

Tables…………………………………………………………………………………………….24

Figures……………………………………………………………………………………………26

An Exploratory Research Study 3

Abstract

This exploratory research study provides a comprehensive description of college placement procedures across Ohio higher education institutions and in particular, focuses on cognitive and noncognitive assessment methods. The purpose of this research is to develop a definitive and descriptive resource that detailed cognitive and noncognitive assessment methods used across

Ohio’s higher education. One hundred seventy colleges and universities (public and private, two- and four-year) were surveyed to determine cognitive and noncognitive assessment methods in place at each institution. The questionnaire included sections on general developmental education information, diagnostic assessment and placement procedures, developmental/remedial education course work, student services, academic advising, and services for students with disabilities. For the purposes of this Practicum Project only cognitive and noncognitive assessment methods will be discussed.

An Exploratory Research Study 4

An Exploratory Research Study Examining College Placement Procedures

Across Ohio Higher Education Institutions Focusing on

Cognitive and Noncognitive Assessment Methods

The Ohio Association for Developmental Education (OADE) is developing a research agenda that is designed to enable the organization and its membership to shape the caliber of the public and political discussion regarding developmental education. Casazza and Silverman

(1996) state the following in regard to developmental education practitioners:

Whatever practitioners call themselves, they frequently exist on the margins of academia.

It is in their best interests and those of their students to strengthen their professional field of study through collaborative research and documentation of their very effective practices. All practitioners need to become more visible and articulate what they do. In large part, it is their responsibility to provide a model for all educators that emphasizes the enhancement of academic standards and realistic access to continuing education for all populations . (p. xiii)

To shape the caliber of discussion on developmental education between practitioners, researchers, policymakers, and consumers, it is first necessary to define and describe developmental education programs and practices so that there is a shared understanding of developmental education. The research discussed in this report is intended to serve as the foundation for OADE’s research agenda that will shape the public and practitioner debate, particularly in reference to cognitive and affective assessment methods.

Statement of the Problem

Ohio’s myriad approaches to developmental education, and in particular, cognitive and noncognitive assessment and placement methods, preclude practitioners and constituents from

An Exploratory Research Study 5 understanding and evaluating the value of such programs and practices. Further, students may receive different placement outcomes at different institutions; this can be a product of different placement methods and policies among colleges and universities, different levels of academic preparation based on high school attended or high school curriculum (Adelman, 1999), or the number of years out of high school prior to matriculating in college, among others.

It would seem reasonable to expect that what constitutes college-level English at one institution would be the same at an entirely different institution. It would also seem reasonable to expect that students who complete a rigorous academic program during high school would be adequately prepared for college upon entry. The numbers of students requiring remedial course work across all Ohio higher education institutions, however, documents that these assumptions are not appropriate. Placement methods that are not regularly examined or do not use rigorous methodology for establishing cutoff scores are suspect at best.

It is critical that the Association provide practitioners and others as well, (a) the opportunity to examine their institutions’ cognitive and noncognitive assessment methods and

(b) the ability to compare practices with those of other institutions. Developing a better understanding of current assessment and placement methods across the state through research enables institutions to review and consider other institutions’ practices and perhaps revise procedures based on this information.

Literature Review

Statewide, the percent of students who require remediation

1

is high; though relatively comparable with the nation’s rate of remediation for first year students. In Ohio, 37% of first year students in 2001-2002 across all types of higher education institutions enrolled in remedial

1 The term remediation is used to refer to students who require pre-college course work in reading, writing, or mathematics upon entry to college.

An Exploratory Research Study 6 mathematics or English (Ohio Board of Regents, 2003b) as compared to 29% of first year students nationally enrolled in at least one remedial reading, writing, or mathematics course

(National Center for Education Statistics, 2003). Although comparable, the difference between these two statistics is striking when noting the fact that the Ohio statistic refers only to students enrolled in remedial mathematics or English, while the national figure refers to students enrolled in remedial reading, writing, or mathematics courses.

The prevalence of remediation warrants a thorough review and understanding of cognitive and noncognitive assessment methods as placement methods determine who needs remediation and who is academically prepared for college. The first step in improving practice is to describe and understand the current state of practice. Practices across Ohio higher education institutions are widely varied. What constitutes a placement score that warrants remediation at one college may be considered college-level work at another institution. This report seeks to provide the base-level description of cognitive and noncognitive assessment methods so that future research can begin to identify best practices within the state by providing the opportunity to compare the differences between placement programs and outcomes.

Issues surrounding access to higher education are of critical importance to society, particularly in reference to developmental education. Immerwahr and Foleno’s research findings

(as cited in Martinez, 2004), indicate that all Americans view higher education as the key to success in today’s world. Martinez (2004) states, “it seems ironic, then, that the issue of how states can maximize access to higher education receives so little public debate and policy discussion” (p. I). Although there are numerous newspaper editorials about developmental education (National Center for Education Statistics, 2003), much of the public debate that these editorials fosters is not based on research-based evidence. This article seeks to clarify the

An Exploratory Research Study 7 understanding of current practice in cognitive and noncognitive assessment so that an informed debate can occur and future research studies can be designed and conducted.

This base-level description of cognitive and noncognitive assessment methods also provides an opportunity for higher education institutions, secondary institutions, and the state to consider how varied assessment practices convolute the process of developing statewide standards for high school graduation. To be sure, different curricula between high schools across the state also contribute to the issue of placement into remedial course work. The Ohio Board of

Regents appointed a subcommittee and asked the subcommittee to “align high school and college

English and math requirements” in the effort to “create a seamless transition to college”

(Buttermore, 2004, p. 3). This is no small task given the varied practices and cutoff scores in place among all of the state’s colleges and universities.

As Ohio’s budget becomes further constrained by limited revenues and increasing costs, higher education is repeatedly called on to absorb budget cuts. In addition, calls for accountability and explanation of practices abound. Internally, institutions are critically examining their programs and services in light of their mission and reduced revenues.

Developmental education, and more specifically remedial education, continues to come under scrutiny as legislators, the general public, and secondary school leaders and teachers question the need for remediation in part because they perceive that taxpayers are “paying twice” for the same course work provided by different entities at different stages in students’ academic careers. To further complicate matters, Brothen and Wambach (2004) state, “critics from both inside and outside the field question whether remedial courses really prepare students for future college work or even if they are properly part of the college mission” (p. 16). Needless to say, the argument against paying twice is not as straightforward as it appears. One potential response to

An Exploratory Research Study 8 legislators is that the argument about paying twice cannot be clearly articulated with confidence as alignment between high school graduation requirements and college entry requirements cannot be clearly demonstrated.

Students frequently come to college with different levels of preparation; in Ohio, it is estimated that of those 52,672 recent high school graduates (students under 20 years of age who were classified as first year students in 2001-2002) who enroll in college, 30% failed to complete the high school core curriculum 2 , while 50% of students completed the core curriculum. For the remaining 20% of students it is unknown as to whether they completed the core curriculum

(Ohio Board of Regents, 2003a). In spite of the fact that perhaps almost half of students fail to complete the core curriculum, most intend to enroll in college. In addition, older adults are enrolling college in increasing numbers after having been out of school for several years. When considering these two constituent groups, it is fair to question whether the state in actuality paid twice for the same course work.

Admittedly, the lack of consistency across institutions regarding cognitive placement, i.e., one institution may require a student to enroll in one or more remedial courses while another does not exacerbates the confusion surrounding the legitimacy of remedial course work and the cognitive and noncognitive assessment methods that are used to make placement decisions.

Purpose of the Study

The purpose of this study is to develop a definitive and descriptive resource that details cognitive and noncognitive assessment methods across Ohio’s higher education institutions that can serve as a base-level description that will inform future research.

2 “College core” coursework as measured by ACT “consists of 4 years of English, 3 years of math, 3 years of science, and 3 years of social studies” (Mortenson, 1997, p. 3).

An Exploratory Research Study 9

Limitations

This research study is descriptive and the results are not appropriate for drawing inferences from the data.

Significance of the Study

This study serves as the first comprehensive description of developmental education programs and practices in the state of Ohio. Its significance stems from two main characteristics of the research: (a) this study serves as the foundational research for the OADE’s research agenda, (b) this study provides a means for practitioners, students, parents, and policymakers to develop a better understanding of the programs and services encompassed by the term developmental education, and (c) this study provides the information necessary to begin researching the effectiveness of placement procedures that include both cognitive and noncognitive assessment.

Methodology

Ohio’s higher education landscape is comprised of 170 institutions across the 5 major

Carnegie Classification categories and types of control (see Tables 1 and 2) (Carnegie

Foundation, 2000). Survey research methods were employed in this study as the research seeks to describe developmental education programs and practices across Ohio higher education institutions.

Subjects. Data were collected on the entire population of 170 institutions because the population size was not prohibitively large and the purpose of the research was to describe cognitive and noncognitive assessment methods across all Ohio higher education institutions. Of the 170 institutions surveyed, 59 responded which yielded a response rate of 34.71% (see Figure

1).

An Exploratory Research Study 10

Several measures were taken to boost the response rate. All 170 institutions were contacted in advance of questionnaire distribution to (a) determine the appropriate contact person who would be responsible for completing the questionnaire and (b) inform the contact person that the questionnaire was forthcoming. In addition, the Executive Director of the National

Center for Developmental Education and the then President of the Ohio Association for

Developmental Education co-signed the cover letter sent with the questionnaire (see Appendix

A). Confidentiality of respondents was guaranteed to the respondent in the cover letter and included with the questionnaire was a self-addressed, pre-stamped envelope. Follow-up calls to non-respondents were conducted two weeks after the questionnaire went out to encourage participation and representation in the survey research.

Apparatus. A questionnaire was designed to collect data on general developmental education information, diagnostic assessment and placement procedures, developmental/remedial education course work, student services and academic advising, and services for students with disabilities (see Appendix B). Questionnaire content was based on three primary sources, including Boylan’s 2002 text,

What works: Research-based best practices in developmental education and the 1998 and 2003 national studies on Remedial education at degree-granting institutions . Prior to distribution the questionnaire was distributed to a panel of experts for review and feedback. Minor revisions to the questionnaire content and design were incorporated as a result of the expert panel’s review.

Statistical procedure. Frequency distributions and percentages were calculated to determine the proportion of institutions engaging in various practices, methods, and using particular instruments.

An Exploratory Research Study 11

Results

The top three instruments used for cognitive assessment in reading, writing, and mathematics were the ACT (22.58%), COMPASS test (17.14%), and the SAT (15.00%) (see

Figure 2). Respondents indicating that they used “other” instruments or methods totaled 11.43% of the institutions; other included such instruments or methods such as admissions essays, the

Degrees of Reading Power test, high school references, on-site interviews, Wonderlic, or student input. Roughly 11% to 13% of the respondents indicated that their institutions used student selfassessment in reading, writing, and mathematics for cognitive placement (see Figure 3).

When asked whether placement into remedial course work based on placement test performance was mandatory, 35.92% of institutions indicated that math placement was mandatory, 24.60% of institutions indicated that reading placement was mandatory, and 33.33% of institutions indicated that English placement was mandatory (see Figure 4). Institutions were also asked to indicated whether they allowed students to bypass remedial course work with permission, without permission, or the do not allow students to bypass the results were 52%,

15%, and 33% respectively (see Figure 5).

A small percentage of institutions measure noncognitive characteristics (27% or 15 institutions) in addition to cognitive skills as compared to 73% who do not measure noncognitive characteristics (see Figure 6). Of the 15 institutions who measure noncognitive characteristics,

47% (7) are associate’s colleges, 20% (3) are baccalaureate colleges, 13% (2) are master’s colleges or universities, and 20% (3) are doctoral/research universities. When asked about specific instruments used to measure noncognitive characteristics, 31.25% of institutions indicated they used the Learning and Study Strategies Inventory, 18.75% of institutions indicated they used the Study Behavior Inventory, and 12.50% of institutions indicated they used the

An Exploratory Research Study 12

Noncognitive Questionnaire (see Figure 7). Many institutions (37.50%) indicated “other” when responding to this question. Open ended responses to this question included the Myers Briggs

Type Indicator and the College Student Inventory).

The offices that used the results from instruments or methods used to measure noncognitive characteristics included academic support services (38.46%), admission and support services (23.08%), and “other” (23.08%). Open ended responses included with the

“other” response included that the results are used as a metacognitive strategy, in institutional research, and in confidential one-on-one conferences with students. Only 19.23% of institutions who measure noncognitive skills require students to participate in services designed to develop those skills or overcome barriers identified (see Figure 9). The top four support services offered to students for noncognitive development are tutoring (25.25%), workshops (study skills, college success) (20.79%), orientation (mandatory) (14.85%), and orientation (not mandatory) (11.39%)

(see Figure 10).

Discussion

While this research was designed to provide a comprehensive description of cognitive and noncognitive assessment methods Ohio higher education institutions two results from this study are of particular interest in light of research-based best practices in the field: mandatory placement is not enforced and a small number of institutions (15) collect information on noncognitive characteristics and use the results to provide services to students.

The finding in this study regarding the ability to bypass remedial course work is not unique to Ohio (Boylan, 2002). Previous research studies have shown that mandatory placement facilitates student success (McCabe, 2000; McCabe and Day, 1998; Roueche and Roueche,

1999). Boylan (2002) states, “for mandatory assessment to be meaningful, however, it must be

An Exploratory Research Study 13 supported by mandatory placement” (p. 36). Institutions should assess the effectiveness of their current placement procedures, including choice of instrument, method of establishing cutoff scores, and student performance by placement results; once the effectiveness of the placement practices has been established, institutions can confidently begin (or continue) to require mandatory placement into remedial course work based on placement test scores, considered in conjunction with other factors.

The results from noncognitive measures could be useful in enabling students to achieve more success in college overall, not just remedial course work. In recent years, the number of colleges providing supplemental programming beyond remedial course-work designed to enhance student success has grown significantly. The establishment of the National Resource

Center for the First-Year Experience and Students in Transition demonstrates the widespread recognition that noncognitive skills significantly influence college success. The National

Resource Center celebrated its 20 th

anniversary in 2001 (n.d.). Sedlacek (2004) set about designing an assessment instrument that would estimate a student’s score on noncognitive variables that are related to student ability and performance. Sedlacek’s instrument, the

Noncognitive Questionnaire, however, is not without its critics (Glenn, 2004). Arguably this is an area that requires further research from several standpoints, two of which are: (a) the need for further research examining the possibility of creating a valid and reliable measure of students’ noncognitive skills, particular those that influence college success and (b) the difference between students’ success at institutions who measure noncognitive skills and institutions who do not measure noncognitive skills. These proposed research studies will serve students’ interests in that the results can influence institutional practices that ultimately impact students’ success.

An Exploratory Research Study 14

Conclusions and Recommendations

Ohio’s 170 higher education institutions provide access to developmental education in a variety of ways. Many institutions provide comprehensive academic support services and remedial course work, while others do not. The way in which developmental education is defined and provided across Ohio campuses is as varied as the number and type of institutions in the state. The institution’s mission is (and should be) the factor that drives the degree of developmental education programs and remedial course work that are provided at any one institution. In an era of declining resources, institutions must consider which programs and practices are central to their mission and focus their resources on those activities and programs.

The fragmented landscape of developmental education in Ohio contributes to the confusion that surrounds the field.

This study serves as the foundation for OADE’s research agenda and it provides a means for practitioners and advocates to educate the field’s external constituents as to what developmental education means and the programs and courses it encompasses. This study also provides an opportunity for the Association to consider its challenges as a professional association based on the landscape of developmental education in the state. The Association can better support its members by providing access to (a) evidence-based best practices in the field and (b) data that enables each member to compare institutional practice with peer institutions and colleagues. This study serves as the first step in providing access to the data on developmental education in Ohio for comparative purposes and it allows the Research Committee to critically examine the state of the field and propose future research studies that can advance the field in the best interests of its students.

An Exploratory Research Study 15

Failure to engage in developmental education best practices compromises the future of the field and the students and institutions it serves. Mandatory assessment and mandatory placement are not consistently enforced across the different institutions and this leads to compromised success for students. Access to higher education without the academic support programs and assessment methods in place that enable students to become better learners is not true access.

The data show that few institutions, only 15 out of 170, concurrently assess students on their cognitive and noncognitive skills. It is possible that coupling these assessment methods together would enable students to achieve greater success. It is not possible to know this without additional research and this study provides the information necessary for developing such research. Future research that includes more in-depth analysis of college placement examinations and cutoff scores is also necessary.

The future of the state and its constituents depends upon an educational system, at all levels, that engages in critical self-examination, identifies best practices, implements programs and courses that are founded on evidence-based best practices, and defines the future according to the rate of student success. The field of developmental education is responsible for engaging in this examination as well; this research serves as the first step in a journey toward enhanced effectiveness as defined by student success.

An Exploratory Research Study 16

Adelman, C. (1999). Tools in the tool box .

References

Boylan, H. R. (2002). What works: Research-based best practices in developmental education .

Boone, NC: National Center for Developmental Education with the Continuous Quality

Improvement Network.

Brothen, T. B., & Wambach, C. A. (2004). Refocusing developmental education. Journal of

Developmental Education, 28 (2), 16-17, 22, 33.

Buttermore, K. (2004). 2004 Ohio Association for Developmental Education and College

Reading and Learning Association Conference Program: Pioneers in the Frontiers of

Teaching and Learning . [Conference Program]. Toledo, OH.

Carnegie Foundation for the Advancement of Teaching. (2000). The Carnegie Classification of

Institutions of Higher Education.

Retrieved May 1, 2003 from http://www.carnegiefoundation.org/Classification/

Casazza, M. E., & Silverman, S. L. (1996). Learning assistance and developmental education: A guide for effective practice . San Francisco: Jossey-Bass. 16-22, 33.

Glenn, D. (2004, June 1). Admissions questionnaire used to measure noncognitive traits is said to be nearly invalid [Electronic version]. The Chronicle of Higher Education .

Martinez, M. C. (2004). Postsecondary participation and state policy: Meeting the future demand . Sterling, VA: Stylus.

McCabe, R. (2000). No one to waste: A report to public decision makers and community college leaders . Washington, DC: Community College Press.

An Exploratory Research Study 17

McCabe, R., & Day, P. (1998). Developmental education: A twenty-first century social and economic imperative . Mission Viejo, CA: League for Innovation in the Community

College.

National Resource Center for the First-Year Experience and Students in Transition. (n.d.). 20 th anniversary presentation.

Retrieved February 5, 2005 from http://www.sc.edu/fye/center/20th/20th.html

Roueche, J., & Roueche, S. (1999). High stakes, high performance: making remedial education work . Washington, DC: American Association of Community Colleges.

Mortenson, T. (1997, December). Academic preparation for college. Postsecondary Education

OPPORTUNITY, 66 . Retrieved December 16, 2004, from http://www.postsecondary.org/ti/ti_02.asp

National Center for Education Statistics. (2003). Remedial education at degree-granting postsecondary institutions in fall 2000 . Retrieved July 13, 2004, from http://nces.ed.gov/pubsearch/pubsinfo.asp?pubid=2004010

Ohio Board of Regents (2003a). ACT scores core vs. non-core. In

Ohio’s colleges and universities: Profile of student outcomes, experiences, and campus measures,

Preparation and related outcomes . Retrieved December 16, 2004, from http://www.regents.state.oh.us/perfrpt/2003-III.html

Ohio Board of Regents. (2003b). Developmental education by core. In

Ohio’s colleges and universities: Profile of student outcomes, experiences and campus measures, preparation and related outcomes . Retrieved December 16, 2004, from http://www.regents.state.oh.us/perfrpt/2003-III.html

An Exploratory Research Study 18

Sedlacek, W. E. (2004). Beyond the big test: Noncognitive assessment in higher education . San

Franscisco: Jossey-Bass.

An Exploratory Research Study 19

Appendix A

Cover Letter

August 2004

The Ohio Association for Developmental Education (OADE) is conducting a survey of developmental education programs and placement procedures across Ohio’s higher education institutions and your institution’s representation in this study is critical to insure the accuracy of this research. OADE plans to use the data to prepare a compendium that describes developmental education programs and placement procedures as they currently exist in the effort to better inform practitioners, parents, students, and policymakers about the field of developmental education. The National Center for Developmental

Education endorses this study and urges you to participate in this important research.

The final report will be made available through the Association’s web site ( www.oade.org

) as a PDF (free of charge) and in hard-copy form (for printing and postage costs). This compendium will ultimately serve as (1) a resource for developmental education practitioners, parents and students, and policymakers, and

(2) a foundation for future OADE research projects.

Your institution was chosen as part of the population of accredited two- and four-year higher education institutions in Ohio. There are four major sections in the questionnaire, which are as follows (a) general developmental education; (b) diagnostic assessment and placement procedures; (c) developmental/remedial course work; and (d) student services/academic advising.

We have placed an identification number on the top right-hand corner of the questionnaire that makes it possible for us to identify your institution. There are two purposes for this identification: (1) it will reduce our mailing costs as only those institutions who fail to respond will be sent follow-up letters, and (2) it allows us to integrate these data with the Integrated Postsecondary Education Data System (IPEDS). If for any reason you choose not to be identified, just remove the ID number on the first page of the questionnaire and complete and return the questionnaire. Individual institutions will not be identified in the final report.

Thank you in advance for your efforts at informing developmental education practice in Ohio through your participation in this research. Please return the completed questionnaire as soon as possible . If you have any questions please contact Lana Evans, Research Chair, via e-mail at levans@northweststate.edu

or by phone at 419-267-5511, extension 225 or David Haiduc, (Research Committee member) via e-mail at David.Haiduc@tri-c.edu

or by phone at 216-987-2515.

Sincerely,

Kathleen Buttermore Hunter Boylan, Ph.D.

President

Ohio Association for Developmental Education

Executive Director

National Center for Developmental Education

An Exploratory Research Study 20

Appendix B

2004 Statewide Study of

Developmental Education Programs

And

Placement Procedures Across

Ohio’s Higher Education Institutions

Sponsored and Supported by:

Ohio Association for Developmental Education www.oade.org

August 2004

Authors: Lana Evans and David Haiduc

Endorsed by:

National Center for Developmental Education www.ncde.appstate.edu/

Mailing Address:

Lana Evans

Northwest State Community College

22600 State Route 34

Archbold, OH 43502-9542

An Exploratory Research Study 21

Instructions: Developmental education encompasses policies, procedures, programs, and courses that are designed to determine, support, and enhance students’ cognitive and noncognitive development. This questionnaire is designed to collect data on the types of developmental education courses and programs currently offered across all Ohio higher education institutions . The Ohio Association for

Developmental Education (OADE) plans to prepare a compendium describing the myriad types of developmental education programs and practices available at colleges and universities that will serve as a resource for developmental education practitioners, parents, students, and policymakers. Your participation in this research project is critical to ensure that all institutions and programs are represented in the compendium.

It is estimated that it will require approximately 30 minutes of your time to complete this questionnaire. Thank you in advance for your willingness to support informed practice in developmental education in Ohio.

PLEASE RETURN THE COMPLETED QUESTIONNAIRE IN

THE ENCLOSED PRE-PAID ENVELOPE.

Contact Information:

Please provide the name and contact number of the person responsible for completing this questionnaire in the event that we need to clarify response information.

Name:________________________ Title:________________________

Institution Name:______________________ E-mail address:_________________

Telephone number: _____________

RETURN COMPLETED FORM TO:

Lana Evans

IF YOU HAVE ANY QUESTIONS, CALL

Lana Evans (Northwest State Community College)

Northwest State Community College

22600 State Route 34

Archbold, OH 43502-9542

Phone: (419) 267-5511, Ext. 225

8:00 a.m. to 4:30 p.m.

Fax: (419) 267-5692

General Developmental Education Information 3

1. How would you categorize your Developmental Education program/services?



Primarily centralized (courses/support services offered through a division or center)



Primarily decentralized (courses/support services offered through separate academic departments)

Diagnostic Assessment and Placement Procedures

Cognitive Assessment

2. Which of the following cognitive assessment instruments or characteristics does your institution use for students’ initial assessment/placement into courses? Please check all that apply.

 ACT  ACCUPLACER

 Nelson Denny   SAT

ASSET  COMPASS

Test of Adult Basic Education

 Test developed by your institution  High School Grade Point Average

 Other _______________________________________________________________________

Noncognitive Assessment 4

3

The term developmental education as used here is based on the National Association for Developmental Education definition as follows: “…Developmental education programs and services commonly address academic preparedness, diagnostic assessment and placement, development of general and discipline-specific learning strategies, and affective barriers to learning….includes but is not limited to all forms of learning assistance, such as tutoring, mentoring, and supplemental instruction; personal, academic, and career counseling; academic advisement; and course work.” (http://www.nade.net/A1.%20de_definition.htm)

4 The term “noncognitive” as it is used here “refers to variables relating to adjustment, motivation, and student perceptions” (Sedlacek, 2004, p. 7).

An Exploratory Research Study 22

3. Do you measure students’ noncognitive characteristics/variables?



Yes

No (Skip to question #6)

4. Are the results from the noncognitive assessments used to make: (Please check all that apply)



Admission decisions



Decisions regarding academic support services required to facilitate student success

 Other (Please list):___________________________________________________________

5. Which of the following instruments do you use to measure noncognitive characteristics/variables

(Please check all that apply)?

 Learning and Study Strategies Inventory

 Canfield Learning Styles Inventory

 Study Behavior Inventory

 Non-Cognitive Questionnaire

 Other _____________________________________________________________________

____________________________________________________________________________

__________________________________________________________

6. Please list below any other instruments you are aware of that purport to measure noncognitive variables:

_____________________________________________________________________________________

_____________________________________________________________________________________

________________________________________________________________

Placement Procedures & Policies

7. Placement into remedial/developmental course work is mandatory based on cognitive assessment in which of the following categories (Please check all that apply):

 Math  Reading  Writing  Not Applicable

8. Students self-assess their skills (i.e., no placement testing occurs) in which of the following categories

(Please check all that apply):

Math

Reading

Writing

Not Applicable

9. At your institution students can bypass remedial coursework:



With permission

Without permission



Not permitted to bypass

10. Placement into programs/services that support students’ noncognitive development is mandatory based on assessment results.

Yes

No

An Exploratory Research Study 23

11. Which of the following services are offered to support students’ noncognitive/affective development?

Workshops (study skills, time management, college success, etc.)

Orientation (mandatory)

Mentoring/coaching program

Tutoring

Orientation (Not mandatory)

Support groups

Learning communities

Other______________________________________________________________________________

______________________________________________________________________

Developmental/Remedial Course Work 5

12. Does your institution offer remedial course work as defined in Footnote #3?

13. Please check the reason(s) your institution does not provide developmental/remedial course work below. (Please check all that apply)

 Students attending this institution do not need developmental/remedial course work

 Institutional policies prohibit offering developmental/remedial course work

 Students complete developmental/remedial course work at another institution

(Now skip to question #19)

14. Developmental/remedial course work is offered in the following areas:

 Math  Reading  Writing  Other_______________________

15. How many levels of developmental/remedial coursework does your institution offer in each of the following subject areas?

Writing

Reading

 1  2  3  4

1

2

3

4

Mathematics  1  2  3  4

Other course(s)

 

1

2

3

4

16. What time of day does your institution offer remedial courses (Please check all that apply):

Day

Evening

Weekends

17. Developmental/remedial courses are offered (Please check all that apply):

At a distance Web enhanced Classroom only

18. Are students restricted from taking college-level courses while enrolled in developmental/remedial course work?

Yes No

19. Does your institution have formal/informal agreement(s) with other institution(s) to provide developmental/remedial course work? If no, skip to #21.

Yes No

5 Developmental/remedial course work is defined as “courses in reading, writing, or mathematics for college students lacking those skills necessary to perform college-level work at the level required by your institution. “

(National Center for Education Statistics, 1995, p. 1 of questionnaire: Remedial Education in Higher Education

Institutions)

An Exploratory Research Study 24

20. Which type of institution is typically the provider of your institution’s developmental/remedial course work (either on a formal or informal basis)?

-year college -year college

-year college

-year college

21. Which of the following learning assistance/facilitation programs is offered by your institution? (Please check all that apply)

Tutoring (In-person)

 Tutoring (On-line)

Content-specific Labs

 Academic Advising

 Mentoring/Coaching

 Adult Basic Literacy Education

Supplemental Instruction (SI)

Career Counseling

 English for Speakers of Other Languages (ESOL) 

 Freshman Seminar Course

 Orientation (Short-term)

Orientation (On-going)

Personal Counseling

 Workshops (Time management, college success, computer skills, etc.)

 Learning Communities

Student Services/Academic Advising

22. At your institution, academic advising services are provided by:

-time Academic Advisors -time Academic Advisors of this

23. Academic advisors are also required to be Licensed Professional Counselors.

24. Are your academic advisors formally trained?

If yes, please complete question #25.

25. Academic advisors are trained using: (Please check all that apply)

 Materials developed “in-house” 

Materials purchased from an external source

Services for Students with Disabilities

26. What services does your institution offer for students with disabilities? (Please check all that apply)

-on-One reader

_______________________________________________________

27. What documentation does your institution accept to provide a student with accommodations?

-Factored Evaluation (MFE) ng appropriate test scores and recommended accommodations

Thank you for your participation!

An Exploratory Research Study 25

Table 1

Distribution of Ohio Higher Education Institutions by 2000 Carnegie Classification

Category a Number of Institutions Percentage Distribution

Doctoral/Research Universities

Doctoral/Research Universities-

Extensive

Doctoral/Research Universities-

Intensive

Master’s Colleges and

Universities

Master’s Colleges and

13

6

7

16

12

7.6

3.5

4.1

9.5

7.1

Universities I

Master’s Colleges and

Universities II

4 2.4

Baccalaureate Colleges

Baccalaureate Colleges-Liberal

Arts

Baccalaureate Colleges-General

Baccalaureate/Associate’s

Colleges

Associate’s Colleges

Specialized Institutions

30

10

16

4

17.6

5.9

9.4

2.4

82 b

29

48.2

17.1

Total 170 100.00 a The number of institutions by category includes public, private not-for-profit, and private for-profit institutions. b The number of Associate’s Colleges includes 13 branch campuses counted as separate and distinct institutions.

Note: Tribal Colleges are excluded as there are no Tribal Colleges located in Ohio.

An Exploratory Research Study 26

Table 2

Distribution of Ohio Higher Education Institutions by 2000 Carnegie Classification and Control

Category Public Private, Not-For-

Profit

1

Private, For-

Profit

0

Percentage

Distribution

3.5 Doctoral/Research

Universities-Extensive

Doctoral/Research

Universities-Intensive

Master’s Colleges and

5

5 2 0 4.1

1 11 0 7.1

Universities I

Master’s Colleges and

0 4 0 2.4

Universities II

Baccalaureate Colleges-

Liberal Arts

Baccalaureate Colleges-

General

Baccalaureate/Associate’s

Colleges

Associate’s Colleges

Specialized Institutions

1

1

4

41 a

3

9

14

0

4

25

0

1

0

37

1

5.9

9.4

2.4

48.1

17.1

Total 61 70 39 100.00 a The number of Associate’s Colleges includes 13 branch campuses counted as separate and distinct institutions.

Note: Tribal Colleges are excluded as there are no Tribal Colleges located in Ohio.

An Exploratory Research Study 27

Figure 1. Institutions responding by Carnegie Classification.

Specialized Masters Only 0.59%

Religious Studies Institution 1.76%

Arts Institution 2.35%

Medical School 2.94%

Doctoral/Research Universities

Master's Universities

Baccalaureate Colleges

Associate's Colleges

0.00% 10.00%

7.65%

11.76%

20.59%

20.00% 30.00% 40.00% 50.00%

52.35%

60.00%

Figure 2. Instruments used to measure cognitive skills.

An Exploratory Research Study 28

ACT

COMPASS

SAT

OTHER

OWN INSTITUTION'S

HS GPA

ASSET 5.00%

TABE

ACCUPLACER

2.86%

2.14%

NELSON DENNEY

0.00%

1.43%

5.00% 10.00%

11.43%

11.43%

10.71%

15.00%

15.00%

17.14%

20.00%

22.86%

25.00%

An Exploratory Research Study 29

Figure 3. Institutions that use self assessment for placement.

63%

11%

13%

13%

Math Reading Writing Not applicable

An Exploratory Research Study 30

Figure 4. Percent of institutions where placement into courses is mandatory based on cognitive assessment.

50.00%

45.00%

40.00%

35.00%

30.00%

25.00%

20.00%

15.00%

10.00%

5.00%

0.00%

34.92%

24.60%

33.33%

7.14%

Math Reading Writing Not Applicable

An Exploratory Research Study 31

Figure 5. Percent of institutions that permit students to bypass remedial course work.

33%

52%

15%

With permission Without permission Not permitted to bypass

An Exploratory Research Study 32

Figure 6. Percent of institutions who do or do not measure students’ noncognitive characteristics.

27%

73%

No Yes

An Exploratory Research Study 33

Figure 7. Instruments used to measure noncognitive characteristics.

Canfield Learning Styles

Inventory

0.00%

Noncognitive questionnaire 12.50%

Study Behavior Inventory 18.75%

Learning and Study

Strategies Inventory

31.25%

Other 37.50%

0.00% 5.00% 10.00% 15.00% 20.00% 25.00% 30.00% 35.00% 40.00%

50.00%

45.00%

40.00%

35.00%

30.00%

25.00%

20.00%

15.00%

10.00%

5.00%

0.00%

An Exploratory Research Study 34

Figure 8. Offices in which noncognitive characteristics are used to make decisions regarding support services and programs.

Are the results from the noncognitive assessments used to make:

(Please check all that apply)

7.69%

38.46%

23.08% 23.08%

7.69%

Admissions Academic Support Svcs Other Admissions and

Support Svcs

Support Svcs and

Other

An Exploratory Research Study 35

Figure 9. Placement into services mandatory based on noncognitive assessment

19.23%

No Yes

80.77%

An Exploratory Research Study 36

Figure 10. Support services offered to students for noncognitive development.

Tutoring

Workshops (study skills, college success)

Orientation (mandatory)

Orientation (not mandatory)

Mentoring/coaching program

Learning communities

Other

Support groups

0.00%

25.25%

20.79%

14.85%

11.39%

9.90%

6.44%

5.94%

5.45%

5.00% 10.00% 15.00% 20.00% 25.00% 30.00%

Download