Criterion 3: Student Learning and Effective Teaching: The Organization provides evidence of student learning and teaching effectiveness that demonstrates it is fulfilling its educational mission Introduction Purdue University is committed to using learning assessment to refine our curriculum at all levels--university, college, and departmental. At all levels, there is recognition of the fundamental importance of a flexible model that allows learning assessment to be a normal part of our procedures. The following chart shows visually how the outcomes, activities, and assessment function: Student learning outcomes Instructional Assessment Instructional activities At the University level, Purdue has articulated learning goals (drafted in 1989 and approved by the faculty) that articulate faculty expectations: Students at Purdue are expected to acquire knowledge, develop the abilities to assess what they learn, and apply it effectively. To accomplish this, they must be able to read and think critically and to communicate— both orally and in writing—with clarity and precision. Developing competence in quantitative and scientific reasoning is equally necessary. 1 They also must become aware of the cultural, social, political, and economic forces and the technologies that shape our world. In their area of specialization, Purdue students at all levels are expected to achieve depth of understanding of both the essential content and principal modes of inquiry and to become familiar with the ethical issues facing their chosen fields. A Purdue education should prepare them for a lifetime of continual learning. These core values are embedded in the curricula across campus. The responsibility for defining the curriculum for each undergraduate program rests with the faculty members in each individual college. At the graduate level, the Graduate School and faculty within the Colleges share the responsibility for defining the curriculum for graduate programs. Core Component 3a: The organization’s goals for student learning outcomes are clearly stated for each educational program and make effective assessment possible. In order to be confident that appropriate assessment is taking place in all parts of Purdue’s campus at both the graduate and undergraduate levels, there is a campuswide director of assessment. The Assessment Director leads the Student Learning Outcomes Assessment Workgroup (SLOAW), a group that represents all the areas of the university and that meets regularly to communicate about assessment. The organization clearly differentiates its learning goals for undergraduate, graduate and post-baccalaureate programs by identifying the expected learning outcomes for each. It is easy for anyone with Web access (faculty, administrators, staff, students, or state legislators) to access the learning outcomes for all of the undergraduate, graduate, and post-baccalaureate programs on campus with a few clicks of the mouse. The SMART site (Student Measurement and t ) provides an efficient way of articulating learning 2 outcomes, evaluating student progress, and facilitating curricular changes that lead to better student learning. Put in a box: Purdue has joined the Higher Learning Commission’s Assessment Academy, resulting in a four-year plan for assessment. Graduate Assessment At Purdue University, all graduate degrees are granted by the Graduate School, regardless of the program in which a degree was earned, so there are shared requirements that graduates of all programs must meet. These are: 1. Knowledge and scholarship: Graduate Students of the Purdue University PhD programs will be able to demonstrate the ability to identify and conduct original research, scholarship or creative endeavors. 2. Communication: Graduate Students of the Purdue University PhD programs will be able to demonstrate the ability to effectively communicate their field of study. 3. Critical thinking: Graduate Students of the Purdue University PhD programs will be able to demonstrate the ability to think critically, creatively and solve problems in their field of study. 4. Ethical and responsible research: Graduate Students of the Purdue University PhD programs will be able to demonstrate the ability to conduct research in an ethical and responsible manner. The faculty in each graduate program can establish additional rules and expectations for their graduate students. These were established after wide discussion with the faculty. Here have a link to (http://www.edci.purdue.edu/misc/Minutes/Grad/Minutes/2007/Graduate_Student_Le arning_Outcomes%20.rtf Department faculty have developed student learning outcomes for the MS, MA, and MFA degrees. Many colleges and departments used the Ph.D. student learning outcomes as the starting point for their master’s outcomes. The most significant changes from Ph.D. outcomes to master’s (with thesis) outcomes were in the 3 "knowledge and scholarship" outcome. The typical Ph.D. student is expected to define his or her own research question and to conduct original research. Neither is usually required of the master’s (with thesis) student, and that difference is reflected in the first student learning outcome for those degree programs. The master’s nonthesis program varies widely from college to college and even among departments within a college. Therefore, more variation is expected and development of those outcomes has been left to the faculty members in each college. Undergraduate Assessment Purdue has more than 200 distinct undergraduate majors, and for each one, faculty members have determined learning outcomes, activities that assess outcomes, and a process for determining appropriate curricular changes. The University’s average class size for all programs is just 24 students. Assessment of student learning provides evidence at multiple levels: course, program, and institutional Assessment at Purdue takes place at all levels, at both the undergraduate and graduate level. Examples in the College of Pharmacy, Nursing, and Health Sciences illustrate this. Example 1 (program and course levels--undergraduate): Ongoing outcomes assessment in the undergraduate nursing program includes, among other criteria, the use of pass rates on the National Council Licensure Examination (NCLEX). In 2002-2003, the scores for graduates in the school of nursing were just below the national rate. A comprehensive review of admission, progression and readmission data, grades in key courses, and student exposure to seasoned faculty revealed that a change to a centralized rolling admissions process, ambiguity in the readmission policy, low grades in the Biology and Pathophysiology courses, and 4 multiple repeats of core nursing courses coincided with this cohort. In addition, a gap between junior year courses (content is heavily tested on the NCLEX) and the timing of the exam was examined. Student exit surveys and employer surveys also identified strong interest in a capstone course at the senior year. There was no correlation between this cohort of students and inexperienced teachers. This data prompted the school curriculum committee to recommend the institution of a clinical capstone course at the senior level. What year was the capstone added? In addition, faculty also approved the integration of a computer-based NCLEX preparation program into core courses in each of the four years of the undergraduate curriculum. The student affairs committee reviewed and revised the admission, progression, and retention policy to require a higher level of performance in core science courses prior to progressing to junior level nursing courses and clarified ambiguity in the readmission policy. Faculty approved these policy changes. Concurrently, conversations were held with Admissions staff to share findings, clarify the desired criteria, and discuss the rolling admissions process. As a result of these actions, the quality of candidates admitted to the school of nursing improved (SAT scores up 7.5%), and the capstone course was offered as an elective to all students in the program. Enrollment has increased by 400% from first offering and beginning in 2009, all senior students have the capstone as part of their required program of study. In addition, the school has faculty to participate in development activities to improve teaching effectiveness. These programs are offered both on campus and in the school. The most recent NCLEX pass rates reflecting a 13% increase (97.2%), indicating that the curricular changes made as a result of assessment have led to better learning by the students. Example 2 (program and course levels—graduate): Learning Assessment has also been successfully used at the graduate level. A traditional "high stakes" written examination that was administered at the end of the graduate student's second year in the Ph.D. program for the Department of Medicinal 5 Chemistry and Molecular Pharmacology was replaced with a cumulative-type examination that begins during the spring semester of the student's first year. First year graduate students are expected to take their first examination in February of their second semester. At each examination period one question is offered from each of three general areas: 1) Chemistry and Chemical Biology, 2) Quantitative and Analytical Sciences, and 3) Molecular and Cellular Biology. The student selects only one of the three questions to answer from any of the three areas. There are no restrictions on the student’s selection of a question. The written assessment provides a platform for development of critical thinking and problem-solving skills based on the analysis of contemporary literature in the areas of medicinal chemistry and molecular pharmacology. The students are challenged to read a paper from the literature and then respond to a series of questions that are related to the paper (or the topic covered by the paper). The questions are designed to specifically examine the ability of the students to analyze critically scientific data, solve problems, and design experiments. The assessment provides for a high pass, pass, or no pass designation. After each examination, the students who did not receive a high pass are strongly encouraged to meet with the faculty mentor who authored the exam question that they selected. During this time, the student will receive direct feedback and guidance. There are several advantages of this format for student learning. Because the assessment begins in the student's first year, it encourages early and regular engagement with the scientific literature. Instead of simply assessing student knowledge (as in the previous "high stakes" written examination), the questions are designed to explore important student learning outcomes such as problem solving and critical thinking. Over the course of several examinations, the students have the opportunity to discover and learn how to respond to the questions being asked. Also, the students have an opportunity to meet with the faculty following the examination to explore answers to the questions. The cumulative nature of the examination format 6 allows the faculty to assess student progress and work with students to enhance outcomes. Example 3 (course and institutional level): Another way that Purdue has utilized technology and maximized teaching is by making appropriate changes in the way that writing instruction is delivered. After careful evaluation, including the realization that many students were not taking the full six hours of writing instruction, and looking at best practices across country, the Introductory Writing Program of the Department of English redesigned first-year composition in 2001. The former English 101 (3 hours) and English 102 (3 hours) has become a four-hour course (English 106), organized with two lecture classes, one class held in a computer classroom, and up to two hours of individual and small group conferences per week. The new curriculum emphasizes persuasive writing and has a number of benefits to Purdue undergraduates. First, the new English 106 acknowledges the importance of one-to-one and small group interaction in writing courses. By requiring students to meet regularly with instructors to receive feedback and advice about their writing, students become better writers more quickly. Individualized instruction included lowering the class size—a move strongly endorsed by writing programs across the US. Second, Purdue students now receive instruction in using writing technology, since each class has time each week in a computer classroom. Third, all undergraduates now have instruction in research and academic writing. Formerly, many students were not required to take English 102, thereby shortcircuiting the learning goals that were established by the English faculty. The new course assures faculty that all students have received instruction in these skills. 7 The change from English 101 and 102 is a prime example of how Purdue faculty at the individual course level gather feedback, use best practices, and work together to improve learning. Example 4 (institutional level): The College of Science redesigned their core curriculum in 2006. With considerable input and involvement from faculty, advisors, students, and alumni, the Task Force identified six primary learning outcomes for Science undergraduates: Demonstrated depth in major Ability to think and function as a scientist Ability to communicate well, both orally and in writing Ability to collaborate as part of a team Ability to function in a multidisciplinary setting Demonstrated breadth of knowledge and cultural appreciation The College of Science then proposed new curricular and co-curricular models to meet those outcomes, and developed strategies for assessment. http://www.science.purdue.edu/faculty_staff/committees/undergrad_task_force/index. asp Assessment of student learning includes multiple direct and indirect measures of student learning. There are multiple ways that groups across campus measure student learning. Each degree granting-program has identified desirable student learning outcomes, which 8 mastery of specific knowledge, processes, and skills as well as development of attributes and attitudes such as a dedication to life-long learning. Direct measures include scores on exams, both in class and on professional examinations required for licensure in some disciplines, performance on class projects such as those in capstone courses, and acceptance of papers submitted to conferences and refereed journals. Indirect measures include surveys of students before and after a course to determine how much they believe their skills have improved as a result of the course or surveys of alumni and employers asking how they rate Purdue University graduates' skills on a scale of 1 to 5. The previous section showed many direct measures of student learning. Other indirect measures are routinely used across campus. For example, Purdue has administered the National Survey of Student Engagement (NSSE) in both 2004 and 2007, and plans to administer it every three years. All of the NSSE questions have been mapped to HLC/NCA criteria, allowing us to use these for curricular change. The results are shared widely at the University level (to deans, department heads, and directors at the President’s Forum, for example) and distributed to stakeholders. Full results for Purdue, including results, reports, and common data sets are available on the Web: http://www.purdue.edu/OIR/resources.htm Another good example of indirect data that is collected and disseminated to stakeholders is information that is gathered from graduating students and prospective employers about salaries and expectations. All of this information is on the Center for Career Opportunities Web site, including salary data by college for the past three years: https://www.cco.purdue.edu/About/PostGradData.shtml Writing is a skill that is valued across campus, and Purdue has established effective ways for improving student writing. The Writing Lab, located in Heavilon Hall, had more than 2,500 individual users during the 2006-07 academic year, aiding writing skills across campus. Establishing satellite locations in Meredith Hall (a residence hall) increased the number of students who are helped each year. Besides the face-to-face visits, Purdue is committed to teaching and learning with technology. Purdue’s University Online Writing Lab (OWL) had almost 85 million hits in 2006-07. The site serves more than 125 countries and generates almost 5,500 e-mail 9 responses each year. Information about how OWL aids student writing is available here: http://owl.english.purdue.edu/ At the graduate level there is abundant information collected about programs, including direct evidence such as, Performance in preliminary examination; Dissertation proposal defense; and Dissertation defense. There are also indirect measures, such as a survey of graduating students about job placement and a reporting of recognitions and awards. The Graduate School also helps the departments by providing five year trends about – The size of the graduate program; – The number of graduate teaching and research assistants per department; – The number of fellowships by department; – The average duration of funding for Masters and Doctoral students; – The number of graduate students per faculty; – The composition by gender and race/ethnicity; – The number of theses and dissertations; – GRE or other test scores and collective grade point averages; – Stipend levels by MS and PhD students; – Retention rates for both masters and doctoral students; – Average time to degree completion; – Degrees granted per year – last 5 years (MS and PhD); – Plans of Study filed; – Preliminary examinations passed; This information can help departments assess their programs and find ways to help students. 10 Results obtained through assessment of student learning are available to appropriate constituencies, including students themselves. The constituencies we have identified include students, prospective students, faculty, alumni, and prospective employers. The previously mentioned BALOTS Web site shows the assessment results for all programs In addition, the university has joined a national system, Voluntary System of Accountability (VSA) which currently has more than 40% of the combined membership of NASULGC and AASCU members. The college portrait on this site will afford our constituencies the opportunity to view the outcomes of students along with other information such as costs. Undergraduates Students, prospective students, faculty, alumni, and employers can find information about organizations that recruit and hire Purdue graduates at the Center for Career Counseling (CCO) website https://www.cco.purdue.edu/.The CCO is among 14 student services areas that participate in The Student Importance and Satisfaction Survey, facilitated by the Office of the Vice President of Student Services. The survey items include: interviewing with employers on campus, website resources, career programs, and career exploration/job search consultation and uses a “gap analysis” technique to compare the average satisfaction index with the average importance index. The 2007 survey report can be found at http://www.purdue.edu/VPSS/plan_assess/studentsurvey/2007/Welcome.php. The employer survey is a hard copy form distributed to recruiters who interview students through the CCO. The purpose of this feedback is to identify the nature of the employer organizations’ relationships with Purdue, and includes among other things, feedback on their experience with students. Specific undergraduate programs participate in accreditation processes and the outcomes of these are published. For example, all programs in the College of Engineering have posted ABET results (e.g., information on the outcomes of the Nuclear Engineering program can be located at the following URL: 11 https://engineering.purdue.edu/NE/ABET/Outcomes/effectiveness_importance/effectiven ess_importance). The College of Science significantly changed its core as a result of student and faculty input, as well as their advisory groups. The old and new cores are posted here: http://www.science.purdue.edu/core/index.asp. Education has PRAXIS results for each of its education majors that are included in the annual report for the Office of Professional Preparation and licensure (add info about annual report as an electronic document—couldn’t find online) The graduate school keeps records and disseminates to programs the following types of important material: – – Direct evidence – Performance in preliminary examination – Dissertation proposal defense – Dissertation defense Indirect evidence – Survey of graduating students – Job placement – Recognitions/award Data provided by the Graduate School for all Departments (5 year trend) – size of the graduate program (# of graduate faculty, - MS students, PhD students, – graduate teaching and research assistants per department – # of fellowships by department – average duration of funding for Masters and Doctoral students – number of graduate students per faculty (as a major professor), whether Masters or Doctoral; the advisory committee assignments – Gender and race/ethnicity composition – Number of theses and dissertations/5 years – GRE or other test scores, collective grade point averages – Stipend levels by MS and PhD students 12 – Retention rates for both masters and doctoral students? – Average time to degree completion? – Degrees granted per year – last 5 years (MS and PhD) – Plans of Study filed – Preliminary examination passed The organization’s assessment of student learning extends to all educational offerings, including credit and noncredit certificate programs. All educational offerings are assessed as to student learning, including credit and non credit certificate programs. For example, graduate certificate programs are approved by the Graduate Council. The council expects that each school developing or administering a graduate certificate program will incorporate the four general overarching outcomes categories (knowledge and scholarship, communication; critical thinking; and ethical and responsible research) in specific measurable ways as pertinent to the specific Masters 'certificate' programs. The undergraduate certificate in entrepreneurship and innovation (15 credit hours) has developed in the last 3 years. This program is similar to a minor and is open to students in any discipline. The incentive for this program came from former President Martin Jischke, who saw the value of entrepreneurial skills for students in all majors, not just management. This coincided with the expansions in Discovery Park, a place for interdisciplinary research. The goal was for 1000 students to participate in 3 years. This goal has been met with 1100 student completing the first course in the certificate program. Forty students have graduated as of December 2007. Because Discovery Park is not an academic unit, the Krannert School of Management provides administrative support. A Faculty advisory committee comprised of a representative from each college works along with outside evaluators to determine effectiveness of the program. Data from the assessment program include: student course assessments, student exit surveys, participation in competitions (business plans), employer surveys following capstone course. Lots of data in a PowerPoint presentation that should go on the website—locate powerpoint for use in resource room. 13 Study Abroad certificate programs are given overseas institutions for a concentration of courses. The assessment completed by the Study Abroad program is on behalf of the academic units. They make arrangements for the course evaluation, and use a few questions to monitor program quality (logistics), but the data go to the instructor. The organization integrates into its assessment of student learning the data reported for purposes of external accountability (e.g., graduation rates, passage rates on licensing exams, placement rates, transfer rates). There are many ways that Purdue University uses data from external sources into assessment activities. For instance, Engineering students first take the Engineering in Training Exam (EIT) and then the Professional Engineer (PE) Exam (which is taken later in the person's career). The pass rate at Purdue University has been consistently about 95-100% through the years. It was 90% in 2008. Another example is found in the College of Education (COE). The COE Title II reports from 2001 to 2008 show results from Purdue students for those years: http://www.education.purdue.edu/oppl/2002/title2/ The 2006-2007 Annual Report from the College of Education Office of Profession Preparation and Licensure (OPPL) includes information on pages 15-20 of the report (1621 on the pdf navigation bar) about three licensure tests: the Praxis I basic academic skills assessment for admission into Teacher Education at Purdue (all participating Colleges: Agriculture, Consumer and Family Sciences, Education, Liberal Arts, Science, and Technology) and for licensure, Praxis II for subject area and specialty tests, and the School Leaders Licensure Assessment (SLLA) for administrative licenses. Program completer pass-rate data for 1999-2000 through 2005-2006 is included on report pages 18-19. In most cases, the pass rates are at or close to 100%. In virtually all cases where the pass-rate is below 90%, that percentage is based on fewer than 10 individual results. Yet another example comes from the veterinary technician program. Graduates must pass the Veterinary Technician National Examination in order to be licensed, 14 certified, or registered in each state. Here are the stats for the past 8 years. Purdue has one of the highest pass rates on the VTNE in North America: Number Number Year of of Purdue passed Januar candidates % passed y Exam Purdu Avera e ge for Avera all ge test Score takers 2000 28 28 100.0% 585 529 2001 25 25 100.0% 574 521 2002 27 26 96.3% 577 524 2003 29 29 100.0% 600 521 2004 30 30 100.0% 610 527 2005 27 27 100.0% 599 537 2006 27 27 100.0% 565 500 2007 24 22 91.7% 563 483 At the graduate level, an example is the Doctor of Pharmacy Program (Pharm.D.), where graduates take two national licensing examinations. The exams and pass rates for the past four years are as follows: North American Pharmacist Licensure Examination (NAPLEX): 2007=97.83% 2006=95.05% 2005=96.30% 2004=97.30% Multistate Pharmacy Jurisprudence Examination (MPJE): 2007=96.84% 2006=95.11%, 2005=96.00% 2004=90.54% (I’ve asked for ref on the above). In the School of Health Sciences, the pass rate for the National Association for Accreditation of Clinical Laboratory Sciences (NAACLS) examination taken by Medical Technology graduates has been 100% for the past several years (I’ve asked for ref). 15 Faculty are involved in defining expected student learning outcomes and creating the strategies to determine whether these outcomes are achieved. Determining whether the student learning outcomes are being achieved is perhaps the most important part of our task, and one we take very seriously. Faculty are involved on committees, in department meetings, and indeed in each step. All of the examples are available on the BALOTS Web site, but the following three examples offer some specific work done in deparments. Learning outcomes in the Department of Food Science are specified by the professional organization, the Institute of Food Technologies (IFT). The Department created a mapping of the learning outcomes in each course to those required by the IFT. The resulting analysis indicated that not all of the learning outcomes could be achieved as a result of taking the core curriculum. In order to realize all of the desired outcomes, students needed to take some of the elective courses. As a result, some of the elective courses were included in the core. In addition, the Department began to survey alumni and their employers annually to determine how well those groups believe the students are meeting the learning outcomes. Courses and the curriculum are modified as necessary to ensure that students are well prepared for their jobs. The Master of Science Management was established in 1972 and renamed the Master of Business Administration in 2001. As part of the regular program review in 2003-04, a faculty committee studied the MBA options offered at other schools. In addition, the committee reviewed feedback from focus groups of graduating students and data gathered by the Graduate Career Services Office. Three areas of interest and growth were identified by all three sources of information. As a result, the Krannert School of Business began to offer options in all three areas: Analytical Consultation, Global Supply Chain Management, and Technological Innovation and Entrepreneurship. The Doctor of Veterinary Medicine professional degree program was established in 1959. In recent years, the American Veterinary Medical Association, the accrediting body for Veterinary Medicine published a list of clinical competency outcomes. Among 16 those outcomes was mastery of anesthesia. Students in Purdue University's program were required to study principles and practice of anesthesiology. However, they were not required to take anesthesia during their final, clinical year, and because of the limited number of faculty members in that area, the clinical course was not offered as an elective until 1999-2000. Exit interviews of graduating seniors who had taken the elective indicated that the course was valuable, and over a period of 4 years, graduating seniors overwhelmingly indicated they thought the course should be a requirement. The faculty took this input into account, and for the senior class graduating in May 2008, the clinical course in anesthesiology is required. Faculty and administrators routinely review the effectiveness and uses of the organization’s program to assess student learning. There is a process for routinely reviewing the effectiveness of assessment at Purdue University. The chart at the beginning of this section showed the components of assessment. Program student learning outcomes define what a student will be able to know and do upon completion of the program. Instructional activities provide students with meaningful opportunities to achieve a stated learning outcome. Instructional assessments allow faculty to monitor student learning and provide feedback about the effectiveness of the instructional activities for achieving a stated learning outcome. To make sure that these three parts of Purdue University’s assessment plan are working in conjunction with each other, each summer (Here we need to describe how we are going to assess the assessment each summer—need to wait until the SLOAW group decides how they are going to do this). Another way that the University has had faculty look at their assessment practices is by bringing experts to campus to talk about different ways of assessing student learning. For example, Tony Angelo was an invited speaker in (check date). Trudy Banta has also spoken to the faculty about assessment (add more details here). Conclusion The assessment of student learning is an integral part of what Purdue faculty members do on a regular basis, and forms a three-part feedback loop: learning goals are articulated, 17 activities to achieve the goals are implemented, and appropriate changes as evidenced by assessment are made. The SMART Web site makes this three-step assessment process more transparent, and allows faculty members to see where they have made progress, as well as focus more directly in places that need more attention. Purdue has an obligation to its students to evaluate its strengths and weaknesses in relation to student learning, and this is a commitment that has been realized. For phase two of our project this coming fall: Strengths Concerns: Few if any departments have set international learning outcomes. A few colleges, like Science and Ag have international/intercultural requirements in the curriculum, but these are just now getting into individual courses as learning outcomes. Our assessment attempt is just an early precursor to what will have to happen campus-wide. And for recruiting and orienting students to course outcomes, again - these are faculty courses. The Study Abroad program acts as the facilitator. Issues for the Future 18 Conclusion The assessment of student learning is an integral part of what Purdue faculty members do on a regular basis, and forms a three-part feedback loop: learning goals are articulated, activities to achieve the goals are implemented, and appropriate changes as evidenced by assessment are made. The SMART Web site makes this three-step assessment process more transparent, and allows faculty members to see where they have made progress, as well as focus more directly in places that need more attention. Purdue has an obligation to its students to evaluate its strengths and weaknesses in relation to student learning, and this is a commitment that has been realized. 1. Service Learning 2. EPICS 3. Information about technology—labs and library 19