University of Colorado at Boulder QIS Indicator 16: Graduation-year assessment plan Page 1 University of Colorado at Boulder CCHE Quality Indicator System for funding for 2001-02 Indicator 16, Graduation-year assessment plan Assessing graduating seniors’ knowledge and skills in the major field November 2000 CCHE request Description: The assessment program should build upon existing institutional, college, department, or program assessment and shall measure the student's knowledge and skills in his/her major field, vocational, or training area. Nationally normed major field tests should be used whenever available and applicable to the institution's program. If a national normed major field test exists and is being utilized by similar institutions across the United States, an explanation and justification for its non-utilization by the Colorado institution must accompany the materials submitted to the CCHE. Portfolios of accomplishment and/or demonstrations of competency may be used. Sampling of students and a spreading of the number of degree programs over several years may be considered. Measures, Data, Documentation: Institutional graduation assessment programs, submitted by the respective governing board, must be received by CCHE no later than November 24, 2000. Programs may be piloted in spring and summer 2001 with full implementation thereafter. History of undergraduate outcomes assessment at CU-Boulder CU-Boulder has a long history of assessing undergraduate educational outcomes. In 1985, the Colorado State Legislature passed House Bill 1187, which established accountability requirements for higher education in the state. The statute required institutions to assess undergraduate student “knowledge, capacity, and skills,” and to report results yearly to CCHE, which in turn summarized the institutions’ reports for the legislature. HB 1187 allowed institutions until fall 1989 to develop their assessment programs, with the first data to be reported for academic year 1989-90. In response to HB 1187, CU-Boulder developed a comprehensive and continuing undergraduate assessment program. The policy governing this program was written in AY 1986-87 by a “blue ribbon” faculty and administrative committee appointed by the Chancellor, and was approved by him in March, 1988. The committee’s premise was that the outcomes assessment program should help individual academic units (i.e., departments, degree-granting programs, and schools and colleges without a department structure) evaluate their curricula, instruction, and student services; plan improvements where necessary; and then evaluate the effects of any changes. The policy statement mandates assessment of both general education and education in the major discipline. It further specifies that all units will explicitly state goals for undergraduates in terms of skills, knowledge, and/or capacities, and that they will examine programs in light of those goals, choose and/or develop and implement ways to measure their achievement, and use results of assessment PBA: Lou.McClelland@colorado.EDU -- D:\99132792.DOC -- 07/12/16 University of Colorado at Boulder QIS Indicator 16: Graduation-year assessment plan Page 2 to strengthen programs. The policy also specifies that every graduating senior will participate in at least one assessment in addition to those required in coursework. In 1996, House Bill 1219 updated the old statute and replaced the accountability program with a system of institutional performance indicators, in particular Indicator 8, which concerns “existence and operation of a formal, comprehensive, and effective institutional assessment and accountability program,” and its subsections, which go into more specific detail. The indicators system legislation was updated again in 1999 with Senate Bill 229, with outcomes assessment remaining part of the indicator system. In addition, CCHE’s policy on academic program review requires an ongoing outcomes assessment program, as does the North Central Association of Colleges and Schools, the accrediting body for higher education institutions in our region. CU-Boulder’s assessment program is built into the administrative structure of the institution. The Associate Vice Chancellor for Undergraduate Education (AVCAA-UGE) and a senior researcher from Planning, Budget & Analysis (PBA) oversee and coordinate the overall process. The AVC provides the financial resources needed. Outcomes assessment is also incorporated into CUBoulder’s formal program review process (PRP) for academic units. As part of PRP, required for each unit every seven years, internal and external review committees examine the unit’s outcomes assessment process and results, and how the information has been used. CU-Boulder’s undergraduate programs have published skills and knowledge goals in the university catalog for some years. For example, the statement for economics is as follows The undergraduate degree in economics emphasizes knowledge and awareness of: the conditions for efficiency in free market production and exchange; contemporary theories concerning economic growth, inflation, unemployment, distribution of income, and international environment; a few of the specialized fields of economics, such as international economics and finance, natural resources and environment, the economics of gender and discrimination, and public economics; the descriptive statistics commonly used by economists; and the institutional characteristics of the U.S. economy, and how these differ from those in other economies. In addition, students completing the degree in economics are expected to acquire the ability and skills to: apply the tools of microeconomic theory to reach sound conclusions for simple economic problems; follow arguments concerning macroeconomic theory, to distinguish between sound and fallacious reasoning, and understand how differences in policy prescription may arise; perform statistical analysis such as multiple regression and understand similar analyses performed by others; and communicate economic reasoning in writing, understand similar writing by others, and appreciate the diversity of views that may reasonably exist about economic problems. Each undergraduate program has an assessment coordinator. Programs report on their activities, and on changes made as a result, in alternate years. Longitudinal accounts of activities in each program are posted on the undergraduate outcomes assessment page; see the individual academic PBA: Lou.McClelland@colorado.EDU -- D:\99132792.DOC -- 07/12/16 University of Colorado at Boulder QIS Indicator 16: Graduation-year assessment plan Page 3 unit summaries. The web site has helped gain CU-Boulder a national reputation for comprehensive, quality undergraduate outcomes assessment. Programs use a variety of assessment methods, including portfolios, questions embedded in course exams, panel reviews of course papers, nationally normed tests, and exit surveys. Many programs have used assessment results to identify and implement changes. For example, Theatre and Dance revised the sequence of courses for majors and added material on theatre history and dramatic literature in the senior seminar. English added a writing component to two introductory courses. Mathematics began requiring students in one of its tracks to take an upperdivision modern algebra course. Why change is needed With a working assessment process in place for over 10 years, and a national reputation for leadership, why should any changes be made? It’s time – the process has become routine, pro forma for many programs. It’s been running on “automatic pilot,” with no meetings of an oversight committee in some years. At the same time, the process is not defined clearly enough for many coordinators. Some programs have not communicated effectively about their assessment activities and their use of assessment results to improve. In some cases these programs have not assessed outcomes at all; in others the deficit is simply in reporting. The expectations of external constituencies have changed. The North Central Association of Colleges and Schools (NCA), our accrediting agency, places much more emphasis on accreditation now than in the ‘80’s. NCA has asked for a “progress report on the use of assessment as a tool to improve undergraduate and graduate student learning and for institutional improvement,” due fall 2003. Assessment of graduating seniors’ knowledge and skills in the major field is one of the CCHE QIS indicators for future years, as is assessment of general education goals for lower-division undergraduates. In recent years students, employers, and parents have become more interested in information about what graduates of a particular program can be expected to know and do. They also expect delivery of this information via web. CU-Boulder is paying more attention to accountability and strategic goals in its budgeting process. The introduction of a “unit merit” component in the allocation process may allow real consequences to be attached to collection and use of assessment information. Plan to reinvigorate the assessment in the major discipline We plan to revise and enhance our current processes, not replace them. In this section we outline the responsibilities of the several actors involved: The Associate Vice Chancellor for Undergraduate Education (AVCAA-UGE); Planning, Budget, and Analysis (PBA) and its institutional analysis area; a new campus-wide Assessment Oversight Committee; academic programs; and students. In the list of responsibilities below we have noted whether each is new or continuing, and whether it is ongoing or one-time. PBA: Lou.McClelland@colorado.EDU -- D:\99132792.DOC -- 07/12/16 University of Colorado at Boulder QIS Indicator 16: Graduation-year assessment plan Page 4 The Associate Vice Chancellor for Undergraduate Education (AVCAA-UGE) Recruit and chair the oversight committee. New, ongoing. With the VCAA, set the charge for the oversight committee. New, one-time. A preliminary list of items in the immediate charge is below, listed under committee responsibilities. All these items focus on assessment of graduating seniors’ knowledge and skills in the major discipline. Eventually the charge will be expanded to include undergraduate general education, graduate education, student development, and student satisfaction with university services and life. It will also include reporting to the NCA on assessment activities in all areas. Manage the assessment budget, with advice of the oversight committee. Allocate funds to academic programs and other uses; request new funds as necessary. Continuing, ongoing. With PBA, consult with individual academic programs on their assessment activities. New, ongoing. Apprise and consult with campus officials and organizations1 about assessment activities. Obtain any necessary approvals for policies and recommendations made by the committee. New, ongoing. Campus-wide Assessment Oversight Committee (AOC). Committee is new, ongoing. Note: All activities listed here are relevant to assessment of graduating seniors’ knowledge and skills in the major discipline. As noted above, eventually the committee’s charge will expand to include undergraduate general education, graduate education, student development, and student satisfaction with university services and life, plus reporting to the NCA on assessment activities in all areas. Committee members are listed in Appendix A. The inaugural meeting will be in November or December 2000. State requirements for academic programs for assessing graduating seniors’ knowledge and skills in the major discipline, including both ongoing work and periodic reporting. State requirements for documenting use of assessment information in program improvement. Set and state consequences of not meeting these requirements (to date there have been virtually no consequences). A preliminary version of the requirements is listed under academic program responsibilities. Previously these requirements were stated in the campus policy adopted in 1988. A formal revision to the policy may be needed, with regular review and revision as necessary. State requirements for assessment activities that are most appropriately administered by a campus-wide unit such as PBA rather than by individual academic programs. Survey research and standardized testing are two candidates. In formulating the requirements focus on utility for programs with the largest number of graduating seniors. A preliminary version of these requirements is listed under PBA responsibilities. State requirements for students, if determined necessary. As an example, current requirements for the BS in computer science include 1. course requirements, 2. requirements concerning the total number of credit hours, 3. grades, 4. hours completed on campus, and 5. a requirement to take part in a senior exit exam and questionnaire. Solicit and oversee reviews of submissions from academic programs, and reports from PBA. Provide feedback to academic programs and to PBA, especially about additional opportunities for communicating assessment results and actions, and for using the results in program improvement. 1 E.g., faculty government, student government, deans, associate deans, student affairs directors, parent association. PBA: Lou.McClelland@colorado.EDU -- D:\99132792.DOC -- 07/12/16 University of Colorado at Boulder QIS Indicator 16: Graduation-year assessment plan Page 5 Develop methods of and guidelines for using information from academic programs on assessment activities in the unit merit component of the Academic Affairs budget allocation process. Merit would be judged by the effectiveness of the program’s collection and use of assessment results for program improvement, not by the results themselves. Advise the AVCAA-UGE on use of the assessment budget. Increase faculty and student awareness of assessment activities, methods, and especially use by campus academic programs. Incorporate assessment activities into routine campus processes such as unit merit. Make use of assessment information more visible on campus, with greater integration into course and curriculum revisions, advising discussions, and other forums focussing on undergraduate education. In doing so, build faculty support and involvement in assessment. As part of increasing awareness, advise on special kick-off events during calendar year 2001. Possibilities include attendance at the annual assessment conference of the American Association of Higher Education, to be held in Denver June 23-26, 2001 and inviting outside experts in for consultations. Names mentioned to date include Peter Ewell of NCHMS, Karl and Karen Schilling from the State Council of Higher Education for Virginia and Miami University of Ohio, local scientist Elaine Seymour, and Ephraim Schechter of North Carolina State. Planning, Budget, and Analysis Staff the work of the AVCAA-UGE and of the committee. Continuing, ongoing. With the AVCAA-UGE, consult with individual academic programs on assessment implementation plans. Continuing, ongoing. Even though this is a continuing responsibility, it will be carried out in a more aggressive manner than over the prior five years. Serve as liaison to CCHE on assessment. Continuing, ongoing. Carry out (or coordinate) assessment activities that are most appropriately administered by a campus-wide unit rather than by individual academic programs. Ensure that each academic unit receives and understands information relevant to these activities. This is a continuing, ongoing responsibility with some new activities. Activities anticipated now include A continuing cycle of standardized tests. PBA will test, or work with departments to test, representative samples of about 40 students per major, rotating through relevant majors on a three-year cycle. New, ongoing. Appendix B presents details. A continuing cycle of student surveys evaluating individual courses (the faculty-course questionnaire), academic degree programs, and the campus as a whole. Continuing, ongoing. Maintain and enhance the outcomes assessment website. Include information on requirements, methods, activities, results, and use. Include materials for departments, materials supporting committee work, and summaries of activities by each academic program. Design the site to serve audiences including coordinators in academic programs, the oversight committee, students, parents, employers, and the public. Continuing, ongoing. Academic programs Ensure that skill and knowledge goals for students in the undergraduate program are published in the university catalog, and are reviewed and revised periodically. Continuing, ongoing. Ensure that the program can state, and document, how well it has been able to help students achieve the stated goals. The process of documentation is the first step in the ultimate goal, use of the information in program improvement. Continuing, ongoing. Publish any assessment requirements for students. Continuing, ongoing. PBA: Lou.McClelland@colorado.EDU -- D:\99132792.DOC -- 07/12/16 University of Colorado at Boulder QIS Indicator 16: Graduation-year assessment plan Page 6 Use assessment information – collected by the program, plus survey and test results collected by PBA -- to consider and design changes, as deemed necessary and desirable by program faculty, to courses and curriculum, instructional practices, course assignment practices, instructional facilities, student support services, and other components of the undergraduate program. Continuing, ongoing. Submit to the oversight committee, in writing, on the requested schedule, sufficient information to demonstrate conformance with the first four requirements – stated goals, stated requirements for students, collection and documentation, and use of assessment information. Include in the submission results, and departmental use of results, of any standardized tests and surveys. Continuing, ongoing. Provide the committee with other followup information as requested. Continuing, ongoing. The direct responsibilities of programs are limited to the above list. Programs will have latitude to use whatever assessment methods fit them best, and can request funds for assessment from the AVCAA-UGE. However, the AOC will offer guidelines and suggestions to help programs accomplish assessment efficiently and effectively. Samples are listed here for illustration. Guidelines: Programs should ensure that Some assessments cover papers, exams, and survey responses from students who are representative of all students in the program, not just of a subset who take honors, go on to graduate school, or are in a particular course. The processes of teaching/instructing and evaluating/assessing work are divorced, not always carried out by the same individual. Periodically – at least once each xx (to be determined by the AOC) years – individuals external to the department or program are involved in assessment. Suggestions: Assessment tools to consider Surveys and exit interviews Post graduation surveys, followups Employer and/or graduate school surveys Student portfolios Close examination of a sample of papers and exams from classes Authentic performance assessments National exams A matrix relating each course taught by the program to each skill and knowledge goal (example: California State Sacramento, sociology) Students Participate in graduation-year assessment activities stated as requirements by their academic program. Offer thoughtful, honest, and constructive feedback to their programs about courses, instructors, curricula, requirements, advising, and the like. PBA: Lou.McClelland@colorado.EDU -- D:\99132792.DOC -- 07/12/16 University of Colorado at Boulder QIS Indicator 16: Graduation-year assessment plan Page 7 Appendix A: Assessment Oversight Committee members As of November 14 2000 Chair Michael Grant, Associate Vice Chancellor for Undergraduate Education, professor and former chair of the Department of Environmental, Population, and Organismic Biology Members Gordon Brown, Mathematics Shelley Copley, Chemistry Sam Fitch, Political Science, chair Stephen Jones, College of Journalism, Assistant Dean Padraic Kenney, History Merrill Lessley, College of Arts and Sciences, Associate Dean Michael Main, Computer Science and chair of the Boulder Faculty Assembly’s academic affairs committee Ronald Melicher, Business Elease Robbins, Dean of Students and Associate Vice Chancellor of Student Affairs James Sherman, College of Engineering, Assistant Dean Kumiko Takahara, East Asian Languages and Civilizations University of Colorado Student Union student to be named Staff Lou McClelland, PBA Perry Sailor, PBA PBA: Lou.McClelland@colorado.EDU -- D:\99132792.DOC -- 07/12/16 University of Colorado at Boulder QIS Indicator 16: Graduation-year assessment plan Page 8 Appendix B: Use of standardized tests Background Standardized national tests exist for some disciplines in which CU-Boulder awards bachelor’s degrees. The Major Field Achievement Tests (MFAT’s) published by ETS are one example; the Fundamentals of Engineering exam is another. Some CU-Boulder departments have used exit exams as part of their graduation-year assessments for years, and found them useful. These include mathematics and computer science. No national standardized tests exist in many disciplines in which CU-Boulder awards bachelor’s degrees. Even though tests such as the MFAT’s are available nationally, ETS records indicate that fewer than five research universities use any of the exams. Therefore comparative results for what CU-Boulder programs consider peer institutions are not available. The results can be useful nevertheless, for monitoring change over time, comparing student performance in several subdisciplines, comparing performance of students with different experiences in the major, and assessing performance relative to absolute standards. Even the best standardized test in the world, perfectly suited to departmental goals and curricula, would not tell a program everything it needs to know to assess its success in helping students and to plan change. A test tells little or nothing about what students want, what they like and don’t like, what they have actually done in earning their degrees. CCHE’s QIS indicator #16 states that “Nationally normed major field tests should be used whenever available and applicable to the institution's program. If a national normed major field test exists and is being utilized by similar institutions across the United States, an explanation and justification for its non-utilization by the Colorado institution must accompany the materials submitted to the CCHE.” In fall 2000 we obtained inspection copies of all MFAT exams, and queried academic programs about their appropriateness and utility. Rationale of our plan Standardized tests can contribute useful information to an assessment program, and are especially appealing to external agencies such as CCHE. Test results from a representative sample of seniors, once every three years, should yield essentially as much utility as results gathered every year and/or by testing all seniors. Given the psychometric properties of the tests, forty test-takers per program or discipline should be sufficient to yield reliable results. Standardized tests should therefore be used as one part of assessment activities in the academic programs for which they are appropriate. Programs should emphasize use of test results in determining and planning any needed changes. Programs will be assisted in testing by Planning, Budget, and Analysis, a campus-wide administrative unit. Elements of our plan We will conduct a continuing cycle of standardized tests. Planning, Budget, and Analysis (PBA) will test, or work with departments to test, representative samples of about 40 students per major, rotating through relevant majors on a three-year cycle. Academic programs that wish to test students every year may continue to do so, subject to committee review. PBA: Lou.McClelland@colorado.EDU -- D:\99132792.DOC -- 07/12/16 University of Colorado at Boulder QIS Indicator 16: Graduation-year assessment plan Page 9 Degree programs that find the available tests in their disciplines counterproductive may veto participation. To date we have vetos from history, political science, and strong misgivings from environmental, population, and organismic biology, biochemistry, and applied math. History and political science state that the tests in their areas are based on outmoded models of their disciplines that emphasize “a collection of facts” rather than “interpretation,” “critical thinking, and writing.” With some 8 to 12 MFAT’s deemed appropriate, we would test students in three or four programs per year. This small number would allow PBA to work closely with programs to determine optimal settings for testing – in class, at a required extra session, etc. In departments with over 50 senior majors, PBA would draw representative samples for testing. The cost is estimated at about $6,000 per year exclusive of staff time and any payments to students. Student motivation on these specialized tests is not expected to be an issue. The MFAT’s not vetoed, plus the Fundamentals of Engineering for selected engineering majors, are in disciplines graduating about half of CU-Boulder seniors. The first audience for test results is the academic program. Academic programs will be asked to report results in its regular submission, and any departmental use of results, to the AOC. PBA: Lou.McClelland@colorado.EDU -- D:\99132792.DOC -- 07/12/16