2012 State Assessment Meeting - Valencia College, June 7 and 8 Attendee Synopsis Attended: Jesse Coraggio, Daniel Gardner, Cynthia Grey, Carla Rossiter, Janice Thiel, Maggie Tymms, Carol Weideman The Role of Professional Development (especially Faculty Development) in program assessment by Steven Sheeley, Vice President SACS-COC. Key take away: Do not overlook the value of faculty professional judgment in the assessment of the teaching learning process. Posed the question, “if you are the assessments person, are you checking with faculty as to why they are assessing as they are and asking for feedback on the quantifiable data….looking for reasons ‘why’? ” There is a perceived notion that quality of curriculum + quality of instruction = good student learning. This is not necessarily quantifiable but is accepted, supports the statement of including faculty professional judgment in the assessment process. Someone on campus needs to listen to faculty to identify problems and work to remedy. How do you capture the well of data that lies in your faculty’s professional judgment? Experience counts; you can identify a problem with student success based upon your experience. Get faculty on board to look at the data. Do not just inundate with numbers, provide trends. Ask faculty what their experience tells them might be a reason for the trend and based upon their experience, is there a solution. Ask faculty what the most important areas to look at when trends are identified. There should be a partnership between quantitative and qualitative data. We need to move away from mechanizing our assessments. Include faculty in assessment evaluation of data and include their professional judgment. Use focus groups as part of qualitative research. Assessment should have a linear quality: 1. Gather appropriate data (observable). 2. Evaluate data for importance (not just for validity and reliability) Fallacy: Assessors claim that the faculty’s professional judgment is not value unless using something quantifiable, i.e. test assessment. Faculty observations of student assessments are valid, i.e. student papers over time). Set appropriate standards for student achievement; tell students how we will be grading them. Faculty and college should made decision on content presented in the classroom. Maybe there should be a change in the process, measure student achievement and change our behaviors…looking ‘I’ (or the institution) can do to change the outcome. Assessment process is trial and error. Course grades do not usually provide data you can act upon, but faculty observations and professional judgment does. Assessment should be a sustained effort. Do not get bogged down in the process, instead of RESULTS. Misconception: If there is a problem with the product, it must be the process. If every student were the same, then we could always tweak the process. Assessments should not be done just for SACS. Course level may be important but SACS is looking for program outcomes. Do not measure every course level outcome every year. Program level outcomes are vital signs and need to be monitored continually. FAILURE TO LAUNCH 1. 2. 3. 4. Constantly returning to process and assessment of instruction for problem solving Statement, ‘no further action required.’ Program survival rather than program improvement. Focus on ‘it’s all about stats’ but not including ‘but what does it mean.’ 5. Lake Wobegone syndrome - the women are strong, the men are good-looking, and all the children are above average. It is okay to trust professional judgment. Stats can be helpful but can mask the true picture. 6. Concept that no one can describe a successful student – A description of the successful student should precede the curriculum. 7. Just ignore it and it will go away. 8. Don’t gather data just for the sake of collecting data. Data needs to be used. 9. Quality assurance, not assessments. Recent Changes in SACS Standards, Interpretations, and the Resource Manual by Steven Sheeley, Vice President SACS-COC. Resource Manual: Last update was May 2012. SACS will be updating the manually continually. There was a failure by SACS in the past to update the manual. 2.7.3.1 General Education: They will be looking at General education even if offsite committee said the college was in compliance. They will be reviewing if your public document offers the students a reasonable expectation of understanding your general education requirements. If the committee cannot understand the public document, your students cannot understand it. General education requirements vary by state. Be sure that the committee knows what the requirements are in your state because no one on the committee will be from Florida. Provide them with this information. General Education courses cannot be program based , i.e. nursing specific general education classes or Algebra Everyone on campus must be able to take the course and for the course material to be general enough for every student to be able to make use of that knowledge outside the classroom (general knowledge everyone should have) for Engineers. Question that has not been answered is if a specific class is open to all, will it still be acceptable. The general thought at the moment is to avoid this. SACS’s concern is trying to determine how much is the course content being influenced by the specific program General Education classes cannot be completely skills based. For instance, if you require 6 hours of English, 3 hours can be skills based (i.e. Comp. I) but you cannot require Comp II (skills based) for the remaining 3 hours. You must offer something think Lit I. SACS sets the minimum requirements. Florida sets 30 as the max. Policies and Procedures: Demonstrate how you administer ALL policies and procedures. Preamble applies across the board all the time. QEP: Must be focused on student learning. You must make the connection. Must have student learning outcomes associated with the QEP. What is going to be enhanced? GENERAL STATEMENTS: Anything written in the future tense will already place us out of compliance. SACS is looking for the constant move forward but you also need to reflect. A continuous cycle must be presented. Parts I and II Intensive Workshop on Developing and Assessing Program Learning Outcomes by Wendi Dew and Laura Blasi Key take away: (Janice) Valencia employs a professional development model when it comes to involving faculty in program outcome assessment. This session was conducted in partnership with their lead faculty development person and assessment person. They recognize that not all faculty are motivated by college-wide goals like improving graduation rates. Valencia has had better success working with faculty at the course level to employ basic instructional design principles such as determining learning objectives, designing learning activities, and aligning assessment strategies. They credit their success to institutional commitment and critical partnerships. http://valenciacollege.edu/instassess/