SACS Working Group Meeting Minutes 8/31/2011 Location: EpiCenter - 1-324 Time: 2:30-3:30 p.m. Attended: Anne Cooper, Carol Weideman, Cynthia Grey, Cynthia Jolliff-Johnson, Darlene Westberg, Doug Duncan, Eric Tucker, Frank Appunn, Gail Lancaster, George Greenlee, Janice Thiel, Jesse Coraggio, Joseph Leopold, Kim Molinaro, Kim Wolff, Laurie King, Leigh Goldberg Hopf, Michael Hughes, Richard Flora, Shirley Johnson, Stan Vittetoe, Tom Furlong, Tom Philippe, Wendy Rib, Cathy Ladewig, Jennifer Fernandes Topics: I) Members who attended the Summer SACSCOC meeting in Fort Worth, TX shared key points that were drawn from the speakers’ presentations. See matrix of key issues on page 3. See last pages for comprehensive notes on Assessment of Student Learning Outcomes (page 4-6) and the Fifth Year Interim Report (page 7-10). A) Key factors from the sessions on Metacognition focused on teaching students how to learn and teaching faculty how to explain effective learning strategies to students. As part of this process, it was suggest that including Bloom’s Taxonomy pyramid be included in all classes as a tool to achieve these goals. By discussing and including Bloom’s Taxonomy in classes, students will understand the level at which learning occurs is important. B) Each session seemed to carry common themes 1) Focus on improvement 2) Focus on being a learner-centered institution 3) Focus on assuring equivalency in learning and student support for distance and face-to-face education 4) Align assessments with outcomes 5) In areas of substantive change, address distance education, off cite facilities, mergers and acquisitions 6) Common errors in the Fifth-Year Interim Report which include: 1) failing to address the changes in item 5 above, 2) not providing sufficient evidence to support statements made in the report, 3) not assuring that all material information align, 4) not providing supportive data including explanations to support faculty credentials and administrators and academic officers. II) A quick assessment was done from the representatives of each standard to identify where each team is at in the review process (0-10 scale). III) Quick Topics *Action Item 1 of 11 SACS Working Group Meeting Minutes 8/31/2011 A) Fall SACS awareness contest went well. Manuel Gerakios was the winner. Foundation made an offer to match the $50 prize, which we have asked to hold for another reward opportunity. B) A discussion was had on the inclusion of students and possible new faculty/administrators to the working group. SGA will be approached on inviting interested students to become part of the working group. Several faculty members have already had contact with SGA representatives and the response has been positive. We will also add new faculty/administrators to the standard groups when we receive requests. C) SACSCOC Annual Meeting in Orlando December 4-6: Potential for sending faculty and staff from the SACS Working Group. Please reply if interested/available to attend. IV) Tightening the Timeline: We have scheduled two more meetings before the year’s end (see below), and we will schedule meetings monthly beginning in January. V) In addressing our mission, a rough draft of a mission statement was presented: “To effect change in practices resulting in educational improvement and maintaining good standing with our accrediting bodies.” Please forward your suggestion for changes. VI) It was discussed that at the end of every meeting the group would list what identified requests for changes should be submitted to the Student Achievement Committee as we go through the standards review process. Please review and relay any comments/suggestions: A) Establish methods for maintaining lists of those assigned primary responsibilities (e.g., curriculum developers) B) Revisit college goals (consider workforce) C) Improve efficiencies of maintaining faculty lists and documenting qualifications/credentials VII) Dr. Furlong met with SACS and provided the document on pages 11-12 for your review. The correspondence focuses on how SACS is working with institutions with baccalaureate degrees that have associate degrees as their foundation. VIII) Next Meetings A) Thursday, October 6, 2:00-3:00 p.m., EpiCenter – 1-324 Please note the earlier time to not interfere with the standing Deans/Provosts Meeting. B) Wednesday, November 30, 2:30-3:30 p.m., EpiCenter – 1-324 C) Monthly, beginning January 2012 *Action Item 2 of 11 Faculty (2.8) Anne Cooper, Kay Burniston Student Support Services (2.10) Tonjua Williams, Cynthia JolliffJohnson Qualified Administrative/Academic Officers (3.2.8) Patty Jones X X X X X X X X X Institutional Effectiveness (3.3.1.1) Jesse Coraggio, Leigh Hopf X X X X X X Admissions Policies (3.4.3) Anne Cooper, Kay Burniston Academic Program Coordination (3.4.11) Anne Cooper X Physical Facilities (3.11.3) Doug Duncan Student Achievement (4.1) Jesse Coraggio, Leigh Hopf X X X X Program Curriculum (4.2) Stan Vittetoe, Anne Cooper, Kay Burniston, Jim Olliver X X Publication of Policies (4.3) Tonjua Williams, Richard Flora X X X X X X X X X X Program Length (4.4) Anne Cooper, Kay Burniston Student Complaints (4.5) Tonjua Williams Accreditation Myths Common Areas of Non-Compliance: 2.8 3.3.1.1 3.4.11 3.11.3 Alignment of Assessments and Outcomes Faculty Ratios Substantive Changes Off Site Campuses Distance Education Management of Information Metacognition Blooms Taxonomy Student Accountability/ Responsibility Fifth-Year Interim Report and Key Issues Key Take-Aways from SACS Institute Recruitment Materials (4.6) Patty Jones X Title IV Program Responsibilities (4.7) and Financial Aid Audits (3.10.3) Michael Bennett, Tonjua Williams X X X Notes from Ashley Hendrickson: SACS Institute Topic: Alignment of Assessments and Outcomes: Assessment of Student Learning Outcomes: It’s about evaluating the effectiveness of programs, courses, and services, NOT individual students. Closing the loop: Identifying the gap between what the goal was and what actually is and outlining a plan for closing the disconnect between the two. Institutions tend to give copies of Allied Health program courses as their sample for SACs reviewers. Don’t do this….they want to see other programs that aren’t necessarily going through regular accreditation! Course level assessment is NOT required in any SACs requirement, however, in order to show assessments at the program level, all sections of a course have to assess performance AND they should be consistent…otherwise, how can you say you are assessing the program if the program is made up of courses. Distance Ed – it’s about equivalency of learning. Were these programs/courses taught in the same way as traditional courses? Identify high risk courses. Those with high enrollments and low completion rates…these are problematic. Institutions spend too much time with trying to engage faculty that just don’t want to participate….they should concentrate on and put more effort into those that are at the table. 63% are found in non-compliance on 3.3.1 Course grades are not sufficient evidence of program assessment Goals and strategies are not assessments. No results = no use = no improvement = NON COMPLIANCE. Advanced IE institutions… Recognize that surveys are indirect, subjective, incomplete assessments of academic programs. Have modified more than the assessment measure or means (not just adding a study session…data driven instruction) Understand what outcomes are and how they are written Have refined and aligned assessments with outcomes *Action Item 4 of 11 Exhibit multiple years of data to illustrate improvement Benchmark against other institutions (not required…but, this is a good practice Have established standards for success in meeting their outcomes/goals Exhibit polished documentation (e.g. comparing to prior years’ results) Offer cogent analysis Use an assortment of well-matched assessment types o Academic units: major field tests, cert exams, exit interviews, surveys, folios, etc o Admin units: internal logs, financial records, financial audit… Can differentiate between strategic planning and IE o Planning is long term future, targeted goals. It moves beyond the status quo o Operational, covers the bases, maintains the core operations, and sees that you accomplish your reason for existence. Unless the scope of the acad prog or dept changes, these outcomes don’t change much from year to year. Strategies for accomplishing these outcomes or goals…… It is NOT necessary to assess every outcome every year, as long as each one is assessed. IE Reviewers need to see…. Policy Practice – how you do things Product – evidence, documentation, proof. Narratives for 2.5 and 3.3.1 could be as short as a page…it’s the evidence Numbers, percentages, comparative and longitudinal data….research-based. 2.5 There is no magic number of cycles to go through. Minimum of two years is recommended. If providing a representative sample, better be GOOD. This system should be systematic and ongoing. Leaders analyze, share, discuss, and act upon the results. Analysis…Integrated into budget. 3.3.1. Could be in a budget cycle, part of cabinet meeting, showing that when it comes time to make budget decisions that IE is considered Highlighted sections pointing to the proof. Evidence of improvement…3.3.1 For reports that may have a lot of information college wide, exec sum is good. Where does the IE reviewer begin? Org charts (functional…with detail and higher-level with departments), policy books, catalogs, websites Problems with organization of evidence: No overview of which units submit IE and when Multiple formats for documentation *Action Item 5 of 11 Confusion about programs offered in traditional vs non-traditional formats, main campus vs sites, etc. Saying outright that you don’t have something for a year here or there for different departments, but have sufficient evidence over the course of the review period, that adds credibility to the institution rather than trying to make it look like we did everything perfect. Organize narrative by key terms. Instead of “2.5”…. Onging Integrated Institution-wide Research-based Systematic Accomplishing mission Many institutions struggle with their response to CS 3.3.1: Institutional Effectiveness. In fact, CS 3.3.1 continues to be among those Standards with the highest percentage of negative findings. It states: The institution identifies outcomes, assesses the extent to which it achieves these outcomes, and provides evidence of improvement based on analysis of the results in each of the following areas: 3.3.1.1 educational programs, to include student learning outcomes 3.3.1.2 administrative support services 3.3.1.3 educational support services 3.3.1.4 research within its educational mission, if appropriate 3.3.1.5 community/public service within its educational mission, if appropriate 3.3.1.1 NonCompliance Review Point SACS Institution Level I and II SACS Institution Levels III - VI % NonIssue rank % Noncompliant compliant Off Site Review 79% #3 issue 89% On Site Review 43% #3 issue 59% C&R Review 21% #1 issue 34% If 50% or more of a program is online, they must be assessed separately. Issue rank #1 issue #1 issue #1 issue Grades are like when you get your temperature taken at the doctor’s office…it doesn’t tell you anything but “direction”. Without more tests you won’t know WHY you are sick and you won’t know how or what to fix or change. *Action Item 6 of 11 Notes from Cynthia Grey: Fifth Year Interim Report Common issues 2.8 Number of Full-time faculty - providing data without presenting a case for why the number is adequate. They are not telling colleges what the ratio should be….but are expecting the colleges to support why the ratio is adequate. If you should find that during analysis your ratio is not adequate, explain the plan for meeting compliance. - Not addressing faculty loads. -Provide information about faculty loads and expectations for faculty outside of the classroom like required committee work, service, advising, and curriculum development. - Not providing disaggregated data down to the program or discipline level. Include information in faculty teaching in all program, include those teaching via distance and at off campus sites. 2.10 Student Support Programs - Alignment with media and reality i.e. website and/or catalog details identify student services that are not mentioned in the report. Student support program descriptions are inadequately explained. Explain how the services meet the needs of the students. Be sure to address support provided for distance and off-campus learners. 3.2.8 Qualified Administrators and Academic Officers - Providing lists or vitae without explanations. Provide a description of qualification for administrators and academic officers; build a case for why they are qualified to fulfill their roles. - Not providing an organizational chart to help evaluators understand who oversees what. - Providing degree level information without listing a major. - Consider using the SACCOC faculty roster form http://www.sacscoc.org/pdf/FACULTY%20ROSTER%20INSTRUCTIONS.pdf 3.3.1.1 Institutional Effectiveness: Educational Programs - Lack of defined learner outcomes and methods of assessing the outcomes. - Use mature data. Emphasis on demonstrating ongoing compliance - If using representative sampling, be sure the sample is indeed representative. Be sure to include a rationale for what makes the sample representative of the programs offered. - Again, be sure to include programs offered off campus and distance 3.4.3 Admissions Policies - Provide a clear and consistent narrative of these policies - Address special admissions policies for specific programs 3.4.11 Qualified Academic Coordinators - Identify coordinators for all programs, including off-campus and distance - List coordinator’s degree and major - Make a case for the coordinator’s qualifications to oversee the development of the program. - Again, recommendation to use Faculty Roster form. *Action Item 7 of 11 3.11.3 Physical Facilities - Include supporting documentation i.e. current facilities master plan, space utilization reports, and facilities maintenance schedules. - Again, address off campus sites. FR4.1 Student Achievement - Avoid using select program graduation rates alone to evaluate overall student achievement. Include a variety of measures, general and program-specific graduation rates, job placement rates, course completion rates, licensure exam pass rates. FR4.2 Program Curriculum - Provide a rationale for the programs offered, including supporting documentation. Be sure to include a discussion of distance education and offcampus sites. - Explain how the mission and curriculum are related - Document how the curriculum is developed FR4.3 Publication of Policies - Be sure that all versions of published versions are current and accurate…calendar, grading and refund policies. - Include information on how the information is disseminated to distance learners and off-campus sites. FR4.4 Program Length - Identify measures of program length for all programs including off campus and distance learners; Explain why the measures are appropriate for all programs including compressed/accelerated programs; be sure program lengths are within the appropriate ranges; verify that the program lengths published are accurate. FR4.5 Student Complaints - Provide a copy of student compliance issues, examples of real student (Emphasis on student) complaints with the names redacted; be sure to include distance education programs and off-campus locations. FR4.6 Recruitment Materials - Address how the institution ensures that recruitment materials and presentation are accurate representations of institutional practices and policies. FR4.7 Title IV Program Responsibilities and CS 3.10.2 Financial Audits - Work with auditors well in advance of your report to ensure that the audit reports are available by the due date. Include evidence of financial aid audits as required by the state, not just the federal governments. *Action Item 8 of 11 Accreditation Myths This was a round table discussion forum - Shared that SACS’ focus is on continuous improvement…How can we improve teaching to improve learning SACS is the path to doing better. The goal of SACS is to help every institution create a culture of continuous quality improvement *Action Item 9 of 11 *Action Item 10 of 11 *Action Item 11 of 11