St. Petersburg College QEP Impact Report August 1, 2013

advertisement

St. Petersburg College

QEP Impact Report

August 1, 2013

Table of Contents

QEP Executive Summary ............................................................................................................. i

The Focus of the Plan: Improving Students’ Critical Thinking ................................................... i

QEP Initiatives in Brief .............................................................................................................. i

Initiative 1. Student Success Initiative .................................................................................. ii

Initiative 2. Professional Development Initiative ................................................................... ii

Initiative 3. Critical Thinking Resources Initiative .................................................................. ii

QEP Goals ............................................................................................................................. iii

Student Success Initiative ................................................................................................... iii

Professional Development Initiative .................................................................................... iii

Critical Thinking Resources Initiative ................................................................................... iii

Expected Outcomes and Benefits ........................................................................................... iii

QEP Impact Report .................................................................................................................... 1

Critical Thinking is Mission Critical .......................................................................................... 1

1. Goals and Outcomes .......................................................................................................... 1

Student Success ................................................................................................................. 1

Professional Development .................................................................................................. 2

Critical Thinking Resources ................................................................................................ 2

2. Changes to the QEP ........................................................................................................... 4

3. Impact on Student Learning ................................................................................................ 4

INITIATIVE #1: Student Success Initiative .......................................................................... 5

INITIATIVE #2: Professional Development Initiative ...........................................................12

INITIATIVE #3: Critical Thinking Resources Initiative .........................................................15

4. Reflection on What We Learned ........................................................................................17

QEP Executive Summary

The Focus of the Plan: Improving Students’ Critical Thinking

The focus of the Quality Enhancement Plan (QEP) for St. Petersburg College (SPC) is enhancing student learning by improving students’ ability to think critically. SPC involved a broad range of faculty, staff, and key stakeholders in considering various ideas for the QEP.

After identifying critical thinking as the most important and urgent topic and reviewing definitions from the critical thinking literature, the Quality Enhancement Committee (QEC) formulated the following definition for critical thinking:

Critical thinking is the active and systematic process of communication, problem-solving, evaluation, analysis, synthesis, and reflection, both individually and in community, to foster understanding, support sound decision-making, and guide action.

QEP Initiatives in Brief

SPC has done an in-depth review of strategies in instruction and institutional improvement to determine ways of improving students’ critical thinking skills. As a result of this research, the

College identified key initiatives that faculty believe will have a favorable impact on students’ critical thinking. Those initiatives cover three broad areas: Student Success, Professional

Development, and Critical Thinking Resources. The Student Success Initiative is the primary focus of the QEP, supported by professional development for faculty and resource materials that reflect and facilitate faculty research on integrating critical thinking activities in the classroom.

SPC QEP Focus:

Enhance Student Learning by Improving Students' Ability to Think Critically

1. Student

Success

Initiative

2. Professional

Development

Initiative

3. Critical

Thinking

Resources

Initiative

Critical thinking will be infused throughout the institution – a comprehensive set of initiatives developed around a common language, to make current practices more effective and develop new instructional rubrics and strategies.

Executive Summary i

Initiative 1. Student Success Initiative

This initiative will focus on implementation of classroom critical thinking activities, supported by key club and student leadership programs and tools that assess and document critical thinking, such as student ePortfolios. Students will be exposed to critical thinking throughout the College and will be offered opportunities to create, collect, and reflect on their own artifacts within their ePortfolios. A Collegewide assessment rubric template and discipline-specific assessments will be used by faculty to evaluate the students’ critical thinking skills.

Academic programs will be selected for implementation over five years, and lead faculty and staff will receive advanced professional development geared to their disciplines or field.

Key student organizations will be included in the five-year rollout process.

Initiative 2. Professional Development Initiative

This initiative concentrates on offering professional development opportunities to faculty and staff at the College in order to impact students’ critical thinking skills. The College will systematically train a small core of faculty members using a “train-the-trainer” approach, and then build on that existing base of knowledge and expertise. The initiative also will include seminars led by outside experts, development of inhouse and on-line training, travel to conferences to learn new techniques, and using Academic Roundtables (ARTs) on the campus sites to explore and implement strategies. Faculty and staff will have access to a variety of professional development opportunities.

Initiative 3. Critical Thinking Resources Initiative

This initiative calls for the creation of an array of electronic resources, many of which will be available from a single gateway website. It also calls for identifying, organizing, linking to, and describing outside resources that can be used in the effort. In partnership with other SACS institutions, SPC will collect, create, and house a library of electronic critical thinking tools that can be used in online, traditional face-to-face, or blended classrooms, including Reuseable Learning Objects

(RLOs). RLOs are small segments of instruction, usually electronic, that can be used in multiple courses, and instructional portfolios of critical thinking activities created by faculty. Lastly, physical resources will be collected through this initiative and housed at Critical Thinking Resource Centers at each library.

Executive Summary ii

QEP Goals

The specific goals from the three initiatives in the QEP, all directed at improving students’ critical thinking skills and faculty ability to develop, infuse, and assess those skills, include the following:

Student Success Initiative

Goal 1-1. Enhance students’ critical thinking skills through “teaching for critical thinking” classroom activities across the curriculum.

Goal 1-2. Develop and use general and discipline-specific assessment tools and strategies for measuring stud ents’ critical thinking skills.

Goal 1-3.

Goal 1-4.

Collect student artifacts through ePortfolio.

Implement critical thinking programs supported by key student organizations.

Professional Development Initiative

Goal 2-1. Provide professional development opportunities to assist faculty in developing class activities to support “teaching for critical thinking.”

Goal 2-2.

Goal 2-3.

Develop in-house critical thinking expertise (i.e., faculty champions) using a

“train-the-trainer” approach.

Institute Academic Roundtables (ARTs) to investigate general and disciplinespecific strategies for “teaching for critical thinking.”

Critical Thinking Resources Initiative

Goal 3-1. Compile electronic critical thinking resources for SPC faculty and staff organized through a College gateway website.

Goal 3-2. Create and collect critical thinking reusable learning objects (RLOs) for SPC and other institutions in Florida and across the world who are seeking multimedia/electronic critical thinking materials.

Goal 3-3.

Goal 3-4.

Contribute to the critical thinking literature through presentation and publication of instructional portfolios of strategies that support “teaching for critical thinking.”

Acquire and use print and multimedia critical thinking resources available at

Critical Thinking Resource Centers housed in campus libraries.

Expected Outcomes and Benefits

First and foremost, SPC expects improvements in critical thinking skills to translate into deeper learning and understanding congruent with the College’s mission. This improved learning will be spearheaded by an engaged and energized faculty reinforced across the College programmatically and by other staff and recognized by students and employers. SPC expects to contribute to the applied research in the field. At the conclusion of the implementation, decisions will be made on which activities and initiatives were effective in promoting improved critical thinking, and how the institution will sustain these effective approaches.

Executive Summary iii

QEP Impact Report

Critical Thinking is Mission Critical

The focus of the Quality Enhancement Plan (QEP) for St. Petersburg College (SPC) is enhancing student learning by improving students’ ability to think critically. The College identifie d key initiatives that faculty believed would have a favorable impact on students’ critical thinking. Those initiatives covered three broad areas: Student Success, Professional

Development, and Critical Thinking Resources. The Student Success Initiative is the primary focus of the QEP, supported by professional development for faculty and resource materials that reflect and facilitate faculty research on integrating critical thinking activities in the classroom.

In an effort to guide the institution in its efforts to improve students’ critical thinking skills, SPC defined critical thinking as “The active and systematic process of communication, problem solving, evaluation, analysis, synthesis, and reflection, both individually and in community, to foster understanding, support sound decisionmaking, and guide action.”

1. Goals and Outcomes

The specific goals from the three initiatives in the QEP, all directed at improving students’ critical thinking skills and faculty ability to develop, infuse, and assess those skills, include the following:

Student Success

1-1.

Enhance students’ critical thinking skills through “teaching for critical thinking” classroom activities across the curriculum.

All students will have demonstrated improvement in critical thinking skills, as evidenced by scores on external tests and ratings on the Assessment Rubric for Critical Thinking

(ARC).

Key stakeholders will report positively regarding improvements in critical thinking skills of

SPC graduates.

Students will report an increase in instructional practices improving critical thinking skills in the majority of modified courses or class activities across the curriculum.

1-2.

Develop and use general and discipline-specific assessment tools and strategies for measuring students’ critical thinking skills.

A majority of programs will have at least one discipline-specific critical thinking assessment tool or strategy for measuring students’ critical thinking skills.

1-3.

Collect student artifacts through ePortfolio.

A range of artifacts will have been collected that demonstrate student growth in critical thinking skills in selected courses across the curriculum.

1-4.

Implement critical thinking programs supported by key student organizations.

Impact Report 1

Each key student organization will have had at least one program related to critical thinking annually.

The majority of students participating in student organizations will report the critical thinking programs add value to their development of critical thinking skills.

Professional Development

2-1.

Provide professional development opportunities to assist faculty in developing class activities to support “teaching for critical thinking.”

SPC will have developed advanced critical thinking seminars with a discipline-specific focus for identified disciplines.

At least 75% of full-time faculty and the majority of adjuncts will have participated in seminars on “teaching for critical thinking.”

The majority of surveys and other forms of feedback on critical thinking seminars will be positive.

2-2.

Develop in-h ouse critical thinking expertise (i.e., faculty champions) using a “train-thetrainer” approach.

SPC will have institutionalized the “Train-the-trainer” program in order to continue developing expertise.

2-3.

Institute Academic Roundtables (ARTs) to investigate general and discipline-specific strategies for “teaching for critical thinking.”

SPC will have formed ARTs for the majority of General Education, A.S., and

Baccalaureate programs.

The majority of faculty participating in ARTs will affirm the value of ARTs to research strategies.

Critical Thinking Resources

3-1.

Compile electronic critical thinking resources for SPC faculty and staff organized through a

College gateway website.

The majority of faculty will identify the gateway website as a valuable source of information and ideas.

3-2.

Create and collect critical thinking reusable learning objects (RLOs) for SPC and other institutions in Florida and across the world who are seeking multimedia/electronic critical thinking materials.

SPC will have collected or created a minimum of 50 RLOs promoting critical thinking in a variety of disciplines.

A majority of RLOs will receive favorable feedback in the form of positive student and faculty reactions.

3-3.

Contribute to the critical thinking literature through presentation and publication of instructional portfolios of strategies that support “teaching for critical thinking.”

Impact Report 2

Instructional portfolios will be available for the majority of programs at the College.

The majority of faculty will give a positive rating to the peer presentations and portfolios on teaching for critical thinking.

3-4.

Acquire and use print and multimedia critical thinking resources available at Critical

Thinking Resource Centers housed in campus libraries.

The majority of faculty will identify the Critical Thinking Resource Centers as valuable sources of information and ideas.

Impact Report 3

2. Changes to the QEP

The timing of implementing the QEP coincided with the nation’s economic downturn, so a number of creative staffing solutions were implemented. While the QEP Director position was fully funded, existing staff from two departments, Web & Instructional Technology Services and

Institutional Research & Effectiveness, stepped in to supplement the responsibilities of the

Technology Coordinator and Assessment Coordinator positions. Additionally, during 2009, a faculty champion for assessment was utilized, and the Assessment Coordinator position was eventually filled in March of 2011.

The QEP stated 11 goals; all were achieved with the exception of “1-3. Collect student artifacts through ePortfolio.

” During the onsite visit in September of 2007, members of the Reaffirmation

Committee questioned whether the current learning management system, ANGEL, was capable of supporting the intended goals of implementing ePortfolio for students college-wide. In its followup report, the Committee stated, “The portfolio portion of Angel software may not be the best mechanism for assessing student portfolios. In the demonstration of the software for a portion of the On-Site Review Committee, it did not appear as though this was a software system designed for assessment of portfolios.” Although not a formal recommendation, SPC proceeded with reevaluating its planned use of ANGEL, and considered alternatives until an effective version of ANGEL was developed and implemented.

In 2009, the challenge of implementing ePortfolio with ANGEL continued, particularly since the functionality promised by ANGEL Learning was still yet to be realized. During a critical thinking advisory meeting in October of that year, a recommendation was forwarded to the Educational

Oversight Group, “After considering the various critical thinking assessments and comparing institutional benefit, it is recommended that ePortfolio resources be reallocated to administration of the Community College Survey of Student Engagement (CCSSE).” The recommendation was approved, and the CCSSE was administered in 2010-11, 2011-12.

Add discussion about changes related to student performance data, including

 alignment of SSI items,

 changes to annual administration of the MAPP: o St. Petersburg College administered the ETS MAPP during the 2007-2008 academic year, with the expectation of continuing the assessment annually as part of the QEP. Due to several logistical issues that prevented an effective administration process, as well as ETS’ plan to make particular modifications to the instrument, SPC did not administer the assessment the following three years.

Upon review of the ETS Proficiency Profile assessment – the new version of the

ETS MAPP

– in 2011, St. Petersburg decided to reinstate implementation of this assessment as the third direct measure aligned to four of the six elements of critical thinking.

 changes to annual administration of the CCSSE,

 removing the “targets” from each measure (and using 06-07 data as our baseline one year before actually implementing the QEP),

 journaling activities

3. Impact on Student Learning

St. Petersburg College implemented each of the activities related to the three initiatives, as outlined in the original QEP in 2007. The initiatives, goals, intended outcomes of each goal and their respective results are outlined below.

Impact Report 4

INITIATIVE #1: Student Success Initiative

Goal 11. Enhance students’ critical thinking skills through “teaching for critical thinking” classroom activities across the curriculum.

1-1.1: By 2012, students will have demonstrated improvement in critical thinking skills, as evidenced by scores on external tests and ratings on the Assessment

Rubric for Critical Thinking (ARC).

Determination of compliance:

SPC assessed both performance and perceptions of students’ critical thinking skills by aligning a total of seven assessment instruments to the six elements of the institution’s definition of critical thinking. The six elements of the definition are – I. Effective Communication, II. Problem

Solving, III. Evaluation, IV. Analysis, V. Synthesis, and VI. Reflection. The aligned assessment instruments, comprised of three direct and four indirect assessments, were implemented between the 2006-2007 and 2011-2012 academic years. Each element of critical thinking was aligned to at least three different instruments, with two of the three being direct assessments; four of the six elements were aligned to six different instruments. Although not all instruments assessed students annually, at least three instruments were implemented in every year of the plan.

Aligned

Measures

CAT

ARC

Direct or

Indirect

Direct

Direct

Internal or

External**

External

Internal

Aligned Critical

Thinking

Elements

I-VI (All)

I-VI (All)

I-IV

Number of Years

Implemented

(during QEP)

5

3

2

# of Students

Assessed

(all years)

429

370

285 MAPP Direct

Employer Indirect

Alumni Indirect

CCSSE Indirect

External

Internal

Internal

External

I-V

I-V

I, II, IV, V, VI

6

6

3

488

4067

3836

SSI* Indirect Internal V 3 99384

*SPC administers the SSI every semester. Data aligned to the Critical Thinking initiative have only been available since

2009-2010, when the instrument was modified. In addition, the number of responses is duplicated across semesters, so the total number of responses reflects the average received each academic year.

**Each measure is identified as internal or external, where internal measures were developed and are administered by the institution; external measures are those that are managed by external agencies and are administered at SPC.

Every year during the QEP, approximately 1,600 students were assessed by direct and indirect measures; more than 9,000 students have been assessed over the six year period. At the close of the 2011-2012 academic year, the sixth year of assessment, the total number of SPC students that were assessed on items related to critical thinking surpassed more than 100,000.

The large increase in annual assessments occurred during the 2009-2010 academic year when

SPC added two critical thinking items to the Student Survey of Instruction (SSI), shifting the annual average number of assessments from 1,600 in years 1-3 to 34,000 in years 4-6. It should be noted, however, that the SSI is aligned to only one of the six elements of critical thinking. Therefore, with the exception of the SSI, the approximate number of annual assessments remained at 1,600 for the duration of the QEP.

Impact Report 5

Direct Assessments

To assess students’ performance on critical thinking skills, SPC aligned three direct assessments – CAT, ARC, and MAPP, consisting of 25 total measures – to all six elements of critical thinking. These assessments were administered between the 2007-2008 and 2011-2012 academic years. Nearly 6,000 students were assessed during the five years.

Critical thinking Assessment Test (CAT)

The CAT is an external assessment developed by TN Tech University and is implemented by

SPC. The CAT is administered each spring semester in six randomly selected sections of faceto-face Elementary Statistics, STA 2023, and College Algebra, MAC 1105. The population of these courses includes students of varying academic performance levels and matriculation statuses; however, a large majority of students in these courses are enrolled in lower division program plans (A.A., A.S.). There are 15 measures, all of which are aligned to the six elements of critical thinking. Available points vary by question, where some questions are worth one point, and others are worth as many as 5 points. A total of 429 students have been assessed by the

CAT over five academic years, between 2007-08 and 2011-12. As described by TN Tech

University (CAT Institutional Report, p. ii):

The CAT Instrument is a unique tool designed to assess and promote the improvement of critical thinking and real-world problem solving skills. The instrument is the product of extensive development, testing, and refinement with a broad range of institutions, faculty, and students across the country. The National Science Foundation has provided support for many of these activities. The CAT Instrument is designed to assess a broad range of skills that faculty across the country feel are important components of critical thinking and real world problem solving. The test was designed to be interesting and engaging for students. All of the questions are derived from real world situations. Most of the questions require short answer essay responses and a detailed scoring guide helps ensure good scoring reliability. The CAT Instrument is scored by the institution's own faculty using the detailed scoring guide. Training is provided to prepare institutions for this activity. During the scoring process faculty are able to see their students' weaknesses and understand areas that need improvement. Faculty are encouraged to use the CAT instrument as a model for developing authentic assessments and learning activities in their own discipline that improve students' critical thinking and real-world problem solving skills. These features help close the loop in assessment and quality improvement.

Assessment Rubric for Critical thinking (ARC)

The ARC is an internal assessment instrument developed by SPC faculty for purposes of assessing critical thinking activities. The instrument was designed to be flexible enough to apply to multiple modalities while also maintaining the requisite competencies for demonstrating critical thinking skills. Each fall term, six randomly selected sections of PHI 1600 participate in the Assessmen t Rubric for Critical Thinking (ARC) workshop by submitting their students’

Critical Thinking Application Papers (CTAPs) to be scored by a group of faculty scorers. The sample is pulled from the list of regular (16-week) PHI 1600 (non-honors) courses that are offered on all campuses and are either online, face-to-face, or in a blended format. The population of this course is typically comprised of newer SPC students. While it includes students of varying academic performance levels and matriculation statuses, a large majority of students in this course are enrolled in lower division program plans (A.A., A.S.). There are six measures, which align to the six elements of critical thinking. Students are assessed on a scale of 0-4, where 0 is “Poor” and 4 is “Excellent”. A total of 370 students have been assessed by the

ARC over three academic years, between 2009-10 and 2011-12.

Impact Report 6

ETS© Measure of Academic Proficiency and Progress (MAPP)/Proficiency Profile

The MAPP (prior to 2011)/Proficiency Profile is an external assessment developed by

Educational Testing Services (ETS) and is administered by SPC. The Proficiency Profile follows the same design as, and is statistically equated to, the former ETS MAPP assessment.

Students ’ critical thinking skills are scored on a scale of 100-130 (31-points) and results are reported as composite subscores for each topic. As described by ETS

( http://www.ets.org/proficiencyprofile/about/content/ ),

Questions on the ETS Proficiency Profile are multiple choice and are arranged in blocks of three to eight. Each section tests the same types of skills. This integrated design prevents a particular skill area from appearing all at once late in the test when fatigue can affect student performance.

Students who are proficient [in critical thinking] can: evaluate competing causal explanations; evaluate hypotheses for consistency with known facts; determine the relevance of information for evaluating an argument or conclusion; determine whether an artistic interpretation is supported by evidence contained in a work; recognize the salient features or themes in a work of art; evaluate the appropriateness of procedures for investigating a question of causation; evaluate data for consistency with known facts, hypotheses or methods; and, recognize flaws and inconsistencies in an argument.

To establish a baseline of student performance, the assessment was administered both face-toface and online during the 2007-2008 academic year to students across the institution, including those enrolled in A.A., A.S., B.S., and B.A.S. programs. In 2011, the instrument was administered to a similar population, sampling students who also met qualifying criteria. All students enrolled in lower-division programs that had earned between 45 and 55 credits (i.e. approaching the end of their program) were invited to participate in an assessment. Two-thirds of the students selected were routed to take SPC’s General Education Assessment Test, while the remaining one-third were routed to participate in the Proficiency Profile. To ensure baccalaureate students were being assessed in the same fashion as A.A. or A.S. students – towards the end of their program

– all B.S. and B.A.S. students enrolled in their final capstone course were given the opportunity to complete the Proficiency Profile. There are four measures from the assessment that are aligned to four of the six elements of critical thinking. A total of

285 students completed the Proficiency Profile between the two administration periods.

Results of direct assessments

Differences in means, or gain scores, were calculated for every direct measure aligned to each of the six elements. Gain scores were calculated by subtracting the first year’s results from most recent year’s results. Due to varying scales and point ranges, the gain scores were standardized. Standardized gain scores were calculated for each measure and an average standardized gain score was calculated for each element. The table below illustrates the demonstrated improvement in critical thinking skills, as evidenced by scores on the internal and external direct assessment measures.

Impact Report 7

Direct measures of students’ critical thinking skills:

2007-2008 through 2011-2012

Critical Thinking elements

I. Effective

Communication

Average standardized gain score

4.9

Total N

1055

# of measures

3

II. Problem Solving -0.7 1078 5

III. Evaluation 1.4 1079 5

IV. Analysis 1.3 1079 6

V. Synthesis 8.2 795 4

VI. Reflection 4.3 790 2

Total 5876 25

Students demonstrated improvement in critical thinking skills in five of the six elements across twenty-five measures. The most notable improvement was in the area of Synthesis

(standardized gain score of 8.2), followed by Effective Communication (4.9) and Reflection

(4.3). Gains were also observed in Evaluation (1.4) and Analysis (1.3), while a slight decline was observed in the area of Problem Solving (-0.7). A comprehensive breakdown of standardized gain scores by measure can be found here.

(Link to excel workbook with tabs for each CT element.)

Indirect Assessments

To assess students’ and stakeholders’ perceptions regarding critical thinking skills, SPC aligned four indirect assessments – Employer, Alumni, CCSSE, and SSI, consisting of 33 total measures – to all six elements of critical thinking. The surveys were administered between the

2006-2007 and 2011-2012 academic years. More than 8,000 students and employers were surveyed during the six year period. While SPC did not begin implementing the QEP until the

2007-2008 academic year, SPC aligned these measures to the elements of critical thinking in order to establish a baseline one year before the QEP efforts began.

Employer Satisfaction Survey

The Employer survey is an internally developed instrument that is administered once per year, each spring semester. Employers of graduates who provide their employer’s information and give permission to SPC to contact their employer are surveyed. Employer surveys are sent to employers of graduates from the previous academic year. There are twelve measures from the

Employer survey aligned to five of the six elements of critical thinking. Employers rate the graduate on a scale of 1 “Poor” to 5 “Excellent”. Data from the survey are available from 2006-

07 through 2010-2011 (where the academic year refers to the year in which the employee, or graduate, completed their program at SPC). Data from 2010-11 are not included in results at this time, but will be available and added to the Impact Report in October 2012.

Alumni Satisfaction Survey

The Alumni survey is an internally developed instrument that is administered three times per year, six months after graduation. All students who complete an academic program at SPC are surveyed. There are twelve measures from the Alumni survey aligned to five of the six elements of critical thinking. Graduates rate how prepared they feel in a variety of areas and on specific

Impact Report 8

tasks on a scale of 1 “Poor” to 5 “Excellent”, which align with questions on the Employer Survey.

Data from the survey are available from 2006-07 through 2010-2011 (where the academic year refers to the year in which the graduate completed their program at SPC). Data from 2010-11 are not included in results at this time, but will be available and added to the Impact

Report in October 2012.

Community College Survey of Student Engagement (CCSSE)

The CCSSE is an externally developed instrument managed by the Center for Community

College Student Engagement and administered at SPC. The CCSSE was first administered in the spring semester of the 2006-2007 academic year. After receiving recommendations from the onsite reaffirmation team in 2007, funds allocated for an electronic portfolio system were reallocated so that the institution could continue administering the CCSSE. The survey was administered again in spring semesters of 2011 and 2012. The instrument assesses mostly returning students and asks about institutional practices and student behaviors that are highly correlated with student learning and retention. Students rate a variety of topics on a four-point scale from 1 “Very Little” to 4 “Very Much”. The CCSSE is aligned to five of the six elements of critical thinking, and a total of 3,836 students participated during the three administrations.

Student Survey of Instruction (SSI)

The SSI is an internal assessment instrument developed by faculty. During 2009-10, the SSI items were reviewed and revised by a committee composed of faculty and administrators. As a result of the revision process, the lecture, non-lecture, and eCampus forms were consolidated into one form, independent of modality, which has been administered online since Spring 2010.

As part of the instrument validation process, the results from the SSI over the last few years were assessed for reliability and validity. The results of this assessment suggested three underlying factors – faculty engagement, preparation and organization, and course instruction.

The survey questions are grouped into these categories. The two SSI critical thinking questions aligned to the element of Synthesis are mapped to the Course Instruction and Faculty

Engagement categories. Students rate the instruction of the faculty member for each course in which they are enrolled. Faculty are scored by students on a scale from 1 to 7. More than

99,000 students participated in the SSI between 2009-201 and 2011-2012.

Results of indirect assessments

Differences in means, or gain scores, were calculated for every indirect measure aligned to each of the six elements. Gain scores were calcula ted by subtracting the first year’s results from most recent year’s results. Due to varying scales and point ranges, the gain scores were standardized. Standardized gain scores were calculated for each measure and an average standardized gain score was calculated for each element. For several of the indirect measures,

SPC identified that perceptions of critical thinking had already reached high ratings in each of the elements of critical thinking, as illustrated in the 2006-2007 baseline data. As a result, this ceiling effect limited the scalable range of improvement that would be able to be reached over the five years of implementing the QEP. The table below illustrates the perception of critical thinking skills, as evidenced by results on the internal and external indirect assessment measures.

Impact Report 9

Indirect measures of students’ critical thinking skills:

2006-2007 through 2011-2012

Critical Thinking elements

Average standardized gain score

Total

N

# of measures

I. Effective

Communication

0.7 8331 10

II. Problem Solving -1.0 8269 7

III. Evaluation -0.7 4544 4

IV. Analysis 0.4 8375 7

V. Synthesis -0.3 107713 5

VI. Reflection 3.8 3825 1

Total 141057 33

Students and stakeholders reported positive perceptions regarding critical thinking skills in three of the six elements across thirty-three measures. The most notable improvement was in the area of Reflection (standardized gain score of 3.8), which is similar to the demonstrated improvements made by students as evidenced by the direct measures (Reflection, 4.3).

Additional improvements in perceptions were observed in Effective Communication (0.7) and

Analysis (0.4). A slight decline was observed in the area of Problem Solving (-1.0), which is in line with the demonstration of problem solving skills by students in the direct measures (-0.7).

Similar observations of slight decline also occurred in the areas of Evaluation (-0.7) and

Synthesis (-0.3), which are reported as much lower than the actual observed demonstration of these skills on the direct assessments (Evaluation, 1.4; Synthesis, 8.2). A comprehensive breakdown of standardized gain scores by measure can be found here.

(Link to excel workbook with tabs for each CT element.)

Data from Faculty Champions and Academic Round Tables

Following a yearly rollout schedule, academic roundtables (ART) were formed and guided by one or two lead faculty members identified by their program director/dean as faculty champions.

An ART is a learning community focused on academic disciplines or related discipline clusters.

The primary goal of the ART was to investigate general and discipline-specific strategies for teaching for critical thinking and to compile an instructional portfolio based upon Earnest L.

Boyer's (1990) Ongoing Cycle of Scholarly Teaching and the Scholarship of Teaching.

Members of the ART devised a strategy or intervention to teach for critical thinking within their curriculum.

For example, the Student Life Skills ART utilized a reusable learning object that stepped students through six elements of critical thinking. The Paralegal Studies ART focused on case assessment, and developed standard methodology to guide students through case briefing. The

Respiratory Care ART applied a clinical problem solving model to interactive exercises that the students completed to solve clinical cases. The Mathematics ART integrated short problem solving exercises in daily lesson to engage students in regular practice of critical thinking.

Classroom activities and outcomes are detailed in the instructional portfolios .

Provide results for examples.

Impact Report 10

While the portfolios focused at the program level, college-wide assessment was also conducted...CAT, ARC, CCSSE, SSI, etc...

1-1.2: By 2012, key stakeholders will report positively regarding improvements in critical thinking skills of SPC graduates.

Determination of compliance:

Faculty Champion Portfolio data

SPC faculty survey data, after administering in fall 2012.

Employer Survey

1-1.3: By 2012, students will report an increase in instructional practices improving critical thinking skills in the majority of modified courses or class activities across the curriculum.

Determination of compliance:

 Faculty Champion Portfolio data…did any of them conduct student surveys in classes where they modified instruction for critical thinking?

CCSSE

SSI items 6 and 13…for now. May remove SSI data in 1-1.1 and move to this section.

Students are not reporting on self-perceptions of skills in SSI, but rather if instructors have promoted specific skills.

Goal 1-2. Develop and use general and discipline-specific assessment tools and strategies for measuring students’ critical thinking skills.

1-2.1: By 2012, the majority of programs will have at least one discipline-specific critical thinking assessment tool or strategy for measuring students’ critical thinking skills.

Determination of compliance:

The Assessment Rubric for Critical Thinking (ARC) was developed by QEP staff and faculty champions during the inaugural year. As part of their study of critical thinking, faculty champions guided their academic roundtables to investigate critical thinking assessments, and to also compose discipline-specific scenarios aligned to the ARC .

Among the first disciplines to implement the ARC was the Ethics department, which integrated its use into the Critical Thinking Application Paper (CTAP) that students write in PHI 1600 –

Studies in Applied Ethics.

Describe a few examples from the portfolios.

Impact Report 11

Goal 1-3: Collect student artifacts through ePortfolio.

Details regarding this goal can be found under the “changes to the QEP” section.

Goal 1-4. Implement critical thinking programs supported by key student organizations.

1-4.1: By 2012, each key student organization will have had at least one activity related to critical thinking annually.

Determination of compliance:

Critical thinking programs have been held on campus for students to participate outside of class time. These ranged from large college-wide initiatives like the Great Debates to smaller campus events like Constitution Day. Many were sponsored by Student Life & Leadership and the

Student Government Association.

The Extreme Entrepreneurship Tour (EET) was held in Fall 2010 and highlighted several young entrepreneurs who shared their experience and strategies for building successful, innovative businesses. The objective was to raise students’ awareness of the importance of opening one’s mind to possibilities by making the link between an entrepreneurial spirit and critical thinking.

Pre- and post-surveys were administered which focused on workplace characteristics linked to critical thinking: being creative, examining assumptions before coming to a conclusion, considering different points of view, questioning why things are done in a certain way, and being involved in decision-making. Analysis of survey results indicate that after having attended the

EET, participants rated all of these characteristics higher in terms of importance.

1-4.2: By 2012, the majority of students participating in student activities will report the activities add value to their development of critical thinking skills.

Determination of compliance:

Add student survey results from activities

INITIATIVE #2: Professional Development Initiative

Goal 2-1. Provide professional development opportunities to assist faculty in developing class activities to support “teaching for critical thinking.”

2 -1.1: SPC will have developed advanced critical thinking seminars with a discipline-specific focus for identified disciplines.

Determination of compliance:

Beginning in the spring of 2008, a variety professional development opportunities were provided to faculty and staff. The largest were the critical thinking institutes held each spring and fall.

Institutes were organized as mini-conferences including an opening keynote presentation

Impact Report 12

followed by concurrent breakout sessions. Prominent scholars, L. Dee Fink, Gerald Nosich,

David Sousa, Barry Stein, Milton Cox, Dean Kohrs, Johnny Good, and Edna Ross, were featured speakers. Faculty champions who had been engaged in the scholarship of teaching and learning with their academic roundtables lead discipline-specific sessions, and QEP staff presented additional critical thinking concepts including teaching and assessment strategies.

Attendance at the institutes ranged from 80 to over 300, and a steady increase was noted at the fall institutes from 80 in 2008 to 155 in 2011. Annual critical thinking retreats were also held beginning in 2009. Retreats were designed to bring the previous year’s faculty champions together with the upcoming year’s faculty champions for a more relaxed and intimate opportunity for exchange of ideas and expertise.

Faculty champions attended workshops as a group approximately five times per year to learn to research critical thinking in their discipline, to devise a critical thinking teaching intervention, to plan an assessment strategy, and to gain skills to compile evidence of their study in an online instructional portfolio.

Faculty attended scoring workshops to assess students’ critical thinking skills using the Critical

Thinking Assessment Test (CAT) developed by Tennessee Technological University or the

Assessment Rubric for Critical Thinking (ARC) developed by SPC. CAT and ARC Scoring

Workshops were opportuni ties for faculty to gain skill in assessing their students’ ability to think critically.

2-1.2: At least 75% of full-time faculty and the majority of adjuncts will have participated in seminars on “teaching for critical thinking.”

A total of 1,104 faculty and staff attended the nineteen events over the five year period

(duplicated attendees across events). The nineteen events are categorized into five unique experiences: ARC workshop, CAT workshop, Critical Thinking Retreates, Fall

Critical Thinking Institutes, and Spring Critical Thinking Institutes/Narrowing the Gulf conference.

Faculty survey data (to be administered fall 2012

Determination of compliance:

2-1.3: The majority of surveys and other forms of feedback on critical thinking seminars will be positive.

Determination of compliance:

Between fall 2008 and spring 2012, nearly 30 Critical Thinking events were held for faculty and staff professional development. The events were facilitated by the QEP liaison and other administrators, and many were held one or more times each academic year. A total of nineteen events were evaluated; individual evaluations were sent out to faculty and staff that participated in those events to determine the perceived value of the information presented and their experiences. The nineteen events are categorized into five unique experiences: ARC workshop,

CAT workshop, Critical Thinking Retreats, Fall Critical Thinking Institutes, and Spring Critical

Thinking Institutes/Narrowing the Gulf conference. A total of 1,104 faculty and staff attended the nineteen events over the five year period (duplicated attendees across events), and 476 responses were received for a response rate of 43%. Overall, a strong majority (between 82%

Impact Report 13

and 99%) of faculty and staff reported that the information they received and/or their experience from the event they attended will be useful to or enhance their teaching. In addition, nine-in-ten faculty/staff would recommend the scoring workshops to colleagues, and the same number also indicated that they planned to attend the Critical Thinking Institutes (fall or spring) the next year, as illustrated in the table below.

ARC workshop

CAT workshop

Retreat

Stakeholder Response to Critical Thinking Events:

# of Faculty and Staff who agree that…

The event has or will

Impact/Enhance/Useful to teaching

They would

Recommend to colleagues

They plan to attend next year

N

27

32

62

%

82%

89%

98%

N

31

23 n/a

%

91%

100%

N n/a n/a n/a

%

Fall CT Institute 186 95% n/a 216 99%

Spring CT Institute 93 99% n/a 85

Faculty survey data (to be administered fall 2012

The Critical Thinking Gateway website serves as the institution’s system for recording and archiving appropriate presentations for use in subsequent years.

97%

Goal 2-2. Develop in-house critical thinking expertise (i.e., faculty champions) using a

“train-the-trainer” approach.

2-2.1: SPC will have institutionalized the “Train-the-trainer” program in order to continue developing expertise.

Determination of compliance:

Faculty representing the range of programs at SPC were selected by their program director/dean to serve as a faculty champion for their discipline. In some cases, two faculty partnered for this position. Each semester, the faculty champion was paid a stipend to attend training and to lead a discipline-specific academic roundtable of peers.

Faculty champions attended conferences such as the International Lilly Conference on College

Teaching, the Teaching Critical Thinking program at Tufts University, the International

Conference on Critical Thinking, the train-the-trainer workshop for the Critical Thinking

Assessment Test (CAT), and the Critical Thinking for Instruction and Learning online course provided through the Foundation for Critical Thinking and Sonoma State University. In addition to working with members of their academic roundtable, faculty champions shared their projects and relayed their expertise by giving presentations at the fall and spring critical thinking institutes .

Impact Report 14

Goal 2-3. Institute Academic Roundtables (ARTs) to investigate general and disciplinespecific strategies for “teaching for critical thinking.”

2-3.1: SPC will have formed ARTs for the majority of General Education, A.S., and Baccalaureate programs.

Determination of compliance:

Under the leadership of a faculty champion, faculty and staff formed a discipline-specific academic roundtable to study critical thinking within its field, design a strategy to teach for critical thinking, implement the strategy, and assess its effectiveness. The process followed the model of the Scholarship of Teaching and Learning (SoTL), and efforts were documented in an online instructional portfolio .

2-3.2: By 2012, the majority of faculty participating in ARTs will affirm the value of

ARTs to research strategies.

Determination of compliance:

Faculty survey data (to be administered fall 2012

Between spring 2009 and spring 2012, four retreats were held for Academic Round Table (ART) members. A total of 145 ART members have attended the retreats. Nearly all faculty (98%) who responded to the evaluations of the retreat (n=62) reported that the information they received and/or their experience from the event they attended will be useful to or enhance their teaching, as illustrated in the table below.

INITIATIVE #3: Critical Thinking Resources Initiative

Goal 3-1. Compile electronic critical thinking resources for SPC faculty and staff organized through a College gateway website.

3-1.1: By 2012, the majority of faculty will identify the gateway website as a valuable source of information and ideas.

Determination of compliance:

The critical thinking gateway website has grown over the years to contain over 270 documents linked from 35 webpages. The gateway website is organized with resources for the three initiatives

– student success, professional development, and critical thinking resources. For example, the catalog of critical thinking materials housed in the campus libraries is linked from the site. And, videos of sessions, presentation files, handouts, and other materials from critical thinking institutes are available. The gateway website is also home to minutes and other documentation of committees and groups working on the critical thinking initiative.

Additionally, resources to assist faculty champions and their academic roundtables were developed and compiled in a community group in SPC’s online learning management system.

These include tutorials, videos, checklists, and links to online resources.

Impact Report 15

Goal 3-2. Create and collect critical thinking reusable learning objects (RLOs) for SPC and other institutions in Florida and across the world who are seeking multimedia/electronic critical thinking materials.

3-2.1: By 2012, the majority of faculty will identify the gateway website as a valuable source of information and ideas.

Determination of compliance:

Reusable Learning Objects (RLOs) were collected and incorporated into 31 learning activities for a variety of disciplines. For example, one activity uses a chaos game RLO to have students use their problem-solving skills to determine a solution in mathematics. Another uses a skeletal system RLO so that students learn the boney structures to build foundational knowledge of the importance of the skeletal system. Another RLO for reaction time is used with students studying how to clicker train small animals, particularly how timing affects outcomes. RLO guidelines , developed by SPC, were consulted by faculty as they selected RLOs and designed the learning activities. This included elements such as ease of use, interactive, meaningful, and feedback.

3-2.2: By 2012, the majority of RLOs will receive favorable feedback in the form of positive student and faculty reactions.

Determination of compliance:

Faculty survey data (to be administered fall 2012)

Goal 3-3. Contribute to the critical thinking literature through presentation and publication of instructional portfolios of strategies that support “teaching for critical thinking.”

3-3.1: By 2012, instructional portfolios will be available for the majority of programs at the College.

Determination of compliance:

As academic roundtables progressed through the steps of studying critical thinking within their disciplines, and implementing and assessing strategies, they documented their efforts in an online instructional portfolio . The roundtables followed the Scholarship of Teaching and

Learning (SoTL) which was mirrored by the portfolio format: Consult literature, choose and apply an intervention, conduct systematic observation, document observations, analyze results, obtain peer evaluation, identify key issues, synthesize results, and place into context of knowledge base.

Instructional portfolios were completed for the following disciplines: Business, Business

Technologies, Communication, Computer/Information Technology, Dental Hygiene-

Orthotics/Prosthetics, Early Childhood, Education, Emergency Medical Services, Ethics, Funeral

Services, Health Information Management, Hospitality/Tourism Management-Parks/Leisure

Services, Human Services, Humanities/Fine Arts, Library, Mathematics, Medical Laboratory

Technology, Natural Science, Nursing, Paralegal Studies, Physical Therapist Assistant, Public

Impact Report 16

Safety Administration, Radiography Care, Sign Language Interpretation, Social/Behavioral

Sciences, Student Life Skills, Veterinary Technology BAS, and Veterinary Technology AS.

Describe a couple portfolios?

3-3.2: By 2012, the majority of faculty will give a positive rating to the peer presentations and portfolios on teaching for critical thinking.

Determination of compliance:

Faculty survey data (to be administered fall 2012)

Goal 3-4. Acquire and use print and multimedia critical thinking resources available at

Critical Thinking Resource Centers housed in campus libraries.

3-4.1: By 2012, the majority of faculty will identify the Critical Thinking Resource

Centers as valuable sources of information and ideas.

Determination of compliance:

Faculty survey data (to be administered fall 2012)

Critical thinking resources are accessible online via the critical thinking gateway and also housed in the campus libraries. Items considered to be part of the critical thinking collection are specially cataloged.

To aid their literature review, faculty champions were provided a set of books borrowed from the critical thinking collection. Along with supporting faculty in conducting a literature review, library staff provided sessions aimed at acquainting faculty with the materials contained in the critical thinking collection.

List some titles?

4. Reflection on What We Learned

Impact Report 17

Download