July 11, 2008 SPC CL Campus, Crossroads Gallery Room 156

advertisement
Critical Thinking Assessment Test (CAT) Scoring Session
Workshop Summary
Date:
July 11, 2008
Location:
SPC CL Campus, Crossroads Gallery Room 156
Attendees:
Earl Fratus, Michael Earle, Lynn Grinnell, Mark Peebles, Mark Lulek, Maureen
Mahoney, Mary Ann Goodrich, Barbara Scarsbrook, Ann McNichol, Brenda Collins,
Nancy Watkins, Gail Lancaster, Anne Sullivan, Jesse Coraggio, Maggie Tymms
Background:
“The CAT instrument is a unique tool designed to assess and promote the
improvement of critical thinking and real-world problem solving skills. The instrument
is the product of extensive development, testing, and refinement with a broad range
of institutions, faculty, and students across the country. The National Science
Foundation has provided support for many of these activities. The CAT Instrument is
designed to assess a broad range of skills that faculty across the country feel are
important components of critical thinking and real world problem solving. The test
was designed to be interesting and engaging for students. All of the questions are
derived from real world situations. Most of the questions require short answer essay
responses and a detailed scoring guide helps insure good scoring reliability”.
(Tennessee Tech University, Critical Thinking Assessment Test Overview).
In collaboration with Tennessee Technological University and with support from the
National Science Foundation, St. Petersburg College received a grant to administer
the Critical Thinking Assessment Test (CAT) instrument to a representative sample of
approximately 100 students enrolled in the College during 2008. Three SPC
administrators attended a regional training workshop at Tennessee Technological
University in May 2008.
Subsequently, eighty-seven CAT assessments were
administered to SPC students enrolled in the courses listed below. For a copy of the
Student Consent form, please see Appendix A.
Table 1
Distribution of Students by Course
Course
Discipline
PHI 1600
PHI 1600
EEC 2300
SLS 1101
PCB 3043C
Ethics
Ethics
Early Childhood Education
Life Skills
Ecology
Completed CAT
Assessments
29
18
15
13
12
Description:
SPC Faculty was invited to participate in the CAT scoring session in July. For a copy of
the faculty recruiting letter, please see Appendix B.
The CAT Scoring Session was held on July 11th, 2008 at the Clearwater Campus of St.
Petersburg College. One-hundred CAT assessments were originally purchased from
Tennessee Tech University. Eighty-seven were administered; as a result eighty-seven
assessments were scored on this day.
Copies of the CAT Scoring Session agenda and faculty consent form are located in
Appendix C and D, respectively.
Most of the scoring faculty (13) and facilitators (2) arrived by 8:15 a.m. and
participated in a breakfast buffet. Each faculty member was given an agenda, a
consent form to sign, and a bin containing six or seven assessments. When most
participants had arrived, Jesse Coraggio welcomed everyone, thanked them for
participating, and asked for introductions to be made. Each attendee was then given
the scoring rubric and the CAT overview was presented using a screen projector.
The CAT overview consisted of a history and synopsis of the CAT development
process, the purpose of creating the assessment as a tool for improving student
success, Best Practices, and the importance of assessing Critical Thinking skills.
Please see figures 1, 2 and 3 below from the CAT Overview presented to the scoring
faculty. Figure 1 presented the History of CAT Development, Figure 2 presented the
Development of the CAT Instrument, and Figure 3 presented Best Practices for
Improving Critical Thinking.
Figure 1. History of CAT development.
Source: CAT Overview, Center for Assessment & Improvement of Learning, Tennessee Tech University
2008.
Critical Thinking Assessment Test (CAT) Scoring Session
2
Figure 2. Developing the CAT instrument.
Source: CAT Overview, Center for Assessment & Improvement of Learning, Tennessee Tech University
2008.
Figure 3. Best practices for improving critical thinking.
Source: CAT Overview, Center for Assessment & Improvement of Learning, Tennessee Tech University
2008.
Following the CAT Overview presentation and some group discussions, the CAT scoring
sessions began. During each CAT scoring session, the procedure listed below was
followed for each question, beginning with test item number one.
1. The CAT Training Module, presented on a projection screen, provided the
criterion and scoring rubric for a specific test item.
2. Next, a sample test item was presented on the screen, and various responses
were discussed and scored based on the scoring rubric given for the specific
item, by the presenter on the training module.
3. Lastly, each scorer reviewed the response provided for the specific item on
his/her first assessment, and scored it based on the scoring rubric. This
process was repeated for each of the 6-7 assessments they were given.
Critical Thinking Assessment Test (CAT) Scoring Session
3
4. Scorers who encountered a response which did not clearly follow the rubric
discussed the response with the group for clarification.
5. Each scorer then passed the scored assessments to the person on their right,
and the same test item on all assessments was scored by the second scorer.
6. In the event that two scores differed, the assessment was provided to a third
scorer, and a third score was recorded.
7. When all scoring for the specific test item on all assessments was completed,
the assessments were collected and redistributed randomly.
8. Finally, steps 1 through 7 were repeated for each test item until all questions
were scored
A fifteen minute break was offered after each hour of scoring, and a one-hour working
lunch provided ample time for discussion and review of the morning scoring sessions.
Once the scoring of all assessments was complete, a forty-five minute review and
discussion session ensued. The day came to a close at approximately 4:00 p.m.
The eighty-seven graded assessments and thirteen unused assessments were returned
to Tennessee Tech University, together with all the scoring material as required.
Results:
The results of the eighty-seven scored assessments show a mean score of 13.8 with a
highest possible score of 38, a maximum of 27, and a standard deviation of 5.8.
There were 28 males and 57 females, varying in age from 18 to 61. The students
reported having earned between 3 and 157 credits, and came from five different
course sections. The assessments were aggregated by gender, age, number of
credits, course, and Grade Point Average (GPA).
There was a slight difference between the mean score of male and female students as
seen in the table below. This could be attributed to the smaller number of males
assessed coupled with the fact that one of the 28 males received zero points for the
assessment. Details can be seen in Table 2.
Table 2
CAT Score by Gender
CAT Score 2008 by Gender
Total
Mean
Standard
Deviation
Minimum
Maximum
Male
28
13.1
6.5
0
27
Female
57
14.2
5.6
4
25
2
14.0
0.0
14
14
Gender
Not
Indicated
Students were divided into three categories by age and results were calculated. The
categories were selected based on standard college student age categories. They
Critical Thinking Assessment Test (CAT) Scoring Session
4
included ‘18 to 25’, ‘26 to 44’, and ‘Over 44’. There was little difference in the mean
score of students based on age, as seen in Table 3.
Table 3
CAT Score by Age
CAT Score 2008 by Age
Standard
Deviation Minimum
Age
Range
Total
Mean
18 to 25
54
13.6
5.4
0
27
26 to 44
25
14.7
6.0
6
27
Over 44
8
13.5
8.1
1
24
Maximum
Students were also divided into categories based on number of credits earned. The
division of these groups was decided based on the number of students within each
group. An attempt was made to have groups close if not equal in size. The groups
included less than 10 credits earned, which made up 24.1% of the group; 11 to 30
credits earned, 26.4% of the group; 31 to 50 credits earned, 20.7% of the group, and
More than 50 credits, 28.7% of the group. There was a notable difference between
the students with more than 50 credits, as seen in Table 4. Not only was the mean
score higher for this group, but the minimum and maximum scores were also higher,
and the standard deviation was the same for all groups.
Table 4
CAT Score by Credits
CAT Score 2008 by Credits Earned
Credits
Standard
Total Mean Deviation Minimum
Maximum
Less than 10
21
10.9
4.8
0
22
11 to 30
23
11.9
4.9
4
21
31 to 50
18
12.2
4.8
1
22
More than 50
25
19.2
4.5
13
27
Student scores were also aggregated by the course section they were enrolled in for
the administration of the CAT. There seems to have been a higher mean for those
students enrolled in the PCB 3034C course. The mean score for that group was 21.8
with a minimum of 14, as shown in Table 5.
Critical Thinking Assessment Test (CAT) Scoring Session
5
Table 5
CAT Scores by Course
CAT Score 2008 by Course Section
Course
Standard
Mean Deviation Minimum
Total
Maximum
PCB 3034C
12
21.8
4.5
14
27
EEC 2300
15
13.0
5.2
5
23
PHI 1600
30
13.6
4.6
1
21
PHI 1600
17
13.4
4.7
6
22
SLS 1101
13
8.7
4.3
0
17
Since there seemed to be a notable difference with the PCB 3034C course, an analysis
was conducted to understand the relationship between the number of credits earned
and the course in which the student was enrolled. All of the students enrolled in the
PCB 3034C had earned more than 50 credits. The course SLS 1101 had the lowest
mean score and did not have any students with more than 50 credits. The higher
mean score may have been attributed to the credits earned rather than the course
the student was enrolled in during the administration of the CAT. The details of this
analysis can be seen in Table 6.
Table 6
Crosstab Between Course and Number of Credits Earned
Course Type
Credits
Earned
PCB 3034C
EEC2300
PHI1600
SLS1101
Total
Less than 10
0
2
7
12
21
11 to 30
0
4
18
1
23
31 to 50
0
4
14
0
18
More than 50
12
5
8
0
25
Total
12
15
47
13
87
The scores from the CAT were also compared by GPA. The students without a
recorded GPA had the lowest mean score (7). There was some difference in the mean
score between the students with a 2.5 to 2.9 GPA with the mean of 12.4 and the
students with 3.0 to 3.4 and 3.5 to 4.0 GPA where the mean was 15.7 and 15.0,
respectively. It is difficult to compare students with GPA below 2.4 because there
were so few in those groupings and there was noticeable variability within the 2.0 to
2.4 GPA group.
Critical Thinking Assessment Test (CAT) Scoring Session
6
Table 7
CAT Scores by GPA
CAT Score 2008 by GPA
Total
Mean
Standard
Deviation
Minimum
Maximum
No GPA reported
3
7.0
6.2
0
12
Greater than 0 and
Less than 2.0
3
14.0
6.2
9
21
2.0 to 2.4
9
14.0
7.0
4
27
2.5 to 2.9
30
12.4
4.9
1
23
3.0 to 3.4
21
15.7
5.9
4
25
3.5 to 4.0
21
15.0
5.8
8
27
GPA
Conclusion:
There are some early indications that there was a relationship between the number of
credits earned and the student’s score on the CAT. This could have positive
implications as an indicator for the college. The second major requirement in
meeting the accreditation requirement standards of the Southern Accreditation of
Colleges and Schools (SACS) is a quality enhancement plan (QEP). The QEP is a
significant issue related to student learning that is faculty-driven, and has a broadbased involvement. Critical thinking has been the QEP focus at SPC. This measure
will assist the institution as one of multiple measures assessing SPC’s ability to carry
out the QEP.
These results suggest an increase in critical thinking skills for students who have
completed more than 50 credits of coursework. There are, however, some limitations
in the analysis. The students with more than 50 credits who were given the
assessment not only had a large number of credits, but were continuing their
education. This factor may make it difficult to draw the conclusion that the number
of credits is the primary cause for the increased score. There were only a few males
in the tested group (28), and the age distribution is slightly younger (62% under 25)
than the overall population at SPC which, according to the 2008-09 Fact Book, has
only 54% of students under the age of 25.
Despite these limitations in the data collection, the overall accomplishments of the
grant were highly beneficial to St. Petersburg College. The faculty who received the
training and had the opportunity to utilize the scoring rubric will have transferable
skills they can use in the future with their students. The administrators and faculty
who conducted the training are able to continue to provide professional development
to faculty. The continued use of quantifiable instruments to determine St. Petersburg
College’s effective implementation of the critical thinking initiative is another
example of SPC’s Institutional Effectiveness model for continuous improvement at the
college.
Critical Thinking Assessment Test (CAT) Scoring Session
7
References:
Tennessee Tech University, Critical Thinking Assessment Test Overview Retrieved on
July 17, 2008, from http://www.tntech.edu/cat/)
Critical Thinking Assessment Test (CAT) Scoring Session
8
Appendix A: Informed Consent Form for Student Participants
St. Petersburg College
Informed Consent Form
For Student Participants
PURPOSE, BENEFITS, and Risks
The Center for Assessment and Improvement of Learning at Tennessee Technological University has an
NSF funded grant to nationally disseminate an innovative assessment instrument and to encourage its
use for improving students’ critical thinking skills. St. Petersburg College has been invited to
participate in this NSF funded grant and will use the results of this critical thinking assessment to assist
in supporting the institution’s Quality Enhancement Plan (QEP). This test involves no known risks, as it
is a measure of intellectual performance.
PROCEDURES
The test involves short answer, essay questions. The test also contains demographic questions to
evaluate the cultural fairness of the test. The test takes approximately one hour to take. Your
participation in this activity is voluntary, and if you refuse to participate there is no penalty.
OTHER INFORMATION and Confidentiality of Responses
We may need to access information about your academic performance to help evaluate this new test.
Therefore, you will be asked to provide your student identification number. This information will be
kept confidential.
Questions
You are encouraged to ask your instructor questions about anything you do not understand prior to
signing this form. You may also contact Jesse Coraggio, Assessment Coordinator for Academic
Programs, in the Department of Institutional Research and Effectiveness at 341-3084 or
Coraggio.jesse@spcollege.edu.
PARTICIPANT”S STATEMENT
I have read the statements above and I have had the opportunity to ask questions about this study.
I agree to participate.
_______________________________________
Student printed name
_______________________________________
Student Identification Number
_______________________________________
Student signature
_______________________________________
Date
Critical Thinking Assessment Test (CAT) Scoring Session
9
Appendix B: Invitation to participate in CAT Scoring Workshop
Dear colleagues,
You are cordially invited to attend an all-day Critical Thinking Assessment Test (CAT) scoring workshop
on Friday, July 11th, in the Crossroads Gallery room 156, Clearwater campus (map attached).
The workshop is scheduled from 8:00 until 5:00 but we hope to be done before that. Faculty will be
paid approximately $125 ($150 minus taxes, etc.) for the day if you can arrange your normal duty hours
so that Friday is not a duty day for you (e.g., by working longer hours other days or taking a personal
day - check with your Dean or Program Director). Because it is an all-day workshop, lunch and snacks
will be plentiful.
We believe you will find the day interesting. The CAT website says: "The CAT Instrument is designed to
assess a broad range of skills that faculty across the country feel are important components of critical
thinking and real world problem solving. The test was designed to be interesting and engaging for
students. All of the questions are derived from real world situations. Most of the questions require
short answer essay responses and a detailed scoring guide helps insure good scoring reliability. … The
instrument is the product of extensive development, testing, and refinement with a broad range of
institutions, faculty, and students across the country. The National Science Foundation has provided
support for many of these activities."
During the workshop, faculty graders will be trained and simultaneously score the CAT instrument.
Faculty graders will also complete several questionnaires designed to asses the scoring workshop and
the adequacy of the training. The scoring workshop will take approximately 7-8 hours. Your
participation in the activity is voluntary.
The CAT scoring workshop will provide faculty graders with an opportunity to:

Understand student strengths and weaknesses in critical thinking skills and real-world problem
solving

Discuss new pedagogical methods (best practices) that could impact student learning

Work with faculty in other disciplines who share your interest in improving teaching and
student learning
We hope you will be able to attend -- if you can't, please let us know right away because we need
about fifteen faculty members to accomplish the goals of the NSF grant.
Thanks for your participation - please contact QEC faculty chair Gail Lancaster, IE Director Jesse
Coraggio, QEP Director Lynn Grinnell, or faculty champions Maureen Mahoney (TS) or Anne Sullivan
(SP/G) if you have any questions.
Critical Thinking Assessment Test (CAT) Scoring Session
10
Appendix C: CAT Scoring Session Agenda
CAT Scoring Session Agenda
SPC CL Campus, Crossroads Gallery Room 156
July 11, 2008
Start Time
Activity
8:00
8:15
8:30
9:00
10:00
10:15
11:15
11:30
12:30
1:30
2:30
2:45
3:45
4:00
5:00
Breakfast
Welcome and Introductions
CAT Overview
CAT Scoring Session
Break
CAT Scoring Session
Break
CAT Scoring Session
Lunch, Discussion, and Review
CAT Scoring Session
Break
CAT Scoring Session
Break
CAT Scoring Session
End CAT Scoring Session
Critical Thinking Assessment Test (CAT) Scoring Session
11
Appendix D: Informed Consent Form for Faculty Grader
St. Petersburg College
Informed Consent Form
For Faculty Grader
PURPOSE, BENEFITS, and RISKS
The Center for Assessment and Improvement of Learning at Tennessee Technological
University has an NSF funded grant to nationally disseminate an innovative assessment
instrument and to encourage its use for improving students’ critical thinking skills.
The scoring and evaluation of this test involves no known risks.
PROCEDURES
Faculty graders will be trained and simultaneously score a test consisting of mostly
short answer, essay questions (the CAT instrument). Faculty graders will also
complete several questionnaires designed to assess the scoring workshop and the
adequacy of the training. The scoring workshop will take approximately 7 – 8 hours.
Your participation in this activity is voluntary, and if you refuse to participate there is
no penalty.
OTHER INFORMATION and CONFIDENTIALITY OF RESPONSES
The scores you assign to individual test questions will be anonymous as will any
questionnaires that you complete.
QUESTIONS
You are encouraged to ask questions about anything you do not understand prior to
signing this form.
PARTICIPANT’S STATEMENT
I have read the statements above and I have had the opportunity to ask questions
about this workshop.
I agree to participate in this workshop.
Printed name
Signature
Date
Critical Thinking Assessment Test (CAT) Scoring Session
12
Download