ARP_reading_Feb2010

advertisement
Academic Readiness in Reading
MIDTERM EVALUATION REPORT
_______________________________________________________
February 2010
Prepared in accordance with Memorandum of Agreement between the
Collaborative Center for Literacy Development and the Office of Undergraduate
Education at the University of Kentucky
By Dr. Laurie A. Henry, College of Education, University of Kentucky
©2010 Collaborative Center for Literacy Development
UNIVERSITY OF KENTUCKY
1
TABLE OF CONTENTS
EXECUTIVE SUMMARY _____________________________________________________iii
I. OVERVIEW, PROGRAM PURPOSE AND GOALS ______________________________1
II. READINESS IN READING COURSE DESCRIPTION ____________________________2
III. STUDENT ENROLLMENT IN READING LAB COURSE _________________________3
IV. TRAINING WORKSHOP FOR TEACHING ASSISTANTS ________________________4
V. OVERVIEW OF EVALUATION ACTIVITIES __________________________________ 6
VI. DATA COLLECTION, ANALYSES, FINDINGS, AND LIMITATIONS ______________6
TRAINING WORKSHOP _________________________________________________7
READING LABS _______________________________________________________ 9
INDICATOR 1. COMPASS READING ASSESSMENT _________________ 10
INDICATOR 2. PLATO READING ASSESSMENT ____________________ 12
INDICATOR 3. STUDENT MOTIVATION SCALES ___________________ 14
INDICATOR 4. SEMESTER GRADES _______________________________16
INDICATOR 5. SEMESTER GPA ___________________________________21
INDICATOR 6. WEEKLY REPORTS ________________________________22
INDICATOR 7. FOCUS GROUP INTERVIEWS _______________________30
VI. CONCLUSIONS __________________________________________________________37
VII. RECOMMENDATIONS ___________________________________________________39
REFERENCES ______________________________________________________________ 41
APPENDIX A – A & S 100 COURSE SYLLABUS _________________________________ 42
i
APPENDIX B – SUMMARY OF TRAINING WORKSHOP FEEDBACK _______________46
APPENDIX C – PLATO READING ASSESSMENT/CURRICULUM __________________48
APPENDIX D – STUDENT MOTIVATION SCALE ________________________________49
APPENDIX E – FOCUS GROUP INTERVIEW PROTOCOL _________________________50
ii
EXECUTIVE SUMMARY
The Collaborative Center for Literacy Development (CCLD) entered into a Memorandum
of Agreement with the Office of Undergraduate Education for the main purpose of designing and
implementing a pilot program, referred to as the Academic Readiness in Reading, as part of the
Academic Readiness Program at the University of Kentucky. This pilot program consisted of two
main components: 1) Development and facilitation of Reading Labs (A&S 100) implemented
during the Fall 2009 semester; and 2) Development and facilitation of Reading Clinics
implemented during the Spring 2010 semester. This Midterm Evaluation Study1 was conducted
to determine if the pilot of the Reading Labs (during the Fall 2009 semester) effectively met the
following objectives:
•
To increase students’ comprehension and reading level
•
To foster an increased motivation to learn in the areas of literacy and paired content
course (i.e. anthropology, history, sociology, agriculture)
•
To foster independent literacy and study skills
This evaluation was conducted during the Fall semester of the 2009-2010 academic year. The
evaluator looked for evidence of efficacy and sustainability of the Reading Labs based upon the
collected data.
Data Points Used for Program Evaluation
Data collection consisted of multiple quantitative and qualitative measures including the
following:
•
COMPASS reading sub-scores as a pre/post assessment measure
•
PLATO Fast Track Reading Assessment, Level I as a pre/post assessment measure
1
A full evaluation report for both the Reading Labs and Reading Clinics will be completed following
implementation of the Reading Clinics during the Spring 2010 semester
iii
•
Student Motivation Scale (for Reading and paired content course) as a pre/post measure
•
Fall 2009 semester grades (for Reading Lab and paired content course)
•
Fall 2009 semester Grade Point Average (GPA)
•
Weekly reports provided by the teaching assistants
•
Focus group interviews
These multiple data points allowed for a more thorough understanding of the efficacy of the
Reading Lab implementation and impact on student success in order to formulate justifiable
conclusions and recommendations.
Conclusions and Recommendations
The following conclusions and recommendations were derived from all sources of data as
identified above. They are intended to be a guide for what the University may choose to take as
next steps for the Academic Readiness in Reading program.
Conclusions
•
Program personnel were successful in planning and implementing the summer training
workshop to train teaching assistants to facilitate the Reading Labs.
•
Students may not have viewed the COMPASS assessments as important to their overall
success resulting in a decrease in scores between the pre- and post-administration.
•
Students did not like the PLATO online system, which may have led to increased stress
levels.
•
Students held negative views of both reading and their paired content course as measured by
the Student Motivation Scale.
•
The majority of students (90%) were successful in the Reading Lab course by earning a final
grade of C or better.
iv
•
The majority of students (79%) were successful in their paired content course by earning a
final grade of C or better.
•
The majority of students (71%) experienced academic success by earning a GPA of 2.0 or
higher.
•
The meeting time for the Reading Labs was too short and meeting locations were
problematic at times (in relation to both space and AV and technology issues).
•
The students viewed having teaching assistants who were content experts as positive and
very helpful. They also saw them as role models.
•
Teaching assistants did not address all of the instructional objectives related to the Reading
Lab content.
•
Many students stated that their participation in the Reading Labs helped them to be more
successful in their content courses.
•
Many students reported using specific reading and study strategies in their content courses.
Recommendations
•
Develop a streamlined, more accurate procedure for documenting and providing data
collected by the Office of Undergraduate Education (e.g. COMPASS scores, grades, GPA).
•
Place greater emphasis on the importance of the reading assessments to the students.
Consider an alternate reading assessment that could be administered during the midterm
and/or final exam periods.
•
Administer the Student Motivation Scale through an online platform to reduce human error
when recording scores. Include a student identifier so results of the motivation scale can be
further analyzed for individual students.
v
•
Conduct follow-up interviews at the end of the Spring 2010 semester with students who were
enrolled in the Reading Labs to determine whether they continued to use the reading and
study strategies they learned.
•
Revise the summer training workshop according to the feedback provided by the teaching
assistants regarding length and other recommendations.
•
Revise the summer training workshop to place additional emphases on the instructional
objectives.
•
Revise the course syllabus to include additional details regarding the dossier assignment.
Include a reflective paper that students complete at the end of the semester focused on the
application of the learned strategies as documented through the dossier assignments. Develop
a common rubric or scoring metric to assess the dossier content.
•
Remove the requirement related to the use of the PLATO online system. Consider the use of
alternate support services available through other units at the university. If PLATO use is
continued, provide on-site supervision and support.
•
Compare data collected on Reading Lab students (e.g. attendance rates, grades and GPA) to
other students of similar standing not enrolled in the Reading Labs to better determine overall
academic success.
•
Increase the class time for the Reading Labs to a minimum of 75 minutes per week to allow
for additional emphasis on the practice and application of strategies, thus increasing the
amount of credit hours earned accordingly to elicit additional buy-in by the students.
•
Consider the use of teaching assistants from the College of Education who are content
experts but also possess foundational knowledge in curriculum and instruction to ensure
learning activities and assessments address all instructional objectives.
vi
OVERVIEW AND PROGRAM GOALS
This evaluation study examined the efficacy of the Readiness in Reading component of
the Academic Readiness Program at the University of Kentucky. The purpose of this report is to
provide a midterm program evaluation of the impact of the Readiness in Reading pilot program
implemented during the Fall 2009 semester. The target population for enrollment in this program
was first time, first-year, full time students having one or more deficit areas in reading skill level.
The Office of Undergraduate Education (OUE) created the Readiness in Reading program as a
supplemental college reading program to promote successful undergraduate education for the
target population. The OUE determined initial eligibility and student enrollment criteria,
developed the format for facilitation with 50-minute weekly Reading Lab courses, and staffed
the Reading Labs with content area Teaching Assistants. The Collaborative Center for Literacy
Development (CCLD) entered into a Memorandum of Agreement (June, 2009) in order to
provide staff and faculty who would help “craft, implement and assess the success” of this
program. Personnel who facilitated this program included Dr. Ellen Godbey at CCLD, who
assumed the role of Reading Lab Coordinator, and Dr. Laurie A. Henry, a literacy faculty
member from the Department of Curriculum and Instruction in the College of Education, who
provided ongoing support for the development and implementation of the pilot program as well
as assuming the role of evaluator. Dr. Carol Eades, Associate Director of CCLD, was acting
supervisor for the facilitation of the program.
The main purpose of the Readiness in Reading component of the Academic Readiness
Program was to “promote successful undergraduate educational experiences for students whose
ARP reading plans include enrollment in supplemental college reading instruction, tutoring, and
mentoring” with an end result of providing “better support services to incoming first-year
students with low reading scores, more efficient and effective use of resources currently in place
for all UK students, and to help eliminate barriers for students in attaining their educational
1
goals” (see Memorandum of Agreement [MOA] stated Purpose and Goals, 2009). The three
goals of this pilot program included the following:
•
Help improve the University of Kentucky retention and graduation rates
•
Create a first-year, cross-college academic developmental model in critical reading skills
•
Increase student diversity and success
This evaluation report will address each of these specific program goals to determine if the pilot
of the Reading Labs effectively met the following objectives:
•
To increase students’ comprehension and reading level
•
To foster an increased motivation to learn in the areas of literacy and paired content
course (i.e. anthropology, history, sociology, agriculture)
•
To foster independent literacy and study skills
This midterm evaluation was conducted at the mid-point of the Readiness in Reading pilot
program. The evaluator used multiple data points to provide evidence of the efficacy and
sustainability of the Readiness in Reading course (known as A&S 100) implemented during the
Fall 2009 semester. This report describes the activities conducted for the evaluative portion of
this pilot program and summarizes the data collection, data analyses procedures, and findings.
The final portion of the report provides recommendations to the Office of Undergraduate
Education for future planning in relation to the academic preparedness of the targeted population
(i.e. first time, first-year, full time students).
READINESS IN READING COURSE DESCRIPTION
The Readiness in Reading component of the Academic Readiness Program for Fall 2009
consisted of the design and implementation of Reading Labs (A&S 100) that were “linked” to
content courses in anthropology (ANT 160), history (HIS 108), sociology (SOC 101), and
agriculture (GEN 109). Students who successfully completed the requirements of the Reading
Lab course earned one credit hour toward his or her degree program. The Reading Labs, which
2
met one time per week for a period of 50 minutes, were conducted by six content specific
graduate student teaching assistants recommended by faculty in the various departments of the
College of Arts and Sciences in conjunction with the Office of Undergraduate Education.
The Reading Labs were conducted by six teaching assistants and met for a period of 50
minutes one time per week for 16 weeks during the Fall 2009 semester. The teaching assistants
utilized a common syllabus developed by Dr. Godbey (see Appendix A) and a common core
textbook (Van Blerkom & Mulcahy-Ernt, 2005) to facilitate instruction in the Reading Labs. The
syllabus was designed as a template so the teaching assistants could customize it to his or her
instructional style. Dr. Godbey also provided the teaching assistants with slide presentations that
coincided with each chapter of the core text, suggested assignments to address the stated
instructional objectives, and access to other texts from her personal library as additional
resources. An overview of these materials was provided to the teaching assistants during a 5-day
summer workshop developed and facilitated by Dr. Godbey and Dr. Henry prior to the start of
the Fall 2009 semester.
STUDENT ENROLLMENT IN READING LAB COURSE
Students targeted for enrollment in the Reading Lab course included first-time, first-year,
full-time students that were identified by the Office of Undergraduate Education as having one or
more deficit areas in reading skill level. Table 1 provides a description of the placement
recommendations that were used.
Table 1.
Placement Recommendations for incoming first time, first-year freshman with reading deficits
ACT
17-19
14-16
COMPASS SAT*
Placement Recommendation
75-85
411Enrolled in one of four targeted courses paired with one hour
470
reading lab.
• Students are taught by trained TA’s using course materials.
• Ongoing evaluation and monitoring of student progress
through portfolio.
64-74
351Enrolled in targeted course with paired one hour reading lab
3
410
< 13
<63
<350
•
In addition: Independent Online Intervention (PLATO) 2
hours week minimum for semester
Enrolled in targeted course with paired one hour reading lab
• In addition: Enrolled in Reading Clinic during Spring 2010
semester for intensive one on one skill and strategy
development
o Limited to 10 students in clinic
o Minimum 2 hours week in clinic
*This table represents most common cut-offs in use for college placement classes. SAT=480 is seen as being
“college ready” in reading
At the start of the Fall 2009 semester, a group of identified freshman (n=112) were enrolled in
one of 16 sections of the A&S 100 Reading Lab course offered across several different days and
at various times in order to meet scheduling demands of the students. However, only 14 sections
of the A&S 100 Reading Lab course were populated with students. Two sections (014 and 018)
did not have any student enrollment and were thus cancelled. Student enrollment in the Reading
Lab sections ranged from a low of 3 students to a high of 17 students with an average class size
of 8 students. Approximately 10 percent (n=11) of the enrolled students dropped the course prior
to the start of the semester leaving a total of n=101 students who began in the Reading Labs on
the first day of classes.
TRAINING WORKSHOP FOR TEACHING ASSISTANTS
Prior to the start of the Reading Labs, seven teaching assistants participated in a 5-day
training workshop developed and implemented by Drs. Godbey and Henry. This training
workshop was scheduled during the month of August prior to the start of the Fall 2009 semester.
The main purpose of the workshop was to train teaching assistants as college reading strategists
in their content area (see MOA, 2009). To that end, the workshop provided an opportunity for
the teaching assistants to develop knowledge and skills in teaching content area reading and
study skill strategies for the college level while orienting them to the specific instructional
objectives for the Reading Labs as identified on the course syllabus (see Appendix A). The
workshop objectives included:
4
Day 1: To introduce instructors to their overall responsibilities to lab students and
CCLD, to share the strategic plan for the pilot study, and to provide an overview
of the Reading lab curriculum, 21st Century Skills, college reading research,
plagiarism, evaluation of web sites, student support services, PLATO software,
and the Ning web site (an online support network).
Day 2: To demonstrate a lesson on the importance of getting motivated to learn to
prioritize test-taking chapters and prepare to teach a mini lesson on Day Three, to
review note taking methods and practice taking notes while viewing a
presentation and to introduce college reading strategies.
Day 3: To introduce strategies for teaching vocabulary development, improving
concentration, and improving memory, to practice teaching a mini lesson on exam
preparation or taking exams, to practice making a graphic organizer as a
learning/memory enhancer, and to share experiences/methodology for time
management.
Day 4: To review methods for taking text notes and to practice taking notes using
one of the methods, to introduce methods for teaching college reading, to practice
teaching reading skills, and to familiarize the instructors with methods for
improving memory.
Day 5: To allow instructors to share their experiences in taking text notes, to
allow instructors to practice teaching college reading strategies, to familiarize the
instructors with the components of the reading dossier which are paired with the
students’ content course readings, to evaluate the workshop, and to set monthly
meetings.
Input and feedback from the teaching assistants was encouraged throughout the training to
increase their level of commitment to teaching in this program as they were given the
opportunity to take ownership and make decisions about how the curriculum would be facilitated
in the classroom. A summary of this feedback as well as evaluation forms were used to evaluate
the effectiveness of the workshop training.
In addition to the face-to-face training workshop, Dr. Henry set up an online portal
(http://ukreading.ning.com) to provide ongoing support to the teaching assistants throughout the
semester. This space was used to communicate monthly meetings to the teaching assistants.
Additionally, the teaching assistants used this space to share PowerPoint files and other resources
that were used during instruction.
5
OVERVIEW OF EVALUATION ACTIVITIES
The evaluator for this program met with key personnel from both the CCLD and Office
of Undergraduate Education (OUE) at the outset of this pilot program in order to develop the
evaluator’s scope of work. Beginning in June 2009, on-going formal and informal conversations
were held to plan the development, implementation, and evaluation of the Readiness in Reading
pilot program. The program evaluation consisted of these key activities:
•
Identification of learner outcomes, sources of data, and development of
instruments as appropriate (e.g. workshop evaluation, interview protocol)
•
Data collection activities
•
Data analysis and synthesis
•
Preparation of midterm evaluation report
•
Preparation of final evaluation report2
The program evaluator also provided on-going program support and monitoring. Monthly
meetings were conducted with the teaching assistants, which provided an opportunity for them to
bring any concerns or issues to the attention of Dr. Godbey and Dr. Henry. These meetings also
provided an opportunity to remind the teaching assistants of the Reading Lab objectives and
distribute additional information related to the Reading Lab requirements. In addition, Drs.
Godbey and Henry observed each teaching assistant during instruction in the Reading Lab on
two separate occasions over the course of the semester.
DATA COLLECTION, ANALYSES, FINDINGS, AND LIMITATIONS
Both quantitative and qualitative data were gathered from multiple sources. What follows
is a description of each data point collected as well as the source of that data for the purpose of
2
The final evaluation report for the Academic Readiness in Reading pilot program implemented during the 20092010 academic year is to be submitted to the Executive Director of CCLD on or before June 30, 2010
6
evaluating both the training workshop for the teaching assistants and the facilitation of the
Reading Labs during the Fall 2009 semester.
Training Workshop Evaluation
The seven teaching assistants who participated in the 5-day training workshop completed
an evaluation form on the last day of training. Dr. Ellen Godbey developed this informal
evaluation tool for the sole purpose of obtaining feedback from the teaching assistants regarding
the training. This tool consisted of ten items using a 4-point Likert scale for responses that
ranged from 1=Strongly Disagree to 4=Strongly Agree. The evaluation tool also included four
open-ended items: 1) The strongest features of this workshop were; 2) Things I think could be
improved; 3) Topics I would like to see addressed; and 4) Additional comments. The teaching
assistants completed the evaluation forms at the close of the final day of training. Responses
were provided anonymously. Table 2 below highlights the results of this evaluation with
frequency counts for how the Teaching Assistants rated each item.
Table 2.
Academic Reading Readiness Workshop Training Evaluations
Item
Strongly
Disagree
(1)
Disagree
(2)
Agree
(3)
Strongly
Agree
(4)
1. The timing of the workshop worked well with
my schedule
2. The objectives were stated clearly at the
beginning of the workshop
3. The information provided in the handouts was
helpful
4. I learned valuable information in this
workshop that I will be able to use in teaching
the Reading Labs
5. The instructors had good knowledge of the
subject
6. The instructors were prepared for the
workshop
0
0
3
4
0
0
3
4
0
0
6
1
0
0
3
4
0
0
2
5
0
0
2
5
7. The instructors were able to convey the topics
clearly and provided relevant examples
0
0
3
4
7
8. I felt as is I could ask the instructors questions
0
0
1
6
9. The pace of the workshop was just right
0
2
4
1
10. The amount of material in the workshop was
just the right amount of information
0
1
4
2
Overall, the teaching assistants rated the workshop in a favorable manner with 100 percent of
them rating the first eight items positively as either “agree” or “strongly agree” in response to the
item. The final two items were rated less favorably with a total of three “disagree” responses.
None of the items were rated at the lowest end of the scale (i.e. “strongly disagree”). Although
space was not provided on this portion of the evaluation for comments, three of the teaching
assistants wrote comments in response to the last two items that may help explain the lower
ratings on items 9 and 10 (shown in the table above). In response to item number 9, written
comments indicated that the pace could have been faster (e.g. “a little slow possibly”, “could
have moved faster”, “could be faster/shorter”). Additionally, one of these three teaching
assistants also added the comment “less redundant” to item number 10 indicating that the amount
of material and information could have been condensed, which is related to the previously stated
issue regarding the pace of the workshop.
The workshop evaluation form also included four open-ended items: 1) The strongest
features of this workshop were; 2) Things I think could be improved; 3) Topics I would like to
see addressed; and 4) Additional comments. Overall, the teaching assistants provided positive
feedback on these items. Five of them felt that the available resources, including PowerPoint
slides coinciding with each chapter of the text, were the strongest features of the workshop.
There were three additional comments regarding the length of the workshop with suggestions on
how it could be condensed (e.g. “the workshop could be shortened, perhaps cut a day off or an
hour each day since the pace was a bit slow”). Other suggestions for improvement focused on the
desire for additional information regarding PLATO, COMPASS, and the logistics of how these
8
would fit into the Reading Labs. Comments related to topics that the teaching assistants would
have liked addressed included additional information on the integration of core course material
into the Reading Labs, use of media literacy, and the manner in which the Reading Labs
addressed the university’s objectives (no specific detail was provided regarding the reference to
the university’s objectives). Other comments included positive remarks regarding the enthusiasm
of the instructors, relaxed atmosphere of the training, ability to share ideas openly, and the food
that was provided.
Following the completion of the training workshop, Dr. Godbey documented the
strengths and weaknesses of the workshop training using reflective field notes (see Appendix B).
These notes included summaries of the strengths and weaknesses that were identified on the
evaluation forms completed by the teaching assistants as well as conversations with Dr. Henry
and Dr. Carol Eades at the completion of the workshop training. Dr. Godbey also documented
changes that were made to the training workshop and materials, which addressed questions,
concerns, and comments that were raised by the teaching assistants during the training.
It should be noted that one of the teaching assistants who attended the 5-day training
workshop did not have clarification regarding whether she would teach one or more sections of
the Reading Lab or one of the paired content courses. She attended all five days of the training
workshop as well as two follow-up monthly meetings until it was determined that her attendance
was no longer required as her teaching assignment was focused on teaching one of the paired
content courses. Thus, the data analyzed in this evaluation are from six teaching assistants who
taught sections of the Reading Labs.
Reading Lab Evaluation
The Reading Lab evaluation for the Fall 2009 semester consisted of multiple data points
and sources. The evaluator obtained data from a variety of sources as identified below. A
description of each data point is also provided. A combination of quantitative and qualitative
9
data analysis techniques were used, including analysis of variance (ANOVA), t-tests,
correlations, frequency counts, and constant comparative methods, in order to interpret the data
by seeking patterns and relationships across data points. In the paragraphs that follow, data
collection, analysis procedures and findings as well as limitations are provided for each of the
seven indicators that contributed to the overall evaluation of the Reading Labs.
Indicator 1. COMPASS Reading Diagnostics Test
The COMPASS Reading Diagnostics Test was used as a pre-/post-test assessment to
measure the efficacy of the Readiness in Reading Fall 2009 Reading Labs (A&S 100). This
assessment developed by ACT, Inc. (see http://www.act.org/compass) evaluates students’
specific skill sets in reading comprehension and vocabulary as well as identifying an individual’s
reader profile. For the purpose of this evaluation, the pre-/post-assessment scores for reading
comprehension and vocabulary as well as the composite reading score were used to conduct the
analyses. The Office of Undergraduate Education provided pre-test COMPASS data for students
enrolled in the Reading Labs that was collected prior to their enrollment in the Fall 2009
semester. The post-test COMPASS data was collected through the Office of Undergraduate
Education during the final exam period of the Fall 2009 semester. All COMPASS data was
provided to the evaluator in the form of Microsoft Excel spreadsheets via an email attachment.
Data Analysis and Findings
A one-way within-subjects repeated measures Analysis of Variance (ANOVA) was
conducted for the pre-/post-composite COMPASS scores to determine a main effect. The means
and standard deviations for the COMPASS scores are presented in Table 3. As can be seen from
this table, the mean score for the COMPASS post-test scores (70.66) is lower than the mean
score for the COMPASS pre-test scores (71.11).
Table 3
Mean and Standard Deviations for COMPASS Scores
10
COMPASS
Mean
Standard Deviation
Pre-test
71.11
10.63
Post-test
70.66
15.19
The results for the ANOVA indicated a non-significant effect, F(1, 85) = 0.059, p = .808,
multivariate 2 = .001, between the pre-test and post-test scores on the COMPASS. Because
there was no significance between the COMPASS pre- and post-test composite score, no
additional comparisons for the sub scores (i.e. comprehension and vocabulary) were conducted.
A one-way analysis of covariance (ANCOVA) was conducted to determine whether there
were significant differences in the COMPASS pre- and post-test assessment scores based on
group placement (i.e. student’s enrollment in one of the A&S 100 sections). A preliminary
analysis evaluating the homogeneity-of-slopes assumption indicated that the relationship
between the covariate (COMPASS pre-test) and the dependent variable (COMPASS post-test)
did not differ significantly (at the .01 level) as a function of the independent variable (A&S100
section), F(11, 61) = 2.18, p = .028, partial 2 = .282. The ANCOVA was non-significant, F(11,
72) = 1.351, p = .056 indicating that there were no significant differences in pre- and post-test
assessment scores based on placement in A&S100 sections.
Limitations
The Office of Undergraduate Education provided pre- and post-test COMPASS data
across five separate spreadsheets. In reviewing these data, several inconsistencies and problems
were noted and included the following:
•
Multiple students were recorded with the same student ID number
•
Pre and post data for individual students were inconsistent across data files
•
Some scores did not have student names or ID numbers associated with them
•
Some students had partial data points or multiple grades documented for the
same course
11
Through detailed email communications with Chela Kaplan in the Office of Undergraduate
Education, many of these inconsistencies and problems were clarified. However, because the
data used for these analyses were compiled from several different sources, the evaluator’s
confidence in the accuracy of the COMPASS scores is somewhat diminished.
Students taking the COMPASS post assessment may not have taken the test seriously.
Chela Kaplan from the Office of Undergraduate Education proctored the administration of this
assessment and indicated that many students had not slept the night before and appeared
lethargic during the test session. Additionally, the internal algorithms of the online interface
allow students to progress through the test without responding to every item. Items with no
response are scored as “incorrect responses” thus skewing the data. This may account for the
decrease in COMPASS scores on the post-test administration. Thus, the results of this
assessment should be interpreted with caution.
Indicator 2. PLATO FASTRACK Advantage Reading Assessment
The PLATO FASTRACK Advantage Reading Assessment developed by PLATO
Learning (see http://www.plato.com/Post-Secondary-Solutions.aspx) was used as a pre- and
post-assessment measure. The PLATO Advanced Reading Strategies is designed for high school
and adult learners to help them develop reading comprehension and critical-thinking skills that
are typically taught in grades 9-14. Students were directed to begin the assessment at Level I and
continue through Level J (see Appendix C). This assessment provided two sub scores, Reading
Skills and Reading Comprehension. Access to the PLATO online system was available on the
computers at The Hub located in the W. T. Young Library. Students were asked to complete the
pre-assessment during the first two weeks of classes and again during the final two weeks of
classes. Responsibility for scheduling these two assessments was placed on the individual
student as a course requirement. IT workers in The Hub were available as a resource to support
the students as they logged into their accounts on PLATO. A compilation of the student data was
12
provided to the evaluator in the format of a summary report generated by the PLATO system,
which was submitted by Signe Dunn, Project Manager for the Kentucky Virtual Campus for K12 Students and PLATO Representative for the University of Kentucky via email attachment.
Data Analysis and Findings
A one-way within-subjects repeated measures Analysis of Variance (ANOVA) was
conducted for the pre-/post- PLATO Reading Skills scores to determine a main effect. The
means and standard deviations for the PLATO Reading Skills scores are presented in Table 4. As
can be seen from this table, the mean score for the PLATO Reading Skills post-test scores (4.85)
is higher than the mean score for the PLATO Reading Skills pre-test scores (1.72).
Table 4.
Mean and Standard Deviations for PLATO Reading Skills Scores
PLATO Reading Skills
Mean
Standard Deviation
Pre-test
1.72
2.02
Post-test*
4.85
1.70
*Note that this analysis is based on the availability of n=23 post-test scores
The ANOVA for PLATO Reading Skills was significant, F(1, 22) = 58.14, p < .01, however this
result should be interpreted with caution due to the small number (n=23) of post-test scores
currently available. One-way within-subjects repeated measures Analysis of Variance (ANOVA)
could not be conducted for the pre-/post- PLATO Reading Comprehension scores to determine a
main effect, because there were no cases in which both the pre- and post-assessment scores were
available for this sub score.
Limitations
It should be noted that the administrations of the pre- and post-test assessments in the
PLATO system were problematic. In regard to the pre-assessment administration, many students
had difficulty accessing the PLATO system and could not complete the pre-assessment in a
13
timely manner. Several students still had not completed the pre-assessment by week 9 of the
semester, thus severely diminishing the time between the pre- and post-assessment. Students had
less difficulty accessing the post-test assessment, however there was an issue with the recording
of this data within the PLATO system. Communications between the evaluator and Signe Dunn
(PLATO Representative) regarding this issue resulted in the following determination:
Excerpt from email communication (February 11, 2010): Unfortunately, the news
is not so good. The only report available is the Fastrack Report I sent you back in
December. For future use, Plato informed me that the student needs 2 accounts-one for the initial assessment and another for the reassessment. We have never
done this before, and didn't know the process.
The data report that was provided included post-assessment data for only 23 out of 100 students
used in the preceding analyses as noted.
Indicator 3. Student Motivation Scales
The Student Motivation Scale (Christophel, 1990) was used as a pre-/post-measure of
students’ motivational attitudes toward reading and their paired content course (i.e.
anthropology, history, sociology or agriculture). This instrument consists of twelve bi-polar,
semantic differential adjectives (e.g. motivated/unmotivated, interested/uninterested, excited/not
excited) that are ranked on a scale of 1 to 7 in which students indicated their feelings toward the
specific classes in which they were enrolled (see Appendix D). The teaching assistants
administered the instrument on the first day of classes and on the final day of classes. Responses
were collected anonymously.
Data analysis and findings
The Student Motivation scale was administered to the students at two different time
periods, during the first day of classes (T1) and on the final day of classes (T2). Seven variables
were reverse coded because they were posed using an opposite response metric in relation to the
14
other nine variables on the scale. Coefficient alpha reliabilities were .82 at T1 and .91 at T2
indicating satisfactory reliability. The average for each item was computed for the two scales
administered during T1 and T2 (i.e. reading and content course) creating four separate average
scores: RDG-Pre, RDG-Post, CONT-Pre, CONT-Post. A one-sample t test was conducted for
each of the four averaged scores to determine if the mean was significantly different from 4. The
test value of 4 was used because it is the midpoint rating (between 1 and 7) on the scale. A value
less than 4 implies a negative view toward the subject (i.e. reading or the content course), a value
greater than 4 implies a positive view toward the subject. With alpha set at .05, the sample means
for all four scores were significantly different (p < .01) from 4. The means and standard
deviations are presented in Table 5.
Table 5.
Means, standard deviations, and confidence intervals for Student Motivation Scale
Averaged Score
Mean
Standard Deviation
RDG-Pre
3.54
0.76
RDG-Post
3.57
0.91
CONT-Pre
3.79
0.44
CONT-Post
3.58
1.00
These results indicate that, overall, students enrolled in the Reading Labs had a negative
orientation toward both reading and their paired content course shown by the mean ratings below
the midpoint of 4 on the rating scale. The mean rating on the reading course pre-administration
(M=3.54) was slightly lower than on the reading post-administration (M=3.57) indicating that
students’ orientations toward reading remained about the same. Conversely, mean ratings on the
content course pre-administration (M=3.79) were higher than on the content course postadministration (M=3.58) indicating students’ orientations toward their paired content course
became more negative.
15
Limitations
Students responded to the Student Motivation Scale anonymously, therefore it was not
possible to do student level comparisons to determine if individual students had a more positive
or negative orientation toward reading or the paired content course. When using self-report data,
social desirability effects can present threats to validity because self-report responses, by
definition, are subjective and systematically biased. Respondents may tailor their responses
(either consciously or unconsciously) to portray a more favorable or unfavorable perception
based upon the context and what might be considered socially acceptable (Crockett,
Schulenberg, & Petersen, 1987). “The crucial problem with self-report, if it is to be interpreted
as a picture of typical behavior, is honesty…Even when [the respondent] tries to be truthful we
cannot expect him to be detached and impartial. His report about himself is certain to be
distorted to some degree” (Cronbach, 1970, p. 40). An additional limitation is related to the input
of the data from hard copies of the instrument administered by the teaching assistants. When
inputting data, there is always an element of human error that needs to be considered when
interpreting the results.
Indicator 4. Course Grades
The Office of Undergraduate Education provided mid-semester and final grades for the
students enrolled in the Reading Labs. Final grades in the paired content courses were also
provided. All data related to grades was submitted to the evaluator in the form of Microsoft
Excel spreadsheets via an email attachment. Each student enrolled in the Reading Labs
completed a dossier as a requirement for the course, which accounted for 50 percent of his or her
final grade. Since the dossier grades are highly correlated with the final grades, a separate
analysis of these grades was not completed. Instead, a description of how the dossier assignment
sought to measure the instructional objectives for the Reading Lab is included.
Data analysis and findings
16
Basic frequency counts were computed for midterm and final grades in the Reading Lab
course (i.e. A&S 100). Chart 1 provides a data display of the midterm grades for the Reading
Lab broken out by the percent of students earning each grade denomination.
Chart 1.
Midterm grades recorded for Reading Labs during Fall 2009 semester
5% 2% 2%
4%
A
B
C
16%
D
53%
E
18%
I
None
*Note: 2 percent of the student population did not have a midterm grade recorded as
identified by “none” in the chart above
At mid-semester, more than half of the students (53%) enrolled in the Reading Labs had earned
an A in the course. The majority of students (89%) earned a midterm grade of C or better. Chart
2 provides a data display of the final grades for the Reading Lab broken out by the percent of
students earning each grade denomination.
Chart 2.
Final grades recorded for Reading Labs during Fall 2009 semester
5% 4%
A
11%
B
C
22%
58%
D
E
17
A total of 80 percent of the students earned either an A or B as their final grade in the Reading
Lab. The percent of students earning either an A or B for the final grade in the Reading Lab
increased from the recorded midterm grades by 5 percent and 4 percent respectively. The
majority of students (90 percent) enrolled in the Reading Labs earned a final grade of C or better.
Four percent of the students received a failing grade (E).
Basic frequency counts were computed for final grades in the content area courses linked
to the Reading Labs (i.e. ANT160, HIS108, SOC101, or GEN 109). Chart 3 provides a data
display of the final grades for the paired content courses broken out by the percent of students
earning each grade denomination. The chart also includes the percent of students who officially
withdrew from the course (denoted with a “W”).
Table 3.
Final grades recorded for paired content courses during Fall 2009 semester
4% 2% 2%
A
13%
B
12%
C
24%
D
E
43%
W
None
*Note: 2 percent of the student population did not have a final grade
recorded as identified by “none” in the chart above
The majority of students (79%) earned a final grade of C or better in the linked content courses,
with a large portion (43%) earning a B and about a quarter of the students (24%) earning an A.
Four percent of the students failed (grade = E), two percent “Withdrew” after midterm grades
were posted, and no students were given an “Incomplete” for the course.
18
Limitations
Course grades are subjective in nature and can differ from instructor to instructor. That is,
completion of a course with a grade of “A” may have different requirements or standards based
on the content area or course instructor. Although the Reading Lab grades were based on the
same criteria (class work/participation = 10%, quizzes = 40%, reading dossier = 50%, see
Appendix A), there was no common scoring metric or rubric for these three categories that
contributed to the overall grade for the course.
Dossier Assignment
Students enrolled in the Reading Labs were required to complete a dossier that consisted
of weekly assignments that provided opportunities for them to apply the strategies that were
learned in the class sessions. Dossier assignments provided to the teaching assistants during the
summer training workshop corresponded with the 16 chapters of the textbook, however the
teaching assistants were encouraged to use their own discretion regarding which applied
strategies might be the most beneficial to their students. Approximately six weeks into the
semester, Dr. Godbey became concerned after reviewing the teaching assistants’ weekly
reporting forms that the instructional objectives of the course were not being measured. Dr.
Godbey spent a large portion of the monthly meeting with the teaching assistants (on October 12,
2009) dedicated to her concerns and the importance of ensuring that the instructional objectives
were being met. Following this meeting, Dr. Godbey distributed an additional list of suggested
dossier assignments that corresponded with the core text and would specifically address five of
the instructional objectives that appeared to be overlooked up to that point. She also requested
that the teaching assistants include a list of the instructional objectives that were addressed
during each class session in their weekly reports. Additionally, the teaching assistants were asked
to report the specific instructional objectives being measured through the administration of
19
quizzes. Table 6 below identifies which objectives were measured by either dossier assignments
or quizzes by the six teaching assistants.
Table 6
Measurement of instructional objectives for Reading Labs
Instructional Objective
1. Identify main and supporting details
2. Identify common writing patterns in reading*
3. Differentiate fact and opinion*
4. Develop reading study strategies
5. Formulate definitions using context clues*
6. Build vocabulary
7. Develop techniques for summarizing/paraphrasing*
8. Apply critical reading skills*
9. Use technology to enhance reading and research skills
10. Create a reading dossier
11. Synthesize information from different sources*
12. Recognize alternate viewpoints in texts*
13. Use research strategies and information technology
Dossier
Quizzes
6
6
4
5
4
4
6
3
2
2
6
5
5
4
5
5
4
6
6
6
2
2
4
4
4
6
*Denotes instructional objective that was not directly measured by all six teaching assistants
Although only 6 of the 13 instructional objectives were measured through dossier assignments or
quizzes, a review of the topics addressed in the weekly reports submitted by the teaching
assistants indicates that all of the instructional objectives were addressed during class sessions.
Limitations
One major drawback of the dossier assignment was the fact that minimal information was
communicated to the students enrolled in the Reading Lab regarding the expectations of this
assignment, thus the assignment was left open to interpretation by both the teaching assistants
and the students. The only details provided on the course syllabus (see Appendix A) included the
percentage (50%) of the final grade that the assignment was worth and a notation that “dossier
20
assignments are due weekly and are worth 10 points each” followed by a restatement of the
grading percentage. Although the teaching assistants were provided with additional details
regarding the dossier assignments during the summer training workshop and in their instructional
materials, it is not clear how much detail was passed on to the students. Additionally, there was
not enough emphasis placed on how each dossier assignment addressed the instructional
objectives of the course during the training workshop.
Indicator 5. Fall Semester Grade Point Average (GPA)
The Office of Undergraduate Education provided each student’s Fall 2009 semester
Grade Point Average (GPA). This data was submitted to the evaluator at the same time that final
grades were supplied in the form of Microsoft Excel spreadsheets via an email attachment.
Data Analysis and Findings
Basic frequency counts were computed for each student’s Fall 2009 semester Grade Point
Average (GPA). Nearly half of the students (47.8 percent) had a semester GPA of 2.5 or above
for the Fall 2009 semester. Chart 5 provides a data display of the Fall 2009 semester GPA ranges
for the students enrolled in the Reading Labs broken out by the percent of students earning each
GPA range (i.e. 1.0 = GPA range from 1.0-1.9, 2.0 = GPA range from 2.0-2.9, etc.).
Chart 5.
Fall 2009 semester GPA for Reading Lab students
2%
2%
8%
16%
19%
4.0
3.0
2.0
1.0
53%
<1.0
None
*Note: 8 percent of the student population did not have a GPA available
in the data set as identified by “none” in the chart above
21
The majority of students (53%) obtained a GPA for the Fall 2009 semester in the range of 2.02.9. Nearly three-fourths (71%) of the students earned a GPA greater than a 2.0 with 16 percent
in the 3.0-3.9 range and 2 percent earning a 4.0 GPA.
Limitations
Although the University of Kentucky utilizes a standard, traditional grading scale for
undergraduate students (i.e. A=90-100, B=80-89, etc.), assigning grades is a subjective act.
Different professors or instructors weight course requirements differently, thus a grade of “A” in
one course or course section may differ in content from another. For example, student
participation is often allocated a percentage of the final grade in a course. This percentage may
range from 5 percent to 25 percent of a student’s final grade based on the structure of the course.
One professor might base this on mere attendance while another may evaluate actual
participation and the quality of that participation. Similarly, different course requirements and
assignments may be weighted differently or have a more or less subjective scoring method than
others.
Indicator 6. Weekly Reports from Teaching Assistants
Teaching Assistants submitted weekly reports that included student enrollment
information (e.g. absenteeism, whether students had dropped course, any students reported on
early alert), instructional information (e.g. topics addressed, instructional materials and methods
used), as well as reflective feedback about the class session. Basic descriptions from the data are
provided for each element of the weekly reports with the exception of the reflective feedback.
Reflective feedback was analyzed using a basic constant comparative analysis (Glaser, 1965) to
identify themes that emerged across the data points from the 104 reports submitted by the six
teaching assistants.
22
Student Attendance and Early Alert Reporting
Absenteeism records submitted as part of the teaching assistants’ weekly reports indicate
that a total of 136 absences were recorded during the Fall 2009 meetings of the Reading Labs. At
most 18.3 percent of the students enrolled in the Reading Labs were absent during a given week
with a total of 17 students recorded as absent across the 12 sections. It should be noted that this
was the week prior to the Thanksgiving break. There were fewer absences during the first and
final few weeks of the semester with the most absences reported during the middle of the
semester (weeks 5 through 13). On average, 9.1 percent of the student population was absent
with no absences reported during the final week of classes. Students who provided the teaching
assistants with a medical note, prior notification of their absence, or other documentation were
not reported in the early alert system.
The teaching assistants submitted a total of 52 early alert reports for 22 students. These
included 15 reports for 3 students who either dropped or withdrew from the Reading Lab course
before the end of the semester. After removing these three students from the early alert data,
there were 37 early alert reports made for the remaining 19 students. One student was reported
on 9 separate occasions, 3 students were reported on 3 separate occasions, 4 students were
reported on 2 separate occasions, and 11 students were reported only once.
Instructional Materials, Style and Topics Addressed
The teaching assistants reported using a wide range of instructional materials and
resources in the Reading Labs. These materials included the core textbook, handouts (e.g.
syllabus, worksheets, articles, poems, assignment sheets, graphic organizers, and quizzes) core
content texts, popular media publications (e.g. Time magazine and Sports Illustrated),
PowerPoint, videos, primary source texts, and the library website. They utilized several different
instructional styles and methods including lecture, small group work, discussion, guided practice,
reading passages aloud, question and answer sessions, and demonstrations. Weekly topics
23
corresponded with the 16 chapters in the core textbook for the Reading Labs as outlined in the
course syllabus (see Appendix A). Most of the teaching assistants made modifications to the
outlined schedule of topics to address the needs of the students, which included additional
emphasis on how to take notes from their text books, how to pull main ideas out of class lectures,
and different study strategies for upcoming tests in their linked content course.
Reflective Feedback from Teaching Assistants
The reflective feedback provided by the teaching assistants through the weekly reports
were analyzed using a basic constant comparative method (Glaser, 1965) to document themes
across these qualitative data points. Nine main themes emerged from the data, including: 1)
instructional time, 2) PLATO references, 3) Reading Lab meeting locations, 4) AV and/or
technology related issues, 5) student engagement, 6) student commitment to success, 7) concepts
learned, 8) application of concepts, and 9) reading texts.
Instructional time. In relation to instructional time, the teaching assistants made several
comments regarding the difficulty of fitting the required content into a 50-minute class period.
The following excerpts from the teaching assistants’ feedback highlight these concerns:
Example 1: It is becoming increasingly apparent that an hour a week is not
sufficient time to effectively teach the material. The simple fact is that by the time
I have given a quiz, gone over that quiz, collected dossier assignments, discussed
dossier assignments and address any other questions students have about their
101 class or other things, time to cover new material in ways OTHER than lecture
simply isn’t there. I am going to attempt next week to integrate ‘activities’ in our
work to see if it is plausible, but as of right now it simply seems we have a
problem of time.
Example 2: Only being three students, I think group activities should be quite
easy in this section, but fitting them in when I have only 50 minutes a week is still
a challenge for me.
Example 3: Because there are some more outgoing students in this section who
like to talk a lot and ask a lot of questions, I sometimes feel as if I’m rushing
through some of the material in order to get it all in time.
24
As can be seen from these data, there is concern about fitting everything in. There were a total of
six comments made in relation to this concern.
PLATO references. There was widespread dissatisfaction regarding the use of PLATO
and difficulties that students were experiencing trying to log onto this online system throughout
the semester. Some of these frustrations are apparent in the following excerpts:
Example 1: They have, however, been frustrated by Plato issues and somewhat
frustrated about setting aside the time to go to library and do Plato only to not be
able to do it when they get there. The frustration was mild however, and they
were generally cooperative about keeping at it until it gets done.
Example 2: We talked about why students had still not finished PLATO training
and they said that they found it frustrating and “useless”. Many said if they were
expected to complete the system they would simply guess at the answers and
others admitted they had guessed when they logged on the first time. They felt
that the system was tedious and made them feel “stupid” since they felt like they
had to memorize a lot of information that was not helpful. Although this is
probably not the response anyone wants. I do think students’ refusal to
participate in the system is also a useful form of data.
Example 3: There was widespread and strong negativity expressed to the idea of
Plato post-test. They seemed quite unhappy about it and were actively unwilling
to try to understand the importance or point behind having to go through that
again.
Although Dr. Godbey, Signe Dunn (PLATO representative), and Chela Kaplan (Office of
Undergraduate Education) went to great lengths to provide login information, detailed
instructions for the students, and IT personnel to assist in the Hub at the W. T. Young Library,
students continually experienced difficulties and frustration with accessing the PLATO system.
There were 16 separate comments made by the teaching assistants that documented difficulties
with the PLATO online system.
Reading Lab meeting locations. Several of the teaching assistants had problems with the
meeting space that was originally reserved for the facilitation of the Reading Labs that resulted
in changes to the meeting location. One teaching assistant who had a large class (17 students)
complained of the smallness of their meeting space. The following excerpts address these issues:
25
Example 1: It went well considering we had to move locations
Example 2: Our normal meeting space is becoming too busy for group discussion
(is distracting) will be moving location in following weeks.
Example 3: I have been having a couple of issues with this class. Because of the
size (17 students), the smallness of the room, and personalities of the students,
there are sometimes distracting individuals in this class.
In the cases in which the meeting location needed to be changed, the teaching assistants were
proactive in finding a workable location in order to successfully conduct their sections of the
Reading Labs. There were six separate comments made by the teaching assistants identifying
problems with the meeting location.
AV and/or technology related issues. This theme was related to the previous theme
regarding the meeting location but was specific to AV or technology problems that were
encountered. The following excerpts provide an illustration of the different problems that the
teaching assistants faced:
Example 1: The classroom in which it is held is going to present a problem. The
room lacks ability to have PowerPoint equipment. There is no screen, no
projector, and TASC is unable to bring a smart cart into the room.
Example 2: The video cart I had reserved was not in the room, thus making it
impossible to use the PowerPoint and online items that I had planned.
Example 3: There are still ongoing a/v frustrations though, making me feel bad
for them that I sometimes end up having to do more lecturing and talking than I
plan on, but they always seem quite willing to go with the flow.
Because PowerPoint presentations had been prepared to coincide with each chapter of the core
text, the availability of a computer and projector to facilitate instruction was essential to the
implementation of the Reading Labs. Unfortunately, the teaching assistants encountered many
problems with the availability of the necessary equipment. There were a total of 10 comments
from the teaching assistants that documented difficulties with the AV equipment (or lack of
equipment) that impeded instruction.
26
Student engagement. The teaching assistants continually documented comments related
to student engagement and participation in the Reading Labs. These comments included
concerns about engagement and participation, positive remarks when students were eagerly
involved in class, as well as observed changes in engagement as the semester passed. The
following excerpts provide examples of each of these elements of engagement and participation:
Example 1. This class lacked enthusiasm; only one of the 5 students seemed really
engaged with discussion and seemed interested in the course itself.
Example 2. The students were very active and interested in the material.
Example 3. They’ve become much more outgoing and talkative over the course of
the semester, making the class/group activities a lot more productive and
instructional.
The teaching assistants also documented instances of end of semester drag with their students as
highlighted in the following excerpt:
Example 4. Students are starting to get tired. During a conversation before class
they said they were feeling overwhelmed by the end of the semester and how much
work they had to do. Many were ready for break since they had not seen their
families since August.
There were a total of 34 comments related to this theme thus indicating the teaching assistants’
overall concern with the engagement of their students.
Student commitment to success. This theme was exclusive from the previous theme
regarding student engagement in that it focused on the level of commitment by the students to
fulfill the requirements of the course. This theme presented a dichotomy between a lack of
commitment by students and the description of positive efforts made to fulfill the course
requirements and make up material missed due to absences. The following excerpts provide
evidence of these two opposing perspectives related to student commitment:
Example 1. I am a little concerned that they aren’t taking homework seriously
enough, and this isn’t extensive homework if they let it slide what are they doing
in their other classes?
27
Example 2. This class is worrying me a bit. Today there were four absences and
only five students turned in their dossier assignment. I am not satisfied with these
numbers and will have to remind them the consequences of such behavior in the
next class.
Example 3. The two absent students both contacted me fairly quickly via email
about being ill and requesting the assignment for the week.
Example 4. Those who have missed any classes continue to be very proactive in
getting the assignments and make up work done.
The teaching assistants made a total of 22 comments related to student commitment in which
they noted concerns about their classes as a whole as well as individual student success.
Concepts learned. The teaching assistants made comments related to concepts in both the
Reading Labs as well as the paired content course. The following excerpts highlight student
learning related to various concepts introduced in their courses:
Example 1. It seems that they are struggling with the different concepts they are
learning in sociology, and I think it was good for them to talk about how to read
the texts for their class.
Example 2. We had really good discussion about author’s perspectives and how
our own social perceptions shape interpretation, especially in social science texts.
Overall, I was especially impressed with their understanding of the ideas
presented this week.
Example 3. One student seemed to catch onto everything right away, the other
two struggled at first but seemed to understand much better by the end of the class
activity.
There were a total of 12 comments made related to students’ learning of concepts in both the
Reading Labs and paired content course. As can be seen from the examples, the teaching
assistants made direct connections between the content of the Reading Labs and the concepts in
the paired content courses.
Application of concepts. The teaching assistants made many comments related to the
application of concepts learned in the Reading Lab to the students’ paired content courses. Both
positive and negative responses by the students were documented as shown in the following
excerpts:
28
Example 1. The feedback I got from the class about the lecture note chapter was
that it was basically useless to them. They said they’d already heard most of it
and didn’t get a lot out of it.
Example 2. The majority of the students were having trouble keeping up with the
reading in their content class. We spent today reading their content textbook and
comprehending each passage. This is challenging because some students are in
different sections of the content course, therefore their reading assignments vary.
Their dossier assignment asked them to summarize a few main ideas of the
chapter.
Example 3. This week we thought through how to use texts as a guide to write a
paper. Students were asked to outline and/or come up with a graphic organizer
that would help them for a final paper that they would have to write for one of
their classes.
A total of 13 comments were made by the teaching assistants related to the application of the
Reading Lab concepts into the paired content courses. For the most part, the application activities
that the teaching assistants had their students complete were denoted as positive experiences.
Reading texts. The final theme derived from the feedback provided by the teaching
assistants specifically addressed reading activities. Several of these comments related to
students’ difficulties with reading their texts and reading load as illustrated in the following
excerpts:
Example 1. Their concerns were how to take notes from their textbooks and how
to pull the main ideas out of class lecture.
Example 2. One area of frustration they all shared in reading their textbooks was
identifying key points in a book that had been heavily highlighted.
Example 3. The majority of the students were having trouble keeping up with the
reading in their content class. We spent today reading their content textbook and
comprehending each passage.
As can be seen from the above statements, the linking of a content course with the Reading Labs
provided a platform for specific student concerns to be addressed. There were a total of ten
comments specific to reading texts. Additionally, other comments included elements of teaching
reading skills including vocabulary development, learning word roots, comprehension through
the summarization of key points, and how social perceptions shape interpretations of texts.
29
Limitations
The information documented through the weekly reports prepared by the teaching
assistants is for the most part observational data. One major limitation of this data is the fact that
the teaching assistants were not provided training in order to document events accurately and
completely, and their knowledge and skill level in research methods is unknown. This data is
limited to descriptions of what happened during each section of the Reading Labs. Since the
teaching assistants were responsible for the instruction, conversations and discussions with the
students were documented after they occurred and may not have been documented completely or
accurately. An additional limitation is bias that may be introduced by the teaching assistants,
since they felt partially responsible for the students’ success in the Reading Labs, as well as the
evaluator when analyzing the data based on the dual role of supporting and monitoring the
success of the program and evaluating the program. These stated issues impede the reliability of
the data and the ability to generalize the results.
Indicator 7. Focus Group Interviews
Dr. Godbey conducted focus group interviews with approximately 30 percent of the
students enrolled in the Reading Labs (n=28) during the final 3 weeks of the Fall 2009 semester.
Students were selected through stratified randomization to ensure representation from each
section was included. The focus groups ranged in size from four to six students. A five-question
interview protocol was followed (see Appendix E). The focus group interviews lasted
approximately 20 minutes, were conducted anonymously, and were audio recorded to ensure the
student responses were documented accurately for analysis. Dr. Godbey provided a summary
table of the student responses from the focus group transcripts in the format of a Microsoft Excel
spreadsheet via email attachment.
30
Data Analysis and Findings
There were five questions that students responded to during focus group interviews that
were facilitated by Dr. Godbey. The transcripts from these interviews were analyzed using a
basic constant comparative method (Glaser, 1965). Each question was analyzed separately to
determine main themes.
Q1. How do you feel the reading strategies you learned in A&S 100 affected your
learning in your content course? Three main themes were identified from the student responses
to this question: 1) note taking strategies, 2) reading strategies, and 3) study strategies. The
excerpts below provide an example of each theme:
Note Taking Strategies
[Response 11] It made me be aware of what I am reading and to take notes on
what I'm reading.
[Response 13] She gave us hints about what to write down when taking notes.
Reading Strategies
[Response 15] It just helped me in general because I don't like to read, but I
learned to pick out the main ideas and important details and understand it.
[Response 21] It helped prepare me to be able to read better as far as interpreting
what the readings meant in sociology and being able to answer questions I had
for myself.
Study Strategies
[Response 14] She told us about different kinds of exams and how to better
prepare for those.
[Response 19] I feel it helped me study and get ready for exams. What we learned
in here helped me study sociology. If I had questions about sociology, he would
help clear it.
There were a total of 21 documented responses to this question. Of those responses, only three
were made that had negative connotations, which indicated that the students felt the class was
31
“useless” because many of the strategies were previously learned. It should be noted that these
three students were in one focus group interview together.
Q2. Which specific strategies do you think were the most helpful to you? There were four
main themes identified from student responses to this question: 1) reading strategies, 2) note
taking, 3) organizing information, and 4) time management. The excerpts below provide an
example of each theme:
Reading Strategies
[Response 4] The comprehension strategies because before I had a hard time
comprehending and he taught us to underline stuff and take notes in the margin.
[Response 8] Trying to pick apart hard readings with different methods.
Note Taking
[Response 16] The Cornell Notes helped a lot. Reading and highlighting and
taking notes.
[Response 27] Mine was the note taking skills cause…I was writing down
everything word for word… but now I only write down the important facts.
Organizing Information
[Response 17] When you have a project, to outline first.
[Response 21] Graphic organizers. And learning our learning styles.
Time management
[Response 5] Time management and memory tricks. He taught us a lot of memory
tricks and how to manage our time.
[Response 7] Spreading my time out as far as studying goes.
A total of 28 responses were documented for this question, only one of which was a negative
comment: [Response 19] It was stuff I already knew before, so nothing really never changed.
Q3. Do you think you would have been as successful in your content course without
attending A&S 100? The responses to this question documented two main themes: 1) successful
32
without A&S course and 2) not as successful. The excerpts below provide an example of each
theme:
Successful without A&S Course
[Response 3] I think I would have done pretty good if I hadn't taken this course. I
eventually would have gotten it and would have done pretty good in the class.
[Response 7] I think I would have been more successful without the class because
it would have been less homework to worry about. One less reading and one less
class to worry about and more time with my history class.
[Response 12] I think I still would be successful. This class was just a reminder
and review kind of thing.
Not as Successful
[Response 10] I don't think I would have done as well.
[Response 20] I think A & S helped a lot, but it helped more in a reading aspect my study habits haven't changed a lot, but as far as reading, it helped a lot.
[Response 21] No. I don't think I would have been as successful. Because it
helped me analyze the reading and it helped me in other classes, not just the
reading.
There were 23 responses to this question, 11 of which indicated the students felt they would have
been successful in their content course without the A&S 100 Reading Lab course. Two responses
showed uncertainty by the students regarding their success without the Reading Lab, and the
remaining 10 responses indicated that the Reading Lab helped them to be more successful in
their other classes.
Q4. What were the main benefits of the A&S course? What were the weaknesses? The
analysis of this two-part question was broken out to identify themes related to benefits of the
course and those related to weaknesses of the course. Three themes emerged related to course
benefits: 1) course instructor, 2) learning various strategies, and 3) college preparedness. Two
additional themes emerged related to course weaknesses: 1) textbook, and 2) redundancy of
content. Examples of each of these themes are provided in the excerpts below:
33
Course Instructor (benefit)
[Response 4] I like that [TA’s name removed] is a current student and knows a lot
and is able to help us. He is awesome! He is helpful. It is good to know that if I
have any questions he will help me. He is so knowledgeable.
[Response 10] Our instructor helped us understand if we had a question about
our Anthropology class. She would help us with any subject if she could but
especially Anthropology. She knew all the instructors that teach Anthropology, so
she could help us in any shape or form.
[Response 18] I learned a lot just because of my instructor being a college.
Learning Various Strategies (benefit)
[Response 6] It helps just in general with studying and understanding readings.
[Response 8] I have more strategies to look at. I think I'm a better note taker.
[Response 17] Learning how to analyze the readings and the book itself.
College Preparedness (benefit)
[Response 18] just basically getting a freshman ready for college [in a] Writing
aspect and reading aspect.
[Response 20] I don't think there is any weaknesses. But there was a lot of good
things about it that helped prepare me for college.
Textbook (weakness)
[Response 3] The readings. I didn't think they were hard enough for us. I feel that
if we could have gotten some harder articles we would have done a lot better.
[Response 10] The book. Almost all of the stuff could have been accessed on the
Internet. It was pricey.
[Response 13] Weakness: the book. This book is what we used in high school.
Redundancy of Content (weakness)
[Response 2] The weaknesses were that we stayed on them too much, the
comprehension strategies.
[Response 14] Repetition. We learned everything over and over.
[Response 16] Weakness? The repetitiveness of doing it every week.
34
Overall, 20 responses to this question were documented. Of those responses, there were 15
benefits and 12 weaknesses noted by the students. The students seemed to view the teaching
assistants as role models and connected with them based on their status as college students. The
strategies that were identified lead to comments regarding college preparedness. The weaknesses
regarding the textbook and redundancy of content should be noted for future planning.
Q5. What did you think about using PLATO online courses? About how much time did
you spend on PLATO each week? What were the main benefits? Weaknesses? Due to the
difficulties with using PLATO documented early in the semester, this question was added to the
focus group interview protocol in an attempt to uncover specific issues that students experienced.
The majority of students made negative comments regarding PLATO and labeled it as pointless
and time consuming, which resulted in identified stress for the students. This main theme was
threaded through all but two responses. Two additional themes that emerged included the need
for an instructor to assist with PLATO and the recognition that the students applied strategies
learned in the Reading Lab. The following excerpts illustrate these three themes:
Pointless and Time Consuming
[Response 6] PLATO was a killer. It took me 3 hours in one day to get through
one part. Busy work, busy work, busy work equals PLATO.
[Response 9] It was a huge stress. I felt like we could have done something in a
more useful way to not get so overwhelmed, I guess.
[Response 15] It's just time consuming and pointless. It had nothing to do with
school. I would go through everything and at the end it would never give me a
score.
Need for Instructor
[Response 25] Me personally, I didn't really like the PLATO. It's a system like
where it needs to have an instructor…It wasn't for a student to just go do it
themselves…there was a lot of steps…there was a lot of errors with the program.
And it took a while…My recommendation for next year is for an instructor or
someone who has knowledge about the test to give it.
35
[Response 27] First of all, I think we should all go as a class and take it. Our
instructor could be there and a helper, whatever…The questions frustrate you
because they are questions you would ask a four year old. I got to the point where
I was just marking anything.
Application of Strategies
[Response 17] What we learned in class like picking out the main idea - I put into
play.
[Response 24] I noticed a big difference between the first time I took it and the
second time. I could pick out the main ideas.
There were 28 comments made related to the use of PLATO with only two responses that did not
include negative connotations. Three students made specific comments that indicated their stress
level had been increased because of the difficulties they had with PLATO as well as the time
they needed to commit to using it to complete the assessments.
Limitations
Although focus group interviews provide an opportunity to obtain a rich body of
information, there are also limitations to this data collection technique. The responses of each
participant are not independent from other responses and a few dominant focus group
participants can skew the session in either a positive or negative manner. Focus groups should be
conducted by a skilled and experienced moderator to probe for additional information and should
last for 40-60 minutes (Rossman & Rallis, 2003). The focus groups for this evaluation were
conducted by an untrained moderator (although an interview protocol was followed) and were
limited to 20 minutes. Finally, the results from the focus group interviews are limited to the
expressed reactions to the questions by a small group of people, thus the findings cannot be
generalized to the entire population of students enrolled in the Reading Labs.
Additional Analyses
Correlation coefficients were computed among the pre- and post-COMPASS
assessments, pre- and post-PLATO assessments, two midterm grades, two final grades, and
36
overall GPA for the Fall 2009 semester. Using the Bonferroni approach to control for Type I
error across the 9 correlations, a p value of less than .006 (.05/9 = .0055) was required for
significance. The results of the correlation analysis showed that correlations between midterm
grades, final grades, and overall GPA were statistically significant, as would be expected, and
were greater than or equal to .41. All other correlations were non significant.
CONCLUSIONS
The results of the evaluation are somewhat inconclusive in regard to the main objectives
of the Reading Labs. Each is discussed separately herein. Additional conclusions are also
provided that consolidates the overall findings of the evaluation.
Objective1: To increase students’ comprehension and reading level
Because there were flaws in both the COMPASS and PLATO reading test scores as
previously documented, it is not plausible to draw conclusions regarding a measurable increase
in students’ comprehension and reading level. However, several comments made by students
during the focus group interviews indicate that they experienced increased success in reading
texts, identifying main ideas, and applying reading strategies to their content texts.
Objective 2: To foster an increased motivation to learn in the areas of literacy and paired
content course (i.e. anthropology, history, sociology, agriculture)
The results of the Student Motivation Scale were relatively flat in regard to reading with a
small decrease related to the students’ motivation toward their paired content course. Since this
data is not available on an individual level because it was collected anonymously, it is not clear
why motivation remained stagnant or may have decreased. It is not possible to determine if there
were correlations between motivation and overall success in the Reading Lab or content course.
Looking to the qualitative data to provide some documentation of motivation, one might interpret
the positive comments made by students regarding their success in their content course as a
result of their enrollment in the Reading Lab as well as the rate of students obtaining a grade of
37
B or better in their content course as an indication of increased motivation. However, this
interpretation is made with caution without additional data points specifically measuring
motivation to triangulate or corroborate this conclusion.
Objective 3. To foster independent literacy and study skills
More legitimate conclusions may be drawn in relation to this final objective. As
documented through both the weekly reports by the teaching assistants and focus group
interviews with the students, it was reported that the students were successful in applying the
reading and study strategies that they learned in the Reading Labs. The data documented the
independent application of skills in the paired content courses, other content courses, as well as
when using the PLATO online system.
Additional Conclusions
•
Program personnel were successful in planning and implementing the summer training
workshop to train teaching assistants to facilitate the Reading Labs.
•
Students may not have viewed the COMPASS assessments as important to their overall
success resulting in a decrease in scores between the pre- and post-administration.
•
Students did not like the PLATO online system, which may have led to increased stress
levels.
•
Students held negative views of both reading and their paired content course as measured by
the Student Motivation Scale.
•
The majority of students (90%) were successful in the Reading Lab course by earning a final
grade of C or better.
•
The majority of students (79%) were successful in their paired content course by earning a
final grade of C or better.
•
The majority of students (71%) experienced academic success by earning a GPA of 2.0 or
higher.
38
•
The meeting time for the Reading Labs was too short and meeting locations were
problematic at times (in relation to both space and AV and technology issues).
•
The students viewed having teaching assistants who were content experts as positive and
very helpful. They also saw them as role models.
•
Teaching assistants did not address all of the instructional objectives related to the Reading
Lab content.
•
Many students stated that their participation in the Reading Labs helped them to be more
successful in their content courses.
•
Many students reported using specific reading and study strategies in their content courses.
RECOMMENDATIONS
The following recommendations are provided to address the main objectives of the Reading
Labs. These are followed by additional recommendations based on the results of this evaluation.
Objective1: To increase students’ comprehension and reading level
•
Develop a streamlined, more accurate procedure for documenting and providing data
collected by the Office of Undergraduate Education (e.g. COMPASS scores, grades, GPA).
•
Place greater emphasis on the importance of the reading assessments to the students.
Consider an alternate reading assessment that could be administered during the midterm
and/or final exam periods.
Objective 2: To foster an increased motivation to learn in the areas of literacy and paired
content course (i.e. anthropology, history, sociology, agriculture)
•
Administer the Student Motivation Scale through an online platform to reduce human error
when recording scores. Include a student identifier so results of the motivation scale can be
further analyzed for individual students.
39
Objective 3. To foster independent literacy and study skills
•
Conduct follow-up interviews at the end of the Spring 2010 semester with students who were
enrolled in the Reading Labs to determine whether they continued to use the reading and
study strategies they learned.
Additional Recommendations
•
Revise the summer training workshop according to the feedback provided by the teaching
assistants regarding length and other recommendations.
•
Revise the summer training workshop to place additional emphases on the instructional
objectives.
•
Revise the course syllabus to include additional details regarding the dossier assignment.
Include a reflective paper that students complete at the end of the semester focused on the
application of the learned strategies as documented through the dossier assignments. Develop
a common rubric or scoring metric to assess the dossier content.
•
Remove the requirement related to the use of the PLATO online system. Consider the use of
alternate support services available through other units at the university. If PLATO use is
continued, provide on-site supervision and support.
•
Compare data collected on Reading Lab students (e.g. attendance rates, grades and GPA) to
other students of similar standing not enrolled in the Reading Labs to better determine overall
academic success.
•
Increase the class time for the Reading Labs to a minimum of 75 minutes per week to allow
for additional emphasis on the practice and application of strategies, thus increasing the
amount of credit hours earned accordingly to elicit additional buy-in by the students.
•
Consider the use of teaching assistants from the College of Education who are content
experts but also possess foundational knowledge in curriculum and instruction to ensure
learning activities and assessments address all instructional objectives.
40
References
Christophel, D. M. (1990). The relationships among teacher immediacy behaviors, student
motivation, and learning. Communication Education, 39, 323-340.
Crockett L, Schulenberg JE, Petersen AC. Congruence between objective and self-report data in
a sample of young adolescents. Journal of Adolescent Research, 87(2), 383–92.
Cronbach, L. J. (1970). Essentials of psychological testing (3rd. ed.). New York: Harper & Row.
Glaser, B. G. (1965). The constant comparative method of qualitative analysis. Social
Problems, 12, 436–445.
Rossman, G. B., & Rallis, S. F. (2003). Learning in the field: An introduction to qualitative
research (2nd ed.). Thousand Oaks, CA: Sage Publications.
41
Appendix A
READING LAB SYLLABUS
Fall, 2009, Section ____, Day ____, Time ________, Bldg. _______, Room ____
Instructor:
Department: College of Education
Phone:
Coordinator: Dr. Ellen Godbey
Mailbox:
Phone: 859-857-5627
Email:
Email: Ellen.godbey@uky.edu
Office:
Office: CCLD 212
TEXTBOOK
Van Blerkom, Dianna L. & Mulcahy-Ernt, Patricia I. (2005). College Reading and Study
Strategies. Belmont, CA: Thomson Wadsworth. ISBN # 0-534-58420-9
COURSE DESCRIPTION
This Academic Readiness Reading Lab is designed to improve proficiency in learning strategies,
study strategies, and reading strategies in correlation to a specified content course. Strategies
taught in this lab are applied to college level reading materials. One hour per week.
INSTRUCTIONAL OBJECTIVES:
1. Identify main and supporting details
2. Identify common writing patterns in reading
3. Differentiate fact and opinion
4. Develop reading study strategies (skimming, scanning, note-taking, outlining, mapping,
highlighting, annotating texts, reviewing information)
5. Formulate definitions using context clues
6. Build vocabulary
7. Develop techniques for summarizing and paraphrasing reading without plagiarism
8. Apply critical reading skills
9. Use technology to enhance reading and research skills
10. Create a reading dossier
11. Synthesize information from different sources
12. Recognize alternate viewpoints in texts
13. Use research strategies and information technology to perform research and locate
readings in Online Library Database
42
GRADING
Class work/participation
Quizzes:
Reading Dossier in correlation with paired content course:
10%
40%
50%
GRADING SCALE
A=
B=
C=
D=
E=
*Dossier assignments are due weekly and are worth 10 points each. As a whole, these
assignments are 50% of your grade.
*During the second week, lab time will begin with a short quiz over something learned
from the previous week. These are very short quizzes, taking no more than 5 minutes and
are worth 10 points each. As a whole, these quizzes are 40% of your grade.
*Midterm and Final Exams are not required for this class.
ATTENDANCE
Class attendance and participation: Attendance is an essential ingredient of class participation.
Each student is expected to attend all class sessions and to participate in class discussions and
exercises. Attendance will be taken at the beginning of each class. If you come in late, it is your
responsibility to sign the attendance clipboard upon entering class. This will ensure that you are
marked “present” that day. Students who miss a class will be required to complete two labrelated sections on PLATO (the online course designed to improve strategies in reading) prior to
the next class meeting. The make-up work on PLATO will be assigned by the instructor in
accordance with the content of the missed work. Documentation must be submitted to the
instructor for credit by the next class period.
Students are entitled to an excused absence for the purpose of observing major religious
holidays. However, the instructor must be notified in writing by the second week of class.
EXCUSED ABSENCES
As stated in Student Rights and Responsibilities (5.2.4.2), the following are defined as excused
absences: (1) illness of the student or serious illness of a member of the student’s immediate
family; (2) the death of a member of the student’s immediate family; (3) trips for members of
student organizations sponsored by an academic unit, trips for university classes, and trips for
participation in intercollegiate athletic events, (4) major religious holidays, and (5) other
circumstances found to be “reasonable cause for nonattendance.” For all excused absences, the
instructor has the right to request appropriate verification or formal written notification. In the
event of an excused absence, the student is allowed the opportunity to make-up out-of-class work
and/or quizzes.
43
LATE WORK POLICY
Late work consists of any Lab requirement not completed on time. Typically, Lab assignments
are due the following week. The grade on a late assignment will be reduced by 10%.
CODE OF STUDENT CONDUCT
All rules and regulations set forth in the current edition of the Student Rights and Responsibilities
of the University of Kentucky will be followed in this lab. It is the student’s responsibility to
obtain a copy of the Student Rights. Please refer to the Student Rights and Responsibilities
available at http://www.uky.edu/StudentAffairs/Code/
CLASSROOM BEHAVIOR, DECORUM, AND CIVILITY
•
•
•
•
•
•
Arrive on time. If you are late, take a seat near the door. If there are no seats near the
door, you may stand or sit on the floor. DO NOT walk in front of classmates or the
instructor once class has begun.
Remain present for the full class period. (Leaving class early will result in a recorded
absence). Notify the instructor (in advance) if you have to leave class early.
Remove all hats and caps upon entering the classroom.
Turn cell phones off upon entering the classroom. Students who disregard this request
will lose some of their class points for the day.
No food is to be eaten during class sessions.
Show respect for others by your speech, behavior, and body language.
ACADEMIC DISHONESTY, CHEATING, AND PLAGIARISM
The University of Kentucky College of Education expects academic honesty and regards
plagiarism and other forms of cheating as absolutely unacceptable. The minimum penalty for
either of these offenses is an “E” in the course, with suspension and dismissal also possibilities.
REASONABLE ACCOMMODATION
If you need an accommodation for a disability that limits your ability to participate fully and
meet the requirements and expectations of this class, you must first go through the Disability
Resources Center, located at Room #2, Alumni Gymnasium. To contact the DRC by phone,
please call V/TDD 859-257-2754. Please do not request accommodations directly from the
instructor. It is important that you do so within the first two weeks of the semester so you can be
approved for any accommodations you may need. Read the syllabus and course schedule
carefully and determine if you will have difficulty with any assignment because of a disability. If
so, the Disability Services Office will issue a letter for the instructor(s) specifying
accommodations for which you will have been approved. Students with disabilities should
inform the instructor of any accommodations or modifications needed the first week of class.
http://www.uky.edu/StudentAffairs/DisabilityResourceCenter/
EOA
The University of Kentucky is an Equal Opportunity Institution.
44
TENTATIVE READING LAB OUTLINE
WEEK #1:
WEEK #2:
WEEK #3:
WEEK #4:
WEEK #5:
WEEK #6:
WEEK #7:
WEEK #8:
WEEK #9:
WEEK #10:
WEEK #11:
WEEK #12:
WEEK #13:
WEEK #14:
WEEK #15:
WEEK #16:
Chapter One: Motivation; Values; Goals; Learning styles
Chapter Eight: Taking Lecture Notes
Chapter Three: Introduction to College Reading Strategies
Chapter Four: General Reading Strategies for College Textbooks
Chapter Ten: Preparing for Exams
Chapter Eleven: Taking Exams
Chapter Five: Using Vocabulary Strategies
Chapter Six: Improving Concentration
Chapter Seven: Improving Memory
Chapter Two: Managing Your time
Chapter Nine: Taking Text Notes
Chapter Twelve: Comprehending Main Ideas
Chapter Thirteen: Locating Specific Details
Chapter Fourteen: Analyzing Key Points
Chapter Fifteen: Summarizing and Synthesizing Texts
Chapter Sixteen: Evaluating Information
45
Appendix B
Summary of Training Workshop Feedback and Instructor Reflection
What Worked Well
Overall, the workshop went very well from the viewpoint of the trainers and the Reading Lab
instructor team (see attached responses to evaluation questions). According to the instructors,
the strongest features of the workshop were
 The enthusiasm and preparedness of the training team
 The brainstorming sessions
 The organization of the materials and daily objectives
 The PowerPoints and presentations for each topic
 Learning what resources are available to them and how to access those resources
 The food
Instructors also enjoyed not being locked into a set curriculum. We valued their input and
feedback and changes were made to the curriculum as we reviewed the chapter and dossier
components.
Things That Could Be Improved






Allow more time for speakers on the first day: Disability Resource Center, The Writing
Center, and provost office regarding student services (15 minutes each instead of 10).
Take instructors to the web site where students are required to complete training during
the first 3 weeks of school (instead of just providing the web site).
Allow more time for downloading and exploring PLATO
Take instructors to the COMPASS site and discuss the test components/sample questions.
Shorten the workshop by one hour each day or shorten it to four days
Provide instructors with a list of action items (with time frames) for them and their Lab
students.
Things That Changed Chapter-By-Chapter
Chapter One: Instructors narrowed the subject matter to learning styles for the first day. By the
time the instructors go over the syllabus, give the 5-minute motivation survey, teach a few things
about learning styles, and let the students take the learning styles inventory in their textbook the
class time will mostly be used up.
Chapter Three: Instructors were provided a PowerPoint with activities/interaction with
students.
Chapter Four: Instructors elected to focus on graphic organizers and brain dominance theory
and omit the information about study systems. Reading study systems will be addressed in an
online seminar.
46
Chapter Nine: It was decided that this content can be combined with the Chapter Eight notetaking chapter or deleted in lieu of critical reading skills such as fact and opinion, author’s
purpose, tone, point of view, etc.
Chapter Sixteen: This chapter is so full of so much good information but there is not time to
cover it all. Instructors have an option to use a PowerPoint presentation/activities that address
propaganda OR fallacies.
Reading Lab instructors agreed to share the PowerPoint presentations they developed for their
mini lessons during the workshop. These are posted on the Ning site along with other
curriculum Lab resources.
Instructors decided to include the evaluation/citing of sources as needed throughout the semester.
Dossier: Chapter Seven was changed to developing a mnemonic device for remembering
something for their content course.
Instructors agreed that they would select a passage from one of the content course textbooks and
make copies for the Lab students when applying one of the strategies learned in the Lab. It will
be easier to monitor and score the dossier component if students in a particular Lab section are
applying the specific strategy to the same content passage.
Instructors will develop a weekly quiz to be given during the first five minutes of each class.
Some of these quizzes will be objective (e.g. T/F, Matching, Multiple Choice), and others will be
subjective (e.g. short answer, fill-in-the-blank).
Lunches/Breaks
Instructors expressed that they really enjoyed our lunches, which included a variety of
sandwiches, cookies, fruit, and veggies put together daily from Kroger and Wal-Mart purchases.
One day we ordered pizza and had it delivered. The instructors were given the freedom to leave
during lunch and run errands, etc. but they chose to stay together each day. I was very pleased
that they stayed together and got to know the TAs outside of their department. They laughed a
lot and seemed to have a great time.
47
Appendix C
PLATO 4.1 FASTRACK Advantage Reading Assessment/Curriculum
Reading Level I
Identifying the Main Idea 2
Identifying the Main Idea When It is Implied
The Title as the Main Idea 2
Details That Support the Main Idea
Chronological and Logical Order
Comparison and Contrast
Cause and Effect – Intermediate
Illustration and Example
Implied Meaning
Implying a Title
General Reading Strategies
Rhyme Scheme
Understanding Meter
Symbolic Meaning
Figurative Language
How to Read a Poem
Literal Meaning of Drama
Interpretation of Drama
What is a Review?
How to Read a Review
Commentary on Literature
Commentary on the Arts
Reading Level J
Understanding Plot
Implied Meaning of Plot
Setting
Implied Setting
Tone
Style
Kinds of Writing
Characterization
What’s a formal Essay?
What’s an Informal Essay?
Biography and Autobiography
Finding Word Meanings
48
Appendix D
The Student Motivation Scale
Instructions: Please circle the number toward either word which best represents your feelings
about READING.
1. Motivated
2. Interested
3. Involved
4. Not Stimulated
5. Don’t want to study
6. Inspired
7. Unchallenged
8. Uninvigorated
9. Unenthused
10. Excited
11. Aroused
12. Not fascinated
13. Dreading it
14. Important
15. Useful
16. Helpful
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
2
2
2
2
2
2
2
2
2
2
2
2
2
2
2
2
3
3
3
3
3
3
3
3
3
3
3
3
3
3
3
3
4
4
4
4
4
4
4
4
4
4
4
4
4
4
4
4
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
6
6
6
6
6
6
6
6
6
6
6
6
6
6
6
6
7
7
7
7
7
7
7
7
7
7
7
7
7
7
7
7
Unmotivated
Uninterested
Uninvolved
Stimulated
Want to study
Uninspired
Challenged
Invigorated
Enthused
Not excited
Not aroused
Fascinated
Looking forward to it
Unimportant
Useless
Harmful
Instructions: Please circle the number toward either word which best represents your feelings
about _________________________ (e.g. history, anthropology, sociology, or agriculture).
1. Motivated
2. Interested
3. Involved
4. Not Stimulated
5. Don’t want to study
6. Inspired
7. Unchallenged
8. Uninvigorated
9. Unenthused
10. Excited
11. Aroused
12. Not fascinated
13. Dreading it
14. Important
15. Useful
16. Helpful
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
2
2
2
2
2
2
2
2
2
2
2
2
2
2
2
2
3
3
3
3
3
3
3
3
3
3
3
3
3
3
3
3
4
4
4
4
4
4
4
4
4
4
4
4
4
4
4
4
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
49
6
6
6
6
6
6
6
6
6
6
6
6
6
6
6
6
7
7
7
7
7
7
7
7
7
7
7
7
7
7
7
7
Unmotivated
Uninterested
Uninvolved
Stimulated
Want to study
Uninspired
Challenged
Invigorated
Enthused
Not excited
Not aroused
Fascinated
Looking forward to it
Unimportant
Useless
Harmful
Appendix E
Focus Group Interview Protocol
*Introduce yourself as the supervisor for the project.
Script
As you may know, the A&S 100 course is part of a pilot project here at UK to help incoming
freshman adjust better to the demands of college reading. One of my responsibilities is to
determine the effectiveness of the Reading Labs so we can make any necessary changes in order
for these classes to provide the best possible assistance to our students. I have a series of
questions that I would like you to respond to. It is important that you provide open and honest
responses to these questions. Your names will not be associated with your responses or
documented in any way. I would like to record our conversation for analysis purposes only and
to ensure my notes accurately reflect your comments. The audio recording will be used solely by
the evaluator on this project, who is preparing a formal report for the Office of Undergraduate
Education. Do any of you have questions before we begin?
1. How do you feel the reading strategies you learned in A&S 100 affected your learning in your
(history, anthropology, sociology, agriculture) course?
2. Which specific strategies do you think were the most helpful to you?
3. Do you think you would have been as successful in your (history, anthropology, sociology,
agriculture) course without attending A&S 100? Why or why not?
4. What were the main benefits of the A&S course? What were the weaknesses?
5. What did you think about using the PLATO online courses? About how much time did you
spend on PLATO each week? What were the main benefits (if any) using PLATO? What were
the weaknesses?
50
Download