Capstone- 2013- Charles, Choi

advertisement
Toward an Approach to Identify Effective Practices
for Faculty Teaching Community-Based Learning
(CBL) Courses
Robiaun Charles and YuKang Choi
Vanderbilt University
Peabody College of Education and Human Development
Capstone Report
April 2013
A Study Project for Rhodes College and Members of the Teagle-funded Consortium
1
Acknowledgements
We would like to acknowledge the faculty at Vanderbilt University, with whom we
have had the opportunity to engage with both in and out-of-class, for their expertise
and wisdom. We are especially grateful to Dr. John Braxton (capstone project faculty
advisor), Dr. Suzanne Bonefas (capstone project client, Rhodes College and Teagle
Consortium), Dr. Bob Johnson (capstone project client, Rhodes College and Teagle
Consortium), Dr. Claire Smrekar (capstone project faculty consultant, Vanderbilt
University), and the faculty and staff representatives from the Teagle-funded
Consortium member colleges and universities.
Lastly, we dedicate our work to our family, friends and colleagues for their enduring
support and encouragement. You made this possible.
About the Authors
Robiaun Rogers Charles is the Associate Vice President and Executive Director of
Development for the Division of Diversity and Community Engagement (DDCE) at The
University of Texas at Austin. In this role, she provides strategic leadership to foster
relationships and secure philanthropic support for DDCE and its programs, projects
and initiatives. A native of Atlanta, GA, Charles earned her BA in philosophy from
Rollins College, where she was honored as an Algernon Sydney Sullivan Scholar, as
well as a certificate in international cultural study from Tver State University in Russia.
She later earned her MPA from The Andrew Young School of Policy Studies at Georgia
State University, and completed the Management Program for Higher Education
Professionals at Harvard University. She and her husband (Preston) have one daughter
(Camille).
YuKang (YK) Choi, who is practitioner and scholar, began his EDD journey in 2009.
Before his study at the Peabody College of Education and Human Development, Choi
studied at the John F. Kennedy School of Government at Harvard University and
received his MPP degree. Choi, also received his BA from Handong Global University
in Korea. Currently, he is working as the founding CEO for Teach For All Korea. In
this role, Choi helps low-income students, North Korea refugee students and foreign
workers’ children.
2
Table of Contents
Executive Summary………………………………………………………………………..
5
Introduction………………………………………………………………………………...
8
Background……………………………………………………………………
9
Project Approach……………………………………………………………..
10
Project Questions……………………………………………………………..
11
Report Organization……………………………………………………….....
12
Project Question One…………………………………………………………………….
13
Conceptual Framework……………………………………………………..
14
Method………………………………………………………………………..
16
Process Used to Construct the Data Set…………………………………...
16
Data Collection Instrument and Protocol………………………………...
21
Data Analysis………………………………………………………………...
24
Findings………………………………………………………………………
25
Summary……………………………………………………………………..
34
Project Question Two........................................................................................................
35
Conceptual Framework……………………………………………………..
35
Method………………………………………………………………………..
35
Findings………………………………………………………………………
35
Summary……………………………………………………………………..
45
Project Question Three…………………………………………………………………..
46
Conceptual Framework……………………………………………………..
47
Method………………………………………………………………………..
47
Findings………………………………………………………………………
47
Summary……………………………………………………………………..
53
Limitations………………………………………………………………………………...
54
Recommendations………………………………………………………………………..
57
Conclusion………………………………………………………………………...............
64
References………………………………………………………………………................
66
Appendices………………………………………………………………………..............
70
3
Appendices
Appendix A: Teagle-funded Consortium Member Institutions……………………….
71
Appendix B: Community-Based Learning Scorecard For Students…………………... 76
Appendix C: Community-Based Learning Scorecard For Faculty/Instructors………
79
Appendix D: Community-Based Learning Scorecard For Community Partners……
82
Appendix E: Courses Eligible for Qualitative Study………………………………........ 85
Appendix F: Courses Ineligible for Qualitative Study……………...………………….. 87
Appendix G: Scorecard to Domains of Practice Key……..……………………………..
89
Appendix H: Recruitment Email Script………………………………………………….
91
Appendix I: Confirmation of Interest Email Script…...………………………………...
92
Appendix J: Confirmation of Interview Email Script….……………………………….. 93
Appendix K: Qualitative Interview Questions and Protocol…………………………..
94
Appendix L: Interview Code Key………………………………………………………...
97
Appendix M: Teagle Scorecard based Key Performance Indicators (KPIs)………….. 98
Appendix N: KPI Scorecard……………………………………………………………….
99
List of Figures
Figure 1: Project Questions………………………………………………………………...
12
Figure 2: Consortium Scorecard Group One Sample Questions………………………
17
Figure 3: Consortium Scorecard Group Two Sample Questions………………………
17
Figure 4: Consortium Scorecard Group Three Sample Questions……………………..
17
Figure 5: Community-Based Learning (CBL) Courses by Semester…………………...
18
Figure 6: Community-Based Learning (CBL) Courses by College and University…..
18
Figure 7: Scorecard Questions by Domains of Practice…………………………………
19
Figure 8: Conversion Methodology by Class…………………………………………….
20
Figure 9: Conversion Methodology by Domain…………………………………………
20
Figure 10: Conversion Methodology across All Domains……………………………...
20
Figure 11: Courses Eligible for Qualitative Study……………………………………….
21
Figure 12: Selection of Participants………………………………………………………..
22
Figure 13: Qualitative Interview Questions……………………………………………...
23
Figure 14: Teagle Scorecard Future Use………………………………………………….
61
Figure15: Ongoing Assessment Model for Faculty…………..………………………….
62
4
Executive Summary
5
Executive Summary
Many scholars agree that one of the most critical indicators of an institution’s
performance is its ability to improve the quality of student learning (Astin, 1991; Banta,
2002; Ewell, 2002; Pascarella, 2005). Thus, investigating how out-of-class influence and
in-class courses lead to better student learning is important. An example of this
investigating has occurred with community-based learning.
Community-based learning is a pedagogical approach that aims to help students learn
academic and community concepts through active community service (O’Grady, 2000).
In a community-based learning or service learning course, faculty combine academic
study and community service with the goal of enhancing academic learning while
developing a student’s critical thinking skills. As an extensive growing body of
research continues to investigate the learning outcomes of service learning, it has
become evident that service learning can result in enhanced learning outcomes for
students. Yet, questions remain, regarding what actually happens in the communitybased learning classroom that produces such results? There exists limited research
assessing the instructional design of community-based learning courses or the
preparation, training or practices of faculty teaching such courses.
In 2007, a collaborative effort was initiated to explore the confluence of student
learning, assessment, and community-based learning within the liberal arts college
context. Rhodes College, Franklin and Marshall College and Niagara University
received a planning grant of $25,000 from the Teagle Foundation for
College/Community Partnerships Consortium: A Planning Grant to Explore Systematic
Assessment of the Impact of Community Partnerships on Student Civic Engagement
and Learning. The purpose of the funding was to extend the Foundation's Outcomes
and Assessment Initiative, which explored the potential of faculty-led value-added
assessment. With their funding, the grantee’s overarching goal was to create a
preliminary draft of a methodology for evaluating community-based learning (CBL)
courses. The goal was achieved in the form of a preliminary scorecard that was
administered voluntarily by institutions during the 2008-09 academic year. At the
conclusion of the planning grant, the Teagle Foundation awarded Rhodes College,
Niagara University and Franklin and Marshall College with a grant of $280,713 over a
period of 36 months to assess the added value of community-based learning courses
and programs on student learning and civic engagement. These three colleges decided
to name their working group the Teagle-funded Consortium (referred to as the
Consortium).
6
Over a four-year period (an extension year was added), the Consortium expanded to eight
liberal arts colleges and universities. A key result of their work was the refinement and
implementation of the Consortium’s scorecard (referred to as the Teagle Scorecard or
Scorecard) which was validated by Dr. John Braxton (Vanderbilt University) and Mr.
Willis Jones (Vanderbilt University). Through their work, including an extensive
literature review, Braxton and Jones arrayed or grouped the success factors into four
overarching domains of practice: placement quality, application, reflection and
feedback, and community voice. From the Spring of 2010 – Fall of 2011, the Scorecard
was administered voluntarily by Consortium members, and the raw data was collected
and maintained in Survey Monkey by Rhodes College.
In the Summer of 2012, the project team met with Dr. Suzanne Bonefas and Dr. Robert
Johnson of Rhodes College – the principal for the Consortium (also referred to as the
client) - to discuss a project focused on analyzing the above mentioned raw data. The
purpose was to identify effective practices for community-based learning courses. With
minor adjustments to the project questions they presented, the following were agreed
upon as the questions for this project study:
1. What are the practices responsible for high performing community-based
learning courses based upon the Teagle Scorecard?
2. What are recommendations that can make other CBL courses better?
3. What is an appropriate process/protocol for assessing community-based
learning courses on an ongoing basis?
This study sought to extend the work of the Consortium by providing a systematic
research and evidence-based process for faculty to identify and employ effective
practices in the community-based learning classroom. To achieve this, the project team
conducted a mixed-methods study designed to build upon the pre-existing Scorecard
data collected by the Consortium and construct a sample of high performing courses.
This sample was used to glean insights on effective practice through interviews and
document observation. The team’s findings revealed several effective practices of high
performing community-based learning courses both within and across the domains of
practice.
As a result of these findings, a total of 17 recommendations specific to
institutional policy makers, community-based learning faculty, research funders and
the project client were presented for consideration.
7
Introduction
8
Introduction
Despite increased acknowledgment that higher education must define student learning
as a key indicator of institutional quality, the challenge remains for institutions to move
beyond input measures (such as class SAT scores, student-faculty ratios, reputational
survey scores, and amount of dollars spent to support students) to assess the quality of
higher education institutions to the output measure of student learning. However, in a
time of unremitting college costs and compelling suggestions of diminishing return,
taxpayers, legislators, parents and students (themselves) want the value of college
explicitly articulated. In response, there has been a proliferation of student learning
assessment systems specifically for higher education.
Many scholars agree that one of the most critical indicators of an institution’s
performance is its ability to improve the quality of student learning (Astin, 1991; Banta,
2002; Ewell, 2002; Pascarella, 2005). To accomplish this goal, institutional measures
need to be established and linked to provide evidence of a positive impact on students
(Ewell, 2002). Thus, investigating how out-of-class influence and in-class courses lead
to better student learning is important. An example of this investigating has occurred
with community-based learning.
Community-based learning is a pedagogical approach that aims to help students learn
academic and community concepts through active community service (O’Grady, 2000).
In a service learning course, faculty members have students actively engage in
community service that is related to academic concepts. Through faculty-guided
reflection, students learn academic concepts as they learn about community issues.
Service learning combines academic study and community service with a goal to
enhance academic learning while developing student’s critical thinking skills. Desired
outcomes of academic service learning are for students to experience active learning
regarding course content, and gain knowledge and understanding about the students’
roles in the community (Eyler & Giles, 1999). Interestingly, the desired outcomes of
academic service learning align with “elements that are known to enhance depth of
understanding in the learning process.” Marchese (1997) stated ‘these elements’ include
active learning, frequent feedback, collaboration, cognitive apprenticeship, and practical
application.
An extensive growing body of research continues to investigate the learning outcomes
of service learning. It is becoming evident that service learning can result in enhanced
academic learning. Yet, a question remains, regarding what actually happens in a
community-based learning classroom that produces such results? Limited research
exists that provides details or a critique of the actual instructional design of service
9
learning methods or the preparation of faculty for instruction of service learning
courses. Most studies on service learning courses entail qualitative interviews with
students regarding learning outcomes. The interviews examine student comments on
learning outcomes, but they do not provide information on, or a critique of, the
instructional components of service learning courses.
In 2007, a collaborative effort was initiated to explore the confluence of student
learning, assessment, and community-based learning within the liberal arts college
context. Rhodes College, Franklin and Marshall College and Niagara University
received a planning grant of $25,000 from the Teagle Foundation for
College/Community Partnerships Consortium: A Planning Grant to Explore Systematic
Assessment of the Impact of Community Partnerships on Student Civic Engagement
and Learning. The purpose of the funding was to extend the Foundation's Outcomes
and Assessment initiative, which explores the potential of faculty-led value-added
assessment.
If colleges and universities are to bring student learning to the
highest level possible, they must build systematically on what they
have already achieved," said W. Robert Connor, president of the
Teagle Foundation. "With this latest round of grants, we aimed to
provide targeted support for the information gathering that is
crucial to the development of liberal education, for the assessment
instruments and processes that let us know how well we're serving
students and how we can do still better, and for the fresh thinking
that energizes the teaching and learning process as a whole.
(Teagle Foundation)
During the planning grant period, Rhodes, Franklin and Marshall, and Niagara 1)
shared strategic and operational practices; 2) reviewed curriculum and community
engagement activities at each institution, reviewed existing instruments for assessment
of community-based learning, reviewed and discussed existing literature; and 3)
developed a pilot assessment protocol. The overarching goal was to create a
preliminary draft of a methodology for evaluating community-based learning (CBL)
courses was achieved in the form of a preliminary scorecard that was administered
voluntarily by a few Consortium members during the 2008 academic year.
The planning grant was an essential step in laying the foundation
for understanding the impact of assessment in the context of
sharing best practices and increasing faculty involvement in the
10
process of assessing student learning. (Rhodes College Report to
the Teagle Foundation)
At the conclusion of the planning grant, the Teagle Foundation awarded Rhodes
College, Niagara University and Franklin and Marshall College with a grant of $280,713
over a period of 36 months to assess the added value of community-based learning
courses and programs on student learning and civic engagement. These three colleges
decided to name their working group the Teagle Consortium (referred to as the
Consortium in this report).
Over a four year period (an extension year was added), the Consortium expanded to eight
liberal arts colleges (see Appendix A), and made great gains in creating a sustainable
infrastructure for the assessment and sharing of effective practice for community-based
learning in liberal arts colleges. A key result of their work was the refinement and
implementation of the Consortium’s Community-based learning scorecard. The
Consortium worked with Dr. John Braxton (Vanderbilt University) and Mr. Willis Jones
(Vanderbilt University), who undertook an extensive literature review on community
based learning and the development of best practices for community-based learning
programs in higher education to validate the instrument. Through their work, Braxton
and Jones arrayed or grouped the success factors into four overarching domains of
practice: placement quality, application, reflection and feedback, and community voice.
They weighted a list of indicators for each practice based upon input from experts in the
field and these indicators became the basis for a set of survey items that comprise the
Teagle Scorecard (hereafter referred to as the Scorecard). It was administered
voluntarily by Consortium members during the following academic semesters: Spring
2010, Fall 2010, Spring 2011 and Fall 2011.
In the Summer of 2012, the project team met with Dr. Suzanne Bonefas and Dr. Robert
Johnson of Rhodes College – the principal for the Consortium (also referred to as the
client, see Appendices B, C, and D) - to discuss a project focused on analyzing the above
mentioned raw data. The purpose was to identify effective practices for communitybased learning courses. With minor adjustments to the project questions they presented,
the following were agreed upon as the questions for this project study:
1. What are the practices responsible for high performing community-based
learning course based upon the Teagle Scorecard?
2. What are recommendations that can make other CBL courses better?
3. What is an appropriate process/protocol for assessing community-based
learning courses on an ongoing basis?
11
•Identification of
Effective Practices
Project
Question Two
•Recommendations
for Courses
•Process/Protocol for
Assessing Courses
Project
Question
Three
Project
Question One
Figure 1: Project Questions
This project sought to provide the Consortium a systematic research and
evidence-based process for faculty who teach community-based learning classes to
identify and employ effective classroom practices. In this report, the capstone project
team details each project question and its related conceptual framework, method, and
limitations. This is followed by an explanation of the findings within the context of the
identified conceptual framework. The report concludes with a summary of the findings
and our resulting recommendations.
Note: For the purposes of this written report, the terminology of service-learning and
community-based learning will be used interchangeably.
12
1
Project Question One
13
Project Question #1: What are the practices responsible for high performing communitybased learning course based upon the Teagle Scorecard?
Conceptual Framework
The number of community-based learning courses on college campuses in the United
States has grown substantially yet there remains opportunity for additional research
and scholarship on the topic (Eyler & Giles, 1999; Giles & Eyler, 1998). However, there
have been notable contributions to the community-based learning body of scholarship
during the last 20 years, including, but not limited to Boyer’s (1996) seminal work on
the scholarship of engagement. A service-learning course is a
Course-based, credit-bearing educational experience in which students
(a) participate in an organized service activity that meets identified
community needs and (b) reflects on the service activity in such a way as
to gain further understanding of course content, a broader appreciation
of the discipline, and an enhanced sense of civic responsibility (Bringle &
Hatcher, 1995, 112).
While the community experience and service component is central to a communitybased learning course, it is important to emphasize that academic credit is earned for
the learning that occurs as a result of the experience – not simply participating in the
experience (Howard, 1993). Zlotkowski (1998) asserts that service activities should be
beneficial to community stakeholders and meet the educational objectives of faculty.
The result is a mutually beneficial relationship between community and the university
via the community-based learning course. When considering the relationship between
community-based learning and community service, it is important to know that while
they are related, they are also distinctly different. Furco (1996) developed a continuum
of experiential education with service learning located in the middle and
volunteer/community service on one end (because of focus on servicing recipients) and
internships/field education on another end (because of focus on student career
development).
To elaborate on the Bringle and Hatcher’s definition, there are key components of a
course which characterize it as a community-based learning course such as a service
experience linked to course objectives and deliberate and intentional reflection on the
experience (Eyler & Giles, 1999; Mitchell, 2008; Rosenberger, 2000). This experience
should be directly related to learning objectives and offer an opportunity to engage with
the community by way of a community partner and achieve specific goals (Jacoby, 1996;
Mitchell, 2008).
14
Informed by such research as that cited above, the Scorecard is grounded in the
following four research-based domains of practice: placement quality, application,
reflection and feedback, and community voice.
Placement Quality
Placement quality refers to the quality of the college-community partnership. High
quality partnerships “provide productive situations for students as well as genuine
resources useful to the community” (Eyler & Giles, 1999, pp.168-169). Eyler and Giles
(1999) refer to the “anchoring of learning in community experience” and placement
quality provides a context in which students can exercise initiative, take responsibility,
and work as peers with practitioners and community members” (p. 169).
Application
Application refers to the connection to the academic content and the extent to which the
instructor links the community-based learning experience to the classroom learning
(Eyler & Giles, 1999, p. 170). Frequency of connecting academic content to the
community based learning experience was shown by Astin et al (2000) to be “an
especially important determinant of whether the academic material enhances the
service experience, and whether the services experience facilitates understanding of
the academic material” (p.iii). Astin et.al (2000) find “strong support for the notion that
service learning courses should be specifically designed to assist students in making
connections between the service experience and the academic material.” Clearlydefined learning objectives, clear connection between educational methodology and
learning objectives, and appropriateness of educational method for the subject matter
was also discussed in the context of application.
Reflection and Feedback
Reflection and Feedback is the third domain of practice and identified in the literature
as a key instructional factor associated with positive learning outcomes for students.
Most studies conducted on service learning focus on the reflective experience and its’
related course design. (Astin et al., 2000; Eyler & Giles, 1999; Mayhew & Fernandez,
2007). The reflection experience is the process in which the student integrates academic
content with the service experience and personal belief and identity systems (Eyler &
Giles, 1999; Jacoby, 1996; Mayhew & Fernandez, 2007). Reflection refers to the activity
(individual writing and group discussion) which provides students the opportunity to
process their community based learning experience. Feedback refers the quality of
15
feedback to students from faculty and community members and conversely feedback
from students and community members about the course.
Community Voice
The final domain of practice is community voice which refers to the alignment between
community-based learning work done and the needs and desires of the community.
Factors of community voice include the existence of community partnership rather than
client-provider relationship, strength of community partnership, and degree of
community input and breadth of community input.
In this section the capstone project team addresses project question #1 within the
context of the theory and research presented in the conceptual framework.
Method
The inherent assumption with project question #1 is that the Scorecard, as a valid
protocol instrument, captures the data necessary to identify effective practices. The
capstone project team’s strategy to address this question was to employ a mixed
methods approach designed to (1) build upon the pre-existing quantitative data
collected by the Consortium to construct a sample for qualitative study and to (2) glean
insights from the qualitative sample on effective practice.
Process Used to Construct Sample
Maintained in Survey Monkey (an online survey tool), the selected pre-existing
quantitative data resulted from student, faculty, and community member responses to
28 questions on the Scorecard separated into three groups. (See figures 2 – 4). Each
question offered a five item likert-scale response of 1) strongly disagree, 2) disagree, 3)
agree, 4) strongly agree, and 5) I do not have enough information to comment. For this
project, the Capstone Project team removed I do not have enough information to comment
responses from the analysis to avoid the skewing of data. Skewing would result
because of the way that Survey Monkey translated the information.
16
The following question seeks to learn about your perceptions of the
experiences you have encountered during your current community
based learning (CBL) course/program. Please indicate your level of
agreement with each statement.

Students perform a variety of activities while carrying out
their community-based learning (CBL project.
There is a clear connection between the specific tasks
students perform in the community and the goals of the
course.
CBL projects require the use of course knowledge and
skills to address real problems in the community.
The course entails the application of theories covered in the
course to CBL projects.



Figure 2. Consortium Scorecard Group One Sample Questions
The following questions seek to learn about your perceptions of the
experiences you have encountered during your current community
based learning (CBL) course/program. Please indicate how often
these behaviors have occurred during this CBL course/program.




The instructor works with students in the community
during their CBL project.
The instructor asks students to identify alternative ways of
viewing issues arising from their CBL experience.
The instructor provides feedback on individual student
reflections.
The instructor uses student reflections to focus class
discussions.
Figure 3. Consortium Scorecard Group Two Sample Questions
The following questions seek to learn about your perceptions of the
experiences you have encountered during your current community
based learning (CBL) course/program. Please indicate your level of
agreement with each statement.





CBL projects takes place over a sustained period of time.
Students are involved in the planning of their CBL project.
Students work directly with community partners in their
CBL project.
Students have important responsibilities in their CBL
project.
CBL projects performed as part of this course/program are
useful to the community.
Figure 4. Consortium Scorecard Group Three Sample Questions
17
The Survey Monkey database (referred to as the consortium database) listed a total of 90
courses. However, only 48 (see Appendix E) of the 90 courses in the Consortium database
contained responses to the 26 questions on the Scorecard. As a result, the other 42
courses (see Appendix F) were omitted from the study. The 48 courses included in the
project research were held over a period of two years, from the Spring of 2010 through
the Fall of 2011 with the highest participation during the spring semesters (see Figure
5). Seven different Consortium colleges and universities were represented (see Figure
6).
48 Courses by Semester
20
18
16
14
12
10
8
6
4
2
0
18
18
7
Spring 2010
5
Fall 2010
Spring 2011
Fall 2011
Figure 5. Community-Based Learning (CBL) Courses by Semester
48 Courses by College and University
Allegheny College (PA)
Franklin and Marshall College (PA)
Hobart and William Smith Colleges (NY)
Ithaca College (NY)
Niagara University (NY)
Rhodes College (TN)
Stonehill College (MA)
Figure 6. Community-Based Learning (CBL) Courses by College and University
18
In accordance with the project question’s call to focus on “high-performing” courses,
the team employed theoretical sampling, a process described by Patton (1990) as the
selection of information-rich cases, or “those from which one can learn a great deal
about issues of central importance to the research” (p. 169). A system of theoretical
sampling was employed by developing a conversion methodology based upon the
responses to the 26 questions for 48 courses both across and within the four domains of
practice that we described in the Conceptual Framework. A key was provided by the
Consortium and used to match each question to one of the four domains of practice (see
Appendix G).
26 Questions by Domains of Practice
14
12
12
10
8
6
6
6
4
4
2
0
Placement Quality
Application
Reflection &
Feedback
Community Voice
Figure 7. Scorecard Questions by Domains of Practice
The capstone project conversion methodology included the following steps. First, we
exported from Survey Monkey responses to the 26 scorecard questions, by course. This
was followed by the selection of a question and the calculation of the corresponding
mean from all question respondents for that specific question. This step was repeated
until a mean score had been calculated for each of the 26 questions. The questions were
then grouped according to domain of practice, with placement quality associated with
six questions, application associated with 4 questions, reflection/feedback associated
19
with 12 questions and community voice associated with 6 questions. When this was
completed, all of the above steps were repeated for each of the 48 classes (see Figure 8).
Extracted
responses to
relevant 28
questions on
Consortium
Scorecard
from Survey
Monkey
For each
individual
question,
calculated
the mean for
all
respondents,
e.g. students,
faculty and
community
partners
Sort
responses to
each
question
according to
domain of
practice.
Average
score
calculated
for each
domain.
Average
score
calculated
across all
domains
Figure 8. Conversion Methodology by Class
Following these steps for each of the classes, the mean was then calculated for each
domain and then across all of the domains. (See figure 9).
Calculated the average
for each domain with all
classes
Calculated the standard
deviation for each
domain
Calculated a set
standard deviation
above the mean
Figure 9. Conversion Methodology by Domain
After high-performing classes were identified within each domain of practice with the
steps above, the team identified high performing classes across all of the domains using
the same process (See Figure 10).
Calculated the average
for each domain with all
classes
Calculated the standard
deviation for each
domain
Figure 10. Conversion Methodology across All Domains
20
Calculated a set
standard deviation
above the mean
After completion, the above steps resulted in a total of 13 courses with mean scores one
standard deviation above the mean in at least one or more of domains of practice. Of
these, 6 courses had mean scores one standard deviation above the mean across all
domains. Given the small number of courses, the capstone project team decided to
change the cut point to ½ of a standard deviation above the mean. This modification
resulted in a total of 30 courses we then categorized as being “high-performing.” Of the
30 courses, 10 were “high-performing” in the placement quality domain, 11 were “highperforming” in the application quality domain, 14 were “high-performing” in the
reflection and 14 were “high-performing” in the community voice domain. A total of 12
courses qualified as “high-performing” across all domains.
When examining
performance unique to a domain, one course was “high-performing” in the placement
quality domain only, one course was “high-performing” in the application domain
only, two courses were “high-performing” in the reflection and feedback domain only
and three courses were “high-performing” in the community voice domain.
90 Courses
42 of which had incomplete data
48 Courses
18 of which were below
theoretical sample
threshold
30 Courses
Eligible
for
study
Figure 11. Courses Eligible for Qualitative Study
Data Collection Instrument and Interview Protocol
The faculty members teaching the 30 “high-performing” courses were invited to
participate in the project study. The purpose of the interview was to gain a deeper
21
understanding of the practices of faculty members teaching the 30 identified “highperforming” courses. Although the scorecard provided information about what was
happening in these selected courses, we conducted the interviews to learn how it was
happening, why it was happening and when it was happening. “Qualitative findings in
evaluation illuminate the people behind the numbers and put faces on the statistics . . .
to deepen understanding” (Patton, 1990).
Qualitative research designs tend to work with a relatively small number
of cases. Generally speaking qualitative researchers are prepared to
sacrifice scope for detail. Moreover, even what counts as detail tends to
vary between qualitative and quantitative researchers. The latter
typically seeks details in certain aspects of correlations between variables.
By contrast, for qualitative researchers, detail is found in the precise
particulars of such matters as peoples understanding and interactions.
(Silverman & Marvasti, 2008, p. 14)
Schostak (2006) stated that the interview should be a pleasant experience in which the
participant can connect with the researcher, creating a space of trust where inner
thoughts and emotions can be shared in a safe and comfortable exchange. This was
even more important for us because, due to budget and time constraints, the project
team was unable to conduct interviews in person. Rather, the team conducted the
interviews by telephone. Invitations were extended to instructors of the 30 courses, and
in a few instances one instructor taught more than one these courses. In the end, the
capstone project team interviewed faculty for 21 courses – a sample representing 70% of
the population of faculty teaching high-performing courses (see Figure 12). Interviews
ranged in time from 60 to 90 minutes and the protocol for each interview included
sending an invitation to the participants’ email, request for informed consent email and
a confirmation of interview email (see Appendices H, I and J).
30 Courses
(13 courses 1SD and 17 courses
1/2SD AM)
9 omitted for various reasons
21 Courses
Represented in Qualitative
Study
Figure 12. Selection of Participants
22
The protocol for each interview also included a brief description of the project and
study questions. The capstone project team began each interview with background
information questions such as, “What is your current title? and Are you tenured,
tenure-track on non-tenure track?” Then each interviewee was asked to describe their
“high-performing” community based learning course. This was followed by a series of
semi-structured questions about their planning process for the course, decision process
for selecting a community partner, why they organized or structured their course in a
particular manner, and the domains of practice (see Figure 13). These questions were
followed with a series of questions about their impression of their institution’s
commitment to community-based learning as well as their department’s commitment to
community based learning (see Appendix K). Faculty members were also asked
questions prompting them to share their thoughts on the Scorecard and its effectiveness
as an assessment too. The interviews ended with a restatement of the project’s purpose
and a request, or reminder, to share any relevant document, e.g. course syllabus, etc.
Qualitative Interview Questions
Placement Quality
 How were the activities that students participated in
during the CBL course project determined/identified?
 In what ways were students involved in planning for
their CBL project?
Application
 What was done to establish an explicit connection
between the tasks students performed in the
community and the goals of the course? Were there any
challenges to do this? If so, how did you overcome
them?
 How did you go about creating a link between the
course and the community problem?
Reflection and Feedback
 How did you facilitate and foster in-class discussion
and reflection?
 What opportunities were provided for community
partners to provide feedback?
Community Voice
 Who articulated the goals of the community to the
students? How were the goals articulated?
 What was the relationship between the university and
the CBL project/project site and/or community partner?
Was there a pre-existing relationship?
Figure 13. Qualitative Interview Questions
23
Themes began to emerge after the first few interviews which resulted in additional
questions during later interviews. For example, the capstone project team began to ask
faculty why they used community-based learning. Additionally, as themes emerged
early, the team honed in on these themes in later interviews.
Interviews were
transcribed within a few days after completion with coding beginning immediately
thereafter. At the conclusion of interviewing, the project team was satisfied with the
data acquired and that it was sufficient for the data analysis.
It is important to note that this study was approved by the Vanderbilt University
Institutional Review Board. It was determined that data acquired for the study would
be reported anonymously and that the identity of individual interviewees would
remain confidential. The confidentiality of the participants through safeguards in the
collection, storage, and records of the data was of great priority. Digital audio
recordings of interviews were conducted and housed by a third party provider with
password restricted access for the research team only. These digital recordings were
sent to a third party provider using a pseudonym for transcription.
Data Analysis
The capstone project team’s analysis involved breaking down the interview data into
manageable parts to allow for the emergence of concepts and categorical information.
According to Silverman and Marvasti (2008), the process of breaking down data for
purposes of analysis involves four phases. The first phase is open coding and the
second phase involves axial coding which identifies the key concepts. Following the
identification of key concepts, the third phase is focused on breaking down these
concepts even further in order to identify patterns, the last phase.
Based upon Silverman and Marvasti (2008), the capstone project team began with open
coding of transcripts. On average, interviews were transcribed within 2-3 days of the
interview which allowed for timely open coding analysis. However, there are instances
of a longer timeframe between when the interview was conducted and when the
transcript was received. Topics that emerged in connection with the open coding were
captured as well. While the open coding was occurring, after the fourth interview the
capstone project team concurrently moved into the second and third phases and began
axial coding. Through this process of analysis across interviews, 12 within and across
domain concepts emerged. The last phase of pattern identification resulted in seven
core themes.
24
Findings for Project Question 1: What are the practices responsible for high performing
community-based learning course based upon the Teagle Scorecard?
The capstone project team’s goal was to identify the practices responsible for “highperforming” community based learning courses. The following provides a summary of
the project team’s findings at the classroom level and the institutional level within and
across each domain of practice. Many quotations have been taken from the data and
presented here to support the core themes. The quotes are single spaced, indented and
italicized.
The capstone project team created a pseudonym for each faculty
member/instructor which indicates their faculty rank (see Appendix L).
Placement Quality Domain
According to Eyler and Giles (1999) for a student’s efforts to be considered service
learning, the proposed activity should promote learning through active participation.
These authors (1999) note that placement quality refers to the extent that students in
their community placements are challenged, are active rather than observers, feel that
they are making a positive contribution, and have an important level of responsibility
(p. 33). Campus Compact (2000) identifies positive, meaningful and real activities as the
common characteristics of authentic service learning. Accordingly, it is essential for the
instructors to identify and secure appropriate placements in order to foster their
learning and growth through pro-active, and responsible CBL experiences. The domain
of placement quality includes questions such as “How were the activities that students
participated in during the CBL course project determined/identified?”, “In what ways
were students involved in planning activities for their project?”, and “How often, and in
what manner, did students interact with their community partner?”
Through the capstone project team interviews, the capstone project team delineated two
categories of best practices for “high-performing” community-based learning course in
the placement quality domain: faculty knowledge and expertise and active student
involvement and engagement.
Faculty Knowledge and Expertise
The capstone project team’s interview results show that in many cases, the instructors’
knowledge and experience are essential factors for “high-performance” in the domain
of placement quality. As Pascarella and Terenzini (2005) noted, instructor's pedagogical
skills, also, influence student learning outcome which is directly related with course
success. Considering that any class cannot be beyond instructors’ capacities, it is
reasonable to think that the instructors’ experience and knowledge greatly influence the
25
CBL course results. At the same time, we could find that students’ involvement in the
beginning stage of planning for CBL project is not critical factor for the successful CBL
courses. Excerpts from a faculty interview below shows that instructors play a
fundamental role in initiating CBL courses.
It developed as a consequence of my personal experience as an expert
witness in political asylum cases, and recognizing the power and the
feeling of empowerment that it gave me in helping someone win asylum
and not be sent back to their country of persecution to be killed…(5T2)
Another interviewee said:
Their area of specialty is community-based art, so that’s what she does
her research on. And one of my teaching specialties is public history. So
we were really interested in sort-of combining those two experiences.
How art can enrich public history projects, how public history can enrich
community art projects.(3N3)
And, other interviewees stated that:
So my idea was that I’m going to teach content. I can’t teach the whole
science, but I can pick one topic as it is and teach the content the way I
want them to teach – my pre-service teachers to use those instructional
strategy.(2TT4)
Both of us – person’s area of specialty is Community-Based art, so
that’s what she does her research on. And one of my teaching
specialties in research areas is public history. So we were really
interested in sort-of combining those two experiences. How art can
enrich public history projects how public history can enrich
community art projects.(3N3)
These excerpts are good examples which show instructors’ background knowledge and
experience equipped them to either develop or actively initiate the CBL course content
and relationship with the community partner. Sometimes, the very existence of the
specific CBL courses itself depends on their experiences and willingness to do it, since
they decide to open a CBL course based on their personal experience.
Well, I have a friend . . . and she teaches a community-based learning
course. And she was familiar with my – one of my responsibilities as
26
judge, as I am the drug court, problem solving judge. And she was
familiar with the drug court concept, and she suggested that I put
together a community-based learning type course to teach students more
about drug courts and problem solving courts addiction and recovery, so
that –then one thing led to another, and I proposed it. (7O01)
These examples suggest to us that instructors should focus on their strengths when they
create CBL courses since familiarity and expertise are of consequence.
On the other hand, the capstone team’s interview results, which are contrary to the
team’s initial thoughts, reveal that a students’ involvement in the planning stage for
their CBL project is not critical factor for high-performance within the placement
domain for CBL courses. It might be possible that the majority of instructors did not
recognize the importance of students' involvement in the beginning stage. No
interviewee mentioned that they have students participate in the planning stage or
student involvement in the planning stage of the CBL course is critical. In most cases,
instructors and community partners initiated the project and offered students alreadyframed CBL courses. One interviewee said:
So the planning, overall course planning was done before I have new
students. Because it was during the summer, I almost finished up my
syllabus. (2TT7)
This shows there is a logistical limitation for students to be involved in CBL planning
from the beginning stage due to the academic calendar.
Required and Expected Involvement and Engagement
As Eyler and Giles (1999) note, it is essential for students to be challenged, active rather
than observers, and to feel they are making a positive contribution. Although students’
active involvement in the beginning stage is difficult due to logistical limitations with
course scheduling and structure, the capstone project team’s interview results show that
high performing CBL instructors make students active participants rather than passive
ones when possible. For example, an instructor mentioned:
One of the things we did early on in the fall is my students crafted the
project’s mission statement… So they looked at a lot of mission
statements for not necessarily similar types of groups but for museums
that really worked very closely with the communities they served. So I
think that the process of doing that not with something that was edited
27
and written by everyone. They were different drafts, there were
discussions what do we include, what do we not include in that mission
statement. From the beginning of the writing process to the finalized
product, it was a group write. And I think that really helped them define
and articulate some of the goals for the course… I think one of the things
we made very clear through even just you’ll see in the syllabus and I
think verbally and through some of our actions is what this was really
their project. We wanted them to have a role in articulating what was
important about this and what they wanted to get out of it. (3N13)
One instructor interviewed every student who wanted to take his CBL class before they
were accepted into his class. This barrier to entry provided students a sense of
belonging and responsibility which seemed to produce active participation.
I have found that it is a value to all of the students if I sort of craft the
makeup of the class. Because we have so many students who are
interested in taking the class and because we have a limitation on size
and space available, we limit the class… And the reason I have the
interviews is because, number one, I want students who are serious
about taking the class and are – who are prepare to do the work and
who understand that this is not intended to be a freshmen level course
where I just talk at you and then you spit it back out on a piece of
paper, that you're going to have to have the – take the initiative on
your own to go out and do the work…, by sort of crafting the makeup
of this class, so we get lots of different ideas and concepts and
philosophies. So the interview process is geared more towards are they
serious about the class, are they prepared for the class, do they know
what the class – the course requires and tell me why you want to take
this class. (7O14)
Application Domain
Service learning has been promoted in higher education to foster social responsibility,
moral health, active community participation and “deep understanding of one’s self
and respect for the complex identities of others, their histories, and their cultures”
(Mayhew & Fernandez, 2007). According to Eyler and Giles (1999) for a student’s
efforts to be considered service learning, the proposed activity should provide an
opportunity to use skills and knowledge in real-life situations, extend learning beyond
the classroom, and foster a sense of caring for others. Campus Compact (2000)
28
identifies addressing complex problems in complex settings rather than simplified
problems in isolation as common characteristics of authentic service learning. In this
sense, it is critical for the instructors to have clear goals and objectives for the course
and this domain of application includes questions like “what was done to establish an
explicit connection between the task students performed in the community and the
goals of the course?” and “how were the community problems identified and who
identified them?”
Through the capstone project teams’ interviews, the capstone project team delineated
two categories of best practices for high performing CBL courses in the application
domain: intentional course design, and preexisting connection with the community.
Intentional Course Design
Eyler and Giles (1999) note that community service adds value, and well-designed
service-learning adds even more value (p. 32). Our study evidences the importance of
intentional course design for high-performance in the application domain. This was
found in the review of community-based learning course syllabi as well as in the
participant interviews.
One interviewee mentioned as below:
I think it was very closely related because I want my students to develop
science lessons. And then – teaching it with real students. And so they
had real chance to see how their instruction works with the students...So
that they are coupled with individual students. And each group taught
about three to four lessons. And they were supposed to assess the
individual partner before and after each lesson, so they can see how the
students learn in progress. So that was pretty closely related to our
objectives. And also another objective was that my students to work with
students from different background. And that was clearly there. (2ATT8)
Also, as Eyler and Giles (1999) note, service-learning can reduce negative stereotypes
and we can reasonably assume that instructors can intentionally design the CBL classes
for reducing any negative stereotype by having students exposed to specific
environments. Quote below is a good example.
My idea was that if we bring our students to school under the school
structure, there’s very little room for them to implement their own plan
29
in the classroom, especially with all the testing going on. And not many
teachers allow pre-service teachers to do whatever they want to do. So
first of all, I wanted to work with a little bit more informal setting. And
also, these days the pre-service teachers need exposure to diverse
students. Because when they graduate, their classroom will be pretty
much diverse. And our school especially has very homogeneous
population themselves…So that was my idea that I really want to go out
to the community and working with diverse group of students and
involving them.(2TT2)
Pre-Existing and Fostering Connection with Community
An instructors’ intentional design is not enough to be successful in the domain of
application as the CBL course inherently necessitates interaction between the class and
community partners. Therefore, no surprisingly, a pre-existing connection with
community partner emerged as an essential factor for success within the domain of
application. One interviewee mentioned as below:
I contacted community partners, formed a relationship, explained to
them what the goals of the course were, and we work together to figure
out what kind of opportunities they had available that undergraduates
would be well suited to. (9A4)
In some cases, instructors of the CBL class either had cooperated with the
organization or had worked in the very organization as staff members. We can
reasonably assume that these instructors have deep understanding about what the
community partner wants.
These are all organizations that I have a connection with or
have worked with. I – a little bit unique for most of the
placements that I work with. I have that direct relationship
with service learning sites.(14A5)
This quotation shows that understanding the need of the community partner is the key
to the success of application, and the understanding comes from pre-existing
connection with community.
30
Much of the way the actual courses laid out was due to the
conversations and came about through the conversations that
we had with our community partner.
Reflection & Feedback Domain
Reflection exercises in service learning courses have been shown to be a key factor in
student learning outcomes. Eyler and Giles (1999) note that learning occurs through a
cycle of action and reflection, not simply through being able to recount what has been
learned through reading and lecture (p. 7-8).
Through the capstone project teams’ interviews, the capstone project team delineated
one categories of best practices for high performing CBL courses in the reflection and
feedback domain: systematic reflection.
Systematic Reflection and Feedback (Multi-level and Ongoing)
For a student’s efforts to be considered service learning, the proposed activity should
provide structured time for students to reflect (Eyler& Giles, 1999). The Capstone
Team’s interview results show that the high-performing CBL courses have systematic
reflection processes. These instructors have intentionally designed their classes to
have enough reflection time either written and/or discussion and in most cases
decidedly more.
Students do a weekly reflection online or in a journal…We’d also have
in-depth reflections more on the philosophical or political or social,
cultural implications of policies or practices. We even used a young
adult’s children’s novel at one point in the semester that really gets to
issues around supports for students with much more significant support
needs and gets to even issues about euthanasia and the larger context of
the rights of people with disabilities and communication. (14A13)
The interview below also shows that “high-performing” CBL courses require a very
well organized approach to reflection with regular reflection time. The instructor asked
students make weekly notes in a journal, class discussion and final paper.
They were asked to take weekly notes in a journal of their experiences in
the field to reflect individually, and then when we would have
workshops, we would often do part of that time individually per group.
And I would sit down with each group and as a group we would reflect
31
on the work that they were doing, and that was probably once every week
or two. And then they’re responsible for a final paper, which in many
ways was bringing together all of these reflections with all the readings
that they had done to make a kind of final report. (4A5)
It’s very important for students to be able to reflect, collectively, on that
experience. (5A15)
There is consensus that reflection and feedback is also key for students’ learning and for
a course to be identified as high-performing. Interviewees said
What I do is I really read them almost as if I was getting a letter from
somebody who was working here, and if they ask me questions, I respond,
or if I think they need some help on something, either academic or CBL, I
tell them to come see me if it’s something can’t answer in the
reflections…probably the reflection’s a key thing and the discussion. The
reflections – because they have to make a connection between what
they’re doing at – in the classroom and what we’re reading, well, that’s –
they have to do that, so they do it. (6A12)
In sum, well-designed reflection and feedback is arguably the most important factor for
a high-performing CBL course.
Community Voice Domain
CBL course is basically designed to meet community needs as well as students’ growth.
For this reason, it is essential for instructor and students to have an understanding of
what the community really wants. Community voice, where students feel that the work
they did was shaped by input from the community, did predict that students would feel
more connected to the community, according to Eyler and Giles (1999, p. 47). The
authors (1999) note that for a student’s efforts to be considered service learning, the
proposed activity should foster a sense of caring for others. This domain of community
voice includes questions like “Who articulated the goals of the community to the
students? How were the goals articulated?”, “What was the relationship between the
university and the CBL project/project site and/or community partner?” Through the
capstone project teams’ interviews, the capstone project team delineated two categories
of best practices for high performing CBL courses in the reflection and feedback
domain: identifying community needs and ensuring the community speaks.
32
Understanding Community Needs
A successful high-performing course instructor mentioned:
Well, it was determined by community need, essentially. It was I
consulted with, in the case of xxx, with our community partner, which
became one lead organization, xxx, and we determined what the
students needed to do in order to put together a complete asylum
package so it was kind of predetermined based on what the needs were of
the asylum seekers and of the community partner so for example the
students were not pre-law students or law school students so they
might not be experts in writing a legal belief but they could do research
and learn about case that could be given to the attorney working on the
case on behalf of the asylum seeker and help them write their legal brief
so it wasn’t actually submitted to immigration court, in that case, but it
would be used to help the immigration attorney…It was a combination
of me and the community partners because they would or in the
training they would talk about what the issues were and what the needs
were. (5A9)
This interviewee clearly mentioned that his CBL course contents were determined by
community need and had consulted with the community partner continually for their
role in service. The instructor makes it clear that “it was a combination of me and the
community partners” who identify and determine what the community needs.
Instructor below mentioned very direct question to figure the community need.
We asked the community partner, “What do you need?” Because the
community-based learning, the part of it is mutual benefit, right? So I
asked them what they need.(2A4)
I contacted community partners, formed a relationship, explained to
them what the goals of the course were, and we work together to
figure out what kind of opportunities they had available that
undergraduates would be well suited to. (9A4)
Some instructors actively contact, and form a relationship and invite community
partners to figure out their needs and introduce the organization to the students to give
them a better understanding the community organization. One interviewee mentioned:
33
I invited the community director to one of my earlier class. And then he
introduced the community center, and how we met and what he was
thinking. So that’s how it was introduced. (2A16)
Community Empowerment
The other aspects of high-performing CBL courses are to let the community speak.
Through letting them speak, the instructor could make the course objective clear.
Well the one – the principal really was the person who formulated the
group that it would be with, and so her goal was that those children
would continue to have dance and to be involved with (school) because,
as first graders, they had been on campus and she saw the positive effects
that that had had, so then her goal was to try to spread that to the other
children who had missed out as first graders on that experience. So when
I came to them my idea was sort of just, "Hey, I've got some
students…?" So she really formulated those goals. And she also – when
she met with my students, she was very clear that she wanted them to see
that they were going to be learning from her children as much as they
were teaching her children, and that they should see these low-income
students as being a population that they could learn from as well
as…(1A16)
Summary
Through our analysis of the interviews with community-based course faculty, we found
several commonalities of high-performing CBL courses within and across the four
domains. For example, the course instructor’s knowledge and expertise are the
essential factors for the success in the domain of placement quality of the highperforming community-based learning courses. Secondly, it is required for the
instructors to make the link between course objective and CBL activities. Thirdly, the
majority of high-performing CBL courses have systematic reflection process. Lastly,
school level's commitment on CBL influences the success of these courses.
34
2
Project Question Two
35
Question 2: What are recommendations that can make other CBL courses better?
Conceptual Framework
Higher education institutions have heavily focused on the scholarship of discovery.
Earnest Boyer (1990), in his book named Scholarship Reconsidered: Priorities of the
Professoriate, argued that the definition of scholarship should be broadened beyond the
heavy emphasis on the scholarship of discovery to embrace the scholarship of
integration, application, and teaching (Braxton, Luckey, and Helland, 2002). By
broadening the definition of scholarship, faculty members are able to invest more time
with student learning by frequent interaction. Terenzini et. al.(1994) note that close
relationships and frequent interaction between faculty and students has implications for
students’ general intellectual-cognitive development. Also, the greater the class
emphasis on collaborative learning and the lower the emphasis on grades, the more
likely students were to use higher-order learning strategies of elaboration,
comprehension monitoring, and critical thinking (Terenzini et. al., 1994).
Institutionalization is also central to the discussion about CBL. Many institutions use
the Furco Rubric and regardless of what stage consortium members found themselves,
the overall consensus was that community-based learning is in practice a relatively
isolated or small scale practice that is not widely accepted by faculty on their campuses.
Thus consortium members are continually faced with the need to demonstrate validity
of CBL to colleagues, even where administrative support is relatively strong.
Method
To address this question, the team analyzed the interviews of faculty and their
institutional administrators of community-based learning classes that scored one half of
a standard deviation above the mean. Of particular interest to this analysis are the
findings from project question #1 and the broadened definition of scholarship and
institutionalization of CBL. Also, the team has focused on the role of institutions and
faculty members on the high-performing CBL courses.
Findings for Project Question 2: What are recommendations that can make other CBL
courses better?
Through our findings in question #1, the capstone project team found several potential
“within” and “across domain” factors which lead to high-performing CBL courses at
the institutional and faculty level. Many quotations have been taken from the data and
36
presented here to support the core themes. The quotes are single spaced, indented and
italicized.
The capstone project team created a pseudonym for each faculty
member/instructor which indicates their faculty rank (see Appendix L).
The capstone project team’s analysis resulted in findings “across all domains” as well as
findings “within domains”. The following are the “across all domains” findings which
were identified both the institutional level and the faculty level.
Institutional Level
Legitimizing of Scholarship other than Discovery
Boyer(1990) proposes that the definition of scholarship be broadened beyond the
predominant emphasis on the scholarship of discovery to encompass the scholarship of
integration, the scholarship of application, and the scholarship of teaching (Braxton,
Luckey, Helland, 2002, p.1). Boyer (1990) argues that the current faculty promotion
system fails to reflect the range of professional activities that faculty members do, since
the current faculty promotion system counts research, or the scholarship of discovery,
as the most legitimate and preferred type of scholarship (Braxton, Luckey, Helland,
2002, p.11). When it comes to community-based learning, the overall consensus was
that community-based learning is in practice a relatively isolated or small scale practice
that is not widely accepted by faculty on their campuses. This means their efforts for
community-based learning classes are not highly recognized by faculty promotion
system. Considering the very mission of liberal arts colleges and universities is to
educate sound citizen, this reality is very problematic. For this reason, it is highly
recommended for liberal arts colleges and universities to broaden the definition of
scholarship to integration, application, and teaching, to provide faculty members the
provision of spending more time and energy on community-based learning courses.
An interviewee mentioned her experience below:
I have been very active, and when I presented myself for promotion to
Associate before getting tenured, I based my self-evaluation on
Boyer's Four Domains and tried to show how teaching incorporated
service and that my work – it was service in all areas, I guess you can
say. And so just to make a long story short, when our faculty review
committee read my self-evaluation and the work that I'd done, they
said, "Hey, not only should she get promoted" which has happened to
non-tenured people at (school) in the past –that she – "her position
should also be tenure track and she should get tenure." So then this
year I was just awarded tenure.(1A2)
37
At the same time, in other institution, the situation was totally different. The instructor
felt left alone, isolated, and the institution did not adopt the broader definition of
scholarship other than discovery. The interviewee mentioned that:
My department – the history department is – to be honest could care
less about, you know, community-based learning. So it’s not really
anything that they’re – if I want to teach it that’s fine, but it’s not
something that they’re particularly supportive of. The reason that we
didn’t continue with the course was because the department needed me
to teach two sections of our American history survey course. And that
was something that the department needed me to do. And it’s like,
“Well, you did your thing for a while and now it’s time to do this other
stuff that we needed.” And it’s certainly not seen as an area of
specialty, if that makes sense. When I talk about what I do in my areas
of research and teaching, and part of this is also I consider myself a
scholar of teaching and learning and my community-based work comes
under that. So this is not just teaching, this is my research. But my
department will probably tell you I’m a historian and my area of
specialty are the dates that I did my dissertation work on, from 1860 to
1920 America. That’s my area of specialty… (3A17)
As we see from the interviewee’s remarks, in this institution, the CBL class was not
perceived as the department’s collective responsibility reflective of the institution’s
mission, but rather the respective faculty members “thing”. This is highlights the
absence of connection between community-based learning and the mission of liberal
arts colleges. Every interview viewed community-based learning as an exemplar of
their institutional missions.
The College is very clear in terms of mission, that we are a small,
residential liberal arts college. We are not primarily doing preprofessional training. We are exposing students broadly to a
spectrum of disciplines which teach them critical thinking skills and
problem solving. . . We want our students to be adaptive and careful,
critical, appreciative thinkers so that whatever they’ve learned here
they can use in whatever they do next. . . So community-based
learning is seen as a really promising way of helping student take
those theories that they’re learning, those critical reading skills for
instance and learn to apply them to a context. (10TT11)
38
Yet, most indicated that community-based learning had not been embraced or widely
received as such and lacked such credibility.
. . .a traditionalist in terms of curriculum so xxx has been very, very
slow to accept community-based learning as a valid academic
pedagogy. . . it’s a perception of what our job is as academics and
professors and I think there’s a very different perspective and, again,
an old-school view of what’s rigorous and what’s important. (5T12)
Few can deny that getting tenure is a top priority of most faculty members, and these
two quotations show good example of how an institution can use evaluation system to
motivate faculty members to work on CBL courses or demotivate an instructor to adopt
CBL class. The faculty interviewees preserve in spite of, whether it is because of their
passion for the subject matter or their own personal experience. However sending a
signal that this institution really recognizes the importance of CBL courses, a liberal arts
college can motivate more faculty members to eagerly participate in teaching and
creating CBL courses.
Aligning Institutional Mission with Organizational Practice: Centralizing community-based
learning to support de-centralized activity
This findings from this study indicate that faculty of community-based learning classes
when above and beyond in giving their time and effort to their course. All spoke of the
additional commitment required with community-based learning courses
in
comparison to other courses.
For example, there exist logistical limitations with
student involvement during the planning stage due to the structure of the academic
calendar. As a result, faculty members prepare the CBL courses between academic
terms and students register for the course during the registration period during the end
or during the beginning of new semester. This impacts all domains of practice.
As Eyler and Giles (1999) note, successful CBL classes make students feel that they are
making and positive contribution and active participants. This is a part of placement
quality, and a significant predictor of tolerance over the course of semester and of
reduced stereotyping, with students believing that the people they worked with in the
community were “like me.” (p. 33). One of possible options is to make students apply
to CBL class before a semester break begins, so that instructors can arrange either online
or offline participation in planning process of CBL courses. Facebook, Google, or
SkyDrive might be used for collecting data or discussing issues related with the course
planning.
39
Aligning Institutional Mission with Organizational Practice: Institution-wide curriculum
The other important factor students’ proactive participation and successful CBL courses
is to have roadmap for whole CBL courses offered to students at the institutional level.
Possibly some courses might ask students multi-semesters commitment, and college
should show students big picture about overall CBL courses. For example, an
institution required Community-Based Learning classes in the name of Learning
Community.
All students are required to take the learning community. And then
most of them take them in their sophomore year. That’s when they’re
sort of strongly encouraged to take them. And the learning community
is made up of three courses, and they’re team-taught. (3A3)
Creating an administrative office focused on community-based learning
This office might take charge of educating CBL faculty members including reflection
process during CBL courses.
We have the office of Community-Based Learning…He (director) set a
workshop, kind of summer institute for I think it was two weeks…So he
brought faculty members who are interested in creating new course, and
he identified several community partners that have some possibility to
work with the faculty members…The faculty members are supposed to
bring our initial idea, bring syllabus but it was very how should I say
input from the community partner was very emphasized. And we used
their ideas in tandem in creating our syllabus…I think I went from there.
(2A3)
From this interview, the capstone project team could see that the instructor above was
motivated by the initiation of the Office of Community-Based Learning. Also, this
initiation gave her some clue to start CBL class. She simply mentioned that “I think I
went from there.” This institution sent clear signal that it would be with faculty
members who adopt Community-Based Learning classes through setting up the Office
of Community-Based Learning and offering workshops.
Making Additional Resources and Support Available
Our interviews reveal that schools’ commitment in CBL is another critical factor in
making high-performing CBL courses. Many CBL instructors mentioned that the
40
schools' mission and commitment have influenced their CBL course performance. One
instructor mentioned that:
I think (school’s) commitment is growing and so they recognize that
this is the way higher ed is moving…, it's definitely growing and
becoming a much stronger component at (school). (1A24)
Also, another interviewee said that
I think people see it as part of the mission of the college and built into its
service and social justice initiatives that come out of Catholic teachings.
(4A7)
Institutional commitment on CBL leads to allocating extra resources and support. Also,
hosting CBL workshop is another important motivation for instructors to initiate CBL
courses. The interview below shows the importance of offering CBL workshop to
faculty members. Regardless of CBL courses, schools open new courses or develop
existing ones. At this point, institutions’ active motivation like offering CBL workshop
to faculty members can make them adopt one.
I was in charge of creating a new course. So because my background is
in science, I was more interested in science. And while I was thinking
about the course, the opportunity for CBL – community-based learning
workshop was announced in college level. So I thought that would be a
great opportunity to start with. (2A2)
Faculty Level
Frequent and Effective Interaction with Students
Frequently interaction between faculty members and students is one of powerful factors
for positive college outcomes for students (Pascarella and Terenzini, 1991). Eyler and
Giles (1999) note that students are more likely to report close relationships with faculty
members if there is more and higher-quality reflective discussion in class, and if there is
ample written work. The authors point out that these are all factors that suggest a high
degree of faculty involvement in the link between service and learning would seem to
offer opportunities for lots of student-faculty interaction (p. 51). This is directly
connected with what the team found through interviews. Many successful instructors
fostered this interaction through either written one and/or discussion.
41
Written Reflection
As Eyler and Giles (1999) note, writing journal is very helpful tool for students learning.
Through writing journal, students create permanent record of the service-learning
process. The team’s interview results, also, support this insistence. The team found that
many high-performing CBL classes required written reflections including journals and
term paper. Interviewees mentioned about written reflection as below:
So I have some formal ways that kind of establish that in terms of students’
weekly reflecting about their experiences in a written way. (14A/15)
Students do a weekly reflection online or in a journal… (14A13)
Well I have them writing these essays and I have them participating on this
message board… (13A8)
My students have assignments for – they wrote what I call Connections
Essays and one of the essays was – they select a course concept and apply it
to the setting and one of the essays is an observations essay where they
describe something that happens at the setting and they are required there
also to use course vocabulary in describing it… (13A5)
So they had to write about – and then we also did some sharing and
reflecting out of that as well… (12A16)
…writing their journals, they’re required, periodically, through the
course… (7A11)
Requiring written reflection might demand extra work to instructors since they should
give comments for every single writing assignment, but clearly written reflection is an
useful process for successful CBL courses. For these reasons, it is highly recommended
for an instructor to hire regular written reflection to make a CBL class better.
In-Class Discussion
Another important factor for effective interaction is in-class discussion. This is another
form of reflections, and it is the link that ties student experience in the community to
academic learning (Eyler and Giles, 1999, p. 171). The team could find that many high-
42
performing CBL classes have adopted in-class discussion as a way of reflection. The
high-performing CBL course instructors noted that:
We discuss each week . . . (2A14)
I think we provided oral feedback as well. A lot of the course structured
around discussions. (3A16)
They got verbal feedback during class time as to what they were doing
and what their problems and questions were and concerns and in our
discussions. (1A13)
We have debates periodically, three debates thorough the semester,
where I assign various issues or topics on - that are germane to the drug
court concept, and they have to argue and debate among themselves in
class about the issue.(7A4)
Probably the reflection's key thing and the discussion. . . (6A12)
So we talk about their observations, and we talk some of these students
are write very dramatically, and they talk about the smells and the
observations and the lighting. (7A12)
Based upon the team’s findings, in-class discussion is recommended for highperforming CBL classes. Furthermore, in most cases, the high-performing CBL courses
had both written reflection and discussion, there was only one course that had
discussion only. It might be better for an instructor to adopt both written reflection and
discussion to make the CBL course more effective.
Increasing Social Capital through Proactive Network
There is a consensus among service-learning related experts about what constitutes
good practice, and this concurs with student surveys (Honnet and Poulsen, 1989;
Sigmon, 1979; Owens and Owen, 1979). As we discussed previously, pre-existing
relationship between faculty members and community partners are important. The
interview shows the importance pre-existing relationship with community partners.
Once there is mutual trust between faculty members and community partner, it is much
easier to make CBL program.
43
When I went to talk to the principal about this idea originally, part of that
was because there was a first grade teacher who I had worked with the year
before and some of her students had come up and performed in the same
concert the year before, so the principal was very excited to keep something
going because these children didn't have any more dance now that they were
second graders and out of that one teacher's first grade classroom… (1A7)
Through ongoing association in a community or neighborhood, one can not only create
social capital (Eyler and Giles, 1999), but also exactly figure out what the community
partners want. With this knowledge about community, an instructor can place students
in a proper community organization and this enhances students' participation and
satisfaction.
The following are the “within domain” findings which were identified by placement
quality, application, reflection and feedback, and community voice domains of practice.
Placement Quality
The faculty of “high-performing” courses taught CBL courses that were indicative of
their subject matter expertise and knowledge. This translated into placement because of
their familiarity. Additionally, they each engaged in some form of pre-selection of
students for their CBL course. For example, some engaged students in a pre-interview
of sorts before they were allowed to enroll in the course.
Application
Despite the challenge it presented, faculty of “high-performing” self-reported that they
spent a substantial amount of time preparing and planning for the CBL course by
necessity. This seemingly resulted in course objectives that were explicitly related to
the out-of-class activity of students. The capstone project team also found that these
faculty had an understanding of their community partner goals and were deliberate
with their course activity responding to those goals.
Reflection and Feedback
Frequent interaction with students was found to be pervasive among the faculty of
“high-performing” courses. This was found both in the classroom as well as outside of
the classroom. Interestingly, the interaction with students was multi-purposed –
associated with reflection and feedback.
Community Voice
The importance of having a strong relationship with their community partner was of
key significance to the faculty of “high-performing” courses. In instances when the
44
faculty member themselves did not have that relationship, they referenced the office or
person on campus that had that relationship and acknowledged great appreciation for
their assistance. Regardless of whether the relationship with the community partner
was fostered by the faculty or someone else, there was consensus about having an
agreement of mutual expectations with community partners. For example, a couple of
faculty had a written memorandum of understanding with their community partner.
Summary
The capstone project team findings suggest that two perspectives are instrumental in
“high-performing” community-based learning courses. First, at the institutional level, it
is highly recommended for institutions to adopt broader definition of scholarship.
Through reflecting faculty members’ efforts of CBL courses on promotion evaluation,
an institution can send strong signal that it recognize the importance of CBL courses. It
is, also, recommended for institutions to have extra resources and support for CBL
courses and flexible academic calendar. Secondly, at the faculty level, frequent and
effective interactions with students are highly recommended. Considering the mission
of liberal arts colleges and universities, it is a fundamental factor for an institution
offering them community engaged activities. Also, faculty members should be
proactive in networking, so that they can offer high quality placement to students.
45
3
Project Question Three
46
Project Question 3: What is an appropriate process and protocol for assessing
community-based learning courses on an ongoing basis?
Conceptual Framework
“A growing number of post-secondary institutions in the United States have
become actively engaged in encouraging undergraduates to involve themselves in some
form of voluntary service (Astin, 1996; Astin, Sax & Avalos, 1999; Eyler and Giles, 1997;
Eyler, Giles & Braxton, 1995, 1996)” (Pascarella and Terenzini, 2005, p.128). This
increased engagement is complemented by an increasing number of individual and
institutional research efforts” focused on service learning in higher education.
“Paralleling growth in postsecondary education’s interest in service experiences has
been an interest in assessing the educational impacts of service” (Pascarella and
Terenzini, 2005, p. 128). As a result, dynamic and creative methods are being employed
at colleges and universities across the country with assessment serving as the catalyst
for curricular improvement as faculty desire to improve their teaching. However, this
does not mean that assessment tools are without challenges including but not limited to
institutionalizing assessment as a core practice of an organization or program.
The British scientist Lord Kelvin stated in 1884, “If you cannot measure it, you cannot
improve it.” The starting point, however, is not the measurements. Rather, it is the
statement of strategy.
Method
To address this question, the capstone project team analyzed the interviews of faculty
members, students and community partners of community-based learning classes that
scored one half of a standard deviation above the mean. Of particular interest to this
analysis is the examination of the Scorecard as an assessment tool of effective practices
explored in project questions #1 and #2.
Findings
There were six themes that emerged from the data analysis: 1) Light Bulb Moments;2)
What Exactly do the Results Mean?; 3) How are Scorecard Results going to be Used?; 4)
Be Mindful of the Community; 5) Administration and Execution of the Scorecard needs
to be Standardized; 6) A Necessary Evil. Many quotations have been taken from the
data and presented here to support the core themes. The quotes are single spaced,
indented an italicized. The faculty letter pseudonym is indicated after each quotation.
47
Light Bulb Moments
Faculty participants in this study commented frequently about their appreciation of the
scorecard as a way for them to think intentionally and deliberately about what they
were doing – or not doing – in their classrooms. Rarely, did a faculty member share
that the scorecard was providing new information. Rather, they commented that the
scorecard served as a reminder and guide for them.
I mean it’s been extremely beneficial, absolutely. It’s helpful on the
front end about how you think about how I have thought about
shaping a course to meet those criteria. So know that there’s going to
be this assessment stuff, it really does help shape the way you’re
thinking about a course, rather than, ‘Oh, my gosh. I gotta get this
together and it’s a narrow sense. Whatever – ‘ you think more
broadly about course design because you have a scorecard as a
framework to operate out of. (12A19)
But how it (scorecard) had functioned for myself and other faculty
here, who were teaching, it helped you think more comprehensively
about what we wanted for our course - how we wanted our course.
So that intentionality really helped. (12A19).
I think it was great. I mean the great thing about assessments – and I
think I will agree with the vast majority of the Teagle participants
that the scorecard has strength, but it also has weaknesses. It wasn’t
a cure-all. But I think the great thing about assessments is the type. I
always like to read over what assessments I’m going to be using
because they help me articulate my goals. So you’re, like, reading
over them in the beginning of the semester and you’re like, ‘Did the
professor establish good communication with the community
partner?” And I’m sitting here thinking, ‘Okay, how am I going to
establish good communication with the community partner? How am
I going to build reflection into the course?” So it becomes a tool for me
in some ways to say, ‘Okay, here are the things that my students are
going to be, in some ways, judging this course on So how can I make
these things explicit in this course?” So I think it’s helpful in that
regard, certainly. (3N23)
48
This is what you wanna look for when you’re putting together your
syllabus, when you’re creating your course or when you’re working
with an institute or center or whomever you are working with in
putting it together. (5T21)
But I think the scorecard for me articulated some key things I already
was doing or knowing or felt like I knew. (3N24).
. . .it helped me realize what I wasn’t doing that I needed to do. I
kinda knew what it was, it’s just in order to do it you really need to
invest a lot more time than what I had but it was helpful in that way
to clarify for me knowing what the best practices were in the literature
and in-the-field that I wasn’t hitting all those best practices.”(5T19)
Yeah, I think it would be good feedback. For example, for me, I made a
really good connection between what I want to accomplish through
this course and put those projects together. But maybe students
didn’t see it. And then maybe that can be a good feedback from me to
make a better connection when I teach next time. Yeah, so in that
sense it can be helpful. But honestly, I didn’t use it that way at the
time.” (2TT24)
Well I think it helped me to see the – almost like the – developmental
stages that, even if a course is not meeting best practices, that it can
still be an introductory level at the service-learning or communitybased learning class, and I think it is just also was a good reminder
for me about how I interact with the community partner, how I
continue to reinforce for the students why they’re doing what they’re
doing – you know that kind of thing. In some ways, in some of my
classes I’ve thought, ‘Oh, this is not going to score very high at all on
the scorecard,’ but that reminder of what best practices are useful.
(1NT26)
What Exactly Do The Results Mean?
While serving as a guide for action, many faculty members either directly commented
or alluded to a desire for the scorecard as an assessment tool to provide more easily
interpretable information about how they were doing. They want to understand what
the scorecard feedback means and to have it translated into what they should be doing
for practice. This would prevent the scorecard from being rendered useless.
49
“So the scorecard idea was a nice one from the beginning because it
looked like it might be eat and useful. But in the end I just don’t
know really how useful it is. (4TT12).
How Are The Results Going To Be Used?
Faculty expressed some reticence about the use of institutional assessment
tool results. There was almost complete agreement among interviewees that
the scorecard should not be used as an evaluation tool for faculty .
Now I think the scorecard is good, and we should still have as part of
our conversation here sorta the move forward – but’s there’s a place
for it. At the same time, there’s gotta be – it cannot be in and of itself,
an instrument of assessing tenure and promotion for example because
if that’s the case, then faculty won’t do it. (12A21).
I think there’s a danger to use these scorecards for an evaluation of
faculty teaching effectiveness because it really is the ideal and no one’s
gonna hit the best on all those levels so it can be used in a punitive
way especially for teaching evaluation of non-tenured faculty or even
for merit increases or whatever so I would shy away a little bit from
that just knowing that on some campuses, colleges and universities it
would be used as a tool not for benefiting teaching but rather as a
punitive tool.(5T21).
And I think it is up to the individual instructor to see how they might
improve their own work. I don’t think it’s useful as an evaluation
tool for instructor, but I think for them to recognize, you know,
‘Okay, well maybe this is just a first-time introductory class, but I
could do this or that better. So I think it’s more a personal use.
(1NT27)
Be Mindful of the Community.
Faculty welcomed a tool that provided feedback from both students and community
partners about their performance in the community-based classroom. However, they
found it equally important that community partners, and their relative circumstances
should be fully considered in both the development and execution of the scorecard or
other assessment tool.
50
It was just – the card itself was devised or imagined in this world
where students are sitting down in desks and filling it out or have
computers and are filling it out. Not in the world where they- you
know, afterschool, non-profit, where students come and go randomly.
Not everyone has access to computers. Not everyone is perhaps
invested as much as the students – our students who are getting a
grade are. (3A26).
Because I think we were in the field so much, getting the students to
sit down and do it (complete the scorecard) was bit of a thing. And
especially from the community end, it just wasn’t super convenient.
(3N25).
We are exploring how to involve our community partners more in
what we do, and so it’s (scorecard) very valuable (to gather
information from community partners) I think we have to be careful
about abusing their time. So a lot that’s on the scorecard might not
apply to their particular experience or work. So I think again, it’s a
good reminder that we need to involve them, but we also have to be
respectful of their time.”(1NT26)
I love that the scorecard goes out to community members. I have
always complained about this. I think it’s really, really important to
have community voice. I do think however, just because we have the
scorecard and we can give to community members, to me we have
access problems.” (3N25).
Administration and Execution of Scorecard Needs To Be Standardized.
The execution and administration of the scorecard was a recurring sentiment
expressed by the faculty as an opportunity for improvement. Closely related
to the fourth theme, faculty shared challenges with have the same scorecard
for faculty and students with questions not relevant to their perspective.
Many participants also were not clear on the who, what, when, where and
how of the scored execution an administration in their classes.
One survey – the same survey for community partners and for faculty
and students, didn’t work. (12A26).
51
Yeah, I don’t know if this questionnaire was even given to my
community partners or not back in 2010 and I don’t know why – I
mean I know that I filled it out and I don’t know why I had only four
students/ I can’t remember what about the administration would
account for the fact that I had only four students to do it. But
something must have gone afoul there because ordinarily my students
would be more cooperative. (13T13)
A Necessary Evil
While there was general praise for the Scorecard some interviewees viewed
it as a “sign of the times” rather than a fundamental necessity.
I’m not a big fan of all this assessment stuff. I understand why we do
it, measuring outcomes and things, but I feel like we’ve measured a
lot of this stuff to death, and we now have more people doing it, and I
guess it’s important to figure out who’s doing it well and who isn’t
and what that means. But I don’t know that the scorecards are the
best way to get at that.” (4TT12).
But this is back when Bill Reddings wrote The University in Ruins in
the mid-90s, he argued that the problem with excellence was that
nobody knew what it was and that we come up with measurements
because we thing if we have numbers, then we’re proving that it’s
excellent, but we don’t know what the excellence is anyway except
that we have high numbers. So I think we can make these tools more
effective to prove to people who need that proof that it does what we
say it does. But it’s kind of an inherent tautology in developing
theses measurement scales to prove what good outcomes are when
they’re developed to prove what good outcomes are. We know what
the out – we get the numbers we want ultimately, but do we really
know what the impact is? No because this is one class and one
student, and what does a student think six months later when
suddenly something kind of makes sense to them? What does it mean
if they continue to work with an organization for four years? That’s
not measured in the scorecard. So there’s so many ways to try to
understand the impact of these courses that we teach and these
experiences that we generate, that the idea that we would somehow
capture them in a scorecard to me seems limited at best. (4TT1213)
52
Summary
The six themes that resulted from the team’s study indicate that the faculty participants
are receptive to assessment and that the Scorecard should be recognized as a starter
effort.
However, they shared some concerns that warrant consideration for future
iterations of scorecard use. These themes were instrumental in developing a set of
recommendations and practical model to identify effective practices for faculty teaching
community based learning courses.
53
Limitations
54
Limitations of Data Analysis
Despite the project team’s efforts to anticipate and mitigate weaknesses with our project
design and methodology, we fully acknowledge the presence of limitations. These
limitations include 1) limited generalizability, 2) limited data integrity, 3) the absence of
weighting, 4) the time gap between the course being taught and the faculty being
interviewed, and 5) absence of community partner and student interviewees. The
capstone project team elaborates on these limitations below.
Limited Generalizability
A limitation is the limited generalizability from the results of this study to other faculty
members or institutions that have community-based learning courses on their campus.
The study was based upon a select group of faculty members who are Teagle-funded
Consortium participants at liberal arts institutions.
Limited Data Integrity
The raw data in the consortium database resulted from varied procedures of
administering the scorecard, collecting scorecard responses, and entry (if applicable) of
responses into the system. This variance resulted in a lack of consistency and
incomplete data. Almost 50% of the courses that administered the scorecard were
eliminated from the study because of incomplete data. Also, the analysis did not
account the respondent type of individual, e.g. student, faculty or community partner.
They are grouped as respondents or in aggregate. Both of these factors greatly impact
the study because the integrity of the data becomes questionable.
Due to limitations with manipulating the raw data in the consortium database, the
methodology to identify high-performing classes included the elimination of the I do not
have enough information to comment responses. While this action was accounted for in the
conversion to a 4-point scale, it can occur that using a 4-point Likert-scale vs. a 5-point
likert scale can over-scale results. For example, the elimination of the I do not have
enough information to comment responses reduces the size of the sample and potentially
increases the responses in one particular category.
Absence of Weighting
The average scores for questions were not weighted to account for differences in course
size or variance in the percentage of respondents in relation to the number of potential
respondents. For example, two courses may have the same average score on a question
55
and one course may have only 4 respondents while the other course has 20 respondents.
Due to averaging, in this scenario, the course with only 4 respondents would have an
advantage.
Potential Time Lag between Course Delivery and Project Interview
The data in Survey Monkey was collected over a period of six academic semesters. As a
result, it is possible that a faculty member who was interviewed may have taught the
course 5 semesters ago. This can mean selective memory at best, lost memory at the
worst, on the part of the faculty member about the happenings in their courses.
Faculty Interviews Only
The project team only conducted interviews with faculty. The study would have been
richer if interviews were conducted with students and community partners as well.
56
Recommendations
57
Recommendations
For institutional (liberal arts) policymakers:
Recommendation 1: Recognize and acknowledge the pedagogy and scholarship of communitybased learning and the connection to the mission of liberal arts institutions.
Overwhelmingly, faculty of high performing courses indicated that a relationship
existed between their institutional mission and the community-based learning approach
in the classroom. However, the capstone team’s findings also found that – despite the
aforementioned relationship – there is an opportunity for community-based learning to
be explicitly acknowledged as a credible pedagogy. This could be achieved through
institutional policies, procedures, programs, and practices that support communitybased learning.
Recommendation 2: Invest and Provide Resources and Support to Faculty of Community-Based
Courses
The capstone team’s findings indicated that faculty of “high-performing” communitybased learning courses required additional resources, primarily the resource of time.
Specifically, the findings indicated that teaching these courses required more time for
both planning and execution. To support the work of CBL faculty, institutions should
invest and provide the resources needed for CBL courses.
This could be in the form of direct support for CBL faculty by way of course release
time or indirect support through an investment in a centralized community-based
learning office that coordinates and manages these courses.
Recommendation 3: Align Institutional Mission and Organizational Practice through
Institution-wide Curriculum and Creation of an Administrative Office Focused on CommunityBased Learning.
For community-based learning faculty members:
Recommendation 4: Have Frequent, Deliberate and Intentional Interaction with Students and
Foster a Classroom Environment of Student as Both Learner and Teacher
For community-based learning to be effective faculty should incorporate
opportunities for faculty and student interaction throughout the entire
course.
58
Recommendation 5: Foster relationships with Community Partners
In addition to interaction between students and faculty, the project findings highlighted
the importance of a relationship with community partners. The findings suggest that
faculty of “high-performing” classes are keenly aware of the impact on the course when
community relationships are fostered, as well as when they are not.
Recommendation 6: Focus on Your Passion
As stated in a previous recommendation, the project team found that teaching
community-based learning requires additional time and work. However, this was not a
deterrence or barrier for the faculty of “high-performing” courses. The findings suggest
that a reason for this is because of the faculty member’s passion for the subject matter
and/or the community endeavor.
Recommendation 7: Develop a Network with Other Community-Based Learning Faculty and
Administrators
In addition to CBL faculty of “high-performing” courses bringing their passion into the
classroom they identified support gained from their network of other CBL faculty. The
findings suggest that this is especially important for faculty at institutions where CBL is
perceived as less than credible.
For research funders:
Recommendation 8: Fund Longitudinal Research to Measure the Effectiveness of Assessment
The findings of this study suggest that there are opportunities for an extended study to
determine if effective practices in the CBL classroom are bound by time.
Recommendation 9: Fund Research on the Impact of CBL on Community-University
Partnerships and Relationships
With increased attention on community-university partnerships, the nature of
community-based learning courses suggests that they may be instrumental in the
success of such partnerships.
Recommendation 10: Fund Research on the Impact of Rhetoric on, and a Rhetorical Strategy for,
the Acceptance of CBL within the Academy
59
The findings of this study indicated a concern about the receptivity of certain
terminology as compared to other terminology for community-based learning.
For Teagle Consortium (client):
Recommendation 11: Develop Separate Scorecards for Faculty, Students and Community
Partners.
Findings suggest that the lack of applicability of some of the questions on the scorecard,
especially for community partners was problematic and may have served as a barrier to
feedback.
Recommendation 12: Create a Standard Protocol for Administering the Teagle Scorecard and
Consider Using Technology.
Recommendation 13: Refine and Improve Data Collection and Entry into, and Management of,
the Teagle Scorecard Database.
Recommendation 14: Explicitly Identify the Teagle Scorecard as a Tool for Improvement, not
Evaluation.
Faculty members expressed concern about the Scorecard being used for evaluation
purposes. As a result, it is recommended that efforts be made to promote it as a tool for
improvement and confirmation of practice.
Recommendation 15: View the Teagle Scorecard as a Tool to Inform Key Performance Indicators
(KPIs) for Effective Practice in CBL Courses (See Figure 14).
The Teagle Scorecard is not a scorecard in accordance with how a scorecard is
traditionally defined. The findings support this with the high-performing CBL
instructors being unclear about what the Scorecard results meant in terms of their
classroom practices.
Recommendation 16: Create Teagle Scorecard Informed KPIs and a related KPI Scorecard (See
Appendix M and N).
The capstone project team recommends that KPIs and a related KPI Scorecard be
developed to provide community-based learning faculty the desired easily interpretable
information. The capstone team developed KPIs and a KPI Scorecard for the
Consortium informed by the findings from project questions one, two and three.
60
• Teagle
"Scorecard"
1
2
• Teagle
Scorecard
Based KPIs
• KPI
Scorecard
3
Figure 14. Teagle Scorecard Future Use
Recommendation 17: Implement the Project Team’s Research-Based Ongoing Assessment Model
for Faculty of Community-Based Learning Courses (see Figure 15).
The capstone project team developed the following Research-Based Ongoing Assessment
Model for Faculty of Community-Based Learning Courses as a sequential process for
assessment. The capstone project team created the evidence and research based model
a sequential process for assessment (see Figure 16). The model is sequential and
informed by the findings from the project questions one, two and three. During the
capstone project team’s interview, faculty members expressed some concerns that they
are not clear on the who, what, when, where, and how of the scorecard execution an
administration in their classes. For this reason, a sub- recommendation is that training
be provided for faculty on how to execute the Community-Based Learning Ongoing
Assessment Model.
61
Teagle
Scorecard
Review
KPIs
1
Convert
results to
Dashboard
Design
Course
2
5
Complete
Teach
KPI
Scorecard
Course
3
4
Figure 15. Ongoing Assessment Model for Faculty of Community-Based Learning
Courses
Step One: Review KPIs
As a precursor to designing or reviewing a community-based learning course, faculty
would review the Community-based Learning KPIs
Step Two: Design Course
The KPIs would inform course design and be explicitly embedded in the course design
process.
Step Three: Teach Course
Faculty teach their CBL course.
Step Four: Administer KPI Scorecard
At the end of the course, the KPI scorecards would be administered to students and
community partners.
62
Step Five: Convert Scorecard Results to Dashboard
The scorecard results would be converted into a dashboard that would illustrate the
success.
63
Conclusion
64
Conclusion
This study sought to extend the work of the Consortium by providing a systematic
research and evidence-based process for faculty to identify and employ effective
practices in the community-based learning classroom. The goal was to gain an
understanding of the most effective practices in community-based learning courses.
The capstone project team’s qualitative methodology combined interview and
document observation to facilitate their work and this form of triangulation lent itself to
generating a deeper understanding of the community-based learning.
The team’s findings revealed several effective practices of high performing communitybased learning courses within and across domains of practice. Through the study
questions, the project identified and unpacked the practices, offered related
recommendations and developed a process and protocol for ongoing examination.
With full acknowledgment of the identified limitations, it is our intent that this
approach to identify effective practices for faculty in the community-based learning
classroom will serve both a practical and a strategic purpose. This study demonstrates
the value of faculty-led assessment and provides example of how such can be done.
Additionally, given the increased attention on, and presence of, community-based
learning courses on college campuses, an understanding the most effective practices for
faculty is essential. This study will be helpful in providing a foundation for future
studies to build upon for exploring factors of effective practice.
.
65
References
66
References
Astin, A. W., & Sax, L. J. (1998). How undergraduates are affected by service
participation. Journal of College Student Development, 39(3), 251-263.
Astin, A. W., Saxin, L. J., & Avalos, J. (1999). Long term effects of volunteerism during
the undergraduate years. Review of Higher Education, 22(2), 187-202.
Bowen, G., & Kiser, P. (2009). Promoting innovative pedagogy and engagement through
service learning faculty fellows program. Journal of Higher Education Outreach,
13(1), 27-43.
Bringle, R. G., & Hatcher, J. A. (2000). Institutionalization of service-learning in higher
education. Journal of Higher Education, 71(3), 273-290.
Bringle, R. G., Phillips, M. A., & Hudson, M. (2004). The measure of service learning (2nd
ed.). Washington, DC: American Psychological Association.
Butin, D. W. (2010). Service-learning in theory and practice: The future of community
engagement in higher education. New York, NY: Palgrave MacMillan.
Butin, D. W., & Seider, S. (Eds.). (2012). The engaged campus. New York, NY: Palgrave
MacMilllan.
Campus Compact. (2000). Highlights and trends in student service and service learning:
Statistics from the 1999 member and faculty survey. Providence, RI: Author.
Dewey, J. (1938). Experience and education. New York, NY.: Touchstone.
Eyler, J. S., & Giles, D. E. (1999). Where’s the learning in service learning. San Francisco,
CA: Jossey-Bass.
Eyler, J. S., Giles, D. E., & Braxton, J. (1997). The impact of service-learning on college
students. Michigan Journal of Community Service Learning, 4, 5-15.
Giles, D. E., & Eyler, J. S. (1994,). The impact of a college community service learning
laboratory on students’ personal, social and cognitive outcomes. Journal of
Adolescence, 17, 327-339.
67
Gray, M. K., Ondaatje, E. H., Fricker, R., Geschwind, S., Goldman, C. A., Kanganoff, T.,
Klein, S. P. (1998). Coupling services and learning in higher education: The final report
of the evaluation of the Learn and Serve America, Higher Education Program. : The
RAND Corporation.
Jacoby, B. (1996). Service-learning in today’s higher education. In Service-learning in
higher education: Concepts and practices. San Francisco, CA: Jossey-Bass.
Jick, T. D. (1979). Mixing qualitative and quantitative methods: Triangulation in action.
Administrative Science, 24, 602-611.
Langseth, M., Plater, W. M., & Dillon, S. (Eds.). (2004). An academic administrator’s guide
to civic engagement and service-learning. Bolton, MA: Anker .
MacKay, V., & Rozee, P. (2004). Characteristics of faculty who adopt community service
learning pedagogy. Michigan Journal of Community Service Learning, 11(1), 21-33.
Mayhew, M., & Fernandez, S. (2007). Pedagogical practices that contribute to social
justice outcomes. The Review of Higher Education, 31(1), 55-80.
Miles, M. B., & Huberman, A. M. (1984). Qualitative data analysis: A source book of new
methods. Beverly Hills, CA: Sage.
O’Grady, C. (2000). Integrating service learning and multicultural education in colleges and
universities. Mahway, NJ: Lawrence Erlbaum Associates.
Pascarella, E., & Terenzini, P. (2005). How college affects students: Volume 2 a third decade of
research. San Francisco, CA: Jossey-Bass.
Patton, M. Q. (1990). Qualitative research and Evaluation (3rd ed.). Newbury Park, CA:
Sage.
Rhodes College (2011). Rhodes College 2010 Annual Report to Teagle Foundation.
Memphis, TN: Dr. Suzanne Bonefas.
Saltmarsh, J., & Hartley, M. (Eds.). (2011). To serve a larger purpose. Philadelphia, PA:
Temple University Press.
68
Sandmann, L., Kiely, R., & Greiner, R. (2009). Program planning: The neglected
dimension of service-learning. Michigan Journal of Community Service Learning,
15(2), 17-33.
Silverman, D., & Marvasti, A. (2008). Doing qualitative research: A comprehensive guide.
Thousand Oaks, CA: Sage.
Southern Regional Education Board. (1969). Atlanta service-learning conference report.
Atlanta, GA: Author.
Spradley, J. P. (1979). The ethnographic interview. New York, NY: Holt Rinehart and
Winston.
Stanton, T., Giles, D., & Cruz, N. (1999). Service-learning: A movement’s pioneers reflect on
its origins, practice, and future. San Francisco, CA: Jossey-Bass.
Taylor-Powell, E., & Renner, M. (2003). Analyzing qualitative data. Madison: Board of
Regents of the University of Wisconsin System.
The RAND Corporation. (1998). Coupling services and learning in higher education: The
final report of the evaluation of the Learn and Serve, Higher Education Program. :
Author.
Van Maanen, J. (1979). The fact of fiction in organizational ethnography. Administrative
Science, 539-550.
Van Maanen, J. (1988). Tales of the field: On writing ethnography. Chicago, IL: University
of Chicago Press.
Vogelgesang, L. J., & Astin, A. W. (2000). Comparing the effects of service-learning and
community service. Michigan Journal of Community Service Learning, 7, 25-34.
Zlotkowski, E. (1998). Successful service-learning programs: New models of excellence in
higher education. Bolton, MA: Ankery.
69
Appendices
70
Appendix A: Teagle-funded Consortium Member Institutions
Allegheny College
Allegheny’s undergraduate residential education prepares young adults for successful,
meaningful lives by promoting students’ intellectual, moral, and social development
and encouraging personal and civic responsibility. Allegheny’s faculty and staff
combine high academic standards and a commitment to the exchange of knowledge
with a supportive approach to learning. Graduates are equipped to think critically and
creatively, write clearly, speak persuasively, and meet challenges in a diverse,
interconnected world.
Allegheny students and employees are committed to creating an inclusive, respectful
and safe residential learning community that will actively confront and challenge
racism, sexism, heterosexism, religious bigotry, and other forms of harassment and
discrimination. We encourage individual growth by promoting a free exchange of ideas
in a setting that values diversity, trust and equality. So that the right of all to participate
in a shared learning experience is upheld, Allegheny affirms its commitment to the
principles of freedom of speech and inquiry, while at the same time fostering
responsibility and accountability in the exercise of these freedoms.
Franklin and Marshall College
Franklin & Marshall was established in 1787 with a gift of 200 British pounds from
Benjamin Franklin, and is located in historic Lancaster, a dynamic city with a thriving
arts scene. The College enrolls 2,324 students. The average class size is 19 students, and
the student-faculty ratio is 9:1. Our students receive more than $500,000 in research
grants every year. At Franklin & Marshall, we emphasize the life of the mind and
provide opportunities for learning while doing.
All students are lifelong members of a College House, five distinct hubs of academic,
extracurricular and social engagement in a residential setting. Guided by faculty dons
and administrative prefects, students govern their houses, develop leadership skills,
and create their own social and intellectual programs.
Students may join one or more of the College’s 115 clubs and organizations, ranging
from anime to Ultimate Frisbee. More than three-quarters of students participate in
community service, and about one-third belongs to one of 12 Greek organizations. Our
scholar-athletes compete in the NCAA Division III Centennial Conference. The College
fields 27 athletic teams—13 for men and 14 for women. Students may study abroad in
71
any of 200 locations around the world. Each year, one-third of our students goes abroad
or enrolls in a travel course. On campus, 87 percent of students have studied at least one
of the 11 foreign languages we offer. Our students learn by doing. They embrace the
opportunity to work side by side or in small groups with faculty members on research
projects that have real-world applications. And when given the choice of being a
scholar, an athlete, an artist, a leader or a volunteer, they are most apt to choose “all of
the above.”
Hobart and William Smith Colleges
Located on 195 acres in the heart of New York State’s Finger Lakes Region, Hobart and
William Smith are independent liberal arts colleges distinctive for providing highly
individualized educations. Guided by an interdisciplinary curriculum grounded in
exploration and rigor, the Colleges prepare students to think critically. In partnership
with the Geneva and global communities and through robust programs in career
development, study-abroad, service, leadership and athletics, the Colleges foster an
environment that values global citizenship, teamwork, ethics, inclusive excellence, and
cultural competence.
Under the mentorship of faculty, Hobart and William Smith students gain the necessary
clarity to be competitive when seeking employment. They win prestigious fellowships
like the Rhodes, Gates Cambridge, Udall, Fulbright and Goldwater. They gain
admittance to the best graduate programs in the country. They go on to lead lives of
consequence.
The Colleges enjoy a rich heritage based on a two-college system rooted in
interdisciplinary teaching and research. Originally founded as two separate colleges
(Hobart for men in 1822 and William Smith for women in 1908), HWS now operates
under a coordinate college system. All students share the same campus, faculty,
administration and curriculum. Each college maintains its own traditions, deans,
student government and athletic department. Men graduate from Hobart College.
Women graduate from William Smith College.
Ithaca College
Coeducational and nonsectarian, Ithaca is a nationally recognized comprehensive
college of 6,700 students. In the center of the Finger Lakes region of New York State,
Ithaca College's campus is 50 miles north of Binghamton and 60 miles south of
Syracuse. Ithaca, a city of 47,000, is served by US Airways and Northwest, Continental,
and United Airlines, and by Greyhound Bus Lines, Short Line, and other bus
companies.
72
Founded in 1892 as the Ithaca Conservatory of Music, the College was located in
downtown Ithaca until the 1960s, when the present campus was built on South Hill
overlooking Cayuga Lake.
Undergraduate enrollment is approximately 6,200: 2,700 men and 3,500 women.
Another 470 students are enrolled in graduate programs. Over 70 percent of
undergraduates reside on campus. Nearly every state in the U.S. and 73 other countries
are represented in the student population.
Niagara University
Niagara University was founded in 1856 as the College and Seminary of Our Lady of
Angels, which began with six students and two faculty. The founders of the university,
Vincentians Priests, the Most Rev. John Timon, C.M. and Rev. John J. Lynch, C.M.,
purchased two adjoining farms, the Vedder and De Veaux farms, on Monteagle Ridge.
Over the next 25 years, the college and seminary grew and prospered producing
graduates that entered such fields as the priesthood, law and medicine, teaching,
journalism and many others. Indeed, by the spring of 1863, the college had become so
successful that the New York Legislature granted a charter empowering the college and
seminary to award degrees to its graduates.
Twenty-five years after its founding, on August 7, 1883, Grover Cleveland, then
governor of New York, gave permission to the college and seminary to change its name
to Niagara University. The seminary remained a full and vibrant part of the university
community until 1961 when it was moved to Albany, New York. The university has
evolved over its long history into an institution that offers degree programs in the Arts
and Sciences, Business and Teaching, and Hospitality and Tourism.
Throughout its long history, Niagara has remained true to the Vincentian principles of
preparing students for personal and professional success while remaining committed to
the values of its namesake, St. Vincent de Paul, as well as to its Catholic heritage.
Rhodes College
Founded in 1848, Rhodes College provides an outstanding liberal-arts education. The
Rhodes experience combines the best of the classroom and the real world – through
internships, service, research and other opportunities in Memphis and far beyond.
Students learn, play and serve others with a determination to grow personally and to
improve the quality of life within their communities.
The collegiate-gothic campus sits on a 100 acre, wooded site in the heart of historic
Memphis. In this beautiful, supportive environment, our students and faculty comprise
73
a community unmatched in its dedication to learning and a life of honor both on and off
campus. In fact, for more than a century, Rhodes has placed its Honor System at the
forefront of student life.
St. Mary’s College
St. Mary's University, founded in 1852 by Marianist brothers and priests, is the first
institution of higher learning in San Antonio and the oldest Catholic university in Texas
and the Southwest. Personal attention and powerful academic programs have made St.
Mary's, located on 135 acres northwest of downtown San Antonio, a nationally
recognized liberal arts institution. With a diverse student population of nearly 4,000 of
all faiths and backgrounds, St. Mary's is home to five schools: Humanities and Social
Sciences, Bill Greehey School of Business, Science, Engineering and Technology,
Graduate, and Law.
The University provides a Catholic education experience that evokes academic
excellence while integrating liberal studies, professional preparation and ethical
commitment. St. Mary's 192 full-time faculty members, 94 percent of whom hold
doctoral or terminal degrees in their fields, are committed to student success in and out
of the classroom. St. Mary's has approximately 70 undergraduate and graduate majors
and offers over 120 degree programs, which include two doctoral and two law
programs. The student/faculty ratio of 13-to-1 permits small classes and promotes active
learning.
The Marianists who came to San Antonio in 1852 responded to the call of their superiors
to establish an educational institution to regenerate the people of the city. Through their
work and the work of those who followed them, St. Mary's University has maintained
its reputation as "a noble institution destined to be a great education center of the
Southwest."
St. Mary's serves the various communities of San Antonio, the Southwest, the nation
and the world through the intellectual, spiritual, moral and professional leadership of
its faculty, administration, staff and students.
Stonehill College
Stonehill is a selective Catholic college located near Boston on a beautiful 384-acre
campus in Easton, Massachusetts. With a student: faculty ratio of 13:1, the College
engages over 2,300 students in 80+ rigorous academic programs in the liberal arts,
sciences, and pre-professional fields. The Stonehill community helps students to
74
develop the knowledge, skills, and character to meet their professional goals and to live
lives of purpose and integrity.
At Stonehill, our ‘one purpose’ is to educate “the whole person so that each Stonehill
graduate thinks, acts, and leads with courage toward creating a more just and
compassionate world.”
75
Appendix B: Community-Based Learning Scorecard for Students
COMMUNITY-BASED LEARNING SCORECARD FOR STUDENTS
Your College or University:
Course Number and Title:
The following survey (“CBL Scorecard”) is being used at multiple colleges and universities who are part of a
consortium funded by the Teagle Foundation. Our goals are to build a common assessment tool to gauge the
impact of service-learning/community-based learning on student learning and collect data about optimal
practices for service learning at small colleges. The data will also be used for individual course and program
improvement.
Please note that by completing this survey, you are indicating your willingness to have your responses used for
research purposes. Answers will be confidential and only group data will be reported in written summaries of the
findings.
S1. What is your current academic status?
First-Year
Senior
Sophomore
Grad Student
Junior
S2. Approximately how many hours do you spend PER WEEK on community-based learning projects?
None
4-5 hours
1-2 hrs
More than 5 hours
3-4 hrs
S3. What is your age?
S4. What is/was the community site where you did most of your work for this course or program?
76
The following questions seek to learn about your perceptions of the experiences you have
encountered during your current community-based learning (CBL) course/program. Please
indicate your level of agreement with each statement.
I do not
have
Strongly
Strongly enough
Disagree Agree
Disagree
Agree information
to
comment
Students perform a variety of activities while carrying out their
community-based learning (CBL) project.
There is a clear connection between the specific tasks students
perform in the community and the goals of the course.
CBL projects require the use of course knowledge and skills to
address real problems in the community.
The course entails the application of theories covered in the course
to CBL projects.
Students are required to keep written reflections of their experiences
with their CBL project.
Instructor feedback encourages students to critically reflect on the
observations they have made in their written reflections.
The instructor encourages students to make written reflections that
may express controversial thoughts or observations about their
experiences.
The course provides opportunities for students to reflect on their
expectations before their CBL project begins.
CBL projects are central to the day-to-day discussions and written
work of the course.
CBL projects take place over a sustained period of time.
Students are involved in the planning of their CBL project.
Students work directly with community partners in their CBL project.
Students have important responsibilities in their CBL project.
CBL projects performed as part of this course/program are useful to
the community.
The instructor has a clear understanding of what students are doing
in the community.
In making written reflections, students are encouraged to explore
their own assumptions and/or perceptions about the organization of
society.
Community partners have a clear sense of what CBL projects will
accomplish for them.
Community partners do not view CBL projects as a patronizing
charity.
The goals of CBL projects carefully consider the traditions/culture of
the local community.
77
The following questions seek to learn about your perceptions of the experiences you have
encountered during your current community-based learning (CBL) course/program. Please
indicate how often these behaviors have occurred during this CBL course
Never
Occasionally
Frequently
Very
frequently
I do not
have
enough
information
to
comment.
The instructor works with students in the community
during their CBL project.
The instructor asks students to identify alternative ways of
viewing issues arising from their CBL experience.
The instructor provides feedback on individual student
reflections.
The instructor uses student reflections to focus class
discussions.
Class discussions focus on the connections between the
subject matter of the course and CBL projects.
Students relate CBL experiences to course readings and
concepts in written reflections.
Community partners provide feedback on students’ work
on the project.

Students, please answer questions S1-S2 before proceeding to the Scorecard. S1. What is your current academic
status? S2. Approximately how many hours do you spend PER WEEK on service learning projects? What is your
age? What is/was the community site where you did most of your work for this course or First-Year Sophomore
Junior Senior Grad Student None 1-2 hrs 3-4 hrs 4-5 hours More than 5 hours
Additional comments (optional)
program? Students, please answer questions S1-S2 before proceeding to the Scorecard.
S1. What is your current academic status?

Students, please answer questions S1-S2 before proceeding to the Scorecard. S1. What is your current academic
status? S2. Approximately how many hours do you spend PER WEEK on service learning projects? What is your
age? What is/was the community site where you did most of your work for this course or First-Year Sophomore
Junior Senior Grad Student None 1-2 hrs 3-4 hrs 4-5 hours More than 5 hours
78
program? Students, please answer questions S1-S2 before proceeding to the Scorecard.
Appendix C: Community-Based Learning Scorecard for
Faculty/Instructors
COMMUNITY-BASED LEARNING SCORECARD FOR FACULTY/ INSTRUCTORS
Your College or University:
Course Number and Title:
The following survey (“CBL Scorecard”) is being used at multiple colleges and universities who are part of a
consortium funded by the Teagle Foundation. Our goals are to build a common assessment tool to gauge the
impact of service-learning/community-based learning on student learning and collect data about optimal
practices for service learning at small colleges. The data will also be used for individual course and program
improvement.
Please note that by completing this survey, you are indicating your willingness to have your responses used for
research purposes. Answers will be confidential and only group data will be reported in written summaries of the
findings.
F1. What year did you obtain your highest educational degree?
F2. What is your current academic rank?
Instructor
Professor
Asst. Prof.
Emeritus/a
Assoc. Prof.
F3. What is your primary teaching discipline?
Staff Instructor/Program leader
F4. Does this Community-Based Learning (CBL) course/program have multiple project opportunities?
Yes
No
F5. Does this CBL course/program have project opportunities at multiple sites?
Yes
No
F6. Does your institution provide funding for CBL projects?
Yes
No
F7. Do instructors of courses with CBL projects typically receive release time for the planning and
implementation of their CBL projects?
Yes
No
F8. Does your institution conduct evaluations of CBL projects?
Yes
No
What is the enrollment of the CBL course or program that you are evaluating?
79
The following questions seek to learn about your perceptions of the experiences you have
encountered during your current community-based learning (CBL) course/program. Please
indicate your level of agreement with each statement.
I do not
have
Strongly
Strongly enough
Disagree Agree
Disagree
Agree information
to
comment
Students perform a variety of activities while carrying out their
community-based learning (CBL) project.
There is a clear connection between the specific tasks students
perform in the community and the goals of the course.
CBL projects require the use of course knowledge and skills to
address real problems in the community.
The course entails the application of theories covered in the course
to CBL projects.
Students are required to keep written reflections of their experiences
with their CBL project.
Instructor feedback encourages students to critically reflect on the
observations they have made in their written reflections.
The instructor encourages students to make written reflections that
may express controversial thoughts or observations about their
experiences.
The course provides opportunities for students to reflect on their
expectations before their CBL project begins.
CBL projects are central to the day-to-day discussions and written
work of the course.
CBL projects take place over a sustained period of time.
Students are involved in the planning of their CBL project.
Students work directly with community partners in their CBL project.
Students have important responsibilities in their CBL project.
CBL projects performed as part of this course/program are useful to
the community.
The instructor has a clear understanding of what students are doing
in the community.
In making written reflections, students are encouraged to explore
their own assumptions and/or perceptions about the organization of
society.
Community partners have a clear sense of what CBL projects will
accomplish for them.
Community partners do not view CBL projects as a patronizing
charity.
The goals of CBL projects carefully consider the traditions/culture of
the local community.
80
The following questions seek to learn about your perceptions of the experiences you have
encountered during your current community-based learning (CBL) course/program. Please
indicate how often these behaviors have occurred during this CBL course/program.
Never
The instructor works with students in the community
during their CBL project.
The instructor asks students to identify alternative ways of
viewing issues arising from their CBL experience.
The instructor provides feedback on individual student
reflections.
The instructor uses student reflections to focus class
discussions.
Class discussions focus on the connections between the
subject matter of the course and CBL projects.
Students relate CBL experiences to course readings and
concepts in written reflections.
Community partners provide feedback on students’ work
on the project.
Additional comments (optional)
81
Occasionally
Frequently
Very
frequently
I do not
have
enough
information
to
comment.
Appendix D: Community-Based Learning Scorecard for Community
Partners
COMMUNITY-BASED LEARNING SCORECARD FOR COMMUNITY PARTNERS
College or University:
Course Number and Title:
The following survey (“CBL Scorecard”) is being used at multiple colleges and universities who are part of a
consortium funded by the Teagle Foundation. Our goals are to build a common assessment tool to gauge the
impact of service-learning/community-based learning on student learning and collect data about optimal
practices for service learning at small colleges. The data will also be used for individual course and program
improvement.
Please note that by completing this survey, you are indicating your willingness to have your responses used for
research purposes. Answers will be confidential and only group data will be reported in written summaries of the
findings.
Approximately how many hours do you spend PER WEEK supervising students from
Community-Based Learning courses/projects?
None
4-5 hours
1-2 hrs
More than 5 hours
3-4 hrs
Special Note for our Community Partners:
Please note that the following survey is designed to measure student learning in this course
or program. We understand that there will be items in the scorecard about aspects of the
student learning experience which you may not have enough information to answer. We
are very grateful for your time in considering these questions and helping us to improve the
service learning experience, and for your role in making this experience possible.
82
The following questions seek to learn about your perceptions of the experiences you have
encountered during your current community-based learning (CBL) course/program. Please
indicate your level of agreement with each statement.
I do not
have
Strongly
Strongly enough
Disagree Agree
Disagree
Agree information
to
comment
Students perform a variety of activities while carrying out their
community-based learning (CBL) project.
There is a clear connection between the specific tasks students
perform in the community and the goals of the course.
CBL projects require the use of course knowledge and skills to
address real problems in the community.
The course entails the application of theories covered in the course
to CBL projects.
Students are required to keep written reflections of their experiences
with their CBL project.
Instructor feedback encourages students to critically reflect on the
observations they have made in their written reflections.
The instructor encourages students to make written reflections that
may express controversial thoughts or observations about their
experiences.
The course provides opportunities for students to reflect on their
expectations before their CBL project begins.
CBL projects are central to the day-to-day discussions and written
work of the course.
CBL projects take place over a sustained period of time.
Students are involved in the planning of their CBL project.
Students work directly with community partners in their CBL project.
Students have important responsibilities in their CBL project.
CBL projects performed as part of this course/program are useful to
the community.
The instructor has a clear understanding of what students are doing
in the community.
In making written reflections, students are encouraged to explore
their own assumptions and/or perceptions about the organization of
society.
Community partners have a clear sense of what CBL projects will
accomplish for them.
Community partners do not view CBL projects as a patronizing
charity.
The goals of CBL projects carefully consider the traditions/culture of
the local community.
83
The following questions seek to learn about your perceptions of the experiences you have
encountered during your current community-based learning (CBL) course/program. Please
indicate how often these behaviors have occurred during this CBL course/program.
Never
Occasionally
Frequently
Very
frequently
I do not
have
enough
information
to
comment.
The instructor works with students in the community
during their CBL project.
The instructor asks students to identify alternative ways of
viewing issues arising from their CBL experience.
The instructor provides feedback on individual student
reflections.
The instructor uses student reflections to focus class
discussions.
Class discussions focus on the connections between the
subject matter of the course and CBL projects.
Students relate CBL experiences to course readings and
concepts in written reflections.
Community partners provide feedback on students’ work
on the project.

Students, please answer questions S1-S2 before proceeding to the Scorecard. S1. What is your current academic
status? S2. Approximately how many hours do you spend PER WEEK on service learning projects? What is your
age? What is/was the community site where you did most of your work for this course or First-Year Sophomore
Junior Senior Grad Student None 1-2 hrs 3-4 hrs 4-5 hours More than 5 hours
Additional comments (optional)
program? Students, please answer questions S1-S2 before proceeding to the Scorecard.
S1. What is your current academic status?

Students, please answer questions S1-S2 before proceeding to the Scorecard. S1. What is your current academic
status? S2. Approximately how many hours do you spend PER WEEK on service learning projects? What is your
age? What is/was the community site where you did most of your work for this course or First-Year Sophomore
Junior Senior Grad Student None 1-2 hrs 3-4 hrs 4-5 hours More than 5 hours
84
program? Students, please answer questions S1-S2 before proceeding to the Scorecard.
Appendix E: Qualitative Study Eligible Courses
ID
Class
Year
1
A1S10
Allegheny: ER/RS 360: Religion & Ecology
Spring 2010
2
A2S10
Allegheny: FS 102 W4: Death
Spring 2010
3
A3S10
Allegheny: FS 102: The 21st Century and the Challenge to Education
Spring 2010
4
A4S10
Allegheny: FS102: Quilts, Stories & Social Change
Spring 2010
5
A5S10
Allegheny: INTDS 160: Introduction to Social Action
Spring 2010
6
F1S10
F&M: GOV 425: Human Rights-Human Wrongs
Spring 2010
7
H2S10
HWS: EDUC 306: Technology and Disability
Spring 2010
8
H4S10
HWS: PSY 370: Topics in Developmental Psychology
Spring 2010
9
H5S10
HWS: PHIL 235: Morality and Self-Interest
Spring 2010
10
H6S10
HWS: WRRH 100: Writer's Seminar
Spring 2010
11
H7S10
HWS: WRRH 322: Adolescent Literature
Spring 2010
12
I1S10
Ithaca: WRTG 31700 Proposals and Grant Writing
Spring 2010
13
I2S10
Ithaca: ENVS 12100 Environmental Science II: Science and Technology
Spring 2010
14
I5S10
Ithaca: ENVS 20200: Topics in Sustainability
Spring 2010
15
R1S10
Rhodes: RS 233: Pain, Suffering and Death (Jordan)
Spring 2010
16
R2S10
Rhodes: RS 460 (Hotz)
Spring 2010
17
R3S10
Rhodes: GEO 214: Environmental Hydrogeology (Houghton)
Spring 2010
18
R4S10
Rhodes: Psych 229: Developmental Psychology (Walton)
Spring 2010
19
A6F10
Allegheny: Eco 100(1) Intro to Micro: Wealth Poverty and Power
Fall 2010
20
A7F10
Allegheny: Educ 220: Social Foundations of Education
Fall 2010
21
A8F10
Allegheny: FS 101
Fall 2010
22
A9S10
Allegheny: DMS 201
Fall 2010
23
F2F10
F&M: GOV 472: Citizenship seminar
Fall 2010
24
I6F10
Ithaca: WRTG 31700 01: Proposal and Grant Writing
Fall 2010
25
I9F10
Ithaca: Course 4
Fall 2010
26
A10S11
Allegheny: ECON 238: Economics of Poverty and Inequality
Spring 2011
27
A11S11
Allegheny: INTDS 201/202: Service-learning I/II
Spring 2011
85
ID
Class
Year
28
A12S11
Allegheny: INTDS 560: VESA Capstone Seminar
Spring 2011
29
A13S11
Allegheny: RS/ES 360: Religion and Ecology
Spring 2011
30
F3S11
F&M: INT371: VITA: Social In/Justice and the Vulnerable in Lancaster
Spring 2011
31
F4S11
F&M: PBH303: Problem Solving Courts
Spring 2011
32
F5S11
F&M: SOC 472: Sociology of Adolescence
Spring 2011
33
F6S11
F&M: SOC384: Urban Education
Spring 2011
34
H11S11
HWS: Econ 213: Urban Economics
Spring 2011
35
H12S11
HWS: Educ 202: Human Growth and Development
Spring 2011
36
H13S11
HWS: Rel 213: Death and Dying
Spring 2011
37
I9S11
Ithaca: ENVS 20200: Community Skills for a Sustainable Future
Spring 2011
38
I10S11
Ithaca: HIST 27000: History of American Environmental Thought
Spring 2011
39
I11S11
Ithaca: WRTG 31700 01: Proposal and Grant Writing
Spring 2011
40
N9S11
Niagara: REL 343: Women in Church
Spring 2011
41
S1S11
Stonehill: EDU 333 : Topics in Education: Energy Playground - Teaching Children Science
Spring 2011
42
S3S11
Stonehill: LC 292: Art & Civic Culture in Urban Neighborhoods
Spring 2011
43
S4S11
Stonehill: SOC 328: Community Organizing: people, power & change
Spring 2011
44
A14F11
Allegheny: VESA 160 Intro to VESA
Fall 2011
45
H14F11
HWS: Soc. 100, Intro to Sociology (J. Harris)
Fall 2011
46
H15F11
HWS: Econ 122: Economics of Caring (W. Waller)
Fall 2011
47
H16F11
HWS: Ed 203: Children with Disabilities (M. Kelly)
Fall 2011
48
H17F11
HWS: Soc 290: Soc. of Community (J. Harris)
Fall 2011
86
Appendix F: Qualitative Study Ineligible Courses
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
ID
Class
Year
H1S10
H3S10
H8S10
I3S10
I4S10
N1S10
N2S10
N3S10
N4S10
H9F10
H10F10
I7F10
I8F10
I10F10
N1F10
N2F10
F7S11
N5S11
N7S11
N10S11
R5S11
R6S11
S2S11
A15F11
F8F11
F9F11
N11F11
N12F11
HWS: SOC 465: Senior Seminar: Research Practicum
HWS: EDUC 407: Special Education Practicum
HWS: GCIP 401 (Research Project
Ithaca: RLST 22200 Spiritual Journeys
Ithaca: SPCM 14000 Small Group Communication
Niagara: ACC 224: Intermediate Accounting
Niagara: PHI 206: Ethics
Niagara: HIS 390: Introduction to Public History
Niagara: PSY 492: Practicum in Psychology
HWS: FSEM - You Are Here: Geneva 101
HWS: REL 213 - Death and Dying
Ithaca: ENVS 35000 01: Conservation Biology: Topics in Natural Resources
Ithaca: ENVS 11200 01: Sustainability Principles and Practice
Ithaca: Course 6
Nazareth: Course 1
Nazareth: Course 2
F&M: WGS 399A: Gender & Violence
Niagara: ACC 366: Govt. and Non-Profit Accounting
Niagara: HIS 374: Modern Africa
Niagara: SWK 210: Diversity and Social Justice
Rhodes: Community Development Fellowship
Rhodes: Crossroads Fellowship
Stonehill: LC 230: Through the Looking Glass
Allegheny: VESA 201/202: Service-learning I/II (K. Joshua)
F&M: GOV425: Human rights-Human Wrongs
F&M: PBH303: Problem Solving Courts
Niagara: ACC 366: Govt. and Non-Profit Accounting
Niagara: CMS 120: Media Writing
Spring 2010
Spring 2010
Spring 2010
Spring 2010
Spring 2010
Spring 2010
Spring 2010
Spring 2010
Spring 2010
Fall 2010
Fall 2010
Fall 2010
Fall 2010
Fall 2010
Fall 2010
Fall 2010
Spring 2011
Spring 2011
Spring 2011
Spring 2011
Spring 2011
Spring 2011
Spring 2011
Fall 2011
Fall 2011
Fall 2011
Fall 2011
Fall 2011
87
29
30
31
32
33
34
35
36
37
38
39
40
41
42
ID
Class
Year
N12F11
N14F11
R7F11
R8F11
S1F11
S2F11
S3F11
S4F11
S5F11
S5F11
N6S11
N8S11
N13F11
N15F11
Niagara: HIS 374: Modern Africa
Niagara: REL 343: Women in Church
Rhodes: Community Development Fellowship
Rhodes: Crossroads Fellowship
St. Mary's: SC 1311, B: Introductory Sociology
St. Mary's: SC 1311, D: Introductory Sociology
St. Mary's: SMC 2302, C: Foundations of Practice, Civic Engagement and Social Action
St. Mary's: SMC 2302, D: Foundations of Practice, Civic Engagement and Social Action
St. Mary's: SMC 2302, E: Foundations of Practice, Civic Engagement and Social Action
St. Mary's: SMC 2302, F: Foundations of Practice, Civic Engagement and Social Action
Niagara: CMS 120: Media Writing
Niagara: PHI 206: Ethics
Niagara: PHI 206: Ethics
Niagara: SWK 210: Diversity and Social Justice
Fall 2011
Fall 2011
Fall 2011
Fall 2011
Fall 2011
Fall 2011
Fall 2011
Fall 2011
Fall 2011
Fall 2011
Spring 2011
Spring 2011
Fall 2011
Fall 2011
88
Appendix G: Scorecard to Domains of Practice Key
Yellow=Placement Quality | Pink = Application | Blue = Reflection/Feedback | Green = Community Voice
I do not
have
Strongly
Strongly enough
Disagree Agree
Disagree
Agree informatio
n to
comment
Students perform a variety of activities while carrying out their
community-based learning (CBL) project.
There is a clear connection between the specific tasks students
perform in the community and the goals of the course.
CBL projects require the use of course knowledge and skills to
address real problems in the community.
The course entails the application of theories covered in the
course to CBL projects.
Students are required to keep written reflections of their
experiences with their CBL project.
Instructor feedback encourages students to critically reflect on the
observations they have made in their written reflections.
The instructor encourages students to make written reflections
that may express controversial thoughts or observations about
their experiences.
The course provides opportunities for students to reflect on their
expectations before their CBL project begins.
CBL projects are central to the day-to-day discussions and written
work of the course.
CBL projects take place over a sustained period of time.
Students are involved in the planning of their CBL project.
Students work directly with community partners in their CBL
project.
Students have important responsibilities in their CBL project.
CBL projects performed as part of this course/program are useful
to the community.
The instructor has a clear understanding of what students are
doing in the community.
In making written reflections, students are encouraged to explore
their own assumptions and/or perceptions about the organization
of society.
Community partners have a clear sense of what CBL projects will
accomplish for them.
89
I do not
have
Strongly
Strongly enough
Disagree Agree
Disagree
Agree informatio
n to
comment
Community partners do not view CBL projects as a patronizing
charity.
The goals of CBL projects carefully consider the traditions/culture
of the local community.
The following questions seek to learn about your perceptions of the experiences you have encountered
during your current community-based learning (CBL) course/program. Please indicate how often these
behaviors have occurred during this CBL course/program.
Never
The instructor works with students in the community
during their CBL project.
The instructor asks students to identify alternative ways of
viewing issues arising from their CBL experience.
The instructor provides feedback on individual student
reflections.
The instructor uses student reflections to focus class
discussions.
Class discussions focus on the connections between the
subject matter of the course and CBL projects.
Students relate CBL experiences to course readings and
concepts in written reflections.
Community partners provide feedback on students’ work
on the project.
90
Occasionally
Frequently
Very
frequently
I do not
have
enough
information
to
comment.
Appendix H: Recruitment Email Script
Hello, my name is [researcher] and I am a doctoral student attending Vanderbilt
University’s Peabody College of Education. I am working on my doctoral
capstone project under the supervision of my advisor Dr. John Braxton to
identify the most effective practices of community-based learning courses.
You have been selected as a potential study participant. If you volunteer as a
participant in this study, you will be agreeing to an individual SKYPE VoIP
interview via SKYPE. Important interview topics involve: your academic or
administrative discipline, community-based learning perceptions; and student
learning and assessment. The interview should take about one (1) hour or 60
minutes of your time and will be audio recorded with your permission. As a
study participant we will ask you to share course or work related materials for
our study.
I would like to assure you that this study has been reviewed and received
clearance and approval through the Vanderbilt Institutional Review Board.
However, the final decision about participation in this study is entirely yours.
Your participation is entirely voluntary; you may skip any questions that you
do not want to answer. Any personally identifiable information collected
during the survey will be kept strictly confidential. I will only use aggregated
data in my research study report.
If you are interested in participating, please complete the attached Interest in
Participation form and email it to [researcher email]. We will follow-up with
you accordingly.
Thank you.
91
Appendix I: Confirmation of Interest Email Script
Thank you for your initial interest in participating in our qualitative study to
identify the best practices for community based learning. To confirm your
interest please carefully review, complete, sign and email the attached informed
consent form to [researcher’s name and email]. After receipt and review of your
consent form, you will be notified via email of your confirmed interview date
and time.
If you have any questions, you may contact [researcher’s name and email].
Thank you.
92
Appendix J: Confirmation of Interview Email Script
Thank you for submitting an informed consent form and agreeing to participate
in our qualitative study to identify the best practices for community based
learning. You are scheduled to interview with [researcher name] on [date and
time].
As a reminder, the interview will last approximately 60 minutes and will be
conducted via SKYPE. Please email your SKYPE address to [researcher’s
email] and add the following [researcher’s SKYPE address] to your contact list.
We look forward to speaking with you soon. If you have any questions, you
may contact [researcher’s name and email].
93
Appendix K: Qualitative Interview Protocol
Introduction
Thank you for your time. We are conducting research to identify effective practices of
community-based learning courses. Research suggests that placement quality, application,
reflection and feedback, and community voice are the key domains of effective practice. You
have been invited to participate in this study because of your resulting “score” on the Teagle
Scorecard in one or more of these domains.
We have your name as INSERT and INSERT as the name of your institution. Is this correct?
Please
Background Information
 What is your role/title?
 What is your faculty rank?
 How long have you been with the university?
 How long have you served in this current role?
 How long have you been in academia as faculty/administrator?
 What is your academic/administrative discipline?
Description Questions
 Can you tell me about your course?
 How long have you been teaching this course? Other CBL courses?
 Why do you use CBL?
 How did you structure your lesson (s)?
 How did you set up the course so that students learn/think about the community
issue?
 What were your desired learning outcomes for your students through this CBL
experience? What did you expect them to do?
 May I have a copy of your syllabus and other course materials?
Structural Questions
Placement Quality
 How were the activities that students participated in during the CBL course project
determined/identified?
 How many activities did your CBL course offer/require for students?
 What was the duration of CBL projects in your class?
 In what ways were students involved in planning for their CBL project?
94



In what ways were students involved in planning activities for their project?
How often, and in what manner, did students interact with their community partner?
Describe your level of involvement “in the field” with the students, including
frequency?
Application
 What was done to establish an explicit connection between the tasks students
performed in the community and the goals of the course? Were there any challenges
to do this? If so, how did you overcome them?
 How did you go about creating a link between the course and the community
problem?
 How were the community problems identified and who identified them?
 Describe how the community was selected?
 How were learning objectives developed for the course? How many learning
objectives did the course have?
 Describe the structure of your class, e.g. lecture, location, etc.?
 How often did your class meet?
 How did you monitor/remain abreast of what students were doing in the community?
Reflection/Feedback
 How did you facilitate and foster in-class discussion and reflection?
 What types of opportunities were provided to students reflect?
 What type of written reflection did you have students do? How often did it have to be
done? When did it have to be done?
 What type of feedback did you provide to students? When did you provide feedback?
 What opportunities were provided for community partners to provide feedback?
How often and when?
Community Voice
 Who articulated the goals of the community to the students? How were the goals
articulated?
 What was the relationship between the university and the CBL project/project site
and/or community partner? Was there a pre-existing relationship?
 In what ways was the relationship with the community fostered after the CBL project?
Scorecard Specific
 How was administering the scorecard beneficial to your teaching?
 How did the results of the scorecard contribute to your understanding of effective
community based learning?
95




What do think about the ability to tailor the scorecard to specific terminology utilized
on your campus, i.e. service-learning, community based learning?
What do you think about the ability to gather data from students, faculty, and
community partners?
What do you think about the categorization of scorecard results according to four
domains?
How can the scorecard and results be made more useful to instructors of CBL courses?
Contrasting Questions
 How is this teaching different from other courses where you may not have a
experiential community project as part of the academic/learning experience?
Institutional Mission
 What is your impression of your institution’s/department’s commitment to
community based learning courses in the curriculum? How did this influence your
course?
 How would you describe the relationship between community based learning and the
university’s mission?
 What administrative area is responsible for coordinating CBL courses?
Resources (financial, time)
 How did access to, or lack of, resources impact your course design?
 Did you receive release time for the planning and implementation of your communitybased learning course? If yes, describe the process for acquiring this release time and
the impact on the course development?
 Was this a pre-existing course? If yes, what type of support did you receive to make
the curricular changes necessary to add a service learning component to your course,
e.g. release time?
96
Appendix L: Interviewee Pseudonym Faculty Type Key
A:
University Administrator
N:
Non Tenured Faculty
NT:
Newly Tenured Faculty
O:
Other (Non University Employee)
T:
Tenured
TT:
Tenure Track
97
Appendix M: Teagle Scorecard Based KPIs
Teagle Scorecard Based KPIs
Placement Quality
# of times community partner communicates the community goals and project activities, alone or in conjunction with faculty, to the
students before project begins (tq11)
% of in-class time dedicated to communicating goals and project activities and placement training
# of times faculty joins student at CBL project site (tq20)
# of times faculty and community partner meet before project begins to plan and develop an MOU (tq1)
# of times community partner and students meet during project (tq1, tq12)
# of activity reports faculty required from students (tq1)
Application
% of in-class time spent discussing tasks performed by students in the community (tq2)
% of in-class time dedicated to teaching course material useful to CBL project (tq3, tq4, tq24)
# of times faculty communicates to community partner course goals and objectives before course begins (tq2, tq3)
# of times faculty solicit community partner goals and objectives before course begins (tq15)
% of in-class time allocated to student tasks and activities at project site
Reflection and Feedback
# of in-project written reflection assignments required of students (tq5 and tq6)
% of in-class time dedicated to discussing student written reflections (tq5 and tq6)
# of times faculty uses prompts for in-class discussion and out-of-class writing assignments (tq7, tq16, tq21, tq23, tq25)
# of pre-project written reflection assignments required of students (tq8)
# of post-project written reflection assignments required of students (tq8)
# of post-project discussion and reflection activities between faculty and community partners (tq26)
# of individual faculty meetings with students to discuss reflections (tq22)
Community Voice
# or times faculty meet with community partner to foster relationship of trust prior to project and gain an understanding of community
(tq14, tq17, tq18, tq19)
Faculty and community partner meet before project begins to plan and agree on expectations (tq14, tq17, tq18, tq19)
98
Appendix N: KPI Scorecard
KPI Scorecard
Placement Quality
Actual
# of times community partner communicates the community goals and project activities, alone or in
conjunction with faculty, to the students before project begins (tq11)
Target
3
% of in-class time dedicated to communicating goals and project activities and placement training
Status
1
5%
# of times faculty joins student at CBL project site (tq20)
1
# of times faculty and community partner meet before project begins to plan and develop an MOU (tq1)
1
# of times community partner and students meet during project (tq1, tq12)
5
# of activity reports faculty required from students (tq1)
10.5
Application
Actual
Target
Status
% of in-class time spent discussing tasks performed by students in the community (tq2)
30%
% of in-class time dedicated to teaching course material useful to CBL project (tq3, tq4, tq24)
30%
# of times faculty communicates to community partner course goals and objectives before course begins
(tq2, tq3)
1
# of times faculty solicit community partner goals and objectives before course begins (tq15)
1
% of in-class time allocated to student tasks and activities at project site
5%
Reflection and Feedback
Actual
Target
Status
# of in-project written reflection assignments required of students (tq5 and tq6)
10.5
% of in-class time dedicated to discussing student written reflections (tq5 and tq6)
30%
# of times faculty uses prompts for in-class discussion and out-of-class writing assignments (tq7, tq16,
tq21, tq23, tq25)
10.5
# of pre-project written reflection assignments required of students (tq8)
1
# of post-project written reflection assignments required of students (tq8)
1
# of post-project discussion and reflection activities between faculty and community partners (tq26)
1
# of individual faculty meetings with students to discuss reflections (tq22)
2
Community Voice
Actual
# or times faculty meet with community partner to foster relationship of trust prior to project and gain an
understanding of community (tq14, tq17, tq18, tq19)
99
Target
Status
3
Faculty and community partner meet before project begins to plan and agree on expectations (tq14, tq17,
tq18, tq19)
100
1
Download