Academy for the Assessment of Student Learning Results Forum Impact Report

advertisement
Academy for the Assessment of Student Learning
Results Forum Impact Report
Metropolitan Community College
May 1, 2014
We called our Academy Project “Outcomes and Student Learning – Defining, Assessing, and
Improving”, and we sought to achieve the following goals: We wanted to
1. Change our culture of assessment
2. Simplify and clarify our outcomes
3. Begin an inclusive, systematic, cyclic, and sustainable assessment process
4. Create opportunities for assessment efforts to be shared
5. Support use of assessment reports to drive improvement of student learning
1. Change the Culture of Assessment
We modified a Culture of Assessment survey developed by the National Center for
Postsecondary Improvement and administered it to all faculty, staff, administrators, and officers
at the beginning of the fall 2011 semester. We planned to re-administer the survey in fall 2013
to measure our progress in increasing awareness and support of assessment. By the time we readministered our survey, we had lost a large number of faculty, staff, and administrators due to a
large retirement incentive as well as normal turn-over. This means that our population was quite
different from our original population. We gave the survey anyway and saw very little change
from our previous survey results. After discussing the results with our district assessment
committee, we decided that perhaps the survey instrument was too long and that all of the
questions might not be appropriate to all populations. We decided to drill down into our
priorities for improvement and choose three items on which we will focus over the next period.
The items we chose are:
1. From your perspective, how important does MCC consider dissemination of student
assessment reports and studies for encouraging student assessment activities?
2. To what extent does MCC use student assessment information in making decisions or
changes in Career and Technical programs?
3. MCC College personnel have a common understanding of the meaning of the term
student assessment.
We have already drafted a definition of student assessment that we will work on refining and
promoting throughout our institution as we develop more consistent terminology:
Assessment is: “gathering, interpreting, and acting on information to improve student learning”.
Additionally, we are focusing on communicating how often student assessment reports are being
presented and discussed across the district. There has already been an emphasis on using
assessment information in MCC’s CTE programs but additional importance will be placed going
forward.
1
2. Simplify and Clarify Outcomes
We planned to simplify our General Education Outcomes by including only outcomes that were
truly general and assessable in all general education disciplines. We began by reviewing our
then current general education outcomes. The outcomes were numerous and complex and some
were not even assessable. There was no process for making sure the outcomes were assessed
and no point of accountability for assessing them. At the time, we discussed our then current
outcomes in our District Steering Committee for Institutional Assessment (DSCIA) meetings and
in related task forces. Next, we looked at best practices from other sources and adapted some
models for circulation. These models were presented to all faculty, staff, and administrators at
our first annual Summer Symposium on Student Learning. In a World Café setting, faculty,
staff, and administrators were encouraged to make comments, suggestions, and revisions to the
models. The modified models were then sent out to all general education faculty for comment
and further revision in an open and anonymous electronic survey. The information from this
survey was gathered and analyzed by the newly formed district assessment committee, the
District Assessment Coordinating Committee (DACC), who shared the results with campus
assessment committees for further comment and revision. In February of 2012, the DACC
finally voted to adopt these outcomes and the rubrics that had been created/ adapted along with
them. The new General Education Outcomes are as follows:
1. Critical Thinking - The student will be able to evaluate and apply information
gathered from observation, experience, reflection, reasoning, or communication.
The student will be able to:
1. Evaluate the validity and soundness of scientific, mathematical, logical, or
other formal disciplinary arguments.
2. Analyze and synthesize information from a variety of sources and apply the
results to resolve complex situations and solve problems.
3. Defend conclusions using relevant evidence and scientific, mathematical,
logical, or other formal disciplinary argument(s).
2. Information Literacy – The student will be able to access and apply information
from multiple sources, evaluating the accuracy and credibility of each, with
appropriate documentation. The student will be able to:
1. Extract information from a variety of sources, using appropriate technology to
access and manage the information efficiently.
2. Evaluate information for its currency, relevance, bias, and accuracy.
3. Document sources utilizing the correct format and articulate the legal and
ethical implications of information use.
4. Interpret and apply quantitative and/or qualitative information embedded in
text, real-life situations, tables, or graphs to analyze complex situations and/or
solve quantitative or solve quantitative or qualitative problems.
3. Communications – The student will be able to use receptive and productive skills to
interpret, synthesize, and integrate ideas of others and their own to communicate.
Receptive Skills – The student will be able to:
2
1. Demonstrate understanding of context of material, including cultural
framework, audience, and purpose of communication.
2. Determine the main idea and significant details.
3. Analyze and interpret the parts of a communication to comprehend the
relationship between them and to deepen understanding of the meaning.
Productive Skills – The student will be able to:
4. Use knowledge of audience expectations and context to shape a
communication.
5. Create and develop an effective controlling idea using appropriate details.
6. Organize material coherently into a meaningful whole.
7. Synthesize and integrate ideas of their own with those of others.
During this same time, we divided the general education disciplines into four cohorts and tasked
them in turn with writing three to five discipline-specific outcomes. The first cohort of five
disciplines (Biology, Economics, Engineering, Physics, and Foreign Languages) began their
work in fall 2011. The discipline faculty met three times during the semester to determine their
discipline-level outcomes, and each discipline chose one discipline-level outcome to assess
during the following semester along with one of the general education outcome attributes.
Cohorts two, three, and four met in spring 2012, fall 2012, and spring 2013 respectively to
complete the same tasks. We have better faculty buy-in to the process because faculty
participated in the creation of the outcomes and their assessments. We have accountability as
discipline faculty are responsible for assessing their own discipline outcomes as well as general
education outcomes.
This process has led to powerful faculty discussions about adopting common terminology, the
relevance and appropriateness of certain assessments to measure specific outcomes, professional
development and mentoring opportunities for adjunct and dual-credit faculty, and curricular
changes that could improve student learning. The process has also provided a means by which
we can compare student performance on the same General Education Outcome Attribute as
measured by tools developed separately by different disciplines but scored using the same rubric.
As faculty continue to assess and interpret their data, there may be opportunities to further refine
our outcomes to yield clearer information for use in improving student learning.
MCC recently completed a revision of the Associate in Arts degree. It is the goal that every
class that is required for obtaining the AA degree will be assessed. This assessment information
will come back to the District Instructional Coordinating Committee (DICC) to use as evidence
supporting the need for these courses. Additionally, we are reporting assessment information on
our district course information forms to ensure tracking of assessment results. To make sure
assessment is a focus for the district, the Vice Chancellor of Academic Affairs appointed the
Chair of the District Assessment Coordinating Committee as a voting member to the DICC.
3. Begin an inclusive, systematic, cyclic, and sustainable assessment process
Our previous assessment structure was made up of a district committee, the District Steering
Committee for Institutional Assessment (DSCIA). This committee only had representation from
a couple of campuses and maybe one or two disciplines at any given time. It was not a voting
body and had no connection to our curriculum committees or curriculum process. Participation
in institutional assessment was limited to only a few faculty, mostly on one or two campuses and
3
in one or two disciplines. Assessment pilots were usually expensive and labor-intensive and
stayed on one campus or came to an end with no feedback loop. We transformed our DSCIA
into the District Assessment Coordinating Committee (DACC). The DACC was charged by our
Vice Chancellor of Academic Affairs and Technology to be a voting body consisting of majority
faculty and having representation on the District Instructional Coordinating Community (our
district curriculum committee) and representation from each campus assessment committee. The
DACC also has faculty representation from each of our campuses as well as representation from
a variety of disciplines including developmental and career and technical as well as general
education. In our five-campus system, the membership is comprised of the dean of instruction
from each campus, three voting faculty members from each campus and from a variety of
disciplines, and one non-teaching member from each campus who could be non-teaching faculty,
staff, or an administrator from student services. Our Director of Institutional Research and
Assessment and Director of Career and Technical Development sit on the committee as voting
members as well. The meetings are open to all and there are many directors who attend as
resources. Restructuring and revitalizing have made it possible for us to move beyond our
project to assess general education. We have also began to work on assessment outcomes and
processes for Human Diversity and co-curricular activities while conducting Hope and Grit
assessments and coordinating and analyzing surveys such as Noel-Levitz.
All full-time faculty from each discipline are encouraged to participate in the assessment process,
although we do not force participation. We are pleased with the voluntary participation rate of
93% of eligible full time faculty. Since not all faculty are comfortable with institutional
assessment, rubrics, norming and other assessment activities, we have developed notebooks of
resources for faculty to use during their meetings when they are creating or adapting outcomes
and assessment tools. Our Director of IR has also created online classes, Assessment 101 and
Rubrics 101, for new faculty or for faculty who would like to learn more about assessment. All
faculty are expected to be involved in the process from determining discipline-level outcomes
and deciding which outcomes to assess first to creating/adapting the assessments, administering
them, and interpreting the results with the Director of Institutional Research and Assessment and
the Chair of the DACC. Disciplines were given latitude over when to begin involving adjunct
and dual-credit faculty. Discipline faculty assess one discipline and one general education
outcome beginning in their 2nd semester of participation in the project. In the third semester,
they meet with the Director of IR and the DACC Chair to look at their assessment data with
added demographics. These 28 demographics include:
1. Instructor type
2. Section number
3. Campus
4. Class grade
5. Class time
6. Class day
7. Class format
8. Student cumulative GPA
9. Student term GPA
10. Gender
11. Ethnicity
12. Credits completed
13. Age
4
14. High School attended
15. Pell Grant eligibility
16. Read Compass scores
17. Write Compass scores
18. Numerical Compass scores
19. Algebra Compass scores
20. College Algebra Compass scores
21. Trigonometry Compass scores
22. English ACT scores
23. Math ACT scores
24. Composite ACT scores
25. Performance on individual questions
26. Previous classes taken
27. Program of record
28. Level
At this meeting, faculty are encouraged to discuss the assessment results, look for trends,
interpret, and talk about possible plans for improvement based on the assessment data and
analysis. The faculty are considered the experts in interpreting the data since they have the
classroom experience to put the data in context. At this meeting, faculty will choose whether to
implement improvements to curriculum or teaching, change the assessment instrument to better
measure outcomes, re-assess the same outcomes during the next semester to gather more data, or
choose another discipline and/or general education outcome to assess in the following semester.
This means that every discipline is involved in some phase of the assessment process every
semester making the process systematic and cyclic. To keep the process sustainable, faculty
have been encouraged to use embedded assignments to assess discipline and general education
outcomes. This has not only made the process more manageable but has helped promote buy-in
of faculty and students because faculty feel their assessments are being valued and students feel
like they are always working on assessments for the class in which they are enrolled. There is
not as much resistance as with standardized tests or standardized prompts which may bear no
relevance to the class in which they are used. Even though institutional assessment does require
more work of the faculty, the extra work is minimized by using embedded assignments before
briefly norming then trading papers to score.
4. Create Opportunities for Assessment Efforts to be Shared
At the conclusion of a cohort’s first semester, faculty from the disciplines in that cohort are
expected to attend the last DACC meeting of the semester to report out on their discipline
outcomes, assessment tools, and assessment plans for the next semester. It is at this meeting that
they may request assistance for their upcoming assessment.
We also started the Summer Symposium on Student Learning to create another venue for
assessment efforts and results to be shared. At our first annual summer symposium in 2011, the
disciplines of the first and second cohorts reported out on their progress in flash-five
presentations to MCC’s faculty, staff, and administrators. Cohorts 3 and 4 were anxious for the
same opportunity at the following Summer Symposium. This summer, our fourth symposium
5
will include not only assessment results of general education disciplines but also of co-curricular
efforts, Career and Technical programs, Hope, and Grit surveys. We have also been using this
forum to share other survey results such as Noel-Levitz results. Great strides have been made in
promoting transparency and trust as well as a culture that expects assessment results to be shared,
discussed, and used to drive improvement in student learning instead of typed, bound, and stored
on a shelf.
Assessment efforts are also being shared in campus assessment committees and at the Regional
Assessment Conferences for Community Colleges which we have hosted for two of the last four
years, alternating with Johnson County Community College. This has provided an opportunity
for the faculty to present assessment work with their regional colleagues from 7 states and 22
college.
5. Support Use of Assessment Reports to Drive Improvement of Student Learning
By the end of fall 2013, we had assessed over 10,000 students and 471 sections. Twenty-four
general education disciplines and 93% of eligible faculty have participated in this project so far.
(We are still in process of gathering and analyzing spring 2014 assessment data.)
Throughout the process, faculty have been encouraged to use their assessment results to generate
ideas for improvement of student learning. Following are some examples of improvements
based on our assessment analyses:
1. Chemistry and Geology faculty discovered that they used slightly different terminology
for certain processes. Once they recognized this, they agreed to work on using a common
vocabulary while exposing students to other terminology so it would not be new to the
students if they encountered it in later courses. While this change could have a
significant impact, it does not require curricular change.
2. Foreign Languages faculty realized that their students could pass courses, making them
eligible for the next course in the sequence, without achieving the outcomes that were
expected of them in the next course. This caused the faculty to reconsider their
assignments and exams, the nature and the weight of each. However, this did not require
curricular change.
3. Sociology faculty found that students in their Tuesday/Thursday sections performed
better than students in their Monday/Wednesday/Friday sections. In their analysis, they
concluded that a particular paper required for TR sections but not MWF sections was
making a significant impact on student learning. Now, the paper is required in all
sections.
4. Philosophy faculty determined that students with low Read Compass scores performed
worse than students with higher Read Compass scores. This has led the faculty to pursue
a reading pre-requisite.
5. Music faculty discovered that younger students were performing better on their
assessments. Further investigation revealed that these higher-performing students were
online students from a particular campus. This led to the conclusion that perhaps the
assessment results revealed a technology issue that the faculty need to address rather than
a content issue. They are working on this.
6
6. History faculty found that students are meeting the objectives they chose to assess first.
Therefore, faculty are moving forward with assessing a different objective. In the course
of their discussions on outcomes and assessment, history faculty have, after many years
of debate, decided to adopt a common textbook among all common sections across the
district.
7. Anthropology faculty have likewise found common ground through their debates and
discussions on common discipline outcomes and assessment and are changing course
content and assignments to reflect a new alignment across the district.
8. Business faculty have made some of the greatest strides in finding common ground.
They have decided to adopt common texts, assignments, and for online courses, course
shells for common sections across the district.
9. Economics faculty are addressing adjunct orientation and training issues due to grade
inflation discovered during analysis of assessment data.
10. Physics faculty determined that students in non-science programs were a population too
different from students in science or science-related programs to use a common
assessment tool. Therefore, they are revising their assessment tools to assess common
outcomes in ways more appropriate for each population.
We have tried to document as many of these changes to improve student learning as we can, but
we are hoping this documentation will become part of something more substantial. From the
beginning of our project, we planned for the cohorts to become involved with a process of
assessment that would become a part of discipline review. We envisioned an improvement piece
as well as an assurance piece, of sorts, in our discipline reviews. This semester, spring 2014, we
saw the realization of that part of our plan. The five disciplines of the first cohort prepared
discipline reviews that contained financials, demographics, SWOT analyses, and the many other
components associated with discipline review, but in addition, each discipline review included
information about the discipline’s assessment efforts and results. Each discipline included their
outcomes, information about which outcomes they had assessed over the last four years (2.5
years for the first cohort, this time), and results of the assessments. Then, faculty discussed their
interpretations of the results of the assessments and any changes they made or were possibly
going to make, curricular or otherwise, based on assessment results. As these presentations were
made to the chancellor, vice chancellor, campus presidents and administration, faculty also
reported any future assessment efforts and improvements to student learning they may make and
in some cases, requested support for these continued efforts to assess and make improvements.
The officers were very attentive and asked many questions about the disciplines, their
assessments, and the institution’s assessment progress overall. They seemed to understand and
be sympathetic to the difficulties in launching this kind of large-scale project, but they were also
supportive and practical, taking notes and pointing out things that they might be able to do to
help that we have been unable to do. In fact, the Vice Chancellor for Academic Affairs and
Technology will be meeting with each of the first cohort disciplines to discuss concrete
contributions that the institution may be able to make to assessment and improvement efforts.
We were told to choose something big but something that needed to be done, and we did. In
fact, we chose a project so big that even though we have made tremendous progress, there is still
much to do before we will be completely satisfied with it. Certainly, we intend to continue the
7
cycle of assessment, review, and improvement while staying committed to sustainability and
inclusivity bringing more faculty on-board with every cycle. We also intend to keep insisting
that assessment data always be made available to the faculty whose students generated it for
discussion and interpretation that leads to improvement of student learning. Hopefully, we will
make strides in better documenting the improvements we make so we can recognize their effects
as we continue to assess, and we will continue to encourage even broader involvement and the
generation of outcomes and assessments outside just general education. We have already begun
to trust our more robust assessment structure and experience to include more co-curricular and
career and technical sharing of assessment efforts in our meetings and symposiums.
Next Steps
Some of our next steps crucial to keeping the fly wheel going include:
1. As referenced on page one, determining our priorities from the Culture of Assessment
Survey, and working on them in manageable chunks until we are satisfied. This should
continue to move us toward continuous improvement of the culture of assessment we are
evolving.
2. Continuing to have disciplines assess outcomes and meet to analyze those outcomes and
determine from analysis ways to improve student learning then move on to assess
different outcomes.
3. Encouraging faculty to include more sections, more adjunct faculty, and more dual-credit
faculty in the assessment cycle. Currently, Reading is the only discipline that is assessing
all sections and has 100% participation of full-time and part-time faculty, even though
some other disciplines are getting close.
4. As mentioned on page 5, continuing to provide opportunities for disciplines to discuss
and share assessment results by arranging discipline meetings and annual summer
symposiums and hosting/participating in the Annual Regional Community College
Assessment Conference.
5. As mentioned on page 7, continuing and improving on the new process of discipline
review as Cohorts 2, 3, and 4 prepare, submit, and present their reviews with a focus on
assessment and improvement along with the other components of the regular reviews.
Establishing this cycle of reviews along with their connection to assessment and
improvement will further improve our culture of assessment.
8
Download