Outcomes Assessment Task Force Minutes November 16

advertisement
Outcomes Assessment Task Force
Minutes
November 16th
12:00 – 2:00 PM
Members in attendance: Dave Hodapp, Dan DuBray, Dana Wassmer, Cori Burns, Pat
Blacklock, Marybeth Buechner, Travis Parker, Margaret Woodcock, Sue Palm, Norv Wellsfry,
Rich Shintaku, Estella Hoskins, Brad Brazil, Kathy McLain, Jeanne Edman, Chris Hawken
Members absent: Winnie LaNier
1. Dana led a discussion about the web site review. Web sites and key observations offered
included:
 Santa Rosa – their sit had a handbook and reporting forms which were simple, easy
to use, and consisted of filling in the blanks. Their web site was fairly well
development and had many examples as well as source documents. However, their
assessment strategy is pretty much one dimensional, although it is tied nicely to
learning.
 Skyline – found a good example of PE that was helpful that included a rubric and a
pre and post test design. However, the reporting seemed very extensive, particularly
since it was a reporting of only one SLO. The site had templates and examples, but
their assessment cycle was not very well defined. The site had a good emphasis on
the why of SLO assessment that included its ties to accreditation as well as its
benefits to learning.
 Fresno City College – had a nice handbook which focused on writing SLOs – had
good examples and some general assessment methods. It did have a good set of
links to other resources. The web page was placed under the curriculum tab – which
was a good idea because it clearly linked it to a faculty driven process.
 Bakersfield – it seems that their SLOs (at least the ones in chemistry) did not meet
our definition of SLOs and were really just objectives with new names.
 Pasadena City College – this was a good web site – had links, examples and a
guidebook. They had a timeline for implementation and had held division workshops,
and had created SLO mentors to help with implementation. In addition, meeting
minutes of their SLO assessment committee were on the site. There was no
assessment plan or process on the site, however. However, there was a preestablished rubric that was to be used for all assessments. The web site also had a
discussion board and a good analysis of tools that were considered to support
assessment implementation. The site was easy to navigate, and they had
assessment examples by types as well.
 Diablo Valley College – good resource sharing device that had minutes of meetings
and good links. There was not much information about program SLOs, however.
 Dan indicated that he thought t he APA.org site sent earlier should be included on
our web page. He also recommended http://www.cai.cc.ca.us (an older site that may
have been migrated to the RP web site).
2. Discussion of pilot projects – Marybeth checked in with folks about the status of their pilot
projects and/or questions or observations they had made thus far. Marybeth shared that
her plan means that the SLO data for earlier outcomes in the semester will be lower
because students who are struggling have not yet dropped. Sue Palm identified the same
issue as well. Some discussion about the merit of having all SLOs assessed on the final
was discussed. Margaret asked for advice about the best method of assessing her SLO –
and her question led to the realization that the assessment method depended on the
research question and what the teacher wanted to learn. For example, if the faculty
member wants to assess the value of a particular activity related to an SLO, then the
assessment should be tied to that activity. If the faculty or staff member wants to address
the effectiveness of an instructional or program unit, the assessment should be tied to
that unit. If it is learning that is to accumulate over the course of the semester or program,
it should be done at the end of the course or program duration.
3. Discussion of key questions – Dana led a discussion of the key questions.
1. Should faculty and staff be required to report something every year? (who, how,
what, to whom, how connected with PRoF?) Yes, but in an indirect manner for some.
Full time faculty will report their assessments during a faculty meeting – and the
results of this meeting will be recorded by a form that is provided in the tool kit.
Meeting minutes can also be used as desired, although they will need to address
certain things. Some mechanism (a brief reporting form) will need to be developed to
collect information from adjuncts. Faculty will also be asked to keep copies of
exemplary work samples for later review during the PRoF cycle and to provide
evidence for accreditation. Note: there may also be a need to including a brief
outcomes assessment as a component of FLEX that collects information about
the assessment practices and impacts of faculty as well so we are collecting
data for the WASC annual report.
2. Should programs write assessment plans? If so, how and on what type of cycle?
Yes – on an annual basis using time that is allocated for this purpose during Friday
Flex. This discussion and plan will be documented via meeting minutes or via the
completion of a form that is included in the tool kit.
3. How do we cycle through courses, programs and faculty? Program faculty will meet
during the Friday of FLEX to identify key learning issues and the related course and
program learning outcomes that will be assessed that year. The assessments will be
done in the fall and then discussed in the spring. Note: it may be prudent to
include a form in PRoF that indicates the program outcomes that have been
assessed over the last two PRoF cycles which can be reviewed by the planning
committee to insure that over a period of time all program outcomes are
assessed.
4. How do we structure this so intermittent reporting is combined with ongoing
assessment? By integrating assessment with existing structures such as FLEX a,
department meetings, and PRoF.
5. How do we integrate this process with the assessment of college-wide and GE
outcomes, and PRoF? During the years when programs are doing PRoF, faculty will
review the previous cycles of assessment data as part of this process and will not be
required to formally assess anything. See below for the other components of this
question.
6. How should we assess GE and College wide outcomes and on what type of cycle?
One GE area will be identified and assessed each year using imbedded assessments
in classes in that GE area. The tool kit will contain tools that can be utilized to
facilitate the assessment of GE outcomes (rubrics, student self assessments, test
items project ideas, etc.) but these will not be prescriptive. At least one program in
each academic unit that has courses in the particular GE area will need to identify GE
as one of their program learning issue(s) for the year. College wide outcomes will be
assessed on a rotational basis by the college research office using student self
assessments and stratified random sampling. In addition, teachers will have the
opportunity to identify specific learning or program activities they think address the
college wide SLO.
7. How do we ensure that this process includes dialogue? This is done by incorporating
it with existing activities such as PRoF, department meeting and convocation.
8. How do we ensure the workload is reasonable? This is done by providing time for the
dialogue and having the reporting consist of meeting minutes and/or a brief form. An
agenda for the convocation department meetings and the format for the meeting
minutes should be included in the tool kit.
9. Should special processes be applied to the vocational programs that have outside
accrediting agencies? Yes – faculty in this area still need to document the dialogue
but can reference their accreditation reports and do not need to do separate
assessments unless they decide to do so.
4. The discussion of the draft reporting plan did not occur as a new draft reporting plan was
developed as the previous questions were discussed. The key components of this plan
are reflected in the answers to the questions in item 3.
5. Committee members were asked to review the PRoF documents prior to the next
meeting. The overview of the process is available at
http://research.crc.losrios.edu/PrOF%20guidelines-refined.pdf and forms are available at
http://research.crc.losrios.edu/PrOF%20Form-Program%20Review-refined.doc
In particular, think about the cycle and the parts of this process particularly related to SLO
assessment and development.
6. other
Download