CENTRAL WASHINGTON UNIVERSITY 2009-2010 Assessment of Student Learning Report

advertisement
Academic Affairs: Assessment
July 2010
CENTRAL WASHINGTON UNIVERSITY
2009-2010 Assessment of Student Learning Report
Feedback for Interdisciplinary Asia Pacific Studies
Degree Award: BA Asia Pacific Studies
Program: Undergraduate
1.
What student learning outcomes were assessed this year, and why?
Guidelines for Assessing a Program’s Reporting of Student Learning Outcomes (Target = 2)
Program
Value
Demonstrated Characteristics
Score
4
Outcomes are written in clear, measurable terms and include knowledge, skills,
2
and attitudes. All outcomes are linked to department, college and university
mission and goals.
3
Outcomes are written in clear, measurable terms and include knowledge, skills,
and attitudes. Some outcomes are linked to department, college and university
mission and goals.
2
Outcomes are written in clear, measurable terms and include knowledge,
skills, or attitudes. Outcomes may be linked to department, college and
university mission and goals.
1
Some outcomes may be written as general, broad, or abstract statements.
Outcomes include knowledge, skills, or attitudes. Outcomes may be linked to
department, college and university mission and goals.
0
Outcomes are not identified.
Comments: The program evaluated 2 student learning outcomes at the undergraduate level. Both outcomes included clear
and measureable knowledge and skills. In future assessment planning, the review may want to include attitudes or
dispositions which are necessary to be successful in the profession. This addition may require an updating of the program
assessment plan and exit surveys. Measuring the level at which students are developing attitudes consistent with the
philosophy of the program and campus (possibly through the exit survey) would be effective for future planning. This is
similar to the feedback provide in earlier reviews.
2.
How were they assessed?
a. What methods were used?
b. Who was assessed?
c. When was it assessed?
Guidelines for Assessing a Program's Reporting of Assessment Methods (Target = 3)
Program
Value
Demonstrated Characteristics
Score
4
A variety of methods, both direct and indirect are used for assessing each outcome.
3
Reporting of assessment method includes population assessed, number assessed,
and when applicable, survey response rate. Each method has a clear standard of
mastery (criterion) against which results will be assessed
3
Some outcomes may be assessed using a single method, which may be either
direct or indirect. All assessment methods are described in terms of
population assessed, number assessed, and when applicable, survey response
rate. Each method has a clear standard of mastery (criterion) against which
results will be assessed.
2
Some outcomes may be assessed using a single method, which may be either
direct or indirect. All assessment methods are described in terms of population
assessed, number assessed, and when applicable, survey response rate. Some
methods may have a clear standard of mastery (criterion) against which results will
be assessed.
1
Each outcome is assessed using a single method, which may be either direct or
indirect. Some assessment methods may be described in terms of population
assessed, number assessed, and when applicable, survey response rate. Some
methods may have a clear standard of mastery (criterion) against which results will
be assessed.
1
Academic Affairs: Assessment
July 2010
0
Assessment methods are nonexistent, not reported, or include grades,
student/faculty ratios, program evaluations, or other “non-measures” of actual
student performance or satisfaction.
Comments: The program goals were assessed with a variety of indirect measures (e.g., exit survey results, required course
level grades, and study abroad participation) to measure student learning and performance. These results provide some
indication as to student learning and goal competency. Each measure has established standards of mastery (which is a very
positive factor). The population (number of students assessed) was also provided. The survey notes an increase in the return
rate of the exit survey. The efforts of the director should be commended! From this the program identified strengths and
weaknesses. A possible process improvement might be to utilize direct measures to gauge outcome achievement. If more
measureable skills or knowledge areas could be assessed through specific activities or tests, it might facilitate the
development of more focused skills, knowledge points and attitudes? Again, caution should be used in using course grades
as measures of student learning. Although student grades are an important indicator of whether they have successfully met a
faculty member's requirements and expectations for a course, they are generally too broad and general to function as effective
assessment measures. Specifically, grades may take into consideration such elements as student improvement, effort or even
attendance while mitigating the learning outcomes. Additionally, course content and management (including grading
structure/weighting/ number of tests, etc.) may and often vary among faculty members teaching the same course. Specific
assignments, on the other hand, can be an indicator of specific skills attainment. If faculty agree on how the assignment is to
be evaluated (including articulating its minimum performance standards), a test assigned within a course may be a very
appropriate program assessment measure. This should be an area of focus and improvement for next year. However, it
should be noted that this program has improved immensely in the last two years as to it’s methods and effort.
3.
What was learned (assessment results)?
Guidelines for Assessing a Program’s Reporting of Assessment Results (Target = 3)
Program
Value
Demonstrated Characteristics
Score
4
Results are presented in specific quantitative and/or qualitative terms. Results are
4
explicitly linked to outcomes and compared to the established standard of mastery.
Reporting of results includes interpretation and conclusions about the results.
3
Results are presented in specific quantitative and/or qualitative terms and are
explicitly linked to outcomes and compared to the established standard of
mastery.
2
Results are presented in specific quantitative and/or qualitative terms, although
they may not all be explicitly linked to outcomes and compared to the established
standard of mastery.
1
Results are presented in general statements.
0
Results are not reported.
Comments: Results were presented in specific quantitative terms. Results were also compared to an established standard of
mastery. The interpretation and discussion was particularly informative. The incorporation of direct measures and attitudes
from early sections would further facilitate and develop this section. Overall, for what was presented, the results are linked
and should be continued in future reports!
4.
What will the department or program do as a result of that information (feedback/program improvement)?
Guidelines for Assessing a Program’s Reporting of Planned Program Improvements (Target = 2)
Program
Value
Demonstrated Characteristics
Score
2
Program improvement is related to pedagogical or curricular decisions
2
described in specific terms congruent with assessment results. The
department reports the results and changes to internal and/or external
constituents.
1
Program improvement is related to pedagogical or curricular decisions described
only in global or ambiguous terms, or plans for improvement do not match
assessment results. The department may report the results and changes to internal
or external constituents.
NA
Program improvement is not indicated by assessment results.
0
Program improvement is not addressed.
Comments The review notes deficiencies from the previous review. It addresses improvements related to pedagogical
decisions directly associated with assessment results. The program has a clear sense of what it needs to do to improve its
2
Academic Affairs: Assessment
July 2010
assessment processes. Based on the high level of student achievement, (grades, participation in study abroad) there was less
of a need for change in a curriculum or pedagogical perspective. The program is improvement minded and is interested in
enhancing its standing and ability to meet student needs. The resulting actions of those conversations were reported in this
year’s review along with specific changes to curricula. Perhaps sharing this process with the students may elicit higher
student response rates on the exit surveys as the students understand the impact their voice has in future planning and
assessment.
How did the department or program make use of the feedback from last year’s assessment?
Guidelines for Assessing a Program’s Reporting of Previous Feedback (Target = 2)
Program
Value
Demonstrated Characteristics
Score
2
Discussion of feedback indicates that assessment results and feedback from
2
previous assessment reports are being used for long-term curricular and
pedagogical decisions.
1
Discussion of feedback indicates that assessment results and feedback from
previous assessment reports are acknowledged.
NA
This is a first year report.
0
There is no discussion of assessment results or feedback from previous assessment
reports.
Comments: This is the second yearly report.
5.
The program has made significant steps to addressing assessment standards and practices from the initial year of filing. The
discussion of results and feedback are being incorporated for long-term pedagogical and curricular decisions. If continued,
the program will better able to meet student needs and adapt in ways that support program strengths and improve program
weaknesses or challenge areas. The program should continue to foster communication between departments that offer
courses for the major and work with individual faculty to collect data, analyze, and provide feedback for the program to
consider.
Please feel free to contact either of us if you have any questions about your score or comments supplied in this feedback
report, or if any additional assistance is needed with regard to your assessment efforts.
Dr. Tracy Pellett & Dr. Ian Quitadamo Academic Assessment Committee Co-chairs
3
Download