CENTRAL WASHINGTON UNIVERSITY 2009-2010 Assessment of Student Learning Report

advertisement
Academic Affairs: Assessment
July 2010
CENTRAL WASHINGTON UNIVERSITY
2009-2010 Assessment of Student Learning Report
Feedback for the Department of Geological Sciences
Degree Award: B.A, B.S. Geology/ B.S. Environmental Geology Program: Environmental Geology,
Geology
1.
What student learning outcomes were assessed this year, and why?
Guidelines for Assessing a Program’s Reporting of Student Learning Outcomes (Target = 2)
Program
Value
Demonstrated Characteristics
Score
4
Outcomes are written in clear, measurable terms and include knowledge, skills,
2
and attitudes. All outcomes are linked to department, college and university
mission and goals.
3
Outcomes are written in clear, measurable terms and include knowledge, skills,
and attitudes. Some outcomes are linked to department, college and university
mission and goals.
2
Outcomes are written in clear, measurable terms and include knowledge,
skills, or attitudes. Outcomes may be linked to department, college and
university mission and goals.
1
Some outcomes may be written as general, broad, or abstract statements.
Outcomes include knowledge, skills, or attitudes. Outcomes may be linked to
department, college and university mission and goals.
0
Outcomes are not identified.
Comments:
The department evaluated six (6) student learning outcomes at the undergraduate level for the BA and BS
Geological Sciences and BS Environmental Geological Sciences programs.
While there was still a lack of inclusion of attitudinal outcomes, the report did note a workshop event based
on the previous year’s student feedback. The ability to apply in a timely fashion for graduate school is a
definite prereq for success in geology as a student. The further inclusion of employers in the workshop is
another positive step toward attitudinal
The program has specific, measurable skill and knowledge outcomes that are clearly aligned to
departmental, college, and university goals.
2.
a.
b.
c.
How were they assessed?
What methods were used?
Who was assessed?
When was it assessed?
Guidelines for Assessing a Program's Reporting of Assessment Methods (Target = 2)
Program
Value
Demonstrated Characteristics
Score
4
A variety of methods, both direct and indirect are used for assessing each outcome.
3
Reporting of assessment method includes population assessed, number assessed,
and when applicable, survey response rate. Each method has a clear standard of
mastery (criterion) against which results will be assessed
3
Some outcomes may be assessed using a single method, which may be either
direct or indirect. All assessment methods are described in terms of population
assessed, number assessed, and when applicable, survey response rate. Each
method has a clear standard of mastery (criterion) against which results will be
assessed.
1
Academic Affairs: Assessment
2
1
0
July 2010
Some outcomes may be assessed using a single method, which may be either
direct or indirect. All assessment methods are described in terms of
population assessed, number assessed, and when applicable, survey response
rate. Some methods may have a clear standard of mastery (criterion) against
which results will be assessed.
Each outcome is assessed using a single method, which may be either direct or
indirect. Some assessment methods may be described in terms of population
assessed, number assessed, and when applicable, survey response rate. Some
methods may have a clear standard of mastery (criterion) against which results will
be assessed.
Assessment methods are non existent, not reported, or include grades,
student/faculty ratios, program evaluations, or other “non-measures” of actual
student performance or satisfaction.
Comments:
The program includes the use of various direct measures (e.g., writing rubrics, oral presentations,
competency exam, targeted assignments). The number of students assessed was reported as was the
standard of mastery.
The program should be commended for the refinement of its rubrics. In general, the framework used to
assess program effectiveness was effective and refined over the past year. It was clear from the report that
program faculty and the chair are working together on some level to consider how to best assess the
program.
The program is encouraged to use other direct measures such as SEOI feedback as it relates to curricular
/course quality, other existing surveys, employer surveys of recent grads or evaluating grad programs of
placed students. Other possible assessment points might include pre and post tests or even entry to major
and capstone reevaluation.
The inclusion of rubrics is encouraged.
3.
What was learned (assessment results)?
Guidelines for Assessing a Program’s Reporting of Assessment Results (Target = 2)
Program
Value
Demonstrated Characteristics
Score
4
Results are presented in specific quantitative and/or qualitative terms. Results are
4
explicitly linked to outcomes and compared to the established standard of mastery.
Reporting of results includes interpretation and conclusions about the results.
3
Results are presented in specific quantitative and/or qualitative terms and are
explicitly linked to outcomes and compared to the established standard of mastery.
2
Results are presented in specific quantitative and/or qualitative terms,
although they may not all be explicitly linked to outcomes and compared to
the established standard of mastery.
1
Results are presented in general statements.
0
Results are not reported.
Comments:
The report provided quantitative and qualitative assessment results and compared them to established
standards of mastery. There was discussion of rectifying areas of less than adequate standards of mastery.
The report included significant discussion of results and future implications.
2
Academic Affairs: Assessment
4.
July 2010
What will the department or program do as a result of that information
(feedback/program improvement)?
Guidelines for Assessing a Program’s Reporting of Planned Program Improvements (Target = 2)
Program
Value
Demonstrated Characteristics
Score
2
Program improvement is related to pedagogical or curricular decisions
2
described in specific terms congruent with assessment results. The
department reports the results and changes to internal and/or external
constituents.
1
Program improvement is related to pedagogical or curricular decisions described
only in global or ambiguous terms, or plans for improvement do not match
assessment results. The department may report the results and changes to internal
or external constituents.
NA
Program improvement is not indicated by assessment results.
0
Program improvement is not addressed.
Comments:
The program has clearly considered assessment results, with detailed analysis and proactive plans for
action. The program is to be commended for demonstrating a commitment to reflective practice to improve
student learning. The program is encouraged to continue to reflect on assessment results and to improve
pedagogy and/or program curriculum as necessary using the effective framework currently in place.
The report notes an action plan which is very beneficial in establishing subsequent steps for assessment.
The review also notes the program’s need to focus on Science Phase II project as a limiting factor for time
and attention to areas of concern noted in this review.
The department is also encouraged to make results accessible to internal and external groups (i.e., advisory,
faculty, students, public groups, etc) that may have interest or might provide the department with feedback
for additional improvement.
5.
How did the department or program make use of the feedback from last year’s
assessment?
Guidelines for Assessing a Program’s Reporting of Previous Feedback (Target = 2)
Program
Value
Demonstrated Characteristics
Score
2
Discussion of feedback indicates that assessment results and feedback from
2
previous assessment reports are being used for long-term curricular and
pedagogical decisions.
1
Discussion of feedback indicates that assessment results and feedback from
previous assessment reports are acknowledged.
NA
This is a first year report.
0
There is no discussion of assessment results or feedback from previous assessment
reports.
Comments:
The review notes the continued and deepening discussion of assessment and the incorporation of
new/improved outcomes for future planning. The program clearly demonstrates a desire and commitment
toward the assessment cycle of continued data collection, analysis and possibilities for improvement. As
with the previous report, this program’s report should be noted for its efforts and dedication to reflective
program improvement, and is encouraged to continue building on the foundation laid in the previous year.
3
Academic Affairs: Assessment
July 2010
Please feel free to contact either of us if you have any questions about your score or comments supplied in
this feedback report, or if any additional assistance is needed with to support your ongoing assessment
efforts.
Dr. Tracy Pellett and Dr. Ian Quitadamo, Academic Assessment Committee Co-chairs
4
Download