RTM 2013_2014_assessment_report

advertisement
2013-2014 Annual Program Assessment Report
Please submit report to your department chair or program coordinator, the Associate Dean of your College and the assessment
office by Tuesday, September 30, 2014. You may submit a separate report for each program which conducted assessment activities.
College:
Health and Human Development (HHD)
Department:
Recreation and Tourism Management (RTM)
Program:
Assessment liaison: Veda E. Ward
1. Overview of Annual Assessment Project(s). Provide a brief overview of this year’s assessment plan and process.
During the 2013-14 academic year the department was awaiting final recommendations from the external COAPRT
accreditation, while undergoing Program Review. Assessment was an integral component of both processes, and
assessment activities were both summative and formative, as faculty responded to feedback from national and state
reviewer comments.
In general, assessment activities dealt with: consistency of learning outcomes in core courses, identification on the course
outline/syllabus of key assignments (for capstone portfolio), review of rubrics (review-modification-adoption), design of
exit exam for majors based on core texts and assignments linked to course and department learning outcomes, and
establishment of an electronic database to house longitudinal data on assessment; as well as identifying an individual to
manage and analyze those data.
2. Assessment Buy-In. Describe how your chair and faculty were involved in assessment related activities. Did department
meetings include discussion of student learning assessment in a manner that included the department faculty as a whole?
Because the Department tenure-track faculty is small (6 + 2 assigned to non-instructional responsibilities outside the
department), and the 3-year commitment to re-gain external accreditation through a national professional affiliation and
Page 1 of 10
concurrent Program Review, faculty meetings were consumed with discussions of consistency in course content, rigor,
assessment of learning and faculty commitment to this process. Throughout the process the Assessment Liaison provided
relevant information gained from campus assessment liaison meetings. Overall, the RTM department was actively engaged
in “closing the loop” in many ways. The most consistent commitment among both full- and part-time faculty members was
at the course delivery level, since the major core (required) courses build on one another.
3. Student Learning Outcome Assessment Project. Answer items a-f for each SLO assessed this year. If you assessed an additional
SLO, copy and paste items a-f below, BEFORE you answer them here, to provide additional reporting space.
3a. Which Student Learning Outcome was measured this year?
A. Emotional Intelligence
3b. Does this learning outcome align with one or more of the university’s Big 5 Competencies? (Delete any which do not apply)





Critical Thinking
Oral Communication
Written Communication
Quantitative Literacy
Information Literacy
Our academic college (HHD) and the RTM department focus on preparing students for full-time (if desired) professional
employment among the diverse industries within parks, recreation, hospitality and tourism. A major factor in predicting
success on-the-job is described in the popular literature as Emotional Intelligence” (EI). Three years ago the department
began administering the short form of EI to students in the RTM 490 capstone course, but the data were not captured in a
consistent database, which was a discussion and opportunity this year.
Emotional Intelligence is arguably associated with all of the Big 5 since students must identify/recognize issues, solve
problems, while communicating both issues and solutions effectively using oral and written communication. Convincing
communication is often grounded in the presentation of supporting facts and data representing ability to demonstrate
quantitative and information literacy. Majors also participate in a 1.5 to 3 hour library instruction with content specialist
Marcia Henry during one or more courses in the department, which exposes them to databases they can use to support and
Page 2 of 10
document their positions on various issues, drawing on peer-reviewed published research, as well as popular literature and
media.
Department majors are also required to take a course in which they learn about and apply quantitative and qualitative
methodology in evaluating issues, performance and events pertinent to their professional sector/niche.
Department faculty believe that these approaches, interwoven throughout the curricular experience prepare students with the
foundations to be “emotionally intelligent” while building on the university’s Big 5 competencies.
3c. What direct and/or indirect instrument(s) were used to measure this SLO?
 Emotional Intelligence (EI) short form available online
3d. Describe the assessment design methodology: For example, was this SLO assessed longitudinally (same students at different
points) or was a cross-sectional comparison used (Comparing freshmen with seniors)? If so, describe the assessment points used.
This was the first year that longitudinal comparisons were considered since the Department was not systematically storing
assessment data.
3e. Assessment Results & Analysis of this SLO: Provide a summary of how the results were analyzed and highlight findings from the
collected evidence.
The EI instrument is self-rating, and overall, students believe they are above-average on most of the measures. In the past,
only the mean scores on the instrument were reviewed, but because there is a high “social desirability” factor, the data were
analyzed at on two additional levels: (1) examination of items on which the mean score indicated “average” or below, and (2)
was there a relationship between EI and internship supervisor evaluations (since this is the context in which the student is
most likely challenged to demonstrate emotional intelligence on a consistent basis, and in a professional setting.
3f. Use of Assessment Results of this SLO: Describe how assessment results were used to improve student learning. Were
assessment results from previous years or from this year used to make program changes in this reporting year? (Possible changes
include: changes to course content/topics covered, changes to course sequence, additions/deletions of courses in program, changes
Page 3 of 10
in pedagogy, changes to student advisement, changes to student support services, revisions to program SLOs, new or revised
assessment instruments, other academic programmatic changes, and changes to the assessment plan.)
The findings on this SLO were reported at the last faculty meeting of the Spring semester and discussion centered around
course-level changes, primarily in content focusing on the areas in which student mean scores were “average” or below, as
well as following up with designing methods to connect EI with internship evaluations, probably as a cohort.
3a. Which Student Learning Outcome was measured this year?
B. Portfolio
3b. Does this learning outcome align with one or more of the university’s Big 5 Competencies? (Delete any which do not apply)
 Critical Thinking
 Oral Communication
 Written Communication
 Quantitative Literacy
 Information Literacy
3c. What direct and/or indirect instrument(s) were used to measure this SLO?
RTM has required a portfolio for many years as part of its overall assessment process. Since the contents include a synopsis of
all courses with standardized key assignments, the portfolio (similar to EI) represents cumulative evidence odf student
learning from the beginning to the end of the program. Department faculty collect and grade portfolios using a common rubric
in the RH 490 “capstone” course. During accreditation ad Program Review, faculty members who did not teach the course
then review the portfolios using the same rubric to see if students are meeting the SLO, and to make suggestions on improving
the process.
3d. Describe the assessment design methodology: For example, was this SLO assessed longitudinally (same students at different
points) or was a cross-sectional comparison used (Comparing freshmen with seniors)? If so, describe the assessment points used.
Portfolios are still submitted in hard-copy, although the Department will gradually transition to an e-version in the near
future. A majority of students pick up their portfolios at the end of the semester, but an increasing number have all of their
Page 4 of 10
documents and photos in electronic formats as well. This has allowed department faculty to compare portfolios over time.
This has been accomplished in both formal and informal ways. In addition to “new faculty” examining portfolios to see the
process and product,
3e. Assessment Results & Analysis of this SLO: Provide a summary of how the results were analyzed and highlight findings from the
collected evidence.
 Portfolios were successfully submitted by roughly 99% of students, because it is a course requirement in RTM 490
(capstone)
 Faculty members instructing the capstone course in which the portfolio is submitted have seen improvements in
understanding of the purpose, content and quality of the portfolio
 Faculty members instructing RTM 490 have gotten “on the same page” in terms of academic standards for content, use
of common rubric and converse frequently on issues, student areas of “confusion”, etc.
 Practitioner feedback has been generally positive because the portfolios are concrete evidence of job-related
competence
3f. Use of Assessment Results of this SLO: Describe how assessment results were used to improve student learning. Were
assessment results from previous years or from this year used to make program changes in this reporting year? (Possible changes
include: changes to course content/topics covered, changes to course sequence, additions/deletions of courses in program, changes
in pedagogy, changes to student advisement, changes to student support services, revisions to program SLOs, new or revised
assessment instruments, other academic programmatic changes, and changes to the assessment plan.)
The accreditation reviewers felt that the portfolio represented double-assessment, and suggested that the Department adopt
the accreditation standards as the Department SLOs. The portfolio serves several functions for our students, however. First
and foremost, it encourages them to retain course products. They often state that they were unaware how much they had
done or learned until they put the portfolio together. Next, it provides an opportunity to reflect on both course SLOs and
Department (program) SLOs. Students are under a lot of pressure to get through their programs quickly, and time to reflect on
learning is difficult if not structured into the learning experiences. A third factor is that students often use products from their
courses as evidence that they can perform a certain skill or have the requisite competence to serve in a specific role, and so
forth.
Page 5 of 10
Continued discussion about grading versus assessment are anticipated, along with acknowledging the complexity of the
multiple levels of assessment now required by external accreditation, campus big 5 competency alignment, college SLOs and
Department (program) as well as individual courses. Many important questions arise as to duplication, overlap, independence
based on student academic needs, societal changes, professional directions, etc.
Results from the portfolio assessment suggest that consistency is the most powerful dynamic when moving toward
achievement of a learning outcome, regardless of level.
3a. Which Student Learning Outcome was measured this year?
C. Senior Internship
3b. Does this learning outcome align with one or more of the university’s Big 5 Competencies? (Delete any which do not apply)





Critical Thinking
Oral Communication
Written Communication
Quantitative Literacy
Information Literacy
The response here is similar to the two SLOs above. The nature of the profession, as well as the design of the core (required)
courses builds on the university Big 5 competencies. Senior Internship is designed as to allow students to test their knowledge
in a professional setting, but requires a practitioner to actively assess student ability to meet agreed upon learning outcomes
(individualized learning pan) in a professional work setting, while demonstrating competency consistent with that of entrylevel full-time employees.
3c. What direct and/or indirect instrument(s) were used to measure this SLO?
The student intern is evaluated at the mid-point and at the conclusion of a 400-hour internship. It must be noted here that the
hours are the minimum for meeting national accreditation standards, but are often insufficient to meet industry standards for
entering a management rack. Students are informed of this “experience gap” and are encouraged to seek other avenues to gain
more hours in the professional setting.
The internship evaluation form is a standardized with the same criteria for every student.
Page 6 of 10
3d. Describe the assessment design methodology: For example, was this SLO assessed longitudinally (same students at different
points) or was a cross-sectional comparison used (Comparing freshmen with seniors)? If so, describe the assessment points used.
This Spring (2014) the first round of data were harvested from the interns. The intent is to maintain this database over time,
which enable the Department to produce longitudinal comparisons with SLO achievement. To summarize the methodology:
 Internship is currently an exit experience with n baseline at the beginning of the program due to campus-imposed
limitations on field experiences that receive university credit.
 Collection of data occurs at 2 points in the 400-hourinternship process, mid-point and final
 Data are submitted electronically to the internship instructor or to online repository
 Data are analyzed by semester
As noted, data will be analyzed longitudinally, but those results will not be reported until the end of 2015 (about 6
3e. Assessment Results & Analysis of this SLO: Provide a summary of how the results were analyzed and highlight findings from the
collected evidence.
In general, RTM students perform well during the Senior Internship. Faculty members have discussed the nature of the prior
agreement between interns and practitioner-supervisors since the student and agency supervisor design the learning
experience together. As a result, students complete the experience and can identify strengths and weaknesses (see EI
discussion above), but may be able to generalize their preparedness to a single type of environment (e.g., hotel, cruise ship,
municipal park, outdoor adventure, tourism age, etc.). Including this possibility in the capstone course may assist students
with exiting the program with realistic expectations of the potential to be employed in multiple industry sectors.
The faculty has also proposed convening a practitioner panel to assist with the review of outcomes fro the Senor Internship,
which will also inform direction of any desired/necessary changes in Department and course level SLOs associated with senior
internship.
3f. Use of Assessment Results of this SLO: Describe how assessment results were used to improve student learning. Were
assessment results from previous years or from this year used to make program changes in this reporting year? (Possible changes
include: changes to course content/topics covered, changes to course sequence, additions/deletions of courses in program, changes
in pedagogy, changes to student advisement, changes to student support services, revisions to program SLOs, new or revised
assessment instruments, other academic programmatic changes, and changes to the assessment plan.)
Page 7 of 10
4. Assessment of Previous Changes: Present documentation that demonstrates how the previous changes in the program resulted in
improved student learning.
Almost all tenure-track faculty report fewer student complaints about not understanding assessment activities (Is this part of my
grade?), better understanding of key assignments to be included in the portfolio, expectation of student accountability for content on
the required exit exam, and increased consistency in course outline structure, outcomes and assignments.
Overall, the assignment of tenured/ tenure-track faculty was recommended during the review process, and has been gradually
implemented over the past year. Discussion of assessment during faculty meetings has greatly improved consistency among faculty in
terms of “norming” expectations, and even discussions about grading criteria has spontaneously arisen.
Most importantly, the shift to a “culture of assessment”, while not as embedded as morning coffee or exercise, is becoming a more
common part of faculty exchanges because the commitment to student success is shared. While some faculty members have opted out
of courses that require formal assessment, email survey responses from a majority of faculty (both full- and part-time) earnestly
believe they are using student performance indicators to improve instruction toward achieving identified learning outcomes. Most
felt that this was an inherent aspect of instruction and assess intuitively. Documentation beyond anecdotal sharing is more difficult to
achieve on a consistent basis.
5. Changes to SLOs? Please attach an updated course alignment matrix if any changes were made. (Refer to the Curriculum
Alignment Matrix Template, http://www.csun.edu/assessment/forms_guides.html.)
While changes have been recommended based on feedback from external accreditation visitors and Program Review visitors, no formal
action has been taken to date. The two review processes (accreditation and Program Review) did not conclude until late in the Spring
2014 semester, so changes were discussed, but not voted on so late in the semester. Assessment remains a sensitive area and no one
wishes to feel forced into a decision without adequate discussion.
Assessment is on-gong and iterative. Points where share behaviors change, and corporate strategies embraced are difficult (if not
impossible) to pinpoint. Following the two reviews, faculty feel that making a commitment to a direction consistent with reviewer
recommendations, but consistent with the nature of our university and profession will direct changes in both scope and direction of SLOs.
Page 8 of 10
6. Assessment Plan: Evaluate the effectiveness of your 5 year assessment plan. How well did it inform and guide your assessment
work this academic year? What process is used to develop/update the 5 year assessment plan? Please attach an updated 5 year
assessment plan for 2013-2018. (Refer to Five Year Planning Template, plan B or C,
http://www.csun.edu/assessment/forms_guides.html.)
Now that the Department has achieved re-accreditation (10 years) and has had an informative Program Review, we will use this year to
further revise department SLOs and streamline data storage and analysis to make this process less onerous in the future. A member of the
faculty has taken on this responsibility and the Department will shift to an “Assessment Committee” structure in 2014-15, rather than a
single assessment liaison.
Revisions to the 5-year plan have been proposed, but will not be presented for faculty review and vote until later this semester (Fall
2014). The department is considering adding “exit interviews” to compare with the results of the exit exam for a more complete picture of
student learning.
7. Has someone in your program completed, submitted or published a manuscript which uses or describes assessment activities in your
program? Please provide citation or discuss.
No, but Professor Ward has presented “closing the loop” for the campus assessment liaison group and at a professional
conference. Quite frankly, assessment coordination itself takes up a lot of what might otherwise be writing time (not to
mention the time spent writing the report). Under the “committee” format, opportunities for future publication will occur. For
example, an accepted 2015 WASC proposal could turn into a publication, or analyses of some of the areas to which data
solutions have been generated by faculty. More opportunities to publish are desired, but with a small faculty the day-to-day
activities take precedence, including the lack of Full professors to conduct Search & Screen and personnel functions, often for
multiple departments. We seem constantly “in process” so it is difficult to identify at what point and to what extent an
outcome has been achieved. It is more likely in the future that publications will be based on the more quantitative measures
(i.e., evaluation data from internship supervisor, pre-and-post tests on “exit exam” and EI) as opposed to the q
Page 9 of 10
8. Other information, assessment or reflective activities or processes not captured above.
The Department of Recreation and Tourism Management has been I the process of review and realignment for three years. This has
required an almost constant, and at times, quite fatiguing commitment to reflection on our program, the learning outcomes we and our
extremely diverse profession expect, and how to increase documentation of our progress toward stated goals.
Members of the 2014-15 Assessment Committee include Jan Tolan (Liaison), Veda Ward and Jimmy Xie (data)
Submitted by Veda E. Ward, Assessment Liaison 2012-2014
Page 10 of 10
Download