The CTL’s conceptual framework is centered in a constructivist philosophy, supported by professional dispositions, a strong content knowledge base, competency in facilitating a diverse society, commitment to field experiences, and assessment. The University’s mission similarly states that this community should support emotional, personal, and professional growth of its learners.
Through implementation, the Reading Advanced Program’s outcomes are committed to this framework and mission. Program outcomes are based in the Washington State 2007 Reading
Endorsement Standards and the 2003 International Reading Association (IRA) Standards. Using these standards, a framework for assessment was created. During the fall of 2007, the reading faculty revised the assessment system for the Reading Specialist degree to emphasize each candidate’s knowledge base and competencies for the endorsement. A protocol to be used during the final oral exam was developed.
This protocol provides the expectations for program completion for each candidate. During the oral exam, the candidate will not only discuss her/his thesis/project, but also note how s/he has met the standards set forth by Washington State and the IRA. Candidates who will graduate after the Spring of 2008 will use this system as part of their oral examination. Therefore, there will be several candidates who will use this system in the next year. As those candidates complete their program and use the system, the reading faculty will discuss the process and make recommendations for adjustments.
The outcomes for these candidates are gleaned from each of the six Washington State common core standards for Reading Professionals. They include:
The candidate will describe understanding of language/literacy development and processes.
The candidate will describe understanding of the assessment, diagnosis, and evaluation process as it relates to her/his current teaching position.
The candidate will describe understanding of the teaching of literacy.
The candidate will describe the literacy environment of own classroom.
The candidate will share professional development goals.
The candidate will describe own process for teaching literacy.
The MAEd-Reading Specialist program uses an end-of-program assessment. At completion of the thesis/project, the candidate will meet with her/his committee in order to discuss the thesis/project.
Additionally, during this final defense, the candidate will demonstrate competency in the six standard areas for the reading endorsement. In order to be fully prepared for the oral defense, the candidate will collect physical evidence toward each of the endorsement standards. Using the protocol developed (on file in the Advance Program-Reading Assessment site), documentation will be taken by the candidate’s committee. The candidate will be evaluated on a “meets,” “meets with assistance,” or “does not meet” the standard.
The program outcomes assessment is administered when a candidate applies for the defense. This final assessment is met through a meeting with the candidate’s committee. Previous to this, the candidate has been collecting physical data toward the defense. This evidence may be collected through the course projects, classroom experiences, or other professional development opportunities, Candidates are encouraged to meet with their committee members to discuss the process of the collection of physical evidence.
The assessment program in this report is a reflection of consideration of previous data. The endof-program protocol was developed as a way to standardize the final defense process. The checkpoints in courses have also helped to keep consistent documentation on those progressing through the program.
In the past year, only one student completed the defense for the Advance Program-Reading. Since this candidate had entered the program previous to the implementation of the revised assessment system, her defense was similar to those in the past. She was responsible for defending only her final project.
However, the grades she received for her courses would be considered evidence of her level of competency in content knowledge, as the standards have been cross listed with the course content in the course syllabi.
As previously stated, in order to devise a more consistent and accountable system to the program outcomes, the assessment system has been revised to include the assessment protocol. Candidates now enrolled in the program have been advised of their responsibility toward showing competency toward the standards. They will share their knowledge during the defense of their program. Because the candidates will be responsible for showing competency in each of the standard areas, s/he will have higher achievement level in Content Knowledge performance.
The candidates in the Advance Program-Reading have an opportunity to meet standards in Pedagogical
Content Knowledge and Skills through their enrollment in an Advanced Practicum. The participation in this practicum meets the standard for the ability to teach content in multiple ways, drawing on cultural backgrounds, and drawing on the background of experiences of students. Each candidate develops a reading program specified to the needs of the student enrolled. Those students enrolled are of varying backgrounds providing for a diversity of experience for the candidates.
The candidates enrolled in the Advance Program-Reading meet the standards for the Professional and
Pedagogical Knowledge and Skills from a historical, economic, sociological, philosophical, and psychological perspective through their enrollment in Advanced Foundation courses. Each candidate is required to enroll in 10 credits of Educational Foundations courses, including EDF 510 for 4 credits.
Additionally, each candidate in this program enrolls in a course in a course concerning the psychology of reading, a course concerning the reading in schools, and a course that surveys reading research.
The candidates enrolled in this program participate in a practicum experience. Through this practicum, they are to keep extensive data on the achievement of the students they are tutoring. The final requirement for this practicum is the development of a case study. The purpose of this case study is to compile, summarize, and report the data collected during the practicum experience. Additionally, the physical evidence the candidates collect toward their final program defense will be in the form of how student learning was impacted by their teaching.
Because the assessment system is in the process of revision, improvements have been made which will help the candidates achieve a higher level of Pedagogical Content and Professional Knowledge and Skills.
Beginning Summer 2009, the candidates enrolled in the practicum will complete a Professional
Dispositions Evaluation. The use of this self-evaluation is to help the candidate set targets for continued growth in Pedagogical Content Knowledge and Skills.
The changes to the assessment system have already been made. There are two students in the upcoming year who will participate in the previously described assessment process. Additionally, the Literacy
Program is working to develop an on-line version of its MA Ed. Literacy Program. The change of this delivery model will somewhat alter the assessment program as described.
Central Washington University (2007-2008)
Assessment of Student Learning Report: Target Levels
Feedback for the Department of
Degree Award: Program:
1. What outcomes were assessed this year and why?
Value
4
3
2
1
0
Guidelines for Assessing a Program’s Reporting of Student Learning Outcomes (Target = 2)
Outcomes are written in clear, measurable terms and include knowledge, skills, and attitudes. All outcomes are linked to department, college and university mission and goals.
Outcomes are written in clear, measurable terms and include knowledge, skills, and attitudes.
Some outcomes are linked to department, college and university mission and goals.
Outcomes are written in clear, measurable terms and include knowledge, skills, or attitudes. Outcomes may be linked to department, college and university mission and goals.
Some outcomes may be written as general, broad, or abstract statements. Outcomes include knowledge, skills, or attitudes. Outcomes may be linked to department, college and university mission and goals.
Outcomes are not identified.
Comments: Reports that obtain higher scores are characterized by increasingly specific student learning outcomes that relate to multiple domains of student development (knowledge, skill, and attitudes). In addition, higher scored reports will clearly articulate the relationship between program outcomes and department, college and university mission and goals.
2. c.
Value
4
How were they assessed? a.
What methods were used? b.
Who was assessed?
When was it assessed?
Guidelines for Assessing a Program's Reporting of Assessment Methods (Target = 3)
A variety of methods, both direct and indirect are used for assessing each outcome. Reporting of assessment method includes population assessed, number assessed, and when applicable, survey response rate. Each method has a clear standard of mastery (criterion) against which results will be assessed
3 Some outcomes may be assessed using a single method, which may be either direct or indirect.
2
1
All assessment methods are described in terms of population assessed, number assessed, and when applicable, survey response rate. Each method has a clear standard of mastery (criterion) against which results will be assessed.
Some outcomes may be assessed using a single method, which may be either direct or indirect. All assessment methods are described in terms of population assessed, number assessed, and when applicable, survey response rate. Some methods may have a clear standard of mastery (criterion) against which results will be assessed.
Each outcome is assessed using a single method, which may be either direct or indirect. Some assessment methods may be described in terms of population assessed, number assessed, and when
0 applicable, survey response rate. Some methods may have a clear standard of mastery (criterion) against which results will be assessed.
Assessment methods are non existent, not reported, or include grades, student/faculty ratios, program evaluations, or other “non-measures” of actual student performance or satisfaction.
Comments: Reports that obtain higher scores are characterized by increasingly clearer information in determining how the assessment took place and the use of a standard of mastery. In addition, higher scored reports will include a greater number of methods in assessing each outcome.
3. What was learned (assessment results)?
Value Guidelines for Assessing a Program’s Reporting of Assessment Results (Target = 3)
4 Results are presented in specific quantitative and/or qualitative terms. Results are explicitly linked to outcomes and compared to the established standard of mastery. Reporting of results includes interpretation and conclusions about the results.
3 Results are presented in specific quantitative and/or qualitative terms and are explicitly linked to outcomes and compared to the established standard of mastery.
2 Results are presented in specific quantitative and/or qualitative terms, although they may not all be explicitly linked to outcomes and compared to the established standard of mastery.
1 Results are presented in general statements.
0 Results are not reported.
Comments: Reports that obtain higher scores are characterized by increasingly clearer information about what was learned from the assessment, particularly in relation to a standard of mastery.
4
What will the department do as a result of that information (feedback/program improvement)?
Value Guidelines for Assessing a Program’s Reporting of Planned Program Improvements (Target = 2)
2 Program improvement is related to pedagogical or curricular decisions described in specific terms congruent with assessment results. The department reports the results and changes to internal and external constituents.
1 Program improvement is related to pedagogical or curricular decisions described only in global or ambiguous terms, or plans for improvement do not match assessment results. The department may report the results and changes to internal or external constituents.
NA Program improvement is not indicated by assessment results.
0 Program improvement is not addressed.
Comments: Reports that obtain higher scores are characterized by specific curricular and pedagogical improvement information. In addition, the department reports the results and changes to internal and external constituents .
5. How did the department or program make use of the feedback from last year’s assessment?
Value Guidelines for Assessing a Program’s Reporting of Previous Feedback (Target = 2)
2 Discussion of feedback indicates that assessment results and feedback from previous assessment reports are being used for long-term curricular and pedagogical decisions.
1 Discussion of feedback indicates that assessment results and feedback from previous assessment reports are acknowledged.
NA This is a first year report.
0 There is no discussion of assessment results or feedback from previous assessment reports.
Comments: Reports that obtain higher scores are characterized by specific curricular and pedagogical improvement information from previous years.