Please complete a Pre-Proposal Cover Page for the award program

advertisement

2004-2005 Enhancing Educational Effectiveness Awards Program

Proposal Cover Page

(MSWord version available online: Academic Personnel, Forms, or by calling the FCPD at x55287)

Please complete a Pre-Proposal Cover Page for the award program to which you are applying

Name: Kristin Powers__________

Appointment Status (check one): ___ Tenured _X__Probationary ___Temporary ___Other

(Please Specify) Summer [ ] Fall [ X ] Spring [ ]

Department: ___Educational Psychology Administration and Counseling (EDPAC)____

Phone: _______985-9287

E-mail: ______kpowers@csulb.edu__________

(For collaborative projects, please provide the above information for each member of the team)

Award program (check one):

Program I: Individual Course Development and Assessment Program

___General Education Courses

___Courses in Majors or Programs

Program II: Program/Major Development and Assessment Program

___General Education Foundation Area Studies

_X_ Courses of Study in Majors or Programs

___Student Services Programs

Area of Emphasis: Technology [ ] Service Learning [ ] International Ed [ ] Other [ X ]

Title of Proposal: Reviewing, Rethinking and Retooling the School Psychology Program Assessment Sequence

Statement of Commitment:

If I receive an award through this program, I/we will comply with all requirements as described in this request for proposals. This proposal does not duplicate activity currently underway or under review by another internal or external award program.

Applicant(s) signature(s): _______________________________________________ Date: ______________

(see attached signatures for this collaborative project)

The Department Chair is aware of the proposal and supports its submission

Chair’s signature : ____________________________________ Date: ______________________

The Dean is aware of the proposal and supports its submission

Dean’s signature: _____________________________________ Date: ____________________

Reviewing, Rethinking and Retooling the

School Psychology Program Assessment Sequence

Statement of Additional Participants

We commit to participate fully in the seminar planned to support our work on the proposed project.

Further, we attest that this proposal does not duplicate activity underway or under review by another internal or external award program (included on cover page).

____________________________________________________

Kristi Hagans-Murillo, Part-time Faculty

____________________________________________________

Jim Morrison, Part-time Faculty

____________________________________________________

Frank Tocco, Part-time Faculty

____________________________________________________

Norma Elias, Student Member

____________________

Date

____________________

Date

____________________

Date

____________________

Date

Reviewing, Rethinking and Retooling the

School Psychology Program Assessment Sequence

1. Abstract

The focus of the proposed project is to revise the current assessment sequence within the school psychology credential program. The school psychology program is a 60 unit post-baccalaureate credential program that prepares students to function as school psychologists in public schools. Assessment activities have always been a major responsibility of school psychologists; accordingly, 30% of the program coursework is devoted to assessment. However, this coursework is in need of revision due to changes in assessment policies and trends within the field of professional school psychology. The proposed project would involve collaboration among the faculty who teach the assessment courses to develop a more state-ofthe-art, coordinated and comprehensive assessment sequence for students beginning AY 2005-2006.

2. Project Goals, Objectives, and Predicted Outcomes

The goal of the proposed project is to modify the assessment sequence within the College of Education’s school psychology program to better reflect changes in assessment practices within the field of school psychology. In order to obtain this goal, our three objectives are:

1. Identify assessment knowledge and skills school psychologists will need to master under the new assessment paradigm (i.e., “assessment linked to intervention”).

2. Map he skills and knowledge-bases identified in objective #1 onto the current assessment curriculum.

Identify assessment skills and knowledge that are insufficiently addressed and redundancies within the sequence. Modify the assessment courses in order to attain a scope and sequence that is aligned, comprehensive and based on current assessment research and policies.

3. Evaluate the modifications made to the assessment sequence to determine whether school psychology student are better prepared to meet the assessment demands of the field. Evaluations will be based on year end program satisfaction surveys, end of the course instructor evaluations, an assessment component added to the program portfolio and a state-wide survey of graduates.

Predicted Outcomes: a.

Most or all of the assessment courses will be modified. b.

The instructors of the assessment courses will have an improved understanding about how their course fits into the entire assessment sequence. Collaboration on the topic of assessment during project meetings will further the project participants’ assessment knowledge. c.

A component on assessment will be developed to reflect the assessment skills the students gain across the individual courses and added to the program’s student portfolio requirement. d.

Publication of an article co-authored by the project team in a state or local school psychology journal or newsletter. e.

Presentation at school psychology national (National Association of School Psychologists), state

(California Association of School Psychologists) and/or regional (Long Beach Association of School

Psychologists) conferences. f.

Successful application to the National Association of School Psychologists for program approval.

g.

School Psychology students will have more advanced assessment skills and knowledge than students who did not benefit from the revised assessment sequence. As a result, these students will be more likely to become leaders in their districts on assessment policies and practices.

3. Statement of Need

Federal guidelines for determining special education eligibility for learning disabilities will dramatically change with the reauthorization of the Individuals with Disabilities Act. Proposed changes to IDEA are based on extensive research that indicates the current medical model approach to identifing and serving students with disabilities is ineffectual. In response to this research, the National Association of School

Psychologists (NASP) revised their Blueprint for Training and Practices in 1997 and released the fourth edition of Best Practices in School Psychology in 2002. Both of these documents clearly recommend a shift in assessment practice from psychometrically-driven tests of intelligence, processing and academic achievement to more functional and curricular-based assessment practices. In short, assessment practices in school psychology are moving from defining student failure in terms of within-child deficits to identifying conditions that support student learning. The latter model is based on assessment technologies including assessments linked to intervention, single-case subject design, resistance to intervention models, dual discrepancy criteria, developing local curriculum-based norms, etc. that represent a significant departure from the content historically addressed in our assessment sequence (see Table 1). Some modifications have been made to these courses over the past three years, but the updates have been rather piece-meal. During

Spring 2003, a meeting was held between all the school psychology faculty who teach an assessment-related course, and resulted in some preliminary topics for future discussion. However, more resources are required in order to complete a thorough review of the entire assessment sequence, redefine it as an assessment-linked

-to-intervention sequence, and develop clear articulation among the following courses: EDP 420, 524, 525,

527, 560 and 579A so that our students graduate with sufficient assessment skills and knowledge-base to meet the changing demands of the field.

Table 1. Two models of assessment

Medical Model of Assessment (Current)

Tests of cognitive, psychological processing and achievement (e.g., WISC-IV, WJ-III)

Ecological or Assessments Linked to Intervention

Model (Future)

Measures of behavior and academic achievement

(e.g., curriculum-based measurement, functional

Tests are commercial, standardized, norm-referenced and require substantial inference.

Battery of tests administered in a few testing sessions. analysis).

Assessment is more direct, based on local norms or intra-individual comparisons.

Repeated measures to monitor progress over time.

Learning Disabilities defined by an IQ/Achievement discrepancy

Learning Disabilities defined by a dual discrepancy definition (i.e., below average performance and resistance to intervention)

Focus on determining special education eligibility. Focus on identifying empirically based interventions.

Nomothetic assessment

Report writing focus on describing and interpreting scores to match special education eligibility criteria.

Ipsative assessment procedures

Report writing focuses on summarizing assessment results and linking those results to interventions.

4. Description of Planned Activities

The assessment sequence currently includes 6 courses which are taught by one full-time and four adjunct faculty. The project would form a team, including faculty and a student representative, to review and revise the courses to better reflect changes in assessment practices within the field of school psychology.

The project will be conducted during Fall 2004 and Spring 2005 semesters. Kristin Powers, the coordinator of the School Psychology program and instructor of two of the assessment courses (EDP 579A Curriculumbased assessment and academic interventions, & EDP 560 Behavioral analysis and positive behavioral interventions) will serve as Project coordinator. Team members will include the instructor of EDP 420

Tests, measurements and evaluations (Frank Toco), the instructor of EDP 524 Psychoeducational assessment and EDP 525 Psychoeducational diagnosis in multicultural settings (Jim Morrison), and the instructor of

EDP 527 Clinical practices in school psychology (Kristi Hagans-Murillo). In addition, an advanced student in the program (Norma Elias) will serve as a student representative.

The project will begin during the Fall semester with Kristin Powers conducting a literature review on current school psychology assessment practices (see Table 2). Key word searches will include “resistance to intervention,” “dual discrepancy,” and “assessments-linked-to-intervention.” Articles on advances in assessment technology, such as Personal Digital Assistant (PDA) assisted observation systems, will also be retrieved and reviewed. Peer-reviewed articles describing efforts to validate and implement innovations in assessment will be read and summarized. In addition, the National Association of School Psychologist standards for training and practice and school psychology trainers from other programs will be consulted. A list of core assessment skills and knowledge-bases will be generated based on this review. Finally, the

Project Coordinator will also make note of any literature on effective strategies for teaching school psychology assessment and intervention courses. These topics might include effective use of clinical and classroom settings, strategies for promoting report writing skills, integrating technology into assessment and interventions courses, teaching assessment and interventions skills related to transition planning, etc.

Beginning in November, the Project Coordinator will be responsible for convening monthly meetings of the project team to conduct the following activities: (a) review curricula of the six courses identified; (b) map the content and skills addressed in the six courses onto the skills identified by the literature and field review described above; (c) review student and graduate evaluations of the program, with particular attention to their reported competencies and knowledge of assessment and intervention practices; (d) modify the curricula and/or instruction of the six courses to match the NASP domains, reflect student perceived needs in training, integrate best practices in teaching assessment and intervention courses, and provide a logical scope and sequence, and (e) develop a portfolio component that reflects the assessment skills taught across the 6 courses to be added to the students’ program portfolios. Curriculum proposals will be develop as appropriate. The Project Coordinator will be responsible for submitting and presenting the curriculum proposals that result from this project.

5. How Assessment Will Be Embedded into the New or Restructured Courses and Programs

School psychology program students annually complete a detailed survey on their satisfaction with the program in terms of fostering specific skills. Changes in the students’ responses to this survey over time

(i.e., the 2002, 2003 & 2004 results will be compared to the 2006 and 2007 survey results) will provide

information on whether the project improved the assessment sequence as intended. In addition, end of the course teaching evaluations will be monitored to ensure that student satisfaction remains high. Finally, the project team will develop an assessment component for the program portfolio. All school psychology credential students are required to develop a school psychology professional portfolio. The current project will assist students in selecting and evaluating portfolio entries that demonstrate their advanced assessment skills.

In addition, one student focus group will be held with the fieldwork students (who are in the last year of the program) each year for 3 years to collect qualitative information on the quality of the assessment sequence beginning Spring 2005.

Because we are a relatively small program and our graduates work all over the state, it is difficult to assess the impact of our curriculum changes on our graduates’ assessment practices. Therefore, we will collaborate with our state organization to develop and disseminate a state-wide survey of school psychologist practitioners. The project coordinator will collaborate with trainers around the state to implement a webbased survey of school psychologists to be administered to the California Association of School

Psychologists (CASP) membership (over 3,000). The respondents will be asked to rate their current skills and training needs in various assessment and intervention practices. Respondents will also indicate which training program they have graduated from, allowing us to identify the responses of our graduates. We will then be able to examine and compare our graduates’ responses to those of the total group of responders. The survey will also indicate length of time since graduation which will allow comparisons between recent graduates who have completed the revised assessment/intervention sequence and those who completed the current or past assessment sequence. The state-wide School Psychologists Practitioner survey is scheduled to be administered every two years, allowing us to gather data on our graduates over a significant length of time.

Finally, the results of our project will be assessed by NASP’s professional standards committee, which will review program documents that reflect the changes and determine whether it meets NASP standards.

6. Timeline of Activities

If funded, the project is expected to be an iterative process, whereby course modifications will be continuously considered as new information is examined. The team will meet once a month to conduct project activities described in 4a – 4e. Modifications that do not require formal curriculum revision may be implemented on a trial basis as soon as Spring, 2005, though most substantive changes would be implemented in the academic year 2005-2006. The project coordinator will present survey data collected from the CASP 2004 survey of graduates and the program 2004 survey of students at the first meeting. The project coordinator will also present the results of her field and literature review (completed by the end of

October). By the end of the first meeting, the team will try to reach consensus on which assessment components (skills, knowledge, etc) are a priority for including in the assessment sequence (November). The curriculum and instruction of the current courses will be reviewed during the second meeting (January). The team will then map the assessment priorities identified during the first meeting on to the current courses to identify possible areas for modification. The third meeting (February) will result in a draft plan for curricula and instructional changes. The fourth meeting (March) will finalize the course changes and develop an portfolio entry criteria that reflects the changes made to the courses. A fifth meeting will be held in April if we deviate from the time-line. During Fall, 2005 the project coordinator will submit proposals to the

Department of Educational Psychology, Administration, and Counseling Curriculum Committee and the

College of Education Curriculum Committee as appropriate. The program portfolio will be submitted for

NASP approval by February 1, 2005. Formal implementation of the model will occur in 2005-2006.

7. Detailed Prioritized Budget

Two (2) units of assigned time for Spring 2005 for the Project Coordinator.

Four (4) $400.00 stipends for part-time faculty and student members ($1600.00).

Download