csu assessment readiness

advertisement
Vicki:
This is not a formal analysis. Rather it summarizes some trends in the data from these three
campuses. I have had a hard time getting all of the data for each program at each campus, the
reason being is that separation among the various programs on any given campus means that data
collection is segregated by programs as well. This is one of the trends that cropped up in my
conversations with the all three of the campus representatives.
I’ve organized my remarks in terms of the major data categories on our survey.
All four programs (CSU Stanislaus, Sonoma State, East Bay and Cal Poly, SLO) offer al three
teacher credential programs.
The summary is based upon data about the following campuses and programs:
CSU Stanislaus (CSUS) – SS
CSU East Bay (EB) – MS, SS
Cal Poly, SLO (SLO) – MS, SS, ES
COORDINATION OF ASSESSMENTS:
None of the campuses have a centralized system or single person or office that coordinates
assessments for all programs. Each of the campuses uses a distributed model with data
collection and analysis spread out among program coordinators, department chairs, technical
staff or credential analysts, and/or faculty who collect data for specific programs and who may or
may not do any analysis of the data collected. It is not clear what percentage of any person’s job
is devoted to the coordination of assessments.
Overall, coordination of assessments seems to be lacking at each campus. The biggest
disconnection is between the general education teacher preparation programs (MS and SS) and
special education programs.
DATA COLLECTION:
Candidate Demographics:
The most consistent demographic data collected is age, gender, and subject matter specialization
(for SS programs only). Analysis of this data for two campuses (CP SLO and EB) consists of
descriptive statistics on changes in counts. Although we collect data on ethnicity at CP SLO
through our admissions process (CSU mentor application), we neither extract nor make use of
that data for any analysis at this time.
The ES program at CP SLO does collect data on candidate’s previous experience with children.
It is the only program that does so for the three campuses.
Progress Monitoring:
Midpoint assessments by master teachers and supervisors and GPA are consistently collected for
all programs. Only one program (EB) does any systematic analysis of this data. EB’s analysis
consists of descriptive statistics showing change over time in all of the assessments listed and
individualized analysis focusing on growth of qualitative data on individual candidates.
Admission Check Points:
Only one of the programs reported collecting data on candidate dispositions prior to admission to
the program – CP SLO, ES only.
All of the programs reported collecting data on GPA, CBEST/RICA passage, prerequisite course
completion and SMPP versus CSET for the SS programs.
As is true for other measures, EB reports analysis of at least two data items, RICA/CBEST
passage and SMPP/CSET choice. CP SLO, like EB, does describe rate of passage for
RICA/CBEST, but no comparative analyses are reported (i.e., comparing rate of passage by
cohorts, types of candidates, etc.)
Candidate Educational Professional and Personal History
CP SLO, ES program only collects the most data about its candidates’ personal histories. All of
the categories were checked. CP SLO, MS and SS and CSUS collected undergraduate degree
and institution from its candidates. None of the programs collecting data on candidates’ personal
histories reported doing any analysis of this data.
Surprisingly, EB, which has typically done the most systematic data collection and analysis for
all of the categories does not report collecting any candidate personal history data.
Fieldwork Experience and Placement
There was the most variation in types of data collected among the three campuses in these data
types. CP SLO, all three programs, collected the most types of data on in this category. CP SLO
collected data on school demographics (all 3 programs). EB also collected school demographic
data.
All three campuses collected data on the number of supervisor observations, but only CP SLO
(ES only) and CSUS collected data actual number of hours teaching or types and hours of field
experiences.
Program Completion Check Points and Performance Assessment Scores
None of the campuses have yet adopted the TPA or PACT for their programs. Thus, none
reported collecting data on these performance assessment measures.
All three campuses reported collecting data from master teachers’ assessments, and all but CP
SLO ES collected data from the final assessment done by university supervisors.
Program Completer Competence and Retention
Only CP SLO ES reported collecting data on its program completers that was separate from the
system-wide exit surveys. CP SLO ES collected, but did no analysis, of the data on employment
record, and graduate education attained. The graduate education for CP SLO ES is done in
conjunction the ES credential that SLO’s candidates’ acquire, so the collection of graduate
education is a part of how they manage both their credential land graduate programs. In talking
with my colleagues in the ES program, I sensed that the collection of employment data was not
systematically done but dependent instead on word-of-mouth or on continuing contact with their
program completers who remained in the local area as ES teachers.
PROCESS NARRATIVES ABOUT DATA COLLECTION
Only CP SLO indicated an interest in collecting more data than they currently obtain. The ES
faculty were particularly interested in getting the data listed on Part 3 that they report not
obtaining at this time. CP SLO is also working to develop a more comprehensive system that
will allow it to access admissions data and other candidate data, such as course grades for
credential courses, that is typically collected and stored by the university but not necessarily
readily accessible for the programs to use or to correlate with other program data.
EB reported that they feel inundated with data and fear the amount of time and effort required to
do data analysis if they were to collect much more than they already do.
Electronic data is collected for licensure requirements by the credential office at all three
campuses; however, the amount of data actually collected on candidates, especially in term of
performance data is dependent upon whether the campus has elected to use a third party vendor,
such as Task Stream to manage its data collection electronically.
It is probably more true than not that many of the data items reside as paper forms kept either in
the credential office or, as is the case for CP SLO ES program, in the offices of the faculty who
coordinate the programs.
Letters of recommendation – CP SLO (SS and MS) and CSUS report using a standard letter of
recommendation for their candidates. CP SLO (ES) has not standard recommendation letter but
do use a rubric to evaluate the nonstandard letters that they receive.
Alignment of data collection with transition points – all three campuses collect data on
candidates as they progress through the program at the key transition points: at admission, at the
end of each academic term in which candidates are enrolled, at exit from the program. Those
programs organized into cohorts (CP SLO ES and EB) are able to monitor candidate progress
more efficiently because of the cohorts move through the programs at the same pace.
Both EB and CP SLO (SS and MS) are moving towards the development or use of systematic,
electronic data collection systems, whether they are home grown (CP SLO) or from a third party
vendor (EB). CSUS does not report any movement towards a third party vendor.
Barriers and Assets for data collection:
Resources (a dedicated staff person to collect and/or analyze data) is either a valuable asset (EB
and CP SLO SS & MS) or a detriment to the program (CP SLO ES and CSUS). Those programs
without a dedicated person report that the burden of this work falls on the shoulders of faculty,
usually those who play some role as coordinator or administrator of the credential program.
However, this responsibility seems to be on top of what is already expected of these faculty; it is
not clear that the faculty who take on these roles have either the desire nor the expertise to do the
type of data collection and analysis that will soon be expected for all credential programs under
the new CCTC accreditation (biennial report) requirements.
Useful efforts at data collection
Each campus interpreted this question differently. EB and CSUS interpreted the question to
mean, what data was most useful to credential candidates, whereas CP SLO (SS, MS and ES)
interpreted the question to refer to how useful the data was to the program itself. Answers across
the campuses can’t be directly compared because of differences in how they interpreted the
question.
K-12 pupil performance data
None of the campuses collect or currently use K-12 pupil performance data. Both EB and CP
SLO cite the difficulties of accessing and making sense of these data.
DATA ANALYSIS
Comments about overall data analysis are similar to those revealed in the reports about types of
data collected. Little actual analysis is done beyond an examination of change elements,
descriptive statistics (at best) and the use of data to inform program processes (not based upon
systematic analysis but more informal views of the data).
The question about alignment was also interpreted differently across campuses. CSUS
interpreted alignment to mean alignment of assessment instruments to the TPEs rather than
alignment of data elements or the alignment of different types of data systems.
EB and CP SLO do use some of the data to compare programs. At CP SLO, the comparisons
across programs have tended to focus on enrollment trends and FTE/SCU generation. This
analysis is being conducted in response to calls from our provost to increase enrollment in CP
SLO’s credential programs.
DATA STORAGE
The responses to the data storage questions are similar to those about electronic forms of data.
For those campuses beginning to use automated or third-party-systems, data storage will be
moved from paper copies housed in faculty offices or in the credential offices to on-line servers
maintained by external providers.
DATA REPORTING AND USE
Campuses use either systemwide surveys (EB, CP SLO –SS & MS), college or departmental
surveys (CP SLO ES and EB) or candidate assessment data (from end of quarter/term
assessments) to make decisions about instruction or programs. There was no sense of an
organized or systematic approach to data analysis or reporting that ran across all campuses.
Again, these efforts seem highly dependent on whether a particular program or campus have
people who are able to and have the time to do this type of reporting and analysis.
At CP SLO, most external reports (Title II and Chancellor or campus-specific reports) are
provided by our technical support person, who also maintains our student information system) or
by the various coordinators or chairs for programs and departments.
The ES program at CP SLO also makes use of an external grant to collect data. That is, the
external funders provide some support for the ES faculty to collect data on their candidates. The
candidate data plays a role in the external funders’ reporting responsibilities.
EB did state that they had difficulties collecting or reporting data on candidate dispositions.
DATA DRIVEN DECISION-MAKING
All campuses use data, GPA and test data as well as prerequisite course taking data to inform
decisions about admission. Although some mention is made about how program improvement is
driven by data, it is not clear from the reports how exactly these data are used in program
improvement efforts.
Both CP SLO and EB also require candidates to complete the exit surveys for the Chancellor’s
Office as part of the application for credential process.
In summary, although each campus makes some efforts to collect data, most data seems tied to
credential/licensure requirements. Thus, licensure data is kept within the credentialing office.
However, it is not clear whether there is yet a culture for data-driven decision-making on any of
the campuses reporting.
Download