Henderson, K. Hoggatt, C. Knight, P. Muench, E. Uchimoto, L.... A. Williams

advertisement
ASCRC Minutes 4/26/11
Members Present: M. Beebe-Frankenberger, B. Borrie, D. Dalenberg, M. Grimes, C.
Henderson, K. Hoggatt, C. Knight, P. Muench, E. Uchimoto, L. Tangedahl, K. Zoellner,
A. Williams
Members Absent/Excused: S. Greymorning, S. O’Hare, J. Sanders, K. Spika, J. Staub
Ex-Officio Present: B. Holzworth, E. Johnson, A. Walker-Andrews
Guests: M. Semanoff (Writing Committee Chair), M. Medvetz (Writing Committee
Member), K. Webster (Writing Center Director), K. Ryan (Composition Program
Director), R. van den Pol, J. Frederickson, M. Hoell
Montana Law Enforcement Academy (MLEA) Guests: Peter Bovington, JD, lead
instructor; Ali Bovington, JD
Attorney General’s Office Guests: Jennifer Dewey, MLEA training officer and Mike
McCarty, MLEA instructor.
Chair Knight called the meeting to order at 2:13 p.m.
The minutes from 4/19/11 were amended and approved.
Discussion Items:
Writing Committee Recommendations on Writing Assessment (appended)
 The Writing Committee Chair, Professor Semanoff, the Writing Center Director,
Kelly Webster, and the Composition Program Director, Kate Ryan, provided
background information regarding the recommendations. Last year ASCRC
presented a report to ASCRC and was directed to prioritize the alternatives to the
UDWPA. That report described the history of the exam. Originally the UDWPA was
to serve the following purposes:
o Bring constructive attention to writing through community-based, local
conversations across disciplines;
o Improve writing instruction;
o Improve writing proficiency;
o Provide continuous assessment throughout a student’s academic career;
o Serve as a mid-career gate meant to assess preparedness for writing in upperdivision courses.
According to the Writing Committee there are several structural and theoretical
problems with the current exam. Students do not take the exam when it is intended so
it does not function as a gate. It is not in line with best practices, is disconnected from
the curriculum, does not improve teaching or learning, and does not reflect learning
outcomes. Therefore the data does not provide an accurate picture of student writing.
It is an outdated model disassociated with student writing experiences. Continuing to
devote time (2500 essays/yr), resources ($60,000/yr), and energy into a tool that is not
valid is irresponsible. Meaningful assessment is needed that can be used to help
students become better writers.
The recommendation of the Writing Committee is to eliminate the UDWPA and
focus on program assessment in order to gather meaningful data that can be used to
improve students’ writing. The definition of program assessment is:
Program assessment is “the systematic and ongoing method of gathering,
analyzing and using information from various sources about a program and
measuring program outcomes in order to improve student learning” (UFC
Academic Program Assessment Handbook 3). In short, program assessment
allows for the gathering of available, relevant information in response to locally
constructed questions about student writing or writing instruction that will
influence decisions about how programs and student learning can be improved.
Program assessment is formative. It focuses on studying aspects of programs in order
to make improvements: It is contextualized and allows for assessment that is
responsive to the needs of the institution and diverse programs. It is aligned with the
curriculum because it focuses on learning outcomes.
There were numerous comments and concerns. Program assessment does not hold
students accountable. It would take more faculty effort and cooperation. It will only
be effective if there is follow through and resources devoted to writing improvement.
It does not address structural problems. Writing courses focus on content not learning
to write.
ASCRC was not clear how program assessment would take place or who would do
the work, or how it would demonstrate progress/success. The Writing Committee
won’t know the outcome of program assessment until it is conducted. It should
identify patterns of strengths and weaknesses. One suggestion was that random
samples of capstone course papers, syllabi and writing assignments be reviewed.
ASCRC needs to see data perhaps from a Pilot project in order to make an informed
decision. According to Academic Affairs the UDWPA cannot be eliminated without
another form of assessment in its place.
The Writing Committee was hoping that ACSRC would endorse the
recommendation. ASCRC will continue deliberations next week.
Police Science
 Professor Dalenberg summarized the status of the Police Science Proposal. The
subcommittee has not yet had time to review the latest response to its concerns
regarding credits, syllabi/assessment criteria, and instructors’ credentials. Another
issue has surfaced regarding the Certificate of Applied Science approach. BOR
Policy 301.12 states that general education coursework comprise no more than 1/3 of
the certificate credits. Exceptions must be requested through the deputy
commissioner for academic and student affairs, citing a compelling reason for the
variation. The subcommittee will need to see this justification. There are several
implementation issues that need to be addressed.
The guests provided information regarding instruction and assessment. The Montana
Law Enforcement Academy (MLEA) is taught as a unified curriculum and is
performance intensive involving lot of field training. It includes exams and practical
evaluations. The instructors are certified by Police Officers Standards and Training
(POST). The MLEA has a legal obligation to educate officers and adhere to the
guidelines in Montana Codes Annotated, Title 44. Montana wants well trained
officers protecting its citizens. The long term goal is to provide an incentive for police
officers to pursue additional education. The program would be unique in that
students admitted into the law academy are employed and the training is paid by the
employer. Students have been asking about how they may receive academic credit
for the program. Agencies pay officers more if they have degrees and degreed officers
are more likely to be promoted. Approximately 150 students go through the program
each year. There are three cohorts which average 45 students. Classes start in
September and January.
The proposal will need to be revised. It can be submitted to the Board of Regents as a
Level I proposal with Level II documentation in September. This will allow it to be
voted on at the same meeting.
The meeting was adjourned at 4:00 p.m.
ASCRC Writing Committee Recommendation on Writing Assessment Practice
at The University of Montana
Based on the findings of the Spring 2010 ASCRC Writing Committee Report on Writing
Assessment Practice at UM, and at the request of ASCRC to make a specific
recommendation based on our study, the Writing Committee (WC) offers the following
recommendation regarding the Upper-Division Writing Proficiency Assessment (UDWPA)
at The University of Montana. The WC recommends discontinuing the UDWPA and
implementing writing program assessment in its place. Program assessment is a
contextualized form of assessment that can be scaled and shaped locally to address questions
and issues that matter to faculty. This recommendation endorses a proven method for
studying writing instruction at UM and for effectively devising ways to address it through
student learning opportunities.
Rationale for Discontinuing Large-Scale Individual Writing Assessment
The UDWPA is classified as large-scale individual student assessment. A student’s individual
performance on a test is used to make a high-stakes decision about his or her academic
progress. We recommend discontinuing this kind of writing assessment altogether because it
lacks validity and efficacy as an assessment tool. The use of UDWPA test scores to make
decisions about a student’s progress is not grounded in a current, sound theoretical
foundation regarding the teaching and learning of writing. More specifically, the UDWPA
does not

Help students to produce rhetorically effective writing.




Accurately reflect a student’s overall writing ability.
Improve teaching or learning. It focuses on gating students not guiding student
learning.
Align with writing course outcomes at UM (including WRIT 095, WRIT 101,
Approved Writing Courses or the Upper-Division Writing Requirement in the
Major).
Align with our accrediting body’s focus on using assessment to evaluate and improve
the quality and effectiveness of our programs (see
http://www.umt.edu/provost/policy/assess/default.aspx).
In addition, large-scale individual student assessments that might more accurately reflect the
complexity of writing and the conceptual framework that informs UM’s writing course
outcomes, such as portfolio assessment, are quite simply cost prohibitive.
Program Assessment
We offer a brief definition and description of program assessment to introduce this method
of assessment to members of ASCRC and the wider campus community. The overall aim of
program assessment in the context of writing instruction at UM is to improve the quality of
student writing by improving the writing program (note: We define writing program here as
the writing-related instruction that the Writing Committee (WC) oversees. The WC is
charged with designing and assessing the Approved Writing Courses and the Upper-Division
Writing Requirement in the Major, and with supporting the Writing Center.).
Definition
Program assessment is “the systematic and ongoing method of gathering, analyzing
and using information from various sources about a program and measuring program
outcomes in order to improve student learning” (UFC Academic Program Assessment
Handbook 3). In short, program assessment allows for the gathering of available, relevant
information in response to locally constructed questions about student writing or writing
instruction that will influence decisions about how programs and student learning can be
improved.
The characteristics of program assessment valued by the WC include the following:

Because program assessment is formative, it focuses on studying (aspects of)
programs to improve and modify them accordingly. Focused on answering specific
questions, program assessment results in qualitative and/or quantitative data to
shape appropriate next steps.

Because program assessment is contextualized, it can be scaled and shaped locally to
address questions and issues faculty care about. This allows for assessment practices
that are responsive to the values and expectations defined not only by the institution
but also by varied academic departments.

Because program assessment focuses on studying the efficacy of learning outcomes,
it aligns with the current writing course guidelines for Approved Writing Courses
and the Upper-Division Writing Requirement in the Major.
Program assessment is a recursive process:






Articulate a program’s mission and goals,
Define relevant student outcomes and select outcome(s) for study,
Develop assessment methods that address the outcome(s),
Gather and analyze data (qualitative or quantitative),
Document the results,
Use the results to improve student learning by strengthening the program.
Writing Program Assessment at UM
As a contextualized form of assessment that can be scaled and shaped locally to address
questions and issues faculty value, program assessment at UM could take several forms. This
flexibility means that faculty would articulate their writing related values and expectations in
particular contexts and would shape questions that could be answered through the
systematic collection of quantifiable data. In all of these contexts, program assessment
practices would be ongoing opportunities to promote faculty engagement in conversations
about writing instruction.
Starting with an inventory of what assessment-related information and processes already are
in place, writing program assessment at UM would take advantage of existing tools and
processes. For example, UM’s laudable writing curricula that require students to write
throughout their academic tenures are currently positioned for program assessment. The
Approved Writing Courses and the Upper-Division Requirement in the Major now utilize
sets of carefully defined learning outcomes. In addition, WRIT 095, WRIT 101, and WRIT
201 (under the guidance of the Basic Writing Director and the Director of Composition and
with the support of their respective departments) also utilize carefully defined learning
outcomes and are likewise poised to embark on program assessment projects. Conducting
program assessments of outcomes-based writing courses across campus could provide the
basis for better understanding the varied ways in which teaching supports student writing
and of the extent to which students are meeting these outcomes as demonstrated in their
written work. Assessment methods may include:





Studying culminating assignments in capstone courses,
Conducting content analysis of student writing, such as final research papers or
reflective essays, to assess student writing samples,
Analyzing curriculum, including reviewing course syllabi, textbooks, and writing
assignments, to assess the effectiveness of instructional materials,
Organizing focus groups of department faculty and/or students to collect data about
the beliefs, attitudes and experiences of those in the group to gather ideas and
insights about student writing and writing instruction,
Collecting institutional data on writing courses or using other university assessments,
like NSSE (National Survey of Student Engagement), to consider writing data.
Such program assessments would allow us to articulate and reinforce discipline-specific
expectations and would enable us to learn about our students’ patterns of writing strengths
and weaknesses, identifying them using collected evidence rather than relying on anecdotes.
Ultimately, this gathered information would shape future steps to support instructional
development and student learning.
Potential Options for Improving the Quality of Student Writing through Writing
Instruction at UM
Formative program assessment at UM would allow us to better understand how we can
improve the quality of student writing through instruction. Program assessment’s primary
value, then, would be in its ability to gather and analyze data in order to make decisions
about appropriate strategies for improving student writing. For example, the WC imagines a
number of options that might grow out of program assessment:
1. Create a 100 or 200-level writing course as a second general education writing requirement
to replace the current Approved Writing Course. Such a writing course could give students
an opportunity to learn strategies for writing in the disciplines (broadly conceived as social
sciences, humanities, technical writing) by reading in the genres. In addition, such a course
would serve as a bridge between WRIT 101 College Writing I and the Upper-Division
Writing Requirement in the Major.
2. Create more rigorous writing requirements for the Approved Writing Course and UpperDivision Writing Requirement in the Major.
3. Require students to take more than one Approved Writing Course or Upper-Division
Writing Requirement in the Major.
4. Offer additional writing related workshops and resources tailored to faculty teaching goals
and student learning needs.
5. Create a Center for Writing Excellence to support faculty and students in writing
instruction and learning to write in different contexts at UM.
Download