Writing Committee Annual Report 2013-2014 Membership

advertisement
Writing Committee Annual Report 2013-2014
Membership
Faculty Members
Beverly Chin, English (Chair) 2014
Gene Burns, HHP 2015
Cathy Corr, Applied Arts & Science 2014
Megan Stark, Mansfield Library 2014
John Glendening, English 2015
Sherrill Brown, Pharmacy 2016
Irene Appelbaum, Linguistics 2016
Marcia Kmetz, Applied Arts & Science 2016
G.G. Weix, Anthropology 2015
Douglas Raiford, Computer Science 2016
Student Members
Mark Triana
Jill Melcher
Additional Representatives (Ex-Officio)
Arlene Walker-Andrews , Associate Provost
Joe Hickman, Interim Registrar
Kelly Webster, Director, Writing Center
Amy Ratto-Parks, Interim Director, Composition Program
Grace Harris, Academic Advisor, University Athletics
Business items:
Writing Assessment Motion
http://www.umt.edu/facultysenate/committees/writing_committee/WritingAssessmentMo
tion.php
The Writing Assessment Motion was presented for a first reading at the September Faculty
Senate meeting. Writing Committee members reached out to senators to address questions and
the additional clarification information was added for the October Faculty Senate meeting. The
motion was debated and approved by the Faculty Senate on October 10th.
Review of Approved Writing and Upper-division Writing Requirement Courses
A total of 38 writing course forms were submitted to renew the writing designation as part of the
rolling review of current Humanities and Fine Arts Writing Courses. The consent agenda is
available in Appendix 1.
The Writing Course Forms were revised to address inadequate responses to the information
literacy and learning outcomes sections, as well as syllabi not including writing learning
outcomes. A section regarding the University-wide Program-Level Writing Assessment was also
added to the Writing Course Form to inform faculty of the requirement for students to upload
papers to the Moodle site. The Committee looks forward to a fully electronic form either in eCurr or Banner Workflow.
The rubric worksheet created during the Writing Assessment Pilot was used to review the forms
for Approved Writing Courses (See
http://www.umt.edu/facultysenate/committees/writing_committee/rubricWorksheet.xlsx ). The results
were entered electronically into a Google form for ease of data collection.
General Findings
Overall, a majority of the 2013 Approved Writing Courses satisfactorily met the
requirements for approval when initially reviewed. Only a small number of course
submissions needed revision to meet the Approved Writing Course criteria.
Strengths
Survey data indicate the following strengths in the 2013 Approved Writing Course
submissions:
 62% provide students with detailed requirements for the writing assignments in
the course syllabus while 38% offer detailed requirements in separate handouts.
 92% meet the minimum requirement of 16 pages of writing.
 100% base at least 50% of the course grade on writing assignments.
 85% demonstrate the type of instruction that will support students as they learn to
synthesize new concepts while 23% demonstrate that students will be challenged
to synthesize new concepts (instructional support not explained).
 100% provide students with an opportunity to formulate and express opinions and
ideas in writing. 77% demonstrate the instructional support used to accomplish
this.
 100% require that students consider purpose and audience when writing. 85%
demonstrate the instructional support used to accomplish this.
Areas for Improvement
Survey data show several areas in which Approved Writing Course submissions could be
improved.
 69% failed to list the Approved Writing Course learning outcomes on the course
syllabus.
 69% showed opportunities for multiple revisions with instructor feedback;
however, 31% show minimal opportunities for revision, sometimes in the form of
peer feedback only.


54% do not demonstrate whether the course addresses the information literacy
skills appropriate to the objectives outlined in the Mansfield Library’s information
literacy rubric and do not expose students to a subject-area librarian. 23% fail to
demonstrate how students will become information literate.
69% demonstrate how students will be supported in developing an awareness of
appropriate English language usage; however, 23% do not outline instructional
support used to help students meet this objective, and 8% do not demonstrate that
appropriate English language usage will be a course expectation at all.
Recommendations
Based on the survey data, the Writing Committee recommends the following:



Support faculty in developing strategies for providing feedback on student writing
and requiring revision in response to that feedback.
Support faculty in developing strategies to provide guided opportunities for
students to learn how to find, evaluate, and use information effectively.
Clarify for faculty the need for and importance of including the Approved Writing
Course Learning Outcomes on the course syllabus.
Writing Center Annual Report
Director Webster summarized the executive summary of the Writing Center’s Annual report (see
http://www.umt.edu/writingcenter/aboutus/annulreports/annualreport1213.pdf).
Review of Upper-division Writing Assessment
By request from the Geography department, a subcommittee of the writing committee reviewed
the departments new Upper-division Writing Assessment and provided feedback.
Transfer student analysis
Professor Raiford completed the data analysis of transfer students compared to non-transfer
students’ performance on upper-division writing courses. It turns out that non-transfer students
perform statistically better on upper-division writing courses. The GPA is slightly higher, 3.22
compared to 3.13 (Appendix 2).
Petition for Exemption from the Approved Writing Course Requirement
A subcommittee of the Writing Committee reviewed and approved a petition for a student to use
a transfer course to satisfy the approved writing course requirement. The policy was updated to
clarify who would review the petition.
University-wide Program-level Writing Assessment
The new assessment provides relevant information about student writing proficiency by
assessing and scoring student-revised papers from Approved Writing courses using a Holistic
Scoring Rubric. The assessment process offers professional development opportunities for
faculty and staff that are committed to improving student writing proficiency at UM. The first
Writing Assessment took place in Spring 2014 during which the ASCRC Writing Committee
collected student papers from 58 Approved Writing courses, which enrolled 1300
students. Only 348 writing samples were submitted to Moodle by the deadline. Students were
also asked to respond to survey questions (appendix 3) regarding revision. A total of 385
surveys were completed for a 26% response rate. In the future students’ final revised papers will
be collected at the end of the semester to assure the revision component and will be used for ease
of processing. There will also be enhanced efforts to create a culture of participation on campus
through discussions in departmental meetings and increased communication regarding the
University-wide Program-level Writing Assessment.
On April 25th, 38 volunteer faculty, staff, and graduate students from the UM-Missoula and
Missoula College scored a representative sample of student papers at the Writing Assessment
Retreat. There were representatives from a variety of disciplines including English,
Composition, Social Work, Geography, Linguistics, Business, Economics, Applied Arts and
Sciences, Accounting, the Mansfield Library, the Writing Center, Education, Health and
Biomedical Sciences, and Chemistry. Participants learned how to apply the Holistic Scoring
Rubric accurately, consistently, and efficiently to student papers.
The graph below shows the scoring results of the randomly selected student papers. The process
for pulling data from Banner for future analysis is described in Appendix 4. The detailed report
is available at Assessment Report.
Score Compared to Number of Revisions-Graph:
60
50
40
Number of Papers by Score
Revised Once
30
Revised Twice
Revised More than Twice
Not Revised
20
10
0
1
1.5
2
2.5
3
3.5
4
Types of Feedback used in Revisions:
Most papers that were revised at least once used more than one type of feedback. Of the revised
papers, 100% reported using written feedback from instructors during revisions and, of those:
 45% also used the a grading criteria/ rubric
 51% also used line editing
 46% also used in-person discussion
 23% also used email feedback
 43% also used group discussion
Participants at the retreat used the 4-point scale described in the Holistic Scoring Rubric to assess
student papers. A score of 1 is Novice, a 2 is Nearing Proficiency, a 3 is Proficient, and a 4 is
Advanced.
Scoring Results- 8-Point Scale/4-Point Scale Comparison:
8-Point
Score
4-Point
Score
Total Papers
8-Point %
4-Point %
1
2
3
1
3
2%
5
2
8
5%
7%
4
50
32%
7
3
30
19%
50%
6
44
27%
4
18
11%
39%
8
8
5%
0
5%
The evaluations of the Writing Assessment Retreat were highly positive (see Appendix 7). A
majority of participants strongly agreed that the retreat helped them understand and apply
the Holistic Scoring Rubric to students' writing. A majority of participants also strongly agreed
that the retreat was a valuable professional experience that they would recommend to
colleagues. The participants also agreed that the retreat helped them assess students' writing
accurately and efficiently. Participants expressed the need for more participation from faculty
and students in Approved Writing Courses as well as more faculty development workshops on
integrating rubrics into writing instruction.
Pending
Long Term planning for Writing in the General Education Framework
 The committee will investigate a structural solution for programs lacking an upperdivision writing course. There are several majors that allow students to take a course
from the approved upper-division writing course list to satisfy the requirement.

Clarify the separate components of the general education writing requirements. There is
still confusion regarding approved writing courses and the upper-division writing
required by the major. This most likely stems from the fact that 36% of approved writing
courses are upper-division. Most universities require a lower-division and upper-division
writing course. Most universities have a lower division and upper-division requirement.
Changing the titles is not going to solve the structural problem. The committees attempt
to create new labels was tabled.

Create a flowchart of writing expectations from freshman to senior year and across
disciplines. Faculty share the responsibility for teaching writing literacy. It cannot be
accomplished in two intensive writing courses. More courses are needed that teach
writing. Some faculty are frustrated with the level of students’ writing upon entering
upper-division writing courses required by the major. Some students do not translate
skills from composition to other courses. Faculty may not understand that it is not
unusual for students to “backslide” when learning new concepts and ideas. The Writing
Committee should consider professional development workshops that help faculty
understand the difference between “learning to write” and “writing to learn.”
…………………………………………………………………………………………………..
Appendix 1
Writing Course Rolling Review, 12/5/13
(Humanities and Fine Arts)
Approved Writing Courses
AAS / HSTR 347
AAS 372
ANTY 310
ARTH 250
ARTH 425
ARTH 434
CLAS 251L
CLAS 252L
HSTR 300
HSTR 315
LSH 151/152
MUSI 302
NASX 235
PHL 210E
THTR 330H
HSTA 401
Voodoo, Muslim, Church: Black Religion
African American Identity
Human Variation
Introduction to Art Criticism
Renaissance Art
Latin American Art
The Epic
Greek Drama: Politics On Stage
Writing for History
The Early American Republic, 1787-1848
Intro to Humanities
Music History II
Oral and Written Traditions in Native America
Moral Philosophy
Theatre History I
The Great Historians
Upper-division Writing required by the Major
AAS / HSTA 415
AAS / HSTA 417
ANTY 408
ARTH 350
DANC 494
HSTA / WGSS 471
HSTA 418
HSTA 419
HSTA 461
HSTR 400
HSTR 418
HSTR 437
JPNS 311
JPNS 312
MART 450
MUSI 415
MUSI 416
MUSI 417
NASX 494
PHL 499
RUSS 494
THTR 331Y
The Black Radical Tradition
Prayer and Civil Rights
Advanced Anthropological Statistics
Contemporary Art and Art Criticism
Seminar/ Workshop
Writing Women's Lives
Women and Slavery
Southern Women in Black and White
Research in Montana History
Historical Research Seminar
Early Modern Britain, 1500-1800
US Latin American Relations
Classical Japanese Literature in English Translation
Japanese Literature Medieval to Modern in English Translation
Topics in Film and Media
Music of the 20th Century, to the Present
Historical Topics in Music
Cultural Studies in Music
Reading Seminar in Native American Studies
Senior Seminar
Seminar in Russian Studies
Theatre History II
…………………………………………………………………………………………………..
Appendix 2
Assessment of Transfer Student Writing Performance
Overview
A recent assessment of student writing proficiency performed by the Geography Department
(UM Geography Assessment Report: 2013) found highly variable performance by students in
their program. They posited that a cause could be the large number of transfer students in their
program. The ASCRC Writing Committee decided a follow-up assessment was in order, and
pulled student grades for all writing courses (including WRIT 101, Approved Writing Courses,
and Upper Division Writing Courses) for all students, both transfer and non-transfer. The data
included records for a total of 16,334 students of which 3,913 were transfer students. Of the total,
1,655 transfer students had taken upper division writing courses while 2,374 non-transfers had.
The specific question at hand became “Do transfer students perform as well in upper division
writing courses as non-transfer students do.”
Summary of Findings
A subcommittee comprised of Camie Foos, Amy Ratto-Parks, Grace Harris, and Doug Raiford
acquired the writing-course grade data and met on Monday 2/24/2014 at 2:00PM in UH 221 to
discuss how to go about this analysis. In this meeting it was decided that a comparative analysis
of transfer and non-transfer students would be performed, and that achieved grades in upper
division writing courses would be taken as a measure of the success of previous, foundation
building, courses.
A Welch Two Sample t-test (unpaired) was performed on the two distributions of grades for
upper division writing courses from transfer and non-transfer students. Non-transfer students do
perform significantly better than transfer students (α=0.05, p-value = 0.008; mean transferstudent grades 3.13; mean non-transfer-student grades 3.22). While statistically significant, nontransfer student GPA performance is only 0.088 points better than transfer students, making it
difficult to conclude that transfer students are the root cause of any highly variable writing
performance.
This is supported by the fact that transfer students that take their foundation building courses at
their transfer institution perform better (though not significantly so) in their upper division
writing courses than do transfer students that take those courses here.
Detailed Analysis
The data included student grades from 2008 to 2013. It included the performance of 1,655
transfer students who have taken upper division writing courses, and 2,374 non-transfer students
who have taken upper division writing courses. Error! Reference source not found. shows the
similarity in grade distributions between the two populations.
Figure 1 Grade distributions for transfer and non-transfer students
Breakdown of Transfer Student Composition
When upper division writing course grades for transfer students are examined, those that took
WRIT 101 here compared to the grades of those that took it at their previous institution were
slightly better for those that took it elsewhere (not statistically significant, α=0.05, p-value = 0.5;
mean GPA for those that took 101 here 3.04; mean GPA for those that took 101 elsewhere 3.12).
Error! Reference source not found. shows the similarity in distributions (though their quantity
is clearly disproportionate).
Figure 2 Distributions of Upper Division Writing Course Grades of Transfer Students that Took
WRIT 101 Here and Abroad
Error! Reference source not found. clearly shows this disparity in the numbers of transfer
students that took 101 here and abroad.
Figure 3 Proportion of Transfer Students that Took WRIT 101 Here and Abroad
Similar results are found for those who took Approved writing courses at their previous
institution (slightly better performance for those that took it elsewhere; 3.23 vs. 3.09; not
significantly different, α=0.05, p-value = 0.19). Error! Reference source not found. depicts the
similarity in distribution and Error! Reference source not found. shows that those that transfer
in Approved Writing Course Credit are outnumbered by those that do not.
Figure 4 Distributions of Upper Division Writing Course Grades for Transfer Students that Took
Approved Writing Courses Here or at their Transfer Institution
Figure 5 Proportion of Transfer Students that took Approved Writing Courses Here and Abroad
………………………………………………………………………………………………………
Appendix 3
Student Survey
How many times did you revise this paper in response to your instructor’s feedback?
 Once
 Twice
 More than two times
 I did not revise this paper in response to my instructor’s feedback
If you did revise this paper, what kind of instructor feedback helped you revise? (check all that
apply)
 Written comments
 Comments related to the grading criteria/rubric
 Line by line editing
 In-person discussion with the instructor
 Email discussion with the instructor
 Small-group or whole-class discussion of assignment
 Other (describe)
If you did revise this paper in response to your instructor’s feedback, what level of revision did
you do? (check all that apply)



Major changes
(for example: reshaped the paper entirely, changed my thesis, changed my topic, started
over)
Mid-level changes
(for example: reorganized the ideas, further developed existing points, revised use of
source materials)
Minor changes
(for example: corrected typos; corrected grammatical, punctuation, and spelling
mistakes; fixed my citation formatting)
……………………………………………………………………………………………………..
Appendix 5
Process for pulling data AFTER EACH RETREAT (once Banner table is created)
1. Get 790s of our scored samples (CSV file from Nancy)
2. I upload the 790 csv file to WINSCP and request that Judy Grenfell updload these 790s to
our Banner table (same process as our WC database upload)
3. Once I get the email confirmation that the load is complete, send data extract job request to
Judy Grenfell
 This will result in a csv file that includes the scored sample 790s and the Banner data
we have requested
 This csv file will have both an activity date (when I pulled it) and a term code so that
each sample set is associated with a term
4. We will need to merge this csv file with our csv file from Moodle that will provide:
 a score for each sample/790
 survey answers for each sample/790
5. At some point, we’ll need to strip the 790s in another version of our merged CSV file
790s from
the scored
samples
uploaded into
Banner table
Banner data
with 790s
extracted into
a CSV file
Banner CSV
file merged
with our CSV
file that
includes
790s, scores,
and survey
answers
Later:
We need to define our assessment questions (must be answerable with the data we are collecting!
Consider what type of tool we will use to be able to answer these questions
 Sort our big excel spreadsheet?
 Use Info Griz?
………………………………………………………………………………………………….
Appendix 6
Evaluation Summary:
Your name (optional) ___________________________________________
Please respond to this evaluation. Your comments will help the Writing Committee write its
2014 report and will assist in our implementation of next year’s University-wide Program-Level
Writing Assessment. Thank you.
A. Please check the statement that best reflects your knowledge and experience with writing
assessment before this retreat.
29_ 1. I have created and used rubrics to assess students’ writing.
3_ 2. I knew about rubrics, but have not used them regularly in my assessment of students’
writing.
____ 3. I did not know about rubrics for assessment of students’ writing.
B. Please place a check in the column that represents your opinion.
Strongly
Agree
Agree
No
opinion
Disagree
Strongly
Disagree
1. This retreat helped me understand and
apply a holistic rubric to students’ writing.
22
9
1


2. This retreat helped me assess students’
writing accurately and efficiently.
16
14
1
1

3. This workshop was a valuable professional
development experience for me.
24
8



4. I would recommend this retreat to my
colleagues.
26
6



C. Please write your responses to these 2 items. Feel free to continue your responses on the
back of this page.











Practice applying the rubric- multiple rounds were crucial
Hearing opinions / reasoning from colleagues in other disciplines
Table conversation and consensus building
Discussion with others interested in writing
The two most important aspects were reading widely varying pieces and getting to
discuss the scores for genres with which we were unfamiliar. These helped me see the
range and purpose of writing courses differently than I had before – a great experience
It is always so fruitful to see what writing at the 200-level looks like across disciplines
Thinking about assessing writing in a holistic way
The discussion of rankings among people at the table and coming to a consensus. Also
the rubrics themselves
Meeting / connecting with other instructors, sharing thoughts on what makes writing
effective / Ineffective
Cross discipline collaboration and discussion
Reinforced my assessment practices- gave me a better understanding of ASCRC
requirements

























Meeting colleagues, having the opportunity to share tools and challenges with other
educators. Sky club was a great space. Food service was wonderful… event ran
smoothly
Lovely! This year’s retreat seemed to run even more smoothly than last year’s Enjoyable
and productive. Thank you
Team building and developing consensus. Also helped to reassure me that I am fairly
and effectively evaluating student writing
Communication with colleagues
Food was fabulous
What a great day! Loved the holistic approach at the first section of the retreat where we
trained and “normed” with the blue (anchor) papers
It was nice to see the range of writing tasks and the capabilities of our students. It was
also great to talk through writing assessment with new colleagues
Discussing assessment with colleagues
By exposing myself to the rubric I am able to evaluate and re-evaluate my relationship to
writing instruction, both in relation to this campus’ objectives and the trajectory of
writing outside the academy
The excellent discussions about the application of the holistic rubric
Sharing thoughts on writing quality with my colleagues
Hearing from other campus faculty / that otherwise don’t get to hear from
Listening to colleagues views
The discussions that occurred re: consensus
Beverly’s leadership/ organization
Megan’s leadership at our table
Table discussion concerning various themes of writing as applied to the rubric
Evaluating / assessing the writing, using the rubric. I’d like to use it with my students
Discussing papers assessed together as a group- hearing others’ particular biases in
weighing certain outcomes more than others
Professional conversations – I valued the opportunity to learn from my colleagues in
other disciplines. I learned how we read differently, value different features, and
therefore assess student writing differently
I felt that the breadth of papers (both in their variety and numbers) were great for looking
at writing across campus. My table was very strong and thoughtful when talking through
disagreements
Actual practice applying rubric to student submissions, table discussions, examples at
start of session of 1-4 submissions
Having people elevate ad grade the same paper, and discussing why or why not.
I really liked the “test” papers what we read ahead of time. It prepared me for what we
were actually going to be doing. Additionally, I love the table leaders! What a fabulous
idea. I am already looking forward to next year and I am also looking forward the results
of this assessment. It was a wonderful experience and I was so happy with it.
I really enjoyed learning about UM’s holistic rubric and feel all UM faculty should be
encouraged to learn about it and to be trained to use it in their courses (including non-w
courses)
2. What might be changed to improve this retreat?

























The retreat was excellent. Maybe give each table a copy of the agenda / schedule
Final papers instead of drafts
More faculty
I thought the retreat was great, but I would recommend doing one version of this
departmentally for all instructors teaching writing courses. I think it would be extremely
helpful.
Bonus for staying the whole time.
It might be helpful to change seating order halfway through, so we could work with more
participants.
Expand the time available to go through the anchor papers and the training papers. It felt
a little rushed as compared to the consensus papers.
Include assignments, continue to recruit more tenure-track faculty, slightly slower pacing
with training papers
Earlier in the semester would be better
A broader range of discipline specific assignments would also be helpful
Perhaps try to score only revised material from completed portfolios
I didn’t find the section with the purple (training) papers to be helpful to my
understanding of the process
Some group discussion about the rubric, rather than explanation would be great. We
could discuss the purposes, varied tasks, campus writing goals, etc and the assessment of
such pieces with the rubric
Preview distinctions between expressing and supporting opinion and persuasive
/argumentative composition
The initial steps of the retreat were extremely helpful but very rushed. I think it would be
infinitely valuable to have more time to acquaint ourselves with these guidelines.
Greater variety of genres in general in training / anchor papers
Well organized. Thank you
We need to know the prompt – hard to apply neutral rubric when reading papers from
some assignment. The training paper scored 4 seemed off and we had wide scores on
that one. But our “real” papers were much closer in scoring
None
Need to spend more time on training
Keep improving on the diversity of the papers across disciplines. Figure out a way to
share the results of the writing assessment to help instructors improve their teaching. The
PD model of workshops sounds great.
Well done! Maybe a bit more time for the initial value models
The first part of the rubric addresses purpose, but how can we assess with such little
information?
Moments during the training sets felt rushed with little time for discussion. I think
whole-group discussions about the training papers would be helpful (but I realize timeintensive, too!)
I think perhaps handing out a writing set of instructions before starting with the anchor
papers might help facilitate the work we do in our tables.








How does the actual assignments of the papers impact assessment?
Atmosphere / location of retreat excellent- conductive to a productive retreat
More relevant / interesting dynamic for me to review scholarship in my discipline in such
a retreat
Food service was great
We need such a session held for our professors at the School of Business if possible
It was awesome, I really enjoyed it. It was also a good place for networking with fellow
faculty.
The timing was tough- we did not have enough time in our group to go through all the
test papers.
I thought the overall approach was terrific! A big strength was having a table leader to
help instruct, facilitate and guide.
Download