ASCRC Writing Committee Minutes, 4/7/14 TODD 204 Members Present: S. Brown, G. Burns, B. Chin, J. Glendening, J. Melcher, D. Raiford, M. Stark, M. Triana, G. Weix Ex-Officio Members Present: G. Harris, A. Ratto- Parks, K. Webster Members Absent/Excused: I. Appelbaum, C. Corr, N. Kimbell Guests: N. Clouse, Teya Tietja Chair Chin called the meeting to order at 10:10 a.m. The minutes from 3/3/14 were amended and approved. Communication Item: Professor Brown introduced her guest Teya Tietja, a resident in Pharmacy Practice. She will be attending the Writing Retreat in Professor Brown’s place. Graduate Student member Triana was congratulated on his acceptance and teaching assistantship in the Rhetoric and Composition PhD Program at the University of Washington Associate Provost Walker-Andrews sent correspondence that the Northwest Commission on Colleges and Universities (NWCCU) was impressed with our writing assessment efforts and the holistic scoring rubric. Business Items: Assessment Coordinator Kimball was unable to attend the meeting, but provided a written update. Many instructors do not require the final revised paper until the end of the semester. Nancy Clouse informed the committee of the Moodle submission progress. To date 117 surveys have been completed, and 108 papers have been submitted. Some students discover after completing the survey that the paper should be one that has been revised. Nancy or Naomi can clear the survey for students to resubmit with their paper. Students can resubmit papers at any time. The deadline for paper submission is Friday, April 11th. This spring the retreat will have to use what is submitted, but in the future papers should be collected the semester prior to the retreat. This will allow the final paper to be uploaded into Moodle. Chair Chin asked members to think about the timeline for the next Writing Assessment Retreat. Although the criteria for writing courses require revision, which ideally should occur throughout the semester, there is not a requirement of when the revised papers are due. The key issue with regard to the full implementation is communication. Some faculty are still unclear about the function and process of assessment. One recommendation was to have a guide or website for Writing Course instructors that list expectations, including syllabus language, with resource links, dates, and FAQs. This could help create a community and improve the image of adjunct writing instructors. Professors Brown, Raiford, and RattoParks will work on the website / guide. There could also be an FAQ page for students. The communication about the assessment requirement could be made clearer through the formal academic structure, not just emails to writing instructors. The issue should be discussed at faculty meetings. Student participation could be required to complete the course or faculty could give a participation grade. Currently there is no consequence attached to non-participation. Members should think of template language to be included on syllabi. Part of the confusion could be the legacy from the UDWPA. The confirmation email will be sent to the 56 faculty that responded to the save the date invitation. The communication includes directions to the venue and a request for any dietary restrictions. Once the final participant list is confirmed chair Chin will work on the table assignments and invite faculty to be table leaders. Director Webster distributed the final list of data fields and explained the process (appended below). The student petition to use a transfer course to satisfy the approved writing course requirement was more complicated than anticipated. An issue that may require further consideration is whether the Committee should grant exemption or only equivalence. In this case the dual credit course was acceptable as an equivalent to WRIT 201, but the student was asking for an exemption because she already had enough credits to graduate. The Committee may see additional petitions as more students are accepted with the dual credit courses. If it becomes a problem, the policy can be revisited. Camie and Amy will update the policy with review language and that notification will occur via the student’s umontana email account. The Mansfield Library is offering a one credit experimental course next fall - LSCI 391 Advanced Research Literacies. The Academic Standards and Curriculum Committee (ASCRC) is communicating its support for the course at the Faculty Senate meeting on April 10th. There have been complaints that upper-division students do not know how to conduct research. It is hoped that faculty, specifically instructors of upper-division writing courses will encourage students to take the 1 credit course. The Committee revised an approved the assessment language on the form below. It will be expanded with the syllabus template information for final approval next month. VIII. Assessment I will participate in the University-wide Program-level Writing Assessment by requiring students in this course to upload a sample paper to the designated Moodle location. Yes IX. Syllabus: Paste syllabus below or attach and send digital copy with form. The syllabus must include the list of Writing Course learning outcomes above. Professor Raiford completed the data analysis of transfer students compared to non-transfer students’ performance on upper-division writing courses. The study showed that nontransfer students perform statistically better on upper-division writing courses. The GPA is slightly higher (3.22) compared to 3.13. The committee was very appreciative by Professor Raiford’s analysis. Chair Chin asked that the research question and population size be included in the overview (see appended report). This data will be provided to the Geography department, Associate Provost Walker-Andrews, and be included in assessment materials as well as appended to the Writing Committee’s annual report. The Committee will need to approve the annual report at the May meeting. Professor Weix and Director Webster volunteered. There are several majors that do not have a specific upper-division writing course. The committee may want to offer some suggestions for these majors in terms of preferred courses or perhaps a creative option to meeting the requirement with a co-requisite course such as the Advanced Research Literacies Course offered by the Library. Chair Chin asked that committee members consider whether it should plan for a writing retreat next fall and spring or perhaps a professional development workshop on writing. Some members felt that two retreats would be too much and would rather host a half day workshop. The committee was asked to think of possible dates in September and topics. The review from the writing forms could suggest a workshop topic. The Committee may also consider a date during winter break for the writing retreat. Other faculty development workshops have been well attended during this time. In addition it was suggested that the writing retreat be included in the Faculty Development Office marketing materials and scheduling system. Camie will follow-up with Director Amy Kinch regarding the possibility. Chair Chin asked that members think about recruiting for next year’s committee. In particular the graduate students might think about recommending other students in their programs. The students’ contribution to the committee has been very much appreciated. It would also be helpful to have representation from the School of Business, Forestry, and additional sciences to include more voices in the generative discussion and work. Good and welfare Professor Weix attended a workshop on accessibility. This might be a good topic for a workshop. In July, Chair Chin will be sharing our University-Wide Program-Level Assessment in her Keynote Address at the Institute of Critical Issues: Assessment for School Leaders and Teachers, sponsored by the Conference in English Leadership. Adjournment The meeting was adjourned at 12:00 p.m. University-wide Program-level Writing Assessment Steps and Process Steps to create data capabilities 1. Meet with Julie to work out data needs and Banner names for each data point 2. Julie files JIRA to request creation of Banner table that will house our data 3. Test new table (use a sample of 790s in a csv file, which I will upload to the Banner table) Process for pulling data AFTER EACH RETREAT (once Banner table is created) 1. Get 790s of our scored samples (CSV file from Nancy) 2. I upload the 790 csv file to WINSCP and request that Judy Grenfell updload these 790s to our Banner table (same process as our WC database upload) 3. Once I get the email confirmation that the load is complete, send data extract job request to Judy Grenfell This will result in a csv file that includes the scored sample 790s and the Banner data we have requested This csv file will have both an activity date (when I pulled it) and a term code so that each sample set is associated with a term 4. We will need to merge this csv file with our csv file from Moodle that will provide: a score for each sample/790 survey answers for each sample/790 5. At some point, we’ll need to strip the 790s in another version of our merged CSV file 790s from the scored samples uploaded into Banner table Banner data with 790s extracted into a CSV file Banner CSV file merged with our CSV file that includes 790s, scores, and survey answers Later: We need to define our assessment questions (must be answerable with the data we are collecting! Consider what type of tool we will use to be able to answer these questions Sort our big excel spreadsheet? Use Info Griz? Assessment of Transfer Student Writing Performance Overview A recent assessment of student writing proficiency performed by the Geography Department (UM Geography Assessment Report: 2013) found highly variable performance by students in their program. They posited that a cause could be the large number of transfer students in their program. The ASCRC Writing Committee decided a follow-up assessment was in order, and pulled student grades for all writing courses (including WRIT 101, Approved Writing Courses, and Upper Division Writing Courses) for all students, both transfer and non-transfer. The data included records for a total of 16,334 students of which 3,913 were transfer students. Of the total, 1,655 transfer students had taken upper division writing courses while 2,374 non-transfers had. The specific question at hand became “Do transfer students perform as well in upper division writing courses as non-transfer students do.” Summary of Findings A subcommittee comprised of Camie Foos, Amy Ratto-Parks, Grace Harris, and Doug Raiford acquired the writing-course grade data and met on Monday 2/24/2014 at 2:00PM in UH 221 to discuss how to go about this analysis. In this meeting it was decided that a comparative analysis of transfer and nontransfer students would be performed, and that achieved grades in upper division writing courses would be taken as a measure of the success of previous, foundation building, courses. A Welch Two Sample t-test (unpaired) was performed on the two distributions of grades for upper division writing courses from transfer and non-transfer students. Non-transfer students do perform significantly better than transfer students (α=0.05, p-value = 0.008; mean transfer-student grades 3.13; mean non-transfer-student grades 3.22). While statistically significant, non-transfer student GPA performance is only 0.088 points better than transfer students, making it difficult to conclude that transfer students are the root cause of any highly variable writing performance. This is supported by the fact that transfer students that take their foundation building courses at their transfer institution perform better (though not significantly so) in their upper division writing courses than do transfer students that take those courses here. Detailed Analysis The data included student grades from 2008 to 2013. It included the performance of 1,655 transfer students who have taken upper division writing courses, and 2,374 non-transfer students who have taken upper division writing courses. Figure 1 shows the similarity in grade distributions between the two populations. Figure 1 Grade distributions for transfer and non-transfer students Breakdown of Transfer Student Composition When upper division writing course grades for transfer students are examined, those that took WRIT 101 here compared to the grades of those that took it at their previous institution were slightly better for those that took it elsewhere (not statistically significant, α=0.05, p-value = 0.5; mean GPA for those that took 101 here 3.04; mean GPA for those that took 101 elsewhere 3.12). Figure 2 shows the similarity in distributions (though their quantity is clearly disproportionate). Figure 2 Distributions of Upper Division Writing Course Grades of Transfer Students that Took WRIT 101 Here and Abroad Figure 3 clearly shows this disparity in the numbers of transfer students that took 101 here and abroad. Figure 3 Proportion of Transfer Students that Took WRIT 101 Here and Abroad Similar results are found for those who took Approved writing courses at their previous institution (slightly better performance for those that took it elsewhere; 3.23 vs. 3.09; not significantly different, α=0.05, p-value = 0.19). Figure 4 depicts the similarity in distribution and Figure 5 shows that those that transfer in Approved Writing Course Credit are outnumbered by those that do not. Figure 4 Distributions of Upper Division Writing Course Grades for Transfer Students that Took Approved Writing Courses Here or at their Transfer Institution Figure 5 Proportion of Transfer Students that took Approved Writing Courses Here and Abroad Appendix—Raw Results From perl script there are 3913 students that transferred with greater than 27 credits 2852 of them received credit for 101 this represents a percentage of 72.89 their average score was 3.27 of the transfer students 203 took 101 once they got here this represents a percentage of 5.19 their average score was 3.30 11 of them took it and withdrew 306 students received credit for 201 this represents a percentage of 7.82 their average score was 3.00 of the transfer students 25 took 201 once they got here this represents a percentage of 0.64 their average score was 3.10 2 of them took it and withdrew 344 students received credit for Approved this represents a percentage of 8.79 their average score was 3.04 of the transfer students 904 took Approved once they got here this represents a percentage of 23.10 their average score was 2.74 0 of them took it and withdrew 0 students received credit for Upper this represents a percentage of 0.00 their average score was 0.00 of the transfer students 1660 took Upper once they got here this represents a percentage of 42.42 their average score was 3.13 141 of them took it and withdrew at least once From R script: Transfer Students Only Welch Two Sample t-test data: upOneHere and upOneThere t = -0.6675, df = 101.022, p-value = 0.506 alternative hypothesis: true difference in means is not equal to 0 95 percent confidence interval: -0.3467311 0.1721418 sample estimates: mean of x mean of y 3.037912 3.125207 Welch Two Sample t-test data: upAppHere and upAppThere t = -1.3261, df = 156.958, p-value = 0.1867 alternative hypothesis: true difference in means is not equal to 0 95 percent confidence interval: -0.31780300 0.06248625 sample estimates: mean of x mean of y 3.098770 3.226429 From R script: Transfer and Non-Transfer Students Welch Two Sample t-test data: combined and upNonX t = -2.651, df = 3383.481, p-value = 0.008063 alternative hypothesis: true difference in means is not equal to 0 95 percent confidence interval: -0.15349437 -0.02297637 sample estimates: mean of x mean of y 3.134177 3.222413