REPORT ON THE GENERAL EDUCATION GOAL OF ORAL COMMUNICATION DEMONSTRATED AT THE 2013 UNDERGRADUATE RESEARCH & SERVICE CELEBRATORY SYMPOSIUM Center for Teaching and Assessment of Learning Kevin R. Guidry Senior Research Analyst Kathleen Langan Pusecker Director of Educational Assessment August 2013 Undergraduate Research Symposium Results 2013- 2011 This report examines the University of Delaware (UD) General Education goal of Oral Communication as demonstrated by undergraduate students at the fourth Annual Undergraduate Research and Service Celebratory Symposium at the University of Delaware held on August 8, 2013 and compares scores with those collected during the previous year. The Center for Teaching and Assessment of Learning (CTAL) collects these data to evaluate Summer Scholar students’ oral communication skills, an essential UD General Education competency. We also collect data on this competency to examine the effect of the electronic portfolio used by Summer Scholars to document their undergraduate research experience and improve their oral communication skills. Data Collection and Analysis The 2013 data in this report were collected by CTAL’s Senior Research Analyst who rated 19 of the more than 220 poster presentations presented by undergraduate students from various disciplines at UD and students from outside of UD who worked with university faculty to conduct research. The same rubric (Appendix) was used to evaluate this summer’s poster sessions as used in the previous two years, a developmental rubric adapted from the Association of American Colleges and Universities (AACU) Valid Assessment of Learning in Undergraduate Education (VALUE). The rubric assesses students’ ability to present material along five criteria for oral presentations: 1) Central message; 2) Organization; 3) Language use; 4) Delivery; and 5) Supporting materials. These five criteria were assessed on a scale of 1 to 4, with 4 indicating a student’s successful mastery of the criteria expected from a sound oral presentation and 1 indicating a student’s failure to demonstrate those components expected for a sound oral presentation. The rubric and some identifying information (i.e., evaluator, poster number, poster category) was entered into a Qualtrics Web-based survey for data collection and analysis. As shown in Table 1, 14 posters presentations were rated this year1 with a total of 85 poster presentation rated across all three years. Table 1: Number and types of poster presentations rated each year 2011 2012 2013 Arts Sciences Social Sciences Humanities Total 1 (6%) 0 (0%) 0 (0%) 12 (75%) 49 (89%) 12 (86%) 3 (19%) 4 (7%) 2 (14%) 0 (0%) 2 (4%) 0 (0%) 16 55 14 2013 Ratings Overall, the 14 UD and DTCC students assessed during the 2013 symposium were found to have less mastery of the criteria expected for oral presentations than desired. The overall average score and the scores for each criterion are shown in Table 2. On the four-point scale for all criteria, the mean was 2.7. Half of the UD students in this sample are rising seniors and the other half are rising juniors or sophomores so this score indicates that the students are performing slightly below the expected level. Table 2: Student poster session ratings Central 1 Organization Language Delivery Supporting In total, 19 posters were evaluated. Posters were not systematically selected prior to the symposium and five of the posters were removed from further analysis as subsequent investigation revealed that they were students from other universities or, in one case, a local high school. Two DTCC students were retained in the analysis because there are close ties between DTCC and UD, particularly UD’s Associate in Arts program, with significant overlap in those student bodies. Sessions rated 1 Sessions rated 2 Sessions rated 3 Sessions rated 4 Average rating message 1 (7%) 5 (36%) 7 (50%) 1 (7%) 2.6 0 (0%) 4 (29%) 8 (57%) 2 (14%) 2.9 use 3 (21%) 6 (43%) 5 (36%) 0 (0%) 2.1 0 (0%) 2 (14%) 7 (50%) 5 (36%) 3.2 materials 0 (0%) 4 (29%) 9 (64%) 1 (7%) 2.8 Longitudinal Comparisons: 2011-2013 This is the third consecutive year that CTAL has rated student poster presentations using the same rubric so we can make longitudinal comparisons. As shown in Table 3, ratings in 2013 were the lowest for this three year period in each criterion and subsequently the overall rating was also lowest in 2013. Table 3: Longitudinal comparison of poster presentation ratings, 2011-2013 2011 2012 2013 Central message 2.9 3.3 2.6 Organization 3.3 3.3 2.9 Language use 3.2 3.2 2.1 Delivery 3.4 3.5 3.2 Supporting materials 2.9 3.3 2.8 Overall 3.1 3.3 2.7 Discussion Ratings for this year’s poster presentations are noticeably lower than in the previous two years. The criterion in which students scored the lowest, “Language use,” reflects that many students used technical language that was not appropriate for lay persons. The middling average score for the “Central message” criterion arose from a similar place in that students focused on technical details without placing appropriate focus on the underlying meaning and purpose of their research. In most presentations, the students only expressed the central theme of their research near the end of their poster presentation or slipped it in as an aside instead of leading with and reinforcing it throughout the discussion. Scores for the “Organization” and “Supporting materials” criteria show that although many students were somewhat mired in technical language their posters and discussions were organized and set up to support those technical details. The criterion for which students received the highest score, “Delivery,” reflects a level of poise and confidence in their work that is commendable. Students with whom we interacted were uniformly enthusiastic about their research. As in previous years, their tremendous enthusiasm for their research resulted in them at times providing too much information and often clouded the central message. However, some students appeared more eager to converse with their peers than with others and other students appeared to be actively avoiding engaging with others (e.g., conspicuously avoiding eye contact, turning their back to visitors). Although we expect some students to be nervous and somewhat withdrawn, it may be helpful to impress on all of the students attending the symposium their role of eager hosts ready to share information with their many guests. There was also a conspicuous lack of research related to art. The program lists only two students whose artwork was exhibited with no poster presentations by art students. To the best of our knowledge, all Summer Scholars were required to participate in this symposium so this is puzzling as it implies that either art students were not funded to conduct research or they did not participate in this event with their classmates. Limitations First, time and personnel limitations make it difficult for us to speak with more than a handful of students. Second, although we have done our best each year to speak with a wide variety of students, there is no guarantee that the students with whom we spoke are representative of all students who presented poster presentations. Finally, this assessment occurs only once each year so there is little opportunity for us to calibrate our evaluators and instrument; in other words, some of the differences in scores, particularly from year-to-year, may be due to differences in how raters used the instrument and not differences between poster presentations. Conclusion This was the fourth consecutive year that the Undergraduate Research Program used an electronic portfolio system designed to enhance the UD Summer Scholars’ ability to obtain the learning goals of oral communication, critical thinking, ethical reasoning, and creative thinking. In 2012, Summer Scholars who used the electronic portfolio also engaged in other activities to enhance their oral presentation skills: they attended communication workshops, reviewed their own performance tapes, reflected about strategies to improve their performance, and received feedback from their peers and group leader about how to improve these skills. In 2013, these activities may not have been emphasized as heavily and we believe that this may have resulted in the lower ratings for the 2013 presentations compared to the 2012 and 2011 presentations. Appendix: Poster Presentation Rubric 2013 Oral Presentations Rubric - Und. Research Symposium Who was the judge? __ Kathy Pusecker __ Kevin R. Guidry # of Poster: _____ Indicate presentation / poster area: __ Arts __ Sciences __ Social Science Oral presentation and poster rubric Criterion 4 Central Message Central message is compelling. (Precisely stated, appropriately repeated, memorable, and strongly supported) Organization Introduction, sequenced material, transitions, and conclusion lead to a logical organizational pattern. Language Use Word choice is appropriate for the audience. Words emphasize the central message, enhance the effectiveness of the presentation. Delivery Voice volume, posture, gestures, eye contact, make the speaker appear polished and confident. Delivery supports the main message and enhances the effectiveness of the presentation. Supporting Materials Speaker uses supporting materials (examples, graphs, visual aid, meaningful quotations, statistics) to establish project credibility. Supporting materials or poster visually is appropriate to the main message. 3 2 __ Humanities 1 Central message can be deduced but is not explicitly stated in the presentation. Organization is not evident. Lack of introduction, conclusion, transitions. Word choice is inappropriate for the audience. Word choice detracts from effectiveness of presentation. Voice volume, posture, gestures, eye contact, make the speaker appear unpolished and uncomfortable. Delivery detracts from the central message and makes the presentation ineffective. Speaker lacks supporting materials or provides inappropriate materials (examples, graphs, visual aid, meaningful quotations, statistics), project credibility is questionable.