Classroom Activity: 56 Rethinking the Egg Drop with NGSS Science and Engineering Practices Articles Learning for 1 Evidence-Based the Student and the Teacher: A Mutations Capstone Activity Wilderness Leadership 15 American School-(AWLS)-Jackson, WY. An Urban Educator’s Perspective of and Experience in the Wild, Wild West 20 Informative Curriculum on Antibiotic Resistance Inspires High School Biology Students to use Antibiotics Wisely 22 High School Science 28 Assessing Students’ Abilities to use Cross Tips to Save Your Teaching Sanity Cutting Literacy Skills and Scientific Center CERR 41 Byron Rubric 2015BW Ichythoplankton Batman! ­— 44 Holy Science Research as a Teacher at Sea Gender Effect (In the Zone 48 Same Treatment) In a Mixed Gender Classroom Part Three: As it Relates to Superior Content Retention Classroom Activities Visibility of Stars, and 54 Seasonal Visibility of Planets in 2015-2017, from positionsof planets in their orbits. the Egg Drop with 61 Rethinking NGSS Science and Engineering Practices 68 Twitter in the NGSS Classroom analysis: A 5E lesson 70 Seasonal on Michigan weather and the “reason for the seasons” FALL 2015 VOLUME 60.2 A Publication of the Michigan Science Teachers Association MSTA Board Members Staff Editor – Chris Chopp Graphic Design & Layout – Shawn Detlor Article Submission Articles for publication in the MSTA Journal are invited on a contribution basis and are subject to editorial review. Please submit articles via email to Chris Chopp. Every attempt will be made to publish within a year after approval for publication. Chris Chopp, MSTA Journal Editor E-mail: cchopp@gmail.com Other publications are hereby granted permission to reproduce articles from the MSTA Journal provided the publication and author are properly credited and a copy of the publication is forwarded to the Association for its records. Copyrighted articles are noted, and permission to use them should be requested directly from the authors. The MSTA Journal is published two times per year and sent to approximately 2000 MSTA members. Inquires should be sent to MSTA Office, 1390 Eisenhower Place, Ann Arbor, Michigan 48108. Phone (734) 973-0433. Fax (734) 677-2407. Membership information is available on our website: http://www.msta-mich.org Executive Director – Robby Cramer President – Charles Bucinenski President Elect ­– Jennifer Arnswald Past President – Michael Sampson Secretary – Betty Crowder Treasurer – Mike Klein Parliamentarian – Marlenn Maicki Directors at Large – Jeff Conn, Diane Matthews, June Teisan Higher Education Director – Charles Dershimer Elementary Director – Crystal Brown Middle Level Director – Yonee Bryant-Kuiphoff High School Director – Kathy Mirakovits Curriculum Director – Holly McGoran Executive Editor – Cheryl Hatch Journal Editor – Chris Chopp Newsletter Editor – Wendy Johnson Historian – Vacant Awards – Marlenn Maicki Membership Chair – Paul Drummond Technology Chair – Robert Bacolor Special Education – Vacant Evolution Committee – Greg Forbes Science Matters Network – David Bydlowski Director Under Represented Groups – ­ Deborah Peck-Brown Regional Directors Region 1 – Donna Hertel Region 2 – Rachel Badanowski Region 3 – Derek Sale Region 4 – Susan Tate Region 5 – Conni Crittenden Region 6 – Brian Peterson Region 7 – Pete Peterson Region 8 – David Brown Region 9 – Jennifer Richmond Region 10 – Carolyn Mammen Region 11 – Vacant Region 12 – Jackie Huntoon Region 13 – Carolyn Lowe Region 14 – Lynn Thomas State Organization Representatives Experiments, laboratory activities, demonstrations, and other descriptions of the use of chemicals, apparatus, and instruments are illustrative and directed at qualified teachers. Teachers planning to use materials from MSTA Journal should consider procedures for laboratory and classroom safety to meet their local needs and situation. MSTA Journal cannot assume responsibility for uses made of its published material. MSTA Journal • FALL 2015 MABT – Cheryl Hach MAEOE – Vacant MCCB – LuAnne Clark MCTA – Mary Jordan McMaster MDE – Stephen Best MDSTA – Erica Ballard MEA – Vacant MESTA – Timothy Neason MIAAPT – Alex Azima MSELA – Marlenn Maicki MSO – Vacant SCST – Sandra Yarema DMAPT-Jeff Conn Evidence-Based Learning for the Student and the Teacher: A Mutations Capstone Activity Anne Jeannette LaSovage, Southfield-Lathrup High School Background This school year I have been making a deliberate effort to include more explicit emphasis on the NGSS science and engineering practices in my 9th grade biology lessons. While some units afforded very natural amendment opportunities, the content of other units seemed more fact- or process-based and required additional effort to more obviously include NGSS elements. One unit in particular that I felt could benefit from incorporating changes was the DNA/Protein Synthesis Unit. Specifically, I wanted students to move beyond factual recall and demonstrate a richer understanding of the big ideas, ideally internalizing the big ideas through the process. I also wanted to find a way to incorporate the argumentation and communication skills we had been working on all year. The following describes an activity that was added as a brief capstone to the existing unit. Prior to implementation of this lesson, students had completed learning activities about DNA structure and function and about protein synthesis, ending with thetopic of mutations. The capstone required students to construct and defend evidence-based claims regarding mutations. A specific process goal of this activity was to get students using, talking about, and evaluating evidence. Regarding the format of the lesson, my students have been having good success utilizing “C-E-R Boards” (large whiteboards on which they present a team argument using a claim, evidence and reasoning format). Students have also developed skills with structured discussion. This was initially begun using “Talk Prompts/Science Talk” and has been built on throughout the year. (For more information, see references at the end of this article.) This activity incorporates the C-E-R format as a starting point, and then takes students to the next level as they use a rubric to evaluate their own use of evidence and the evidence and arguments of others. The Lesson Day 1: Preparation At the conclusion of the notes and activities about mutations (the last topic of protein synthesis), students were given this task: END OF CLASS/HOMEWORK: Answer at least 3 of these questions thoughtfully in your notebook. Use evidence and examples to support your answers. Questions 1. Which kind of mutation usually messes up the message more: a point mutation or a frame shift mutation? (Explain!) 3. Is there redundancy in the genetic code? (Can a mutation accidentally code for the same amino acid anyway?) (Give examples!) 4. Does the answer to any of these change if the location of the mutation changes (e.g. if the mutation occurs at the beginning of a gene vs. the middle or the end of the segment, does it make a difference)? (Explain!) 2. What happens if a mutation makes the code for the STOP codon? (Elaborate!) Classroom Activities | www.msta-mich.org • 1 portion of the lesson. For homework that night, students were given 6 strands of mRNA to translate into amino acid sequences. These were: This reflection activity gave students an opportunity to synthesize and apply their learning about mutations, allowed choice (a frequent aspect in my class), and began the thought process needed for the next Translate these into their amino acid sequences: STRAND 1: A U G C C C U A U A U A G G C A U U STRAND 2: A U G C C A U A U A U A G G C A U U STRAND 3: A U C C A U A U A U A G G C A U U STRAND 4: A U G G C C C U A U A U A G G C A U U STRAND 5: A U G C C C U A A A U A G G C A U U STRAND 6: A U G C C C U A U A U A G G C A U U U Teacher notes: The differences between these strands were intentionally chosen to represent a variety of mutation possibilities. [Note: This planning information was not shared with students at this time.] On the next class day, students began class with a journal which asked the following: Strand 1:original strand Strand 2:early point mutation with neutral result Strand 3:early deletion Strand 4:early insertion Strand 5:point mutation resulting in a stop codon Strand 6:late insertion with neutral result *Disclosure note: In the initial delivery of the lesson, I had a typo on the homework, resulting in two strands being identical. I fixed this by adding a strand in the journal for the following day. For clarity purposes, I am including the corrected strands. 2 Day 2: Construction of C-E-R Boards • MSTA Journal • fall 2015 “Which of the strands [from the homework] is the most different from the original strand (Strand 1)? Explain your reasoning.” We spent time reviewing the correct answers to the homework (see below) and discussing students’ different answers to the journal prompt. Strands and their respective amino acid sequences were manipulated on the Smartboard during this discussion. Some students commented on the differences in the nitrogen bases between strands, including the number of bases. However most contrasted the amino acid sequences. Of those, students were divided into two major groups: those who thought having a premature STOP codon made a strand most different and those who felt that the total number of differing amino acids in the sequence was a more significant measure. Translations of Strands STRAND 1: met – pro – tyr – ile – gly – ile STRAND 2: met – pro – tyr – ile – gly - ile STRAND 3: ile – his – ile – STOP – (ala) STRAND 4: met – ala – leu – tyr – arg – his STRAND 5: met – pro – STOP – (ile) – (gly) – (ile) STRAND 6: met – pro – tyr – ile – gly – ile Students, in teams, were then given the prompt for the C-E-R: “Which is worse: a point mutation or a frameshift mutation, and is that always the case?” The phrasing of the question was purposefully open. I did not want to influence students too much by indicating that the location of a mutation or other factors might influence its relative impact. [Hopefully many groups would discover this on their own, or the concept could be drawn out during the discussion portion of the lesson.] However, I did want to leave the door open to students who were already at that sophistication level, or who are engaged by the “intrigue” of the implication that the answer might somehow vary. Students used the remainder of the hour for work time on their C-E-Rs. I instructed teams that the evidence portion of their work would be the most important part of today’s assignment and reminded them that evidence is not just a rephrasing of the claim, but actual data or examples that support it. I made it explicit that quality evidence would be the primary evaluation factor on these boards. At the end of the hour, I digitally photographed each team’s C-E-R board prior to erasing in order to capture the data for the next class period. (As I only have one set of large boards, this is common practice for lessons when construction and discussion occur on separate days.) Sample student responses are included as an appendix to this article. Day 3: Evaluation day Students began the hour with a journal: A student makes the CLAIM that a teacher has entered the wrong grade into MiStar [our school’s grading system] for an assignment he has done. Which of the following EVIDENCE would be most convincing to make the teacher change the grade? The student elaborating on what the assignment is The student showing the teacher another student’s copy of the assignment The student getting another student to confirm his story The student showing the teacher the paper with the other grade on it The student repeating the request to change the grade every day for a week It should be noted that the journal intentionally had an obviously correct answer. The purpose of this opening activity was to make it overt that evidence is not just a repeat of the claim or an explanation of the problem but must consist of examples or other support that convinces an observer. Each of the incorrect journal answers reflects a common tactic I have seen my students use when attempting to present an “argument.” During journal discussion, students were able to identify these real-life examples as ineffective argumentation. After the journal, each team received a packet containing numbered images of the Articles | www.msta-mich.org • 3 C-E-R whiteboards from their class on the previous day. (These were printed from the digital images.) Each group also received one team rubric. (A copy of the rubric is available at the end of this article.) As a class, we reviewed the rubric expectations and clarified the purpose and use of rubrics (setting expectations and ensuring consistency). Student teams then evaluated each of the C-E-Rs from their class, including their own. Comments were also required for each of the whiteboards. Scores and comments were recorded on the group rubric. In addition, after reviewing all the boards, each team was asked to write one recommendation to improve their own answer. *Students were allowed to assign ½ points on the rubric scores. When the class was finished with this task, we resumed a whole class format. We reviewed our expectation that if we all used the same rubric on the same boards, our scores of each board should be reasonably close. To see if this indeed occurred, I projected an image of the first C-E-R onto the Smartboard and had students hold up the fingers to represent the score their team had assigned this answer. (Finger-holdup is a common strategy in my classroom. It allows me and the students to visually get a pulse on the status of the class.) When student ratings differed significantly among the teams, I encouraged intentional discussion. [See references for more details on strategies for this.] At the conclusion of the discussion for each board, I revealed to students how I had rated that particular board and elaborated on my rationale as needed. After reviewing all the boards, I introduced (for students who had not already come to this understanding themselves) that location and character of a mutation do matter. I then proposed and modeled my own C-ER answer, claiming that while frameshift is generally a worse type of mutation, there are circumstances when a point muta- 4 • MSTA Journal • fall 2015 tion is worse. For my evidence, I revisited the homework strands, highlighting and contrasting in particular the strand with the point mutation resulting in a STOP codon and the strand containing the neutral-result insertion. During the subsequent discussion, we also contrasted neutral-effect point mutations (made possible by redundancy in the genetic code) with point mutations that result in different amino acids or lead to inappropriate STOP codons. To close the hour, students briefly summarized their own personal answer for themselves. They also reflected on their successes and areas for improvement throughout the C-E-R and rubric process. Conclusions and Teacher Reflection Overall, students indicated that they found value in the exercise. More importantly for instructional purposes, the format of this lesson afforded me the opportunity to collect a great deal of information about my students and their understanding. For example, I was not only able to evaluate the whiteboard responses, but also the team scores and comments from the rubrics, each team’s self-assessment, and the quality of individual oral argumentation. The actual board responses do show some range of mastery on the subjects, although few groups, if any, were totally off the mark. In groups where some details were in error, there was still evidence of understanding of broader concepts from the unit. In looking at the boards individually, I was able to see areas where students may still have misconceptions or may not have mastered material. I may need to revisit or review content in some classes. For example, in first hour, Team 6 has a correct point example shown but their intended frameshift is more of a series of transversions or simply randomized letters. A few other groups among the classes also discussed frameshifts, but their examples did not show shifting. Instead they wrote a random sequence or a series of point mutations. This difference between the “spirit” of some team’s evidence and their actual examples indicated that the results might benefit from a little more preparation on definitions. At least one team had a claim that one mutation type was worse, but all their evidence referred to the other type of mutation. This may have been due to a lack of understanding or simply an oversight error. [During the rubric assessment and discussion, some teams did notice and acknowldge their own mistakes like this.] Team 7 in first hour has appropriate (but unlabeled) examples of each kind of mutation, but could have been more specific with the reasoning. (They also included examples containing multiple point mutations.) They earned a high mark on the rubric because the focus of the activity was evidence, not rationale. However, they could improve in their skills of explaining and identifying their evidence. Given that this board was one with inconsistent scoring, it led to good discussions. In fact, on at least two boards, this one included, I reconsidered my initial assessment after the student discussion. The range of examples used by the teams was interesting to me. Examples from all classes ranged from word sentences and RNA strands (which had been used in class previously) to individual variations of class examples, but also to some unique examples. One team counted the number of codons in a mutated and nonmutated strand and included that in their evidence. Another indicated that the DNA proofreading mechanism might miss a small (e.g. point) mutation, but would likely catch and fix/discard a significant (e.g. frameshift) mutation, eliminating the problem. A small number of teams submitted products that indicate they are still novices at providing evidence and constructing arguments. Based on the team composition, I was anticipating this to occur with a few groups. Having the data helps me identify strategies to help these particular teams of students. Articles | www.msta-mich.org • 5 Besides the whiteboards, I could also make inferences about my students’ understanding of content from the comments written on the team rubric sheets. Further, the comments provided me with evidence of how well students were using the evaluation tool. I do see room for improvement in the sophistication level of their understanding of the argument, but there is strong evidence that students were striving to use the rubric correctly to identify the best evidence. Examples of student comments on other teams’ answers: “I didn’t fully see how its [sic] worse” “They said like a definition and in the reasoning part was an example” “Flip the R & E and be more specific” “Say why your evidence is relevant” “They explained their evidence well and they also gave a visual that help [sic] a lot” “They have explain so its [sic] easy to understand and they compared and contrasted the 2 mutations” “Give examples instead of just facts” “They left out some info and the reasoning just defines frameshift mutation” “Could have used a little more table legs” [refers to class use of a table model of supporting evidence] “need examples about frameshift as well” “Their example didn’t add or delete anything” “needs evidence, not just the definition” “give examples or compare” “There [sic] evidence is coming along. Try to explain the example more” “I’m lost in the evidence” “The evidence sounds like reasoning” “really good evidence, confusing reason” “In the evidence they could have explain how a frameshift insertion could be neutral” “They could have moved most of their claim to their evidence or reasoning to create better table legs” “We had a great example for our evidence and we had sturdy support in our evidence.” Examples of student comments for improvement on their own boards “I could have gave [sic] a visual example.” “to improve our own data by adding a reasoning” “give examples and compare frameshift mutation to point mutation, so people can understand how frameshift is worse” “Add examples and no definitions” “great evidence and the evidence matches the claim” “We could have given one more example to further bring the point across. The Claim should have been shortened to what we think, not the explanation” “Evidence isn’t completely true and reasoning isn’t finished” “We could have added more than just an example to our evidence” “The evidence does not support claim” Using the team rubric sheets, I was also able to compile quantitative data and get 6 • MSTA Journal • fall 2015 For boards where peer grading may have produced inconsistent results, I could use the data to identify areas where there may be potential misconceptions or low skills of constructing an argument or using a rubric. For example, a team that assigned a majority of scores outside the average range may need individual assistance from the teacher. A board that was assigned a range of different scores was looked at for clues about possible misconception triggers distracting students. a formative view of the class as a whole. Questions like the following were able to be answered: • How well did individual team’s scores relate to the teacher’s score? • How well did the overall student scores relate to the teacher’s scores? • How consistently did students answer as a class? • How accurately did teams rate their own products? Evaluation of each class’s data also opened new questions to consider. (See below.) Table 1: C ompiled data of student and teacher rubric scores (data is from 1st hour) Peer Assigned Scores by TEAM White board # A B C D E F G Student average score Teacher score 1 2 3 4 5 6 7 1.5 1 2 1 3 2.5 2 1 1 1.5 2 2 2 1 1 2 2 1 2 1 0.5 1 1 2 1 3 2 2 1 1.5 2 2 3 2 3 2 2 2 1 2 1 1.5 2 1 2 1 3 3 1 1.3 1.4 1.9 1.3 2.5 1.8 1.7 1 1 2 2 2.5 2.5 3 Stdev 0.418 0.492 0.204 0.516 0.548 0.612 0.876 Student Teacher Score Diff 0.3 0.4 -0.1 -0.7 0.0 -0.8 -1.3 stdev.p 0.382 0.449 0.186 0.471 0.500 0.559 0.799 *Bold indicates this is the team's own board Questions for Instructor to consider (based on table information): • Why was not picked up on in #4/Why was #4 scored low by students? (Also #6) • Why was there such variation in #7? • Why were scores on #3 so consistent? • Why did Team C and F seem to have several boards marked much lower/higher than the average? Modifications: For my last hour of the day, I had students view the whiteboards directly since the boards did not have to be erased for use by the next class period. For this class hour, instead of receiving a packet of images, the boards were spaced around the room. Teams were assigned a board to start with and rotated around the room to view each board. For management purposes, I set a timer to keep track of the duration at each station so all groups moved at the same time. For groups with special education students or those who were otherwise struggling, I suggested that they could use some or all of the strand examples from the homework as evidence for their claim. This scaffolding gave struggling groups a starting point, yet Articles | www.msta-mich.org • 7 still required that they select appropriate strands for their examples. Summary All in all, I would repeat this lesson. In a future class, I might consider assessing the mutations fact-level knowledge (definitions/ examples) prior to the lesson to ensure that a gap there does not interfere with the higher level activity. For a first time lesson, however, I was pleased with my results. The lesson did offer the engagement and opportunities to construct and defend arguments that I was looking for. I also got a lot of data about my students to use in crafting the next series of learning opportunities. References (with annotations) Talk Science Primer (Inspiration for my “Science Talk Prompts”) http://inquiryproject.terc.edu/ shared/pd/TalkScience_Primer.pdf The Inquiry Project (source site for the Talk Science Primer) http://inquiryproject.terc.edu/ NGSS Appendix F (A pdf file is available by clicking from left tab of the link below; 8 • MSTA Journal • fall 2015 describes the Science and Engineering Practices) http://www.nextgenscience.org/nextgeneration-science-standards A Framework for K-12 Science Education: Practices, Crosscutting Concepts, and Core Ideas (2012) (digital access, describes rationale of NGSS Science and Engineering Practices) http://www.nap.edu/openbook.php?record_ id=13165&page=42 !! References (formal style) Michaels, Sarah and O’Conner, Cathy. (2012). TERC. Talk Science Primer. Retrieved from: http://inquiryproject.terc.edu/shared/pd/ TalkScience_Primer.pdf ! The National Academies Press. (2015). A Framework for K-12 Science Education: Practices, Crosscutting Concepts, and Core Ideas (2012) http://www.nap.edu/openbook.php?record_ id=13165&page=42 ! NGSS Lead States. Next Generation Science Standards: For States, By States (Appendix F). Apendix 2 Student whiteboard responses from first hour, and additional examples Teacher comments and examples of student discussion points Team 1 is repetitive through the claim and evidence. They are one of only a few groups to refer to effects on amino acids, but they lack actual evidence or examples. Team 2’s response indicates knowledge of protein’s importance and that mutation can affect proteins, but like Group 1, there is no tangible evidence relating to the claim. Team 3 appears to have good evidence, but their example is not a frameshift. Team 4 shows the before and after of an insertion. This example is counted as evidence. However, they did not connect the mutation to a negative consequence to explain it. Articles | www.msta-mich.org • 9 Additional student answers from first hour Students show varying mastery of definitions of mutations by their selected examples. Some groups demonstrate understanding of consequences but not corresponding examples. Team 5 uses strands similar to class examples, and shows both types of mutation. They also mention the idea of amino acids. Including a translation of the strands would have strongly linked the two concepts and made a complete argument. As is, the link must be inferred by the reader Team 6 has a point example shown but due to labeling, it might seem to indicate that it is a frameshift. Their intended frameshift example is more of a series of transversions or randomization. From conversations with the team, they expressed they needed more time to elaborate and include their reasoning. This was one of the few teams among all hours who indicated in their claim that the answer might vary based on conditions, but they did not elaborate on that point. Team 7 has appropriate (but unlabeled) examples of each kind of mutation, but could have been more specific with the reasoning. They earned the high score because the focus of this particular rubric was evidence. A case could be made for a deduction for lack of labeling or for having multiple point mutations in a single strand. In terms of their progress with constructing an argument, conversations with this group focused on connecting the reasoning with the example as in their current state they do not easily connect. 10 • MSTA Journal • fall 2015 Sample responses from other classes In board labeled #4 (top left), many evaluating teams did not identify an example when scoring. However, on closer evaluation, the authors of this board do refer to a specific action (“insert and delete bases”) and a location on the gene (“especially earlier on”) which I considered an example. Although it did not look like other examples, it was specific. The board immediately below it (lower left) also indications a deletion “in the middle” which demonstrates understanding of the process even though no sample strand is written. On board labeled #1 (top right), it looks like there is an example, but what is listed as an example has no relevance to the claim. There is no indication that the strand contains a mutation. The lower right board on this page is another example of a team with correct logic but incorrect example of a frameshift. They understand that location matters, but did not correlate with a correct example. Even though they show a gap in knowledge, the larger idea has come across. Articles | www.msta-mich.org • 11 First board on this page (upper left): claim indicates not always the case, but no evidence included to support that. Team shows very novice ability at developing an argument. Top right board: “hanour vs. hangout” is an example of a point mutation. This team understands that mutations mess up the message, but does not distinguish between the two types discussed in the day’s lesson. They do understand that evidence involves examples. On the day of this activity, it might be noted that all members of this group were special education students. 12 • MSTA Journal • fall 2015 Lower left: A unique reasoning: Point mutation is a small change and less likely to be caught and fixed during DNA proofingreading. The evidence and reasoning cover more than one point and could be more cohesive, but I was impressed with the novel answer from this team. Lower right: Another possible misconception about frameshift definition; example seems like missing bases instead of moving over; reasoning does not dismiss this interpretation. Upper left: This team referred to the HW strands, and included both the RNA and the corresponding amino acid sequence. Upper right: This team also referred to the HW strands, however, they included a change in number of bases/codons as evidence. The consequence could be elaborated on, but the team does understand the concept of a frameshift. Bottom: refers to neutral mutations, though does not explain frameshift. They got commendation for thorough examples of one aspect, but suggestions to elaborate on the frameshift portion of the answer. Articles | www.msta-mich.org • 13 Strength of Evidence Ranking Activity Team member names: Date: Hour: Part 1: Use this chart to rank the use of EVIDENCE by groups on yesterday’s C-E-R whiteboards. Be fair and keep to the rubric. It is acceptable to give a score that is half way between two levels. Remember: We want to be convinced because the evidence says so, not because the student says so! Level 1 Level 2 Level 3 Evidence written on board is only words WITHOUT specific examples. Evidence is simply a restatement or elaboration of the claim with no additional support. Note: Definitions of mutations are NOT examples. Evidence includes a specific example (e.g. an RNA strand, a sentence with and without a mutation) but may not elaborate on why the example is relevant, OR may show a change but not the effect of the change. May indicate a change is bad but not fully elaborate on why. May start a line of reasoning but not finish. Information in evidence and reasoning supports claim. Evidence includes a specific example (e.g. an RNA strand, a sentence with and without a mutation) AND elaborates on why the example is relevant. Shows a change AND the effect of the change. Indicates a change is bad AND why; explains a negative outcome or consequence beyond simply the error itself. Information in evidence and reasoning supports claim. Information in evidence and reasoning may or may not obviously support claim. Whiteboard # 1 2 3 4 5 6 7 Evidence Score (use above ranking system) Comments Part 2: Look at your own team’s whiteboard response from yesterday. What could you do to improve your response? 14 • MSTA Journal • fall 2015 American Wilderness Leadership School-(AWLS)-Jackson, WY. An Urban Educator’s Perspective of and Experience in the Wild, Wild West. By Zakiya A. Jackson, Special Education Teacher, Ralph J. Bunche Preparatory Academy, Detroit Public Schools What is AWLS? American Wilderness Leadership School is a phenomenal learning experience for educators that takes place in the BridgerTeton National Forest located in Jackson, Wyoming. Established and sponsored by the Safari Club International Foundation, the American Wilderness Leadership School provides educators with activities and lessons that can be brought back to their local schools and enhance the learning experiences for their students. AWLS is the perfect opportunity for teachers to receive a hands-on learning experience that will enhance their various science, math, social studies and reading curriculums. AWLS is an eight-day professional development that addresses wildlife ecology, outdoor survival, stream ecology, hunting ethics, conservation education, landscape plant life, and the list goes on. Todd Roggenkamp is the Deputy Director of Education for SCI FoundationAWLS. He and his staff do an outstanding job at presenting and training educators on the best practices in outdoor education. The versatility of the staff to teach and flow throughout the many different activities is amazing! AWLS camp grounds are inviting and engaging comprised of the professional, hospitable and welcoming staff that is second to none. From the friendly service of AWLS Clerk Marj Barter to the delicious cuisine and accommodating kitchen staff (Aaron Widom, DeAnn Roggenkamp and Bailee Roggenkamp) you will know that the entire AWLS staff are passionate about the advancement of our students through the dynamics of training & equipping educators to bring the ideals of conservation education to the many students that we reach daily in our classrooms. My Opportunity to Attend As an educator, I am always looking for opportunities to grow as a professional. I actively seek out professional developments and trainings that will enrich lessons for students with special needs. A number of my recent workshop opportunities have been learning of ways to expose inner city students to the outdoors. Since 2012, I have incorporated the outdoors as a classroom. Tree identification, stream clean-up, gardening at school, bird watching, water investigations and insect life cycles are just a few activities that my students have been exposed to. As AWLS 2015 Workshop Session #6 Class Photo Articles | www.msta-mich.org • 15 a teacher, I realize that I am always a lifelong learner. When I continue to learn and grow, I will always have new information to teach my students. My growth means my students growth!! One aspect that I believed has changed in my thinking is that if I had so much fun learning through the engaging, enriching and enjoyable activities that we did at AWLS…..then my students would enjoy them more. I have found that outdoor education awakens a spirit of curiosity, exploration and desire for discovery in students. Learning should challenge you to go beyond your present level. Learning should be engaging, enriching and fun. Learning should be relevant. Learning should be a memorable experience. Learning should leave you wanting to know more. Learning should be impactful. Instruction that inspires and innovates the mind’s creativity and awakens the gifts, talents and abilities in a pupil will set the course for their lives. As a teacher, I feel that I have a part to play in that process. I have the honor of helping to shape the lives of the next generation. These are things that I desire to accomplish as an educator. My week at AWLS allowed me to be a kid at heart again. It allowed me to put myself in my student’s shoes. Going through the lessons, playing the outdoor games and teaching each other helped me to think of how my students would benefit from the activities we did. As a special education teacher, I am always asking myself, how do I differentiate instruction and provide the adaptations, accommodations and modified lessons needed to meet the individual needs & learning styles of my students? This can be a very challenging task. However, when I can incorporate movement, manipulatives, singing and other hands-on procedures into a lesson my students will learn and remain engaged. AWLS gave me a fresh perspective on ways that I can meet those needs for my students To a city girl, born and raised in Detroit…. Jackson, Wyoming was a whole new world. I was pleasantly awed by the grandeur of the mountains. Jackson, Wyoming’s trees, plants & wildlife presented nature at its finest and did I mention the grandeur of the mountains!! Wildlife, including moose, mule deer, pronghorn antelope, bison, beavers, short-tailed weasel, sandhill cranes, bald eagles, osprey, hummingbirds are some of the animals that you are bound to see. The wilderness was a refreshing get away from the busyness of city life. In addition to learning and gaining all the new skills, being White Water Rafting on the Snake River in Jackson, Wyoming 16 • MSTA Journal • fall 2015 in Jackson, Wyoming at AWLS presented the blessing of just breathing fresh air and rejuvenating the sense of why I do what I do every day. I am excited about taking all that I have learned back to Detroit and expose my students to the great outdoors. The outdoors presents the wonders of wildlife, trees, insects and so much more. It is my hope that my students will develop an interest is in the STEM activities presented to them and ultimately consider pursuing careers in STEM related fields. Why you should attend: Top Five reasons 1. Receive Continuing Education Units or College Credit in Conservation Education At AWLS, you will learn from the best of the best! AWLS presents a dynamic group of instructors and staff, all of whom are experts in their various fields of expertise. Educators who attend will receive a wealth of knowledge and resources needed to begin, build upon or sustain an outdoor education curriculum. Each day got better and better! 2. Obtain STEM curriculum and Activities• Outdoor survival skills Gary Gearhart & George Oberstadt taught us survival techniques for being safe in the wilderness. We learned about the reasons why animals attack and ways to protect yourself from animal attacks were reviewed. • Conservation Information Todd Roggenkamp, Dr. Fidel Hernandez, George Oberstadt and Gary Gearheart reviewed national and state rules and regulations for hunting. Hunting responsibilities and ethics were addressed during this teaching. Educators learned about the sportsmen code and ethical dilemmas in hunting. • Fly Tying Dave Brackets & Dr. Fidel Hernandez were the instructors showing us how to use fly tying to mimic insects. Learning how to make the cadis fly-dry and the wooly bugger-wetfly was a phenomenal hands on activity. • Stream Ecology John Valley was an awesome instructor that explained to us how to conduct water investigations. Our water investigations were conducted in Granite Creek located right on the AWLS campgrounds. Mr. Valley presented an excellent lesson on how to determine the rate of speed and flow of the waterway. During our hands on lesson, educators were also able to locate macroinvertebrates in the creek to determine the health of Granite Creek. 3. Firearm Safety, Shooting Sports and Archery 4. Exciting Field Trips • Teton Park Visiting Center Interpretive displays of Jackson, Wyoming wildlife with pellets for touching to feel the different furs of different animals. A wonderful place to explore and connect wildlife to the information about them. • Pinedale Learn about the gas fields located there, the conservation issues of the area, the animals that live in this habitat (including the sage-grouse) and what is being done to sustain their way of living. • National Elk Refuge Maintained and preserved by the U.S. Fish and Wildlife Service, the National Elk Refuge is a beautiful space of land set aside for preservation of the elk population. Learn about elk and why they are of public interest. Learn about their migratory patterns and how the collection of their antlers each year by the local boy scouts is a big event. • An evening in town (Jackson, Wyoming) Shopping, phenomenal sights, a mock shoot out, good food and good fun!! • White water rafting on Snake RiverWhite water rafting on the Snake River was one of the most thrilling and exhilarating experiences of the week. It was an outstanding closure to a week filled with so many other wonderful experiences. Articles | www.msta-mich.org • 17 Bunche Academy teachers at AWLS: Zakiya A. Jackson (left) and Diana Koss (right) 5. Expand your Professional Learning Community Attendees at the AWLS Workshop Session #6 were diverse. Attendees included educators and various pastors, staff/personnel from The Salvation Army from all over the country. Building a professional learning community as educators can inspire us to make positive changes in our teaching styles and methods. It often is beneficial to hear about and see what other teachers are doing effectively in their classrooms that have warranted them success. I greatly cherished the collaboration of the educators, the AWLS staff and pastors/staff from the Salvation Army. There was unity and a sense of community among us. The fellowship was inspiring. We all laughed together, broke bread together, helped each other, played games together, encouraged one another, cheered for each other, enjoyed each other’s company and shared ideas. By the end of the week. We were all one big family!! We continue to be in contact with each other to share ideas and keep each other informed about the changes in education. 18 • MSTA Journal • fall 2015 My introduction to the Wild, Wild West at AWLS was phenomenal! Life-changing, engaging, enriching and impactful are a few words that come to mind. I left Jackson, Wyoming with a fresh fire burning in me for my love of outdoor education and teaching stewardship of the earth. AWLS delivered its promise to renew an educator’s energy and enthusiasm for teaching! As I think back on what this experience meant to me…it was a defining moment in my life not only as an educator but also as a person. AWLS was more than just a professional development opportunity. Participating in AWLS helped me to step out of many personal comfort zones, dispel limitations that I’d placed on myself and move forward by faith to becoming a better educator. The program helped me to stretch myself to grow as a person. Participating in AWLS was life-changing. My horizons has been expanded and I have reached new heights in understanding about life, personal growth, enhancing the learning experience and the joys of outdoor education. My heart is filled with joy & gratitude for being afforded this awesome opportunity. I AWLS Summer Workshop Dates for 2016 have been scheduled June 9 – June 16 Educator Workshop 1 June 19 – June 26 Educator Workshop 2 June 29 – July 5 Student Session – High School-For ages 16-18 (Limited to 30) July 8 – July 15 Educator Workshop 3 July 21 – July 28 Educator Workshop 4 July 31 – August 7 Educator Workshop 5 August 10 – August 17 Educator Workshop 6 am tremendously blessed to have been sponsored to participate in this most amazing experience. I am humble and honored to have been chosen to attend. I know that many people have donated time, money and valuable resources to provide educators like myself this opportunity. My greatest repayment will be pouring all that I have received & learned into the lives of my students. I have so many new skills and knowledge to impart unto them. Thank you Safari Club International, SCIF Education Sables, the AWLS staff and every organization who donated towards making all this possible. AWLS was life-changing!! AWLS was impactful!!! I have left this experience with a fresh fire burning in me for my love of outdoor education and teaching stewardship of the earth. This experience has inspired me to seek out new avenues to engage my students in new outdoor activities. This experience has also presented me with the wonderful opportunity to collaborate with other educators across the country. My professional learning community has expanded tremendously because of AWLS. The instructors at AWLS were knowledgeable & enthusiastic experts in their various fields that desired to impart all that they knew to help us to be successful in educating our students. I have no doubt that after my AWLS experience I am going to be a more effective, efficient and excellent teacher Educators!!! If you have not taken advantage of this wonderful opportunity, I highly encourage you to step out of your comfort zone and discover the beauty of Jackson, Wyoming. American Wilderness Leadership School is all it promises to be and so much more!! You will be most proud of the credentials that you will attain. The credentials will be proof that you have enhanced yourself as an educator. Your students will benefit from the new lessons and skills that you will learn at AWLS. AWLS is the greatest professional development opportunity provided to help enhance your classroom being outdoors. To access an application please visit: http://safariclubfoundation.org/education/ american-wilderness-school Scan and Email completed applications and educational background statement to: Karen Crehan KCrehan@safariclub.org Articles | www.msta-mich.org • 19 INFORMATIVE CURRICULUM ON ANTIBIOTIC RESISTANCE INSPIRES HIGH SCHOOL BIOLOGY STUDENTS TO USE ANTIBIOTICS WISELY Elaine M. Bailey, PharmD, MARR Coalition Advisory Council Antibiotic resistance may be the most important infectious disease threat of our time. Are we losing our battle against resistant bacteria? In 2013, the Center for Disease Control (CDC) issued a report that summarized the health and economic threats related to antibiotic resistance. Among the data that were quoted, it was estimated that every year more than 2 million people in the U.S. get infections that are resistant to antibiotics resulting in at least 23,000 deaths. The costs associated with antibiotic resistance is staggering with annual US expenditures including loss of worker productivity estimated at approximately $35 billion. For nearly two decades, the mission of the Michigan Antibiotic Resistance Reduction Coalition (MARR), which has been mainly supported by a competitive grant from the CDC, has been to promote appropriate antibiotic use through educational programs with an ultimate goal to reduce antibiotic resistance. In 2003, MARR received an award for Excellence in Community Education for a curriculum called “Antibiotics & You” that has been used to educate thousands of elementary students and adults. Recognizing a need to inspire high school students to be ambassadors of appropriate antibiotic use, in 2013 MARR collaborated with their counterparts in Oregon to develop a two-day 20 • MSTA Journal • fall 2015 The MARR curriculum on bacteria, viruses, antibiotics and bacterial resistance to antibiotics is available for FREE on the MARR website (mimarr.org) curriculum for high school students. Excellent support was provided by Susan Codere and Kevin Richard, Science Consultants of the Michigan Department of Education, as well as Cheryl Hach of the Kalamazoo Math and Science Center to insure that the curriculum met the Michigan Science content standards and expectations for Biology in four of the five main areas. This past May, Juliane Cody at Regina High School in Warren, MI presented the curriculum to her 11th grade students and shared the feedback she received from her students directly with one of the content developers, Dr. Stephen Lerner, Professor of Medicine at Wayne State University. The student evaluations uniformly commended the content for being very informative. Since her students were looking for additional interactive content, Ms. Cody found a number of Youtube videos and an exercise from Flinn Scientific that simulated the development of antibiotic resistance. Ms. Cody appreciated all the resources that MARR provided and emphasized the utility of the student worksheets. The scores on the posttests reflected that the students grasped the material content. Ms. Cody felt that she needed at least 3 days to present the curriculum, partially to incorporate the additional content outside of the packaged materials provided by MARR. MARR encourages teachers to visit mi-marr. org to learn more about antibiotic resistance. In addition to the high school curriculum, the “Antibiotics & You” curriculum can also be downloaded for presentation to community groups or elementary students. High school students might enjoy presenting the “Antibiotics & You” program to younger students as a volunteer opportunity or teachers could consider giving students extra credit if they were to present the program to a lay audience. A great time to do this would be during Get Smart About Antibiotics Week November 16-20, 2015! This annual observance is a key component of CDC’s efforts to increase awareness of the threat of antibiotic resistance and the importance of appropriate antibiotic prescribing and use. Again, more information on Get Smart Week and links to the CDC can be found in the MARR website. Together we can win the battle against antibiotic resistance!! Curriculum Ideas: What does MARR hope to achieve through their curriculum? MARR’s goal is to provide an applicable knowledge for students about the seriousness of antibiotic resistance that answers these questions: • *What are microbes and what are the differences between viruses and bacteria? • The MARR curriculum for High School students includes the following materials: • *How do bacteria and humans interact? When is bacterial colonization beneficial to the human host? • *Instructional video for teachers • *Why were antibiotics developed? • *How do antibiotics work against bacteria and how do bacteria become resistant? • *What are some strategies for overcoming antibiotic resistance in the future? • *PowerPoint presentations for students • *Pre and posttest for students • *Worksheet for students • *Resource information for interactive activities • *Report/evaluation Articles | www.msta-mich.org • 21 Tips to Save Your Teaching Sanity Laura Kaye Harris, Faculty, Science Laboratory Coordinator, Davenport University Abstract Teaching can be stressful. Tight deadlines, frustrated students, and curriculum that is not completely functioning can turn a passion for teaching into a desire to quit. Any tips to reduce instructor stress results in more productive and happier students and teachers alike. This article provides techniques to reduce stress from grading through the use of answer keys and grading rubrics to analyze student works efficiently along with several other tips to improve grading and classroom management when challenges arise. Having one of those stressful semesters filled with challenges? You are not alone. In April 2015, Irwin Horwitz, a Strategic Management professor of 20 years at Texas A&M in Galveston attempted to fail his entire class after reaching a “breaking point” (Fearnow, 2015). Dr. Horwitz claimed this class was a unique group of students that displayed a “complete lack of maturity and general incompetence,” though it is clear that stress contributed to his reaction to these common teaching challenges. Besides teaching, instructors at all levels have growing additional job responsibilities. According to Washington Post author Francie Alexander, in 2012 the average teacher worked 53 hours per week, including 95 minutes at home preparing classroom activities, grading, and doing other job-related duties (Alexander, 2012). With the implementation of major curriculum changes, such as common core, teachers are spending even more time making things work in the classroom, thus increasing their stress levels. This increases teacher burnout rates, which already show that 40 – 50% of new teachers quit teaching within their first five years (Seidel, 2014). Reducing instructor stress and time spent on administrative tasks such as grading is important to happy productive students and teachers. 22 • MSTA Journal • fall 2015 Answer Keys For the past couple years, I taught highly grade-motivated students in a competitive undergraduate nursing program. My courses ran concurrently with nursing courses such as pharmacology and mental health that had a reputation for lowering grade point averages. Students anxiously awaited feedback on all grade-bearing assignments, especially exams, and the faster you could get grades to them, the happier they were with you. The nursing department had a scantron machine, which several faculty members shared. Being from another department and unable to use it, I was jealous. It seemed like it would save grading time. From my office next door, I could hear nursing faculty trying to figure out how to use, maintain, and fix various errors from the machine. They also dealt with stray marks causing false answers and thinner scantron sheets causing jams and tears leading the instructors to fill out new sheets for already existing exams. I, on the other hand, had a paper printed answer key with no line of faculty members to wait through and no technical difficulties. My exams were typically 40 – 50 multiple- choice questions with two-to-five short answers covering several weeks of material while most nursing exams were 20 – 30 multiple-choice questions only covering that week’s content. My answer key would also include bullets of what details I awarded for what points in the short answer. I would start grading exams immediately upon receipt and as they piled up, I formed an assembly line of sorts. I would stall earlier exams, left turned to the page I last completed, and put them in respective piles based on the degree of completion. Then I would line up to four exams with my answer key on top and grade them, circling only incorrect answers. At the bottom of each page, I would put the total number of points missed on that page. As I reached exams already in progress, I incorporated them into the larger pile until all exams were completed. A quick tally of the points at the bottom of each page gave me the final grade. The entire process took about 30 minutes for a class of 12 students and students frequently got their grades back after a 15-minute break starting when the last student completed their exam. For my largest classes of 30 students, I needed a couple hours to return exam grades via electronic media, which I did from a spectator’s seat at my kid’s dojo during their practices. Within the first month of each new term, students commented on my speed in returning exam grades. I was faster than the nursing faculty’s scantron machine! Students were also able to look at the questions with their individual testing notes and answers in one document. This facilitated better discussion on missed points. Students also did not have to worry about having the proper pencil, erroneous marks, or proper answer alignment on scantron sheets, reducing their stress level during the tests and increasing their confidence. Even though it would seem the scantron machine would be a more efficient approach, old-fashioned paper answer keys have led to less student frustrations and overall stress for me. No Answer Keys? Use Top Student Papers! When I started teaching as full-time science faculty at a university, I taught 16 credit hours in six courses at one time, three of these were laboratory courses I had not taught before. The weekly assignments, taken from the course textbook, were comprised of mostly fill in the blank or matching questions averaging five pages long. I had no answer keys from the textbook publisher and quickly found myself overwhelmed with weekly lab assignments to grade. At first, I would generate an answer key by attempting to answer the homework questions myself. I would then go back and verify the answers to any questions that did not match most of the students’ answers. This process, in addition to actually grading their assignments, took hours and I was going crazy trying to keep up! Fortunately, the top students separate themselves from the rest of the class within the first few weeks of any class, and if the instructor is observant, those top students can be identified early in the grading period. After a couple weeks, I identified three students whose work was always high quality and who provided intelligent questions and answers during class. For the rest of the term, I would fish out the assignments from those three students and compare them to each other. If they agreed on a particular question, that was the answer that went on my answer key. If they disagreed, I would examine the question further. They rarely disagreed. This allowed me to save time developing answer keys. There are some challenges to using this approach. Students naturally form study groups and those study groups can stay intact throughout the academic career of those students. This is beneficial to the learning process, and collaborative learning increases student retention, learning, and graduation rates (Carnegie Mellon, 2015). However, if your three top students are from the same study group, they may reflect the Articles | www.msta-mich.org • 23 same content misconceptions, thus throwing some incorrect answers into your answer key. There are two ways to avoid this issue, 1) use top students that do not study together or 2) review the newly created answer key against the rest of the class and pay particular attention to questions where more answers were wrong than right. The latter solution involves more time on behalf of the instructor, but may be best if class sizes are small. Selective Grading While I was struggling to keep up with a heavy course load without textbook provided answer keys, a colleague recommended an alternative solution: selective grading. Rather than use top student works as answer keys, grade only a subset of the assignment questions that I felt best reflected the main content points of the assignment. This idea of selective grading invoked an initial feeling of disgust. After all, do not all questions asked fill this requirement? I remembered when I started teaching and received a test bank of multiple-choice questions from a textbook publisher to write my class exams. The recommendation was that assignments and exams be comprised of 60 – 70% memory recall, 20 – 30% mid-level analyses, and 10 – 15% high-level analyses questions. If I focused my attention on the questions that were not memory recall, I was more effective in my use of this technique of selective grading. I also quickly found myself using this technique to ignore textbook questions in assignments that I did not immediately know the answer to and did not have time to figure out. There are some issues with this approach. The instructor does not evaluate 100% of the work; ergo it is possible for a student to earn full credit for work containing wrong answers. This would cause inaccurate grades. Furthermore, content misconceptions likely generated those wrong answers. Those uncorrected misconceptions lead to compounded misconceptions in later, more difficult material. Students will also catch 24 • MSTA Journal • fall 2015 on that you are not reviewing their assignments entirely and tend to lose confidence in your teaching relationship. I would not recommend using this technique for grading high point value works, such as exams, but for regular textbook work, this technique is handy in a pinch. Grading Rubrics Grading rubrics are excellent for evaluating student performance on a piece of work clearly and consistently. While generally used at universities to standardize scores among several graders, there are grading rubrics for a wide variety of assignments including papers, presentations, and individual and group projects. The use of grading rubrics has increased student retention, particularly for high dropout risk students such as minority, first generation, and/or non-native English speaking students (Stevens & Levi, 2005). Since rubrics spell out explicit expectations for the assignment, they carefully describe hidden and unspoken assumptions of academic culture such as proper punctuality and use of citations, which helps high dropout risk students significantly (Stevens & Levi, 2005). I give my grading rubrics alongside the assignment directions so students have no confusion about criteria, level of quality, and number of points for each work. So what makes a good grading rubric? Having served on several course development committees the consensus is that content related to the questions asked comprise 60 – 70% of the overall grade with 10 – 20% for grammar and references, and 10 – 20% for assignment specific for format (i.e. sentence and paragraph structure for papers, heading or bullet format on slides or eye contact with audience for presentations, etc.). Using succinct labels for each evaluation criterion and keeping criteria specific and measurable are grading rubric best practices according to Blackboard Customer Success Advocates, Connie Weber and Amber Goularte (Weber & Goularte, 2015). Since all of the rubric’s evaluative criteria must be included in the assignment instructions, I find it easiest to develop a grading rubric after writing the assignment instructions. In return, developing a grading rubric is helpful in evaluating the clarity and detail of assignment directions. Detailed assignment directions leads to detailed grading rubrics and less grade complaints and audits. The design of a grading rubric is personal. I was frequently annoyed working as an adjunct using prepared rubrics with vague descriptions or several evaluative criteria lumped under one large point allocation with a large comment box. While this style of grading rubric made grading faster since only a couple comments were needed to justify loss of significant points toward the overall grade, I always felt that more detail on how many points would be earned for each specific evaluative criterion was important. I would frequently redesign the prepared rubrics to fit my level of detail. According to Weber and Goularte, matching the length of the rubric to your personal tolerance for detail is another best practice (Weber & Goularte, 2015). Carnegie Mellon’s Eberly Center has great examples of grading rubrics for paper assignments, projects, oral presentations, and class participation/contributions at https:// www.cmu.edu/teaching/designteach/teach/ rubrics.html. Teach to the Test Not all teaching lessons are learned in the classroom. My two grade-school aged girls and I are into the Korean martial art of tae kwon do. As we grew, so did our physical requirements to reach the next belt level and I soon found myself working with a personal trainer outside of the dojo to reach my goals. One of my biggest challenges was pushups, of which I had to complete 30 to pass a newly required fitness test scheduled at the next belt testing in 6 weeks. Immediately, I went to my personal trainer and explained the new requirement. He instantly re-wrote my training plan to focus solely on my upper back and arms. He called it “teaching to the test”, a strategy that his old college football coaches employed. It worked as I doubled my pushing power within those 6 weeks and passed my fitness test! The following semester I taught my first pathophysiology course. Like most universities with a high adjunct teacher pool, I got a prepared course complete with assignments, exams, grading rubrics, study guides, syllabus, and textbook publisher generated PowerPoint presentations. Feeling confident in what I had without spending enough time evaluating everything, I modified the syllabus with my personal information, posted the presentations and rubrics, and focused on reading the textbook since this was not a topic I was fluent in. It was not until the first exam loomed that I realized that the study guides covered more textbook chapters than the syllabus stated or that we had covered in class. Fortunately, the study guides matched the exams, and I did not feel comfortable changing the exams based on my limited content knowledge. To complicate matters further, I was unable to get the presentations for the missing chapters due to a recent textbook edition change. I walked into the class, explained what happened, and began to cover the missing chapters on the whiteboard. We then threw out the syllabus schedule and ran every class period by going through the study guides. If I had the presentations, we went through them and if no presentations were available, I whiteboard lectured. Exams and assignments remained the same and “teaching to the test,” allowed the class to get the material and grades students needed with less stress. The biggest issue with this approach is that students focus solely on the material that the exam requires, often over-looking broader concepts or other details of interest they would have otherwise attempted to master. Furthermore, as I learned from student evaluation comments, mastery-oriented Articles | www.msta-mich.org • 25 students do not appreciate this approach as it limits students’ creative thinking. While this is not the ideal teaching strategy, this approach is handy in some situations such as standardized testing. In an ideal teaching environment, instructors would have small class sizes, standardized and functioning curriculum, ample time for grading and other classroom-preparation activities, and a relaxed attitude. Unfortunately, instructors have far from ideal teaching conditions and have to make do with the time and physical resources available. This causes teachers and students much stress. I hope that these techniques will help you avoid a breaking point when stressed against tight deadlines, frustrated students, and other teaching challenges. References Alexander, Francie. Survey: Teachers work 53 hours per week on average. Washington Post The Answer Sheet. March 16, 2012. http://www. washingtonpost.com/blogs/answer-sheet/post/ survey-teachers-work-53-hours-per-week-onaverage/2012/03/16/gIQAqGxYGS_blog.html Brown University. Grading Criteria & Rubrics. 2015. http://www.brown.edu/about/ administration/sheridan-center/teachinglearning/assessing-student-learning/gradingcriteria-rubrics 26 • MSTA Journal • fall 2015 Carnegie Mellon Eberly Center. Grading and Performance Rubrics. 2015. https://www.cmu. edu/teaching/designteach/teach/rubrics.html Carnegie Mellon Eberly Center. What are the benefits of group work?. 2015. https://www. cmu.edu/teaching/designteach/design/ instructionalstrategies/groupprojects/benefits. html Fearnow, Benjamin. Texas A&M Galveston Professor Hits “Breaking Point,” Fails Entire Class. CBS Houston News. April 27, 2015. http://houston.cbslocal.com/2015/04/27/texasam-galveston-professor-hits-breaking-pointfails-entire-class/ Seidel, Aly. The Teacher Dropout Crisis. NPREd. July 18, 2014. http://www.npr.org/ blogs/ed/2014/07/18/332343240/the-teacherdropout-crisis Stevens, Dannelle D., Levi, Antonia. Leveling the Field: Using Rubrics to Achieve Greater Equity in Teaching and Grading. The Professional & Organizational Development Network in Higher Education Essays on Teaching Excellence. 2005-06. http://podnetwork.org/content/ uploads/V17-N1-Stevens_Levi.pdf Weber, Connie, Goularte, Amber. Grading, Rubrics, and Retention. Blackboard Inc. 2015. http://huteachinglearningcenter.weebly.com/ uploads/1/8/3/0/18308719/_herzing_grading_ rubrics_retention.pdf Looking for a contentstrong, online program? Lawrence Technological University can help! Master of science education • $1,320 per course scholarship for all K-12 educators (DI or non-DI endoresements) covers nearly 42 percent of tuition. • Most courses offered online and asynchronous, with a science experiment component to be completed using science kits and activities. • Science content developed by Lawrence Tech in partnership with the Detroit Zoological Institute, Cranbrook Institute of Science, Aquinas College, and the University of Detroit Mercy. • Courses aligned with the Michigan Department of Education 2015 requirements 2015 TOP 100 for Science and the DI (Integrated Science) Endorsement. AMERICA’S BEST UNIVERSITY UNIVERSITIES Highest Alumni U.S. News & Master of educational technology World Report® Salaries PayScale • $1,320 per course scholarship for all participants covers nearly 42 percent of tuition. • 100 percent online and asynchronous format. 015 22015 2015 TOP AMERICA’S AMERICA’S • This practice-oriented program is offered by Lawrence Tech inBEST partnership with 100BEST UNIVERSITY UNIVERSITIES Marygrove College. Courses cover up-to-date technologies in instruction,UNIVERSITIES Web-based Highest Alumni& U.S. News U.S. News & Salaries World Report® learning tools, streaming video, electronic communication, and software and World Report ® PayScale hardware options. 015 • Complete the seven required courses of the Master of Educational Technology2degree and be eligible for the NP endorsement on your existing teaching certificate. AMERICA’S BEST UNIVERSITIES • Some curriculum requirements will be tailored individually based on the candidate’s U.S. News & World Report® goals. Instructional Technology graduate certificates (12 credits) are also available. Explore over 100 undergraduate, master’s, and doctoral programs in Colleges of Architecture and 2015 Design, Arts and AMERICA’S BEST Sciences, Engineering, UNIVERSITIES andU.S. Management. News & World Report® 15 5 01 20 2 MILITARY AMERICA’S BEST FRIENDLY UNIVERSITIES O O L& S C HNews U.S. G.I. Jobs ® ® World Report 015 22015 TOP 100 MILITARY UNIVERSITY FRIENDLY Highest Alumni S CSalaries HOO L PayScale G.I. Jobs® 2015 TOP 100 UNIVERSITY Highest Alumni Salaries PayScale 2015 TOP 100 UNIVERSITY Highest Alumni Salaries PayScale 2015 TOP 100 UNIVERSITY Highest Alumni Salaries PayScale 2015 MILITARY FRIENDLY SCHOO L G.I. Jobs® 2015 MILITARY FRIENDLY SCHOO L G.I. Jobs® For more information on these and other science programs, visit: www.ltu.edu/sciences Waive your application fee at www.ltu.edu/applyfree Lawrence Technological University | Office of Admissions 21000 West Ten Mile Road, Southfield, MI 48075-1058 800.225.5588 | admissions@ltu.edu | www.ltu.edu Articles | www.msta-mich.org • 27 Assessing High School Science Students’ Abilities to Use Cross Cutting Literacy Skills and Scientific Argumentative Writing Skills in a Michigan School District Ellen M. Karel, Ed.S., Western Michigan University, 2015 The following information is the result of a study conducted in 2013-2014 which sought to determine to what extent a centrally focused initiative concentrated on how to teach students to not just write, but how to think, read, and speak about real world problems in a persuasive manner based on multiple sets of data related to science concepts, increased scientific argumentative writing proficiencies among high school students. A secondary area this study attempted to explore was the correlation between the implemented processes in the initiative and high school students’ scientific argumentative writing proficiencies. The study was conducted in a Michigan high school, population 1,088, with a select group of students in 9th grade chemistry-physics N = 98. The students experienced evidencebased cross cutting literacy strategies and scientific argumentative writing strategies over the course of one academic year. The quasi-experimental, empirical study was designed to see if there was any significant difference in students argumentative writing proficiency based on the analysis of preand post-assessment scores. The descriptive measures used in the study measured the correlations between the results and the initiative. Findings in this study suggest that the strategies implemented caused student scientific argumentative writing to increase significantly at a 95% confidence level. The outcome of this study shows promise that evidence-based skills can be trans- 28 • MSTA Journal • fall 2015 ferred to more advanced science classes and increase student proficiency on state science assessments. An example of anecdotal evidence occurred at the end of the 2014-2015, school year, one-year beyond the analytical study. Two teachers were standing in the hallway reflecting about the year. One teacher, a chemistry and 9th grade science teacher said to the other “Wow! I can’t believe how much my students thinking and writing have improved since the beginning of the year!” “You were right, all of this does work!” The evidence-based skills that were used during the research study continued to be implemented in all of the 9th grade science courses. Over the school year the two teachers would reflect during PLC’s. Routinely open and honest communication occurred abou what to do, how to do it, and how the students were or were not improving, relative to the practices implemented in the study the previous year. At times the teachers would express their frustration as some students would regress or other students simply wouldn’t try. However, at the end of the school year, students’ thinking was transformed equally or more so, than students had the previous year. To hear a somewhat skeptical teacher express with enthusiasm how transformational the work is as one is walking out for summer break is refreshing and motivating to continue the hard work. Why conduct the study? Problem Statement As mentioned, this study explored to what extent and in what ways a centrally focused program concentrated on how to teach students to not just write, but how to read, write, and speak about real world problems in a persuasive manner based on multiple sets of data related to science concepts increased student scientific argumentative writing proficiencies. The Michigan high school implementing the program was experiencing a reoccurring problem where the number of students who score at the proficient level in science dropped significantly once MME cut scores were changed to reflect college readiness standards. The long-term desired outcome of this program is an overall increase in student proficiencies on the MME and future science performance assessments (such as SAT); however, results on these assessments cannot be measured immediately. Consequently to address the problem, Pre- and post-assessments, were used to measure a student’s ability to analyze data and answer an essential question through the use of scientific argumentative writing. What Are Contributing Factors to the Problem? The factors that were addressed are specific to the Michigan high school, specifically 9th grade; however, the researcher notes the problem is a systems problem, K-12. There are many developmental issues that that may contribute to the problem in 9th grade science, and these factors should be considered for further study, especially in the areas of curriculum, instruction, and assessment. That said, over the course of several journal articles the factors that will be addressed are: (1) the need for change due to legislated policy in the form of the new cross-cutting Common Core standards, state assessments (e.g., SAT + writing, the proposed NGSS, and other proposed state assessments such as M-STEP), (2) the sheer difference between the philosophy that literacy skills inclusive of reading, writing, speaking, listening, and viewing are to be embedded into all science courses, and actual practice in the science classroom; (3) weak instructional plans and practice concerning how to teach students literacy skills; (4) limited use of formative data to change instructional practices for literacy and data analysis skills in the classroom; (5) the lack of student reflection and metacognition about scientific concepts and how they demonstrate their understanding to real world problems through the use of argumentative writing; (6) the lack of professional development for how to integrate literacy skills, specifically argumentative writing, in the context of curriculum and how to plan for the shift in the curriculum. These six factors are impediments in the learning environment, and when occurring simultaneously, exacerbate the problem in this study, making its resolution much more difficult. Common Core, NGSS, and State Assessment Standards The first factor that contributes to the problem in this study is the need for change due to legislated policy in the form of new Common Core standards, the proposed NGSS, new state assessments (i.e., SAT + Writing and SAT), and M-STEP. The Common Core standards, and the cross cutting skills proposed in the NGSS emphasize the need for deep thinking, analysis of data, analysis of argumentative writing, and the ability to write a persuasive/argumentative piece. Students are required to explore problems that require scientific understanding, as well as understanding about how engineers would respond to the same problem to obtain scientific results. Also, students are required to develop and use models or simulations in science to predict and test the outcomes, all while they learn how engineers use models and simulations to test solutions based on strengths and limitations. Students must analyze and interpret scientific data to Articles | www.msta-mich.org • 29 30 • MSTA Journal • fall 2015 derive meaning in a given problem. Moreover, students must demonstrate how to use statistical models to identify significant features and patterns. Finally, students must respond to questions or problems as a scientist would by looking for sources of error. Additionally, students must respond to the same question or problem as an engineer to determine the strength and constraints of solutions (Bybee, 2011, p. 18). The new learning expectations set forth by the Common Core and NGSS standards are not just content related, but also focused on literacy skills. Specifically, they are based on the premise that “…science cannot advance if scientists are unable to communicate their findings clearly and persuasively or learn about the findings of others” (Bybee, 2011). Scientific reasoning and argument are essential for clarifying and communicating strengths and weaknesses, and require analytical literacy strategies. Students, therefore, must (a) be able to formulate explanations based on evidence, (b) examine their understanding in light of the evidence and comments by others, and (c) collaborate with peers while searching for the best explanation for the outcome of an investigation. Moreover, students must use careful analysis to draw conclusions and determine the best solution to a problem (which may lead to revision of the original design and a better solution) (Bybee, 2011, p. 20). The discourse that occurs as part of this type of reasoning helps students derive the best solution to a problem by using thought processes similar to those of an engineer. In a practical sense, the science portion of the ACT or SAT is as much of a reading test as it is a science test, as it measures a student’s ability to read and to comprehend select scientific passages and data sets either in tables or graphic forms. This implies that students need highly developed literacy skills in order to pass the test. Moreover, the SAT science test is designed to distract students with data that is not relevant to the question; thus students must be able to quickly decipher what is significant and what is not significant based on multiple data sets, all while using graph-reading skills along with solid comprehension skills based on a foundation of good scientific vocabulary (QuotEd, 2014). Teachers consider future performance assessments based on student sample exemplar models. The parts of exemplar models include a writing prompt, student scientific knowledge text to build background understanding, a rubric, and an explanation about the type of writing to be submitted by the student. The writing is defined as having a claim with a citation from the text, evidence with a citation from the data, reasoning with a citation from the text, and personal connections (Smarter Balanced Assessment Consortium, 2013). These exemplars help teachers frame their own writing prompts to include close reading and argumentative writing skills based on analytical literacy. Instrumentation and Results Based on the known features of the ACT + writing (similar to the SAT) The researcher created two assessments that were similar in nature to questions that might be on a state or national science assessment. Again the research group was 9th grade students so the complexity of the writing prompts were developed with the students ability in mind. The assessment tools included two different writing prompts with different complexities. The pre-assessment was less complex than the post-assessment based on the amount of evidence provided to students for analysis. Pre-Assessment The pre-assessment was given to students in 9th grade science in the first full week of school, September 2013. The content related to this writing prompt was taught in both 7th and 8th grade; therefore, no review of the content was given prior to the pre-assessment. The prompt was, “Write Articles | www.msta-mich.org • 31 a scientific explanation that states whether any of the liquids are the same substance. The writing should be at least a paragraph in length.” Students were given only one set of multi-column data as evidence to answer the question. The writing prompt and student examples are in Appendix I. Students were given as much class-time as they needed to answer the question and encouraged to put forth their best effort. Students were also given the following rationale for demonstrating their best effort: “If you can prove that you can analyze data and write in an argumentative way, we will modify the instruction of the course so that we can focus on other skills. If you prove that you are not proficient with these skills, we will help you develop the skills throughout the course.” Most student answers to the prompt were only one to two sentences in length. The average number of words was approximately 21, and the writing was completed, on average, in less than 10 minutes. In general, the answers rarely explained whether any of the liquids were the same substance. If the students would have explained if any of the liquids were the same substance, they would have demonstrated reasoning; however, this was lacking in most writing samples. Most students listed two substances. Some used one number as evidence from the data table. Rarely were the answers correct. Post-Assessment The post-assessment was given to 9th grade science students during the last full week of school in May 2014. Most of the content related to this assessment’s writing prompt, “What affects the speed of a wave?” was taught in the previous semester; the content related to waves was taught three weeks prior to the post-assessment. On the post-assessment, students were given a much more complex set of evidence, which included three sets of multi-column data to analyze. The writing prompt and data with 32 • MSTA Journal • fall 2015 student writing samples can be found in Appendix I. Students were given sufficient time to complete the writing assignment; 45 minutes were given in class and if needed students could finish the writing outside of school. The assignment was given on Friday and was due on Monday if additional time was necessary. Students were encouraged to put forth their best effort and given the following rationale for demonstrating their best effort: “This writing is evidence to prove that you can analyze data and write in an argumentative way, as we have been working on these skills for a full year.” Most student answers to the writing prompt were at least two paragraphs long. At least 35 students of the 98 wrote multiple pages of argumentative writing to address the complex problem presented to them. The average number of words was approximately 1,200. In general, most students made sure they had a claim, used the evidence provided, and used reasons associated with both the current learning and learning from the semester. Most of the writing demonstrated that students could answer the question correctly. This question was chosen because it was complex, data rich, required much reasoning, and was not something directly taught to students. In the appendix are both of the pre and post assessments with random samples of student work. An updated rubric used to evaluate student work is also attached. Since the study the rubric has been modified. The modified rubric has been included because students found it more understandable as feedback was provided many times throughout the year. The original rubric only included claim, evidence, reasons, and conventions. Because students writing has improved greatly the highschool teachers involved in the study continue to stretch the 9th grade science students analytical writing ability by incorporating additional writing skills as seen in the rubric attached. Final Conclusion As one considers the findings that have been studied for the two-fold problem faced by science educators at the Michigan high school in this study in relationship to analytical, thinking, reading, speaking, and writing skills, in conjunction with the initiative implemented and the significant findings related to the difference between the pre- and postargumentative writing assessment, one can conclude that the initiative as implemented has made a significant difference in student proficiency. The initiative design focused on the six research-based contributing factors of the problem. The first contributing factor identified the need to change due to legislative policy in the form of the new Common Core, cross cutting standards, state assessments, the proposed NGSS, and proposed new state assessments for science. Understanding that there is a problem and a need to change helped to initiate a solution to the problem. The second focus of the initiative was to change current analytical literacy philosophy into evidence-based best practice. Most of this was done through conversation during PLCs and through ongoing conversations about what is the best way to help students learn. The third factor focused on during the initiative was to determine how to implement these strategies in the science classroom. This took time, feedback, conversation, review, research, in an ongoing systematic manner. There were times where students did not seem to be making improvements, and yet the teachers were persistent—developing and implementing instructional plans based on evidencebased literacy strategies to help all students close their learning gaps. Once the instructional practices were implemented, the use of formative data caused the teachers to realize that additional time and skill development was necessary to have a direct impact on argumentative thinking and writing. Use of formative data was minimal prior to the initiative; thus, it was the fourth contributing factor. There was a risk involved in taking more time to develop the skill because it left less time to focus on content; however, the evidence has proven thus far, that the impact has not been negative, rather only positive. Meanwhile, to address the fifth contributing factor, students were provided opportunities to reflect on their own understanding and learning related to scientific concepts, and how they demonstrate their understanding to real world problems through the use of argumentative writing. This metacognitive practice allowed for student-based ownership of the cross cutting analytical skills, and deepened the desire to continue to improve their scientific thinking and writing. Lastly, teachers were given opportunities to use professional learning time and outside professional development to enhance their own understanding and to enhance their own pedagogical skills in analytical thinking and writing. This professional learning allowed the sixth contributing factor to be addressed. Subsequent articles will address how these contributing factors were addressed. Work samples, student samples, and process steps will be used to help illustrate the importance of “how to” make changes in the science classroom that effectively improve student writing proficiencies. Articles | www.msta-mich.org • 33 21 ! ! Pre-Assessment: September 2013 Pre-Assessment: September 2013 Examine the following data table: Density Liquid 1 Liquid 2 Liquid 3 Liquid 4 ! 0.93 g/cm 0.79 g/cm 13.6 g/cm 0.93 g/cm Color Mass Melting Point no color 38 g -98 °C no color 38 g 26 °C Silver 21 g -39 °C no color 16 g -98 °C Write aWrite Scientific explanation whether ofliquids the liquids a scientific explanationthat that states states whether anyany of the are theare samethe same substance. TheThe writing should atleast least a paragraph in length. substance. writing should be be at a paragraph in length. Random examples ! ! ! ! ! ! ! ! ! ! ! On the left, the two samples are both scored at 2four outsamples of 16 to the The left are all scored at 1 out of 16 On the left, the two samples are scored 5 out of 16 (Top) 3 out of 16 (Bottom) 34 • MSTA Journal • fall 2015 Post-Assessment: May 2014 23 ! What affects the speed of a wave? 23 Post-Assessment: May This writing is your final writing for the year. Be sure to 2014 make a clear claim, use the evidence What affects the speed of a wave? provided below and then be sure you include reasons that make connections to states of matter, speed-motion, waves, energy. your reasoning bethe sure to state This writing is your finaland writing for the When year. Beyou surewrite to make a clear claim, use Post-Assessment: May 2014 evidenceword, provided below then be sure youthe include reasons that makeit,connections your vocabulary define it,and and then make connection. (say define it, and use it) What affects of aWhen wave? states of be matter, speed-motion, waves, the andspeed energy. you write your reasoning be This writingto should at least 2 paragraphs long. ! sure to state your vocabulary word, it, and theaconnection. it, This writing is your final writing fordefine the year. Be then sure make to make clear claim, (say use the define it, provided and use it)below Thisand writing at least 2 paragraphs long. evidence thenshould be surebeyou include reasons that make connections to statesofofa Wave matter,Lab speed-motion, waves, and energy. When you write your reasoning be Speed - Sample Data for Tension, Frequency, Wavelength sure to state your vocabulary word, define it, and then make the connection. (say it, define it, and use Tension it) This writing should be at least 2 paragraphs long. Frequency Wavelength Speed Speed of a Wave Lab Tension, Frequency, Wavelength Trial (N)- Sample Data for (Hz) (m) (m/s) 1 2.0 Tension 4.05 Frequency 4.00 Wavelength 16.2 Speed Trial 2 (N) 2.0 (Hz) 8.03 (m) 2.00 (m/s) 16.1 13 2.0 4.05 12.30 4.00 1.33 16.2 16.4 24 2.0 8.03 16.2 2.00 1.00 16.1 16.2 35 2.0 12.30 20.2 1.33 0.800 16.4 16.2 46 2.0 5.0 16.2 12.8 1.00 2.00 16.2 25.6 57 2.0 5.0 20.2 19.3 0.800 1.33 16.2 25.7 68 5.0 12.8 25.5 2.00 1.00 25.6 25.5 5.0- Sample Data for 19.3 1.33Coil Thickness 25.7 Speed7 of a Wave Lab Different Mediums and 8 Medium 5.0 Wavelength 25.5 Frequency 1.00 Speed 25.5 Speed of a Wave Lab - Sample Data for Different Mediums and Coil Thickness Speed a Wave and Coil3.49 Thickness Zinc 1ofinch coil Lab - Sample 1.74 m Data for Different 2.01Mediums Hz m/s Medium Zinc 1 inch coil Wavelength 0.90 m Frequency 3.9 Hz Speed 3.51 m/s Zinc 1 inch coilcoil Copper 1 inch 1.74 1.19 m 2.01 2.11 Hz 3.49 2.51 m/s Zinc 1 inch coilcoil Copper 1 inch 0.90 0.60 m 3.9 4.18Hz Hz 3.51 2.50 m/s Copper 1 inch Zinc 3 inch coilcoil 1.19 1.82 m 2.11 Hz 2.2 Hz 2.51 4.004m/s m/s Copper 1 inch Zinc 3 inch coilcoil 0.60 0.96 m 4.18 4.17 Hz 2.50 4.003m/s m/s Zinc 3 inch coil 1.82 m 2.2 Hz 4.004 m/s 0.96 m 4.17 Hz 4.003 m/s !Zinc 3 inch coil ! ! ! Articles | www.msta-mich.org • 35 ! ! Speed of a Sound in Various Subtances CRC handbook 25 ! Example 15.5/16 ELL Student 6 months in USA Example 4/16 At Risk Student, New to Michigan High School A wave moves along a medium from one end to another, if you watch a lake wave move along a medium (the lake water), you canExample see the crest of the wave moves from one 8/16 Example 15.5/16 ELL location to another. A crest is seen to cover distance, the speed of reason a wave for or object goes The medium is the the effect on Student 6 months in USA back to how fast a wave or object is going and wave is expressed as distance traveled per timed speed. The medium could be anything travel. Speed traveledNew by atogiven point on aelectromagnetic wave. So we come back toIf what (for the spectrum). the Example 4/16 is Atdistance Risk Student, electromagnetic waves are traveling through affects theHigh speedSchool of a wave? Frequency or wavelength of a wave could affect its speed? Michigan empty space, then it The can go at top speeds A wave moves frombyone The speed of aalong wave aismedium unaffected the changes in the frequency. wave speed of 300,000,000m/s. But if going through end to another, if you watch a lake wave depends upon the medium through which the wave is moving. Only an alteration in the like water, it would take longer move alongofa the medium (thewill lakecause water),and youchangesomething properties medium in the speed. because then the waves (energy) would now can see the crest of the wave moves from Site source: http://www.physicsclassroom.com/class/waves/Lesson-2/The-Speed-of-ahave to pass through each molecule like a one location to another. A crest is seen to Wave mechanical wave. ! cover distance, the speed of a wave or object goes back to how fast a wave or Example 8/16 and is expressed as distance When waves are traveling through a meobject is going dium, different have The medium is thetravel. reason for the effect on wave speed. Themediums medium can could be different anything traveled per timed Speed is distance tensions and densities, liketraveling Zinc coilsthrough vs. traveled a given point onspectrum). a wave. So (for the by electromagnetic If the electromagnetic waves are Copper, the Zinc less dense, so the energy we come back then to what empty space, it affects can gothe at speed top speeds of 300,000,000m/s. But if going through has a harder time going from one molecule of a wave? Frequency or wavelength a something like water, it would take oflonger because thendue the to waves (energy) to another the spacing of would the mol-now wave could its speed? The speed have to passaffect through each molecule likeofa mechanical wave.the copper molecules are right ecules while a wave is unaffected by the changes in the next to each other, so the energy can flow frequency. The wave speed depends upon through the copper easier, it would take When waves are traveling through a medium, different mediums canbuthave different the medium through which the wave is movlonger due to the fact that the energy would tensions and densities, like Zinc coils vs. Copper, the Zinc less dense, so the energy has a ing. Only an alteration in the properties of the have due to pass through moreofmolecules than harder time goingand from one molecule to another to the spacing the molecules medium will cause change in the speed. the energy to for Zinc. while the copper molecules are right next to each other, would so the have energy canthe flow through ! Site the source: copper http://www.physicsclassroom.com/ easier, but it would take longer due(https://www.schoology.com/assignment/ to the fact that the energy would have 115369845/info) class/waves/Lesson-2/The-Speed-of-a-Wave pass through more molecules than the energy would have to for the Zinc. ! Zinc 1 inch coil 0.90 m 3.9 Hz 3.51 m/s Copper 1 inch coil 1.19 m 2.11 Hz 2.51 m/s to ! (https://www.schoology.com/assignment/115369845/info) (https://www.schoology.com/assignment/115369845/info) While the mediums density has an effect, but on the other hand tension can play a big • MSTA Journal • fall 2015 36 role in a way that when the tension is more loose, the speed is down, and when the tension is high, the speed is faster; but why? When the tension is looser the energy has to While the mediums density has an effect, but on the other hand tension can play a big role in a way that when the tension is more loose, the speed is down, and when the tension is high, the speed is faster; but why? When the tension is looser the energy has to flow through a longer distance, when the displacement could be under a half of what the energy did not need to flow. When the tension is high, so is the Amplitude (the height of the crest or trough from the origin). Example Special Education Student 8/16 The medium in which a wave travels through changes the speed of the wave. When a wave travels through different mediums weather it’s a gas, liquid or a solid. For example the speed of sound goes through different thing and it takes longer to go through some object or substance compared to others. The speed of sound in Carbon Dioxide is 259m/s and the speed of sound in Hydrogen is 1284m/s. The speed of a wave even travels faster in different types of water (sea water and regular water). The speed of sound in water is 1493m/s and seawater is 1535m/s. The speed of sound is 5969m/s in Iron and 3240m/s in Gold. Speed can travel through any thing and its speed it’s different for each thing, there for the medium is the resin the speed changes. Example 15.5 ELL student 6 months in the USA One of the things that affects the speed of a wave is the density of a wave. The density of an object affects the speed of a wave. How close the particles are like a how close the particle are in a solid object compared to a liquid. In a solid object the wave or energy moves faster and according to the graph the highest speed was about 120,000 and in a liquid it moves slower but its not the slowest through because its fastest speed was about 1,904. On the other hand in a gas the speed of the wave is the slowest and its fastest speed is about 1,284. The farther apart the particles the less speed there is and the less spread out the faster the speed of the wave. In the chart in the solid part of it the speed was faster than the liquid or gas. For example Diamond and Glycerol Diamond has a sound speed, according to the graph, is 12,000, and the Glycerol has a sound speed of 1,904. Or another example would be s liquid and a gas like Water and Helium, the water has the sound speed of about 1493m/s and the Helium has the sound speed of about 965m/s. The particles in a solid are closer together making it the speed easier to move. This shows that density is an important factor for the speed of a wave. The tension of a wave affects the speed of the wave also. In the chart 1 it shows that when the tension goes up the speed of the wave’s speed goes up as well. For example when the tension was at 2N then the speed was at 16.2m/s and to all for tiles the speed stays somewhat constant the speed increase a little but they are no significant changes, but when the tension increased at 5.0N the speed of the wave also increased to about 25m/s and the same happen it stayed constant to all the rest of the tiles, the speed did change a little but they where no significant changes as well. Therefore you could conclude that when the tension is higher the speed is higher as well. This also shows you that tension those affect a waves speed. Another important factor that changes the waves speed is the medium of the wave. The medium of the wave is the most important factor of them all because it’s the one that changes the waves speed the most. For example in the second graph they are four factors of a wave Medium, Frequency, Wavelength and the Wave’s Speed. And the Medium of the first two tiles are about Zinc one-inch coil and on both of the tiles the waves speed is about 3.49m/s-3.51m/s therefore they stayed constant when the medium stayed constant, however when the medium decreases to Copper one-inch coil the Speed of the Wave decreases as well, the speed decreases to about 2.51m/s- Articles | www.msta-mich.org • 37 2.50m/s, but when you use wavelength at the first tile the Wavelength is only about 1.74m and the speed is 3.49, but when the wavelength decreases significantly to about 0.90m the speed only changes to 3.51m/s which it is actually an increase therefore according to this graph the wavelength has absolutely no effect on the waves speed. However the Frequency in the third tile is about 2.11Hz and the waves speed is about 2.51m/s, but in the forth tile the tile increases significantly to about 4.18Hz and the speed only increases to about 2.50m/s, therefore frequency doesn’t change the waves speed either. This proofs that the Medium of the wave significantly affects the wave’s speed. This is why the Medium, Tension, and Density are most important factors that affect the wave’s speed. Example 16/16 The previous data sets connect back to the states of matter, speed/motion, waves, and energy. They first relate back to the states of matter because the speed through each matter will be close to constant. For example, the speed of sound is about 340.29 m/s. Sound obviously travels through air, but also travels through solids and liquids. The speed of light differs because this travels through air most easily. The speed of light is much faster than sound, and is 300,000,000 m/s. The data sets also connect back to energy because the Law of Conservations state that energy cannot be created or destroyed only transferred. This supports the statement that only the medium and tension affect the wave speed. A person might say that the amount of energy imputed into a wave will determine the speed of the wave. This is not true because the person is never creating more energy, only transferring. The Law of Conservations also comes true on the subject that if the medium changes, then the speed is able to change. As looked at in the data, you could have slight differences in the wave speed between two sets of data in the same medium. But this is not a drastic change, and this is because 38 • MSTA Journal • fall 2015 energy cannot be created. The data also relates back to speed/motion because as states in the opening paragraph, speed refers to wavelength x frequency. If you are calculating two sets of speed in the same medium, the wavelength and the frequency will only change at the slightest because they are the same medium. This is also because frequency and wavelength are inverse properties, which means that if frequency is high then the wavelength is low, and vise versa. Lastly, the data refers back to waves. This is quite obvious because energy is transferred through molecules in a wave. A wave can either be transverse or longitudinal. For the case of sound waves, the wave will be longitudinal, meaning the particles are displaced parallel to where the energy is first imputed. As one individual particle is disturbed, it transfers to the next particle, and the disturbance continues. This rate at which the particles are disturbed relates back to speed, and also proves why in a constant medium, the speed of that wave will be the same. Through looking at data and referring the data back to states of matter, speed/motion, waves, and energy, one is able to determine that the only thing that affects wave speed is the medium in which the wave is traveling through, and the tension of that wave. Example 16/16 The only thing that varies speed is the medium in which the waves are traveling through, and the medium’s tension. Prior to the assignment, the concept was brought to us in the examples that ask “if two Slinkys are attached to a wall and each person moves the slinky at a different amplitude, which of the two pulses would take a shorter amount of time to reach the wall?” The answer was always neither. Neither would get there faster because in this problem the tension did not vary and the medium that they are traveling through is a plastic slinky for both, so they are constant. Speed = wavelength x frequency. Wavelength being one complete wave cycle, and frequency being how often that wave occurs. The previous statements can be proved through looking at examples of data, and connecting the data back to states of matter, speed/motion, waves, and energy. In the first set of data, the first five trials were constant, and the tension was set to 2.0 n. Out of the first five trials, the speed varied at the slightest amount ((16.2, 16.1, 16.4, 16.2, 16.2 (m/s)). The sixth trail’s tension was then changed to 5.0n. The speed of this wave was 25.6 m/s. The 7th and 8th trail’s were both set at a tension of 5.0 n, and both of their speeds were quite similar to the first, being 25.7 and 25.5 m/s. The next set of data shows “different mediums and coil thickness”. This isn’t so much tension that varies, as it is a different medium in which the waves are traveling through. A medium is simply the substance or material that carries the wave. For example, the first medium that is shown is Zinc. Zinc’s mediums in this case are a 1-inch coil, another 1-inch coil, a 3-inch coil, and another 3-inch coil. For the 1-inch coil, the first set’s speed is 3.49 m/s. The next set’s is 3.51 m/s. This is a very small difference. This is also shown in the 3-inch coil, the first set is 4.004 m/s, and the second is 4.003 m/s. And finally, this is shown through both of the medium Copper 1-inch coil. The first set is 2.51 m/s, and the second is 2.50 m/s. Again, this is the slightest difference. Although there is a slight difference between each set, the difference shown here is the medium in which the wave is traveling through. The last set of data shows the “speed of sound in various substances”. To compare what this is getting at, imagine hitting a plate with a fork, and then imagine hitting a piece of Jell-O with a fork. The speed of sound in an object depends on the medium in which the waves are traveling through. For example, in the data set it is shown that the speed of sound in air that’s 20*C is 344 m/s. It is then shown that the speed of sound in air at 0*C is 331 m/s. These are very close numbers concluding that the medium changes the wave speed, and other factors have very slight impacts. Now, comparing the speed of sound in air to the speed of sound in a solid. The speed of sound in a diamond is 12,000 m/s. This is a giant change compared to the speed of sound in that of a gas. The difference between a gas and a solid is the density. It is more difficult for a wave to travel through a gas and this is due to the closeness of the particles. For example, in a solid, the particles are very crystalized, and compact. Because of the closeness of the particles, molecules can be transferred through a solid more easily because the energy does not need to travel very far to be transferred. This differs from a gas’s case because the greater the density of the particles of a medium, “the less responsive they will be to the interactions between neighboring particles and the slower that the wave will be” (“The Speed of Sound.” The Speed of Sound, n.d. Web. 01 June 2014). Articles | www.msta-mich.org • 39 REFERENCES ACT. Science test description. (2015). Retrieved from http://www.actstudent.org/testprep/ descriptions/scidescript.html Ary, D., Jacobs, L., & Razavieh, A. (1990). Introduction to research in education (4th ed.). Fort Worth, TX: Holt, Rinehart and Winston, Inc. Boston Public Schools. (2010). Common writing assignment: Science rubric. Boston, MA: Author. Bybee, R. W. (2011). Writing in science: Scientific and engineering practices in K-12 classrooms: Understanding a framework for K-12 science education. NSTA’s Journal. Retrieved from http://nstahosted.org/pdfs/ngss/ resources/201112_Framework-Bybee.pdf Common Core State Standards (n.d.). English language arts & literacy in history/ social studies, science, and technical subjects. Appendix B: Text exemplars and sample performance tasks. Retrieved from https://docs.google.com/a/bcpsk12.net/ viewer?url=http://www.corestandards.org/ assets/Appendix_B.pdf&chrome=true Dimitrov, D., & Rumrill, P. (2003). Pretest-posttest designs and measurement of change. Retrieved from http://www.phys.lsu.edu/faculty/ browne/MNS_Seminar/JournalArticles/Pretestposttest_design.pdf ELA common core state standards resource packet. (2014). Retrieved from http:// commoncore2012.homestead.com/ Grade_Level_Files/First/Reading/ELA_Page/ Michigan_Units/Unit_7_Writing_Like_a_ Scientist_Resources.pdf 40 • MSTA Journal • fall 2015 Elliott, P. (2013). AP: The big story. Retrieved from http://bigstory.ap.org/article/act-only-quartergrads-ready-all-subjects McNeill, K. L., & Krajcik, J. (2012). Base rubric for claim, evidence, reasoning, rebuttal (CERR) SEPA Science Education Partnership Award Project Neuron. Nicolette, J. (2013). Help! How do I teach the next generation science standards? Grand Rapids, MI: Van Andel Education Institute. QuotEd. (2014). 3 types of ACT science strugglers. Retrieved from http://www.quotedapps.com/2014/03/16/3types-of-act-science-strugglers/ Regents of the University of California. (n.d.). Relevant-supporting evidence description. Retrieved from http://sciencearguments. weebly.com/uploads/2/6/4/3/26438648/ rse_description_v1.pdf Smarter Balanced Assessment Consortium. (2013). Grade 11-performance task. Retrieved from http://www.smarterbalanced.org/ wordpress/wp-content/uploads/2012/09/ performance-tasks/nuclear.pdf ­ï‚ Choice of evidence is relevant (appropriate), accurate and sufficient to support the claim and to connect to the reasons ­ï‚ Quantitative evidence includes: extremes, patterns, and other data pairs that are reliable, and relevant ­ï‚ Qualitative evidence is descriptive and clearly enhances the argument. ­ï‚ All evidence (Cause and Effect) is clearly mentioned as ordered pairs. Evidence Scientific data or information that supports the claim. The Scientific data/information needs to be relevant (appropriate), accurate and sufficient to support the claim. (Cause and Effect are logically related) ­ï‚ Choice of evidence is mostly relevant (appropriate), mostly accurate and mostly sufficient to support the claim and/or to connect to the reasons ­ï‚ Most of the quantitative evidence includes: extremes, patterns, and other data pairs that are reliable, relevant ­ï‚ Qualitative evidence is descriptive and mostly enhances the argument. ­ï‚ Most evidence (Cause and Effect) is clearly mentioned as ordered pairs. ­ï‚ Clear scientifically accurate ­ï‚ Scientifically accurate claim (The claim that is placed correctly in sentence structure is simple and the context of the answer clear. ­ï‚ There is complexity to the ­ï‚ The variables are identifiable. sentence structure and thought. (Cause and effect are clear) ­ï‚ The variables are easily ­ï‚ The answer can be linked to most identified. (Cause and effect of the evidence, reason, and/or are clear) counter argument. ­ï‚ The answer can be linked to all of the evidence, reason, and counter argument. 3: Good B Claim: A statement or conclusion that answers the original question or problem. A cause and effect are included (ordered pairs). 4: Excellent A ­ï‚ Claim reveals partial understanding and includes both accurate and inaccurate details or omits important details ­ï‚ The sentence structure is simple or somewhat clear. ­ï‚ The variables (cause or effect) are not clear (Cause and effect are missing in most cases) ­ï‚ The answer can be linked to some of the following evidence, reason, or counter argument but not all. ­ï‚ Choice of evidence is somewhat relevant (appropriate), some what in accurate and/or somewhat sufficient to support the claim and/or to connect to the reasons ­ï‚ Or: There is enough evidence provided, but it contains both accurate and inaccurate statements and has limited connection to the claim/reasons ­ï‚ Or: some evidence is used inaccurately because there is no cause and effect relationship (cause and effect are missing in most cases) ­ï‚ Some of the quantitative evidence includes: extremes, patterns, and other data pairs that are reliable, relevant ­ï‚ Only qualitative evidence is used to enhance the argument with some connections to the claim/reasons 2: Satisfactory C ­ï‚ No Claim, ­ï‚ There is no clear answer to the question to begin the scientific argument. 0: No evidence ­ï‚ No qualitative or ­ï‚ Choice of evidence is quantitative not relevant evidence is (appropriate), not provided accurate and not sufficient to support the claim and/or to connect to the reasons ­ï‚ Or: evidence is used inaccurately because there is no cause and effect relationship (No ordered pair of evidence) ­ï‚ Only qualitative evidence is used to enhance the argument with no connections to the claim/reasons ­ï‚ The claim is inaccurate or the claim is off topic. ­ï‚ The claim does not answer the question in a scientific way because no variables are considered. (no cause or effect or inaccurate use of variables) 1: Needs Improvement D Claims Evidence Reasoning Rubic Articles | www.msta-mich.org • 41 42 • MSTA Journal • fall 2015 ­ï‚ ­ï‚ ­ï‚ ­ï‚ ­ï‚ An explicit counter argument(s) or alternative explanation(s) is provided that includes relevant, accurate, and sufficient (counter) evidence and reasoning. ­ï‚ Provides multiple connections to the claim and evidence used in the argument ­ï‚ Continues to use evidence as an ordered pair ­ï‚ Provokes thoughts about how others may view the same evidence Counter Argument: Recognizes and describes alternative explanations (using the same evidence or by providing counter evidence) and the reasoning for why the alternative explanation is not appropriate. ­ï‚ Some connections are in the reasons to show how some of the evidence or why some of the evidence supports the claim. ­ï‚ The reasoning does not link all relevant evidence to the claim ­ï‚ An appropriate scientific principle/term/vocabulary is described or defined and then is used to justify why some of the data/information counts as evidence. ­ï‚ The appropriate scientific principles are not fully described or accurately used to justify why the data/information counts as evidence. ­ï‚ The reasons are inaccurate due to errors or lack of connections. ­ï‚ Basically, there is no logical cause/effect relationship described. There is a clear counter argument ­ï‚ There is a counter argument or alternative explanation that or alternative explanation that includes some relevant, accurate, includes mostly relevant, and sufficient evidence and accurate, and sufficient evidence reasoning (insufficient with and reasoning Or introduces counter evidence inaccuracy) ­ï‚ Provides a connection to the and reasoning claim or reasons but not both Provides some connection to the ­ï‚ Uses some of the evidence as an claim and/or reasons ordered pair or include only one Uses most of the evidence as an ordered pair cause or effect. ­ï‚ Or there is both accurate and inaccurate logic used ­ï‚ A clear connection is maintained throughout the reasons to show how most of the evidence or why most of the evidence supports the claim. ­ï‚ Most of the appropriate scientific principles/terms/vocabulary are described or defined and then are used to justify why most of the data/information counts as evidence. ­ï‚ And/or relevant and accurate prior knowledge/life experience(s) are used to support how/why the evidence supports the claim Reasoning: A justification ­ï‚ Explicit accurate, relevant, that links the claim and and sufficient reasoning is evidence through the logical provided that connects all application of cause and evidence to the claim. effect. It shows why the data ­ï‚ All of the appropriate and information counts as scientific evidence by using relevant, principles/terms/vocabulary accurate and sufficient are described or defined and scientific principles. are used to justify why the data/information counts as evidence. ­ï‚ The response describes an application of the scientific principles beyond the context of the prompt. (Additional research and connections were made to outside sources/real world context) ­ï‚ There is a counter argument or alternative explanation that includes inaccurate evidence, logic which makes the argument irrelevant, ­ï‚ Or the counter argument or alternative explanation is off topic ­ï‚ Does not recognize that alternative explanations exist ­ï‚ The reasoning does not ­ï‚ No reasoning is support the claim provided And/or ­ï‚ The reasons may be off topic And/or ­ï‚ The reasons may be lacking connections to the evidence And/or ­ï‚ The reasoning is flawed with many errors because the reasons are NOT relevant, accurate, or sufficient And/or ­ï‚ No attempt was made to include any scientific principle/term/vocabular y to support the reasons Claims Evidence Reasoning Rubic Articles | www.msta-mich.org • Score 24 Overall comments ***In the study FCA’s and Conclusion were in the same category. FCA’s -Conventions: The overall flow and grammar of the paper tth ***Anything in bold print was not used in the study but will be incorporated in the 4 year of implementation. Academic year 2015-2016 Conclusion: A summary of the scientific argument that highlights the key points and draws and end to the argument in a clear and concise manner. ­ï‚ Control of sentence structure, grammar and usage, and mechanics ­ï‚ Choice of signal words increases the integrity of the argument ­ï‚ Length and complexity of response provides opportunity for student to show control of standard English conventions ­ï‚ The paragraph flows because the claim, evidence, reasons and counter argument are logically connected. ­ï‚ Includes o confidence level about the findings, and o provides suggestions for further research, and o explains how errors may impact the findings sighted in the scientific argument. ­ï‚ Concluding statement emphasizes the most significant, relevant, and accurate evidence and reasons and/or counter argument that support the claim. ­ï‚ Strong and appropriate signal words are used ­ï‚ A clear, concise and appropriate concluding statement ends the paragraph ­ï‚ ­ï‚ ­ï‚ Errors do not interfere with communication and/or clarity ­ï‚ Or few errors relative to length of response or complexity of sentence structure, grammar and usage, and mechanics, ­ï‚ Signal words are used ­ï‚ The paragraph flows most of the time because the claim, evidence, reasons and counter argument are logically connected. . ­ï‚ Includes most of the following o confidence level about the findings, and o provides suggestions for further research, and o explains how errors may impact the findings sighted in the scientific argument. ­ï‚ Concluding statement emphasizes the significant, relevant, and accurate evidence and reasons and/or counter argument that support the claim ­ï‚ Appropriate signal words are used ­ï‚ An appropriate concluding statement ends the paragraph ­ï‚ ­ï‚ ­ï‚ Errors interfere somewhat with communication and/or clarity ­ï‚ Or too many errors relative to the length of the response or complexity of sentence structure, grammar and mechanics ­ï‚ Or lacking appropriate signal words ­ï‚ Or not written as a paragraph, lacks paragraph integrity and complexity (separate claim, evidence, reason) ­ï‚ ­ï‚ ­ï‚ Errors seriously interfere with communication AND clarity ­ï‚ Or little control of sentence structure, grammar and usage, and mechanics ­ï‚ Or there is an attempt to make an argument however it is not a sufficient writing sample to evaluate CERCC If you wrote something you earn at least a “D” in the conventions category ­ï‚ No signal words are ­ï‚ No concluding used and a somewhat statement appropriate concluding statement ends the paragraph ­ï‚ Concluding statement ­ï‚ Concluding statement emphasizes does not emphasize evidence or reasons or evidence and reasons and/or counter argument counter argument that support the claim. (not always relevant, significant, or accurate) ­ï‚ Includes one of the following o confidence level ­ï‚ Includes some of the following about the findings, o confidence level about the and findings, and o provides o provides suggestions for suggestions for further research, and further research, o explains how errors may and impact the findings sighted o explains how errors in the scientific argument. may impact the findings sighted in the scientific argument. ­ï‚ Some signal words are used that are appropriate or inappropriate ­ï‚ A somewhat appropriate concluding statement ends the paragraph Claims Evidence Reasoning Rubic 43 Join Our Growing Meemic Family A loyal partner to the educational community for more than six decades. For 65 years, Meemic has offered exclusive benefits, exceptional service and great auto and home insurance rates exclusively to the educational community. With Meemic, you’re not “the insured,” but rather a family member who deserves timely and attentive customer service. It’s not surprising that 71% of new members are referred by current highly satisfied members.* Contact Your Local Agent for a FREE Insurance Quote and Grant Opportunities. As a member of the educational community, you give your all, every day. For 65 years, Meemic has made sure you got something back. For a free auto quote, contact your local agent: Meemic.com/Agent *Based on YTD 2015 Member referral rate. 44 • MSTA Journal • fall 2015 Holy Ichythoplankton Batman! — Science Research as a Teacher at sea June Teisan, Education Outreach and Program Specialist National Oceanic and Atmospheric Administration NOAA How do STEM educators stay abreast of cutting edge practices and programs? In what ways can they renew and refresh skills that go beyond pedagogy and instead ground them in reallife research in their primary STEM major? I have found that sending myself to ‘science teacher boot camps’ each summer has done exactly that for me. Whether it was spending a week studying zoonosis at the Centers for Disease Control in Atlanta, a stint focused on the science of food safety with the US Department of Agriculture in Washington, D.C., or time on Lake Superior with scientists from the Environmental Protection Agency investigating water quality, I flourished as a biologist in these immersive science settings and was then able to transform the experience into deeper learning for my students. So imagine my excitement when an email in early 2015 alerted me to my selection as a Teacher at Sea with the National Oceanic and Atmospheric Administration. I’ve known for a long time that NOAA’s Teacher at Sea Program is a premier educator training experience that launches an educator on an authentic research expedition to work side-by-side with world-class scientists in the field. The teacher can, in turn, share this adventure with students in their classroom. In my case, I would be sharing my ocean experiences with educators and students across the country! As an Albert Einstein Distinguished Educator Fellow placed in NOAA’s Office of Education in Washington, D.C., I was spending the year presenting to teachers at professional development conferences nationwide, so I would be able to bring a vibrant, first-hand account of the Teacher at Sea program to my audiences of educators. And although I don’t have a class of my own right now, I’m also in touch with K-12 students through my home district and in the classrooms of my former student teachers. I set sail May 1st for a two-week voyage on the Gulf of Mexico aboard the Oregon II, a 170-foot research vessel from NOAA’s Fisheries facility in Pascagoula, Mississippi. The purpose of the cruise was to measure water quality parameters and gather ichthyoplankton samples, specifically targeting the larvae of Bluefin tuna. With my two noonto-midnight teammates, and the invaluable Oregon II deck crew to operate the winches, Articles | www.msta-mich.org • 45 I learned to draw samples from the Gulf with specially developed equipment at both the surface, sub-surface, and at depths in excess of 200 meters! The data collected on NOAA’s plankton cruises provides one piece of the complex puzzle of the regulation of commercial and recreational fishing. Ichthyoplankton data is added to findings from trawl teams catching juvenile sizes of certain species, analysis of gonads and spawn from adult fish caught on other cruises, and other stock assessment information. Data analysis and modeling examine these information streams, and serve as the basis of stock assessment recommendations brought to policy makers. Spring ichthyoplankton surveys have been conducted for over 30 years, and my Teacher at Sea time was an amazing glimpse behind the scenes of NOAA’s critical work maintaining the health of our fisheries. Onboard ship I quickly learned that there’s a unique rhythm to a working research vessel, and it takes a while to acclimate. During the 12 hour shift to which I was assigned - 46 • MSTA Journal • fall 2015 noon to midnight - there could be a variable number of hours of waiting to get to the next testing site at particular coordinates in the Gulf, then 2-3 hours of intense physical activity as we deployed the scientific gear and processed the samples, all on a rolling ship deck. And oh, once you settle into that pace and schedule… BAM! …weather delays or ship repairs upend the plans and we’d be back to uncertainty. What was a big help in all the ambiguity was working with such a positive, cheerful, professional team which made it easier to roll with the changes. I was most certainly out of my comfort zone on the Oregon II. While I have sailed on boats and ships of various sizes, this is an intense working vessel on a tight schedule doing the 2015 sample set of what is a thirty-year ongoing data collection effort, so the focus does not vary and the pace it is rigorous. So as the waves continued to build (we started with calm seas and experienced 8-9 foot swells) again, a trust in the ship’s crew and my science team helped me dig in and contribute to the work in spite of my lack of training and my newbie status. It was in all a stellar scientific research experience for a geeky biology major who spends most of her days in a classroom filled with exuberant 11 year olds! While on board the Oregon II for fifteen days, I was deeply impressed by many facets of this scientific journey. protecting our water resources, and equally passionate about sharing this stewardship mission with students and peer educators. So I was beyond excited to be chosen for a STEM adventure with the National Oceanic and Atmospheric Administration’s Teacher at Sea Program. If you or someone you know would like to apply please visit teacheratsea. noaa.gov • The level of dedication, professionalism, and passion of the NOAA science team: This work is high caliber data gathering in sometimes grueling conditions, with monotonous waiting periods in close quarters, but the good humor, dedication to best practice field science, and mutual respect and support among the team is always evident. • The complexity of running a working research vessel: From the commanding officer down the chain, each crew member has their jobs and each person is vital to the success of the excursion. • The importance of the work: Our fisheries are a vital food source; to manage the stocks and avoid overfishing we need data to make management decisions that ensure a healthy ecosystem. • The beauty and jaw-dropping magnificence of the Gulf: this vast expanse of water - teaming with life, driving weather patterns, supplying us with food and fuel – is a sight beyond words. • The pathways to STEM: Always curious about why people choose the careers they do - at what point did a door open, who pointed the way, when did the proverbial light bulb go on - I asked members of our science team and ship’s crew the when, how, and why behind their chosen careers and learned so much from them that I can in turn share with other educators about creating STEM opportunities for students. SCIENCE TEACHERS NEEDED! The A.G.B.U., Alex & Marie Manoogian School in Southfield is looking for Michigan Certified Science teachers for Middle School and High School. Send your resume with cover letter to: torossian@manoogian.org It is the oceans, lakes, rivers, and streams that give Earth its stunning blue hue and foster life on our planet. I am passionate about Articles | www.msta-mich.org • 47 Same Gender Effect (In the Zone Treatment) in a Mixed Gender Classroom Part Three: as it relates to superior content retention J. Gail Armstrong-Hall Ph.D. Abstract By having gender separation, both vertically and horizontally, in a treatment called “in the zone,” seventh grade students at a middle school were able to more than double their content knowledge acquisition. Further, the classes engaged in this treatment exhibited lower decibel level readings than the regular coed class (no treatment) and the same gender classes. This longitudinal study suggests that coed public schools with the “in the zone treatment” should be able to outperform private and charter schools that rely on same gender classes. At the beginning of this school year, I was explaining to the parent of a new student that my same-gender classes from the previous year retained twice as much information as the mixed gender classes. I suddenly realized that his daughter was destined to be in a ‘mixed class’ for this school year and I had just announced that she would be learning half as much as her counterparts in a same gender class. Whoops! How can any teacher in good conscience allow this to happen? There must be some way to simulate the same gender effect in a mixed class. Perhaps a reorganization of our science classrooms might offer a solution. We now have the potential to have students separated in the vertical plane since I now have lab tables on two different height levels. At the back of the room I have higher lab tables and at the front I have lower lab tables. Believing that girls are talkers and boys are pokers, I decided to put the girls up front where the tables were in rows that did not face each other. This arrangement minimized gossip and helped the girls focus on my instructions. The boys were at the 48 • MSTA Journal • fall 2015 taller lab tables at the back of the room and were spatially “enticed” to look at me for instruction. This vertical and horizontal separation of the sexes was dubbed the “In the Zone Treatment”. See the diagram in Figure 1 for the lay out. Students knew they were ‘in the zone’ to help them focus on my instructions, then students could move about the room. I selected one class to be the “control” class and students were mixed in their seating arrangement. A second class had all the same gender (male). All students were preand post-tested to determine how many of the essential science concepts they learned. A random sample was taken from each class, pulling data from every 5th person. The results of the data collection are found in Tables 1 and 2. To be able to compare current data with those collected in past years and establish better controls over the testing, the same essential questions and grading rubrics were used. (See Appendices 1-3 to view last year’s data.) Figure 1: IN THE ZONE TREATMENT Articles | www.msta-mich.org • 49 Discussion Data from this year was supported by comparison with the previous year because in the all boy class students performed almost twice as well (25%) as the control class (15%). By having a gender separation both vertically and horizontally in a treatment called “in the zone,” 7th grade students at middle school were able to more than double their content knowledge acquisition. The treatment classes had 40%, 35%, and 38% content improvement compared to the control group that had a 15% content gain and the all boy group that had a 25% content improvement. The overall content retention between years was higher in the first year (maximum of 57 points out of 60) and lower in the second (maximum of 41 points out of 60). The problems picked for analysis were based on three types of pedagogy: a field trip, an experiment and independent reading formats. The field trip had to be deleted the second year due to lack of funding. Field trips really can make a difference. Decibel readings further support the “in the zone technique” as it gives us evidence of lower sound levels (at least by 10 decibels) in the treatment classes as opposed to the control or all boy groupings. History/Conclusions Focused communication in the classroom is essential to success for both the student and teacher. Spatial orientation, usually in the form of traditional rows, horseshoe, and modular tabling arrangements can influence the effectiveness of this kind of communication. Sommer explains that the traditional seating of rows first evolved as a way to best make use of light from side windows. (Sommers, 1969) Despite advancements in lighting, this configuration persists, with a recent survey showing over 90 percent of the classrooms at one university continue to use this arrangement. Smaller groups tend to employ the horseshoe arrangement and modular or cluster grouping is most often found in specialized classrooms like those teaching home economics or science. 50 • MSTA Journal • fall 2015 Lower elementary classrooms also frequently utilized this sort of seating plan. It has been known for a long time that, depending on the type of communication you wish to foster, seating organization is important; seating in rows is best to disseminate information since it limits student interaction and places the primary focus on the teacher. Horseshoe arrangement is effective for both student-student and student-teacher interaction and modular seating is most effective for student-student interactions. Despite our knowledge about different seating arrangements, may educators continue to use only one seating mode. Teachers cite reasons like janitorial preference, that they were chastised by their colleagues or administrators for having a ‘messy room’ when they changed or that they had just never thought about changing their seating as reasons for using traditional row-based seating. Change in education is very slow. Teachers are conservative and resistant to permanent change. They make short-term changes in teaching pedagogy, but often end up going back to what they feel works the best, making effecting lasting change in education very difficult. I am not aware of anyone looking at vertical/gender separation in the classroom, only horizontal variation. Another factor that makes this arrangement particularly powerful is that it takes into account bullying behavior by gender. According to our school administrator, Dr. Kocenda, girls bully by gossiping and with their backs to each other in this seating arrangement, it is hard for them to do this. Boys tend to bully by physical horseplay and have a very large lab table separating them, thus reducing this problem. The “in the zone treatment” here is recommended for giving instruction only. It creates a focused environment for the teacher to introduce the lesson of the day. After the introduction of the lesson, students can either stay in their seats and work, or gravitate to group work within the room, depending on whether the assignment calls for student-student interaction. References Armstrong-Hall, J.G. (2008) Same Gender Instruction Part two: as it relates to content retention, submitted to the Troy Public Schools. Galvin, K & Book, C. (1976) Growing Together: Classroom Communication, Columbus, Ohio: Charles E. Merrill. Sommer, R. (1969) Personal Space, Englewood Cliffs, NJ: Prentice-Hall. McCroskey, J. C. & McVetta, R. W. (2009) Classroom Seating Arrangements: Instructional Communication Theory Versus Student Preferences. www.jamescmccroskey. com/publicaitons/82.htm. Hurt, H. T., Scott, M.D. & McCroskey, J. C. (1978) Communication in the classroom (Reading, Mass.: Addison-Wesley Publication Company. Kocenda, D. (2008) in communication, June 26, 2008. Middle school Assistant Principal. Artifacts and Data Tables— Essential Science Learning’s for 7th Grade Pretest and Posttest 1. Can you describe and draw the water, carbon and rock cycles? 2. Can you identify mechanical and chemical weathering and give examples of each? 3. What is the relationship between erosion and weathering? 4. What is the difference between succession and evolution? 5. What causes the seasons on the earth? 6. What is composition of the air and how does it move? 7. How is matter affected by temperature change? 8. What is the difference between a null and directed hypothesis and between an independent and dependant variable? 9. How do you write an abstract and a conclusion? 10.How does on analyze data? Pre-Test and Post-Test Data for Classes 2008 First Hour Second Hour Third Hour Fifth Hour Sixth Hour All Boys All Girls Boys and Girls Boys and Girls Boys and Girls Points / % Points / % Points / % Points / % Points / % Pretest 29 48 15 25 9 15 30 50 23 38 Posttest 57 95 42 70 22 37 27 45 27 45 Classroom Activities | www.msta-mich.org • 51 Pre-Test and Post-Test Data for Classes 2009 First Hour In the Zone class Points / % Second Hour Control class Posttest 33 Pretest 9 Fourth Hour All Boys Points / % Third Hour In the Zone class Points / % Points / % Fifth Hour In the Zone class Points / % 55 19 32 41 68 26 43 27 45 15 10 17 20 33 11 18 4 7 Percentage Increase/Decrease in Knowledge between the Classes 2008 1st hour 2nd hour 3rd hour 5th hour 6th hour 47% 45% 22% -5% 7% Percentage Increase/Decrease in Knowledge between the Classes 2009 1st hour 2nd hour 3rd hour 4th hour 5th hour 40% 15% 35% 25% 38% Loudness by Class Period Measured in Decibels for Ten Days in February and March, 2009 1st hour treatment 2nd hour control 3rd hour treatment 4th hour all boys 5th hour treatment 72.7 82.8 73.3 79.6 73.4 52 • MSTA Journal • fall 2015 The Decibel Difference tween a regular class set up with students seated at random and then put them in her “in the zone treatment” set up (see paper above). Here are the results: Private school results in a public school setting that is what Dr. Hall’s science class offers students at middle school. Recently Dr. Hall tested the decibel difference be- Hour Random Seating Decibel Level In The Zone Seating Decibel Level First 84 70 Second 82 70 Third 87 72 Fifth 81 71 Sixth 89 72 It is clear that there is a significant difference in decibel level when students are placed in the zone as opposed to their regular random seating in most classes. This year’s group has had a great deal of success with academics and their behavior and focus has been much improved thanks to this classroom design. Articles | www.msta-mich.org • 53 Seasonal Visibility of Stars, and Visibility of Planets in 2015-2017, from positions of planets in their orbits Robert C. Victor, Abrams Planetarium, Michigan State University These orbit charts and accompanying data table can be used for plotting the positions of the six inner planets, and determining any planet’s visibility as seen from Earth. In addition to doing the problem set below as a desktop activity, students can “act out” each problem’s situation in the classroom, by having one student represent the Sun, another the Earth, and others the five other planets. Be sure to have all students take a turn at representing the Earth. That student will do more than just stand in place, but will rotate as well, to determine planet visibility at dusk, in middle of night, and at dawn. These two charts of the orbits of the planets, one showing Mercury through Mars, and the other Mercury through Saturn, depict the view as seen from the north side, or “above” the solar system. In these views, the direction of revolution of the planets about the Sun is counterclockwise. The outer circular scale is labeled with values of heliocentric longitude, measured from the Vernal Equinox, or apparent direction of the Sun as seen from Earth at the beginning of northern hemisphere spring. That scale also indicates the directions of the thirteen zodiacal constellations (those in the plane of the Earth’s orbit) from the Sun. The directions of the five first magnitude stars Aldebaran, Pollux, Regulus, Spica, and Antares, as well as the Pleiades star cluster, are also indicated. The outer circular scale should be imagined to be much larger than shown: Earth is one astronomical unit, or 8-1/3 light minutes from the Sun, compared to stellar distances of many light-years. One light-year is approximately 63,000 astronomical units. On a chart where the Sun-Earth distance (one a.u.) would be represented by one inch, a light year would be represented by one mile. On both orbit charts, the Earth’s orbit is exactly in the plane of the sheet of paper. For all the other orbits, the portion drawn as a solid curve lies north of or above Earth’s orbit plane. The dotted part of the orbit lies south of or below Earth’s orbit plane. Viewed from the north side of the solar system, the Earth’s rotation on its axis also appears counterclockwise. But the axis of Earth does not point at right angles to the plane of the orbit; rather, it tips away from the perpendicular, leaning by about 23.4° toward the top edge of the chart, beyond the 90° mark of the circular scale. Using both orbit charts and the data table, try working out the answers to these questions: 1. Why is the Pleiades star cluster visible all night each year around November 20? Where (in what direction in the sky?) would you expect to see it at nightfall? In the middle of the night? At dawn’s first light? Why can’t you see the cluster for several weeks around May 20? 54 • MSTA Journal • fall 2015 2. On what approximate date each year is Aldebaran visible all night? Give approximate date of all-night visibility for Pollux; Regulus; Spica, Antares. 3.On what approximate date each year is Earth heading toward Antares and away from Aldebaran? On that date, Antares is visible (at dusk or at dawn?) about 90 degrees from the Sun, while Aldebaran is visible (at dusk or dawn?), also about 90 degrees from Sun. 4.In which month would a First Quarter Moon appear near the star Spica? Hint: The First Quarter Moon occurs when the Moon appears 90 degrees or a quarter-circle east (counterclockwise in this top view) of the Sun. 5.Northern Hemisphere summer? 6.Describe the arrangement of Sun, Venus, and Earth that occurred on August 15, 2015. The arrangement, with Venus passing between Earth and Sun, is called an inferior conjunction of Venus. Notice Venus was located in the portion of its orbit plotted as a dotted curve, rather than solid. During the alignment on Aug. 15, 2015, did Venus pass north, or south, of the Sun’s disk? Before Aug. 15, 2015, the previous time Venus passed between Earth and Sun occurred just over 19 months earlier, on Jan. 11, 2014. On that occasion, did Venus pass north, or south, of the Sun’s disk? Just over 19 months before that, on June 5, 2012, Venus appeared as a small black dot moving across the Sun’s disk. This rare event was a transit of Venus, which won’t happen again until December 10, 2117. From the orbit diagram, can you explain why transits of Venus can happen only in early June or early December? After Aug. 15, 2015, the next inferior conjunctions of Venus will occur at intervals of just over 19 months, on Mar. 25, 2017 (north or south of Sun’s disk?), Oct. 26, 2018 (north or south of Sun’s disk?), and Jun. 3, 2020 (narrowly N of Sun’s disk). For several weeks before and after each of these events, what will be the phase of Venus? In 2016, Mercury passes inferior conjunction on Jan. 14, May 9, Sept. 12, and Dec. 28. During one of these events, Mercury will transit the Sun’s disk. On what date? 7.Which brilliant planets will form a close pair on Oct. 25 and 26, 2015? (Use Outer Planets Chart.) When will the event be seen, at dusk or at dawn? The two planets will be easily seen within the same telescopic field. Describe their appearances through the telescope. Another planet, not as bright, will fit within the same 5° binocular field as the bright pair, forming a trio with them for eight mornings, Oct. 22-29, 2015. Which planet? For much of October 2015, yet another planet will be seen at the same time of day as the preceding three planets, but closer to the Sun and lower in the twilight glow. Which planet? 8.Using the Inner Planets Chart, find which two planets will appear close together in our sky on Nov. 3, 2015? On Feb. 13, 2016? On Jul. 16, 2016? For each pair, determine time of day it will be seen, at dusk or at dawn. 9.From late January through most of February 2016, all five naked-eye planets will be simultaneously seen in twilight. On Feb. 1, 2016, the Moon will appear half full and close to one of the five planets. Plot all the planets’ positions for that date on the orbit diagrams, and determine: (a) When can you see all five planets, at dusk or at dawn? (b) Names of the planets in order of their apparent positions in the sky, from the eastern Classroom Activities | www.msta-mich.org • 55 to the western horizon? (c) Name of the planet near the “half Moon” on Feb. 1? (d) Note in early February, no planets will be visible at dusk, in twilight after sunset. As the Earth rotates, which planet rises first in the evening, after sunset? Which rises last, shortly before sunrise? 10. Which planet will be at opposition, visible all night on March 7-8, 2016? In which constellation will it appear? Which bright star will appear about 18° west of that planet? 11. In which constellation will the Full Moon appear on May 21, 2016? Which bright planet will appear near the Moon that night? As the Earth rotates on its axis, the Moon and the planet, near opposition that night, will move together across the sky all night. 12. Saturn will be at opposition, visible all night, a day after the start of what month in 2016? In which constellation will it appear? Which bright star will appear near Saturn? 13. Using the Outer Planets Chart, find which two planets will appear close together in our sky on these dates in 2016: On Jan. 9? On Aug. 24? On Aug. 27? On Oct. 11? On Oct. 29? For each pair, determine the time of day it will be seen, at dusk or at dawn. 14. In what month in 2017 will Venus reach its greatest angular separation from the Sun in the evening sky? In what phase will Venus appear then? Follow Venus’ phases through a telescope evenings until late March that year. This data table at right and accompanying orbit charts can be used for plotting the positions of the six inner planets, and then determining any planet’s visibility as seen from Earth. For a set of questions about coming events, see the activity sheet, Seasonal Visibility of Stars, and Visibility of Planets in 2015-2017. To plot Earth or another planet on any date of interest, first place a ruler or straightedge on the orbit diagram. Lay it along a line from the center of the Sun dot to the appropriate degree mark on the circular scale matching the longitude of the planet on that date, as given in the table above. Next, make a tick mark where the straightedge crosses the planet’s orbit. Robert D. Miller, who provided the orbit charts, did graduate work in Planetarium Science and later astronomy and computer science at Michigan State University. He remains active in research and public outreach in astronomy. 56 • MSTA Journal • fall 2015 Data table for activity sheet,! Seasonal Visibility of Stars,! Data table for activity sheet,Seasonal Visibility of Stars,and Visibility of Planets in 2015-2017 ! ! ! ! ! and Visibility of Planets in 2015-2017! Heliocentric Longitudes of the Planets on the first day of each Heliocentric Longitudes of the Planets! month, July 2015-July 2017 on the first day of each month, July 2015-July 2017! Date ------7/2015 8/2015 9/2015 10/2015 11/2015 12/2015 Mercury ------350 163 264 9 180 272 Venus ------250 299 348 36 86 135 Earth ------279 309 338 8 38 69 Mars ------91 106 121 134 148 161 Jupiter ------149 151 154 156 159 161 Saturn! -------! 243! 244! 245! 246! 247! 248! 1/2016 2/2016 3/2016 4/2016 5/2016 6/2016 7/2016 8/2016 9/2016 10/2016 11/2016 12/2016 31 196 281 48 203 293 67 216 305 92 228 315 185 235 281 330 18 67 116 166 216 264 313 0 100 132 161 192 221 251 280 309 339 8 39 69 174 188 201 216 231 247 263 281 300 318 338 357 163 166 168 170 172 175 177 179 182 184 186 189 248! 249! 250! 251! 252! 253! 254! 255! 256! 257! 258! 259! 1/2017 2/2017 3/2017 4/2017 5/2017 6/2017 7/2017 117 239 323 129 242 338 145 50 100 145 196 244 293 340 101 132 161 191 221 251 279 16 35 51 68 83 98 112 191 193 195 198 200 202 205 260! 260! 261! 262! 263! 264! 265! This data table and accompanying orbit charts can be used for plotting the positions of the six inner planets, and then determining any planet’s visibility as seen from Earth. For a set of questions about coming events, see the activity sheet, Seasonal Visibility of Stars, and Visibility of Planets in 2015-2017.! ! To plot Earth or another planet on any date of interest, first place a ruler or straightedge on the orbit diagram. Lay it along a line from the center of the Sun dot to the appropriate degree mark on the circular scale matching the longitude of the planet on that date, as given in the table above. Next, make a tick mark where the straightedge crosses the planet’s orbit.! Classroom Activities | www.msta-mich.org • 57 Chart of Planetary Orbits ad es to A 90 80 to TAU 110 GEM RUS 60 100 INI Ple i 70 x ollu to P ldeb aran Mercury through Mars 12 0 50 13 ER 0 NC 0 40 14 CA S IE AR gu lus 15 0 Re 30 to PISCE 170 10 20 160 LE O Dec 21-22 S 0 Sept 22-23 SUN 350 M Jun 20-21 UAR 33 S A 23 CA 24 SC OR 0 0 OPH IUCH 260 SAGIT 270 280 32 0 30 S TARIU US 0 31 290 to A ntar es 250 RN CO I PR PIU 0 22 BR 0 0 US LI 0 1 2 ASTRONOMICAL UNITS 1 AU = 93 million miles 58 • MSTA Journal • fall 2015 3 Robert D. Miller Sept. 26, 2011 340 AQ 21 0 a pic to S M ar s 200 Ea rth O IUS Ve nu s VIRG 190 er cu ry 180 Mar 19-20 Chart of Planetary Orbits Ple ia de s to A 90 80 100 TAU GEM RUS 60 110 INI to 70 x ollu to P ldeb aran Mercury through Saturn 12 0 50 13 ER 0 NC 0 40 14 CA IE AR S 20 160 PISCE 170 10 lus 30 gu 15 0 Re LE O to 0 S 180 UAR r te AQ Ju pi 200 Sa P S A CA PIU 23 0 31 24 SC OR 0 C RI 0 tu US RN O BR 0 OPH IUCH 250 SAGIT ntar es 260 280 270 0 30 S TARIU US 290 to A 0 22 LI 32 rn 0 33 21 0 a pic to S 340 O IUS VIRG 190 350 E M arth ar s SUN 0 1 2 3 4 5 6 7 ASTRONOMICAL UNITS 1 AU = 93 million miles 8 9 10 Robert D. Miller Sept. 26, 2011 Articles | www.msta-mich.org • 59 60 • MSTA Journal • fall 2015 Rethinking the Egg Drop with NGSS Science and Engineering Practices Joshua Ellis, Assistant Professor of STEM Education and Emily Dare, Assistant Professor of STEM Education, Michigan Technological University, Matthew Voigt, Graduate Student in Math and Science Ed, San Diego State University, Gillian Roehrig, Professor of STEM Education, University of Minnesota The Next Generation Science Standards (National Research Council, 2013) call for instruction that weaves science, technology, engineering, and mathematics (STEM) concepts into a coherent instructional plan. Engineering is relatively new to K-12 classrooms, but has been shown to be beneficial to students due to contextualizing math and science content and developing students’ problem solving and teamwork skills (Brophy et al., 2008; Hirsch, Carpinelli, Kimmel, Rockland, & Bloom, 2007; Koszalka, Wu, & Davidson, 2007). However, integrating engineering into a science classroom proves challenging for many teachers when students choose to “tinker” in order to solve a given problem or challenge instead of meaningfully applying science and mathematics concepts (Dare, Ellis, & Roehrig, 2014). Many of these engineering-integrated science activities result in students not actively reflecting on the science content that would help them address the challenge effectively, where students instead resort to manipulating variables or design elements at random until the challenge is met. A favorite classroom activity that is evocative of this engineering-integrated approach to science is the famous Egg Drop. The objective for students is to design a device that protects a chicken egg from breaking during a fall, usually from the height of a building. Students may use the given materials to either cushion the egg on impact or break the fall of the egg with a parachute-type mechanism. In either case, the students either succeed or fail depending on the state of the egg after the drop. While this activity is often enjoyable and memorable for the students, it is often performed without students applying science concepts to the creation of their egg containers or analyzing any data to reiterate science or mathematics concepts. As suggested above, students make their design decisions based on common knowledge and trial-and-error. However, this activity is ripe for the application of not only science concepts but practices from mathematics and engineering. We sought to reimagine the classic Egg Drop activity in a way that guides students through an application of science, mathematics, and engineering while still retaining the fun of smashing eggs. One could argue that the Egg Drop is in fact more engineering than science; students are presented with a problem and must design, create, and test a device that will solve the problem. However, one could also argue that engineering requires the application of science and mathematics concepts, in which case the Egg Drop is nothing more than a fun, art-like project. Moore et al. (2014) note that engineering can provide an engaging context for science or mathematics learning, and the egg drop activity certainly seems to lend itself well to that approach. The Egg Drop is often associated (albeit loosely) with the physics concepts of force and motion, inertia, and impulse, and the activity is almost always presided over by a science teacher. Our approach to the reimagined Egg Drop activity, presented to science teachers during a professional development workshop, called upon students (science teachers in this case) to explore an engineering design challenge that required the understanding and application of scientific concepts to achieve the broad goal of protecting the egg. We allowed participating teachers to define a more specific goal for their students that would be applicable in their classroom; some examples included minimizing the force at impact, Classroom Activities | www.msta-mich.org • 61 calculating the deceleration of the egg, and simply determining the maximum impact velocity that would still render the egg intact. Success was assessed by the teachers’ use of mathematical data analysis and measurement techniques in their experiments and testing. Our target audience for this activity was physical science classrooms that ranged from elementary to high school. The traditional Egg Drop involves constant acceleration (usually by dropping the egg from a given height), and a complete understanding of force and motion in this context requires knowledge of quadratic functions. Since this would not be appropriate for many of our teachers’ students, we chose instead to “drop” the egg horizontally, allowing us to limit the motion of the egg to a constant velocity in a single dimension. This level of mathematical knowledge is relevant for solving many problems, and we want our new activity to situate students’ mathematical learning in the context of realistic life scenarios as described by Doorman and Gravemeijer (2009). With this in mind, we focused on answering this question: What real-world problem is analogous to protecting an egg traveling in a straight line? With motion restricted to one dimension, we decided that a train would be an appropriate mode of transport for the egg. This decision helped guide our development of an engineering design challenge with a realistic problem. At the time, rail transport was experiencing a huge boom as a result of oil mining in the Upper Midwest, which was not unknown to our participants, as many were well aware of the increase in rail traffic. Some were even aware of devastating recent derailments that had occurred (see below for more detail). This real-world problem provided us with a compelling context for our engineering design challenge: the egg was an analogue for hazardous cargo being transported by rail cars. The challenge for the students was to design a rail car that would protect the cargo (the egg) in the event of a collision. The challenge was named “Runaway Train,” and teachers were first presented with the following problem: Your client is a rail transport company that is experiencing a boom in business. They need to transport toxic and/or dangerous cargo, but all of their rail cars are too old and unreliable. Design a new rail car that will protect the cargo in the event of a collision. To provide context for the problem, we discussed what happens when a train collides or derails. A recent and tragic example of derailment had occurred on July 6, 2013 in the town of Lac-Mégantic, Quebec, Canada. The brakes on a 73-car freight train carrying crude oil failed, causing a derailment and massive explosion (Transportation Safety Board of Canada, 2014). The 1 kilometer blast radius killed 42 persons and destroyed over half of the buildings in the downtown area. Harrowing eyewitness statements (Crary, 2013) were also shared with the teachers in order to further personalize and contextualize the problem. For students, the reaction would not be much different. For our professional development purposes, the content focus was limited to force and motion, though teacher participants were encouraged to explore other related areas, such as momentum and energy. Table 1 depicts the NGSS standards related to our presentation of Runaway Train. One of the prompts that we provided regarding force and motion content is reproduced below: Collisions typically involve both forces and motion. The position, velocity, and acceleration of an object play a large role in the outcome of a collision, as do the interplay of balanced and unbalanced forces. Other concepts, such as force over time (impulse), may be relevant as well. 62 • MSTA Journal • fall 2015 Variations of this prompt might be useful for different applications of this engineering design challenge. Table 1. Potential NGSS Connections to Runaway Train Grade Level Elementary Possible science content to focus on Forces(including 1 friction) NGSS DCI S2.A: Each force acts on one particular object and has both strength and direction. An object at rest typically has multiple forces acting on it, but they add to give zero net force on the object. Forces that do not sum to zero can cause changes in the object’s speed or direction of motion. (3-PS2-1) The patterns of an object’s motion in various situations can be observed and measured; when that past motion exhibits a regular pattern, future motion can be predicted from it. (3-PS2-2) PS3. A: The faster a given object is moving, the more energy it possesses. (4-PS3-1) PS3.B: Energy is present whenever there are moving objects, sound, light, or heat. When objects collide, energy can be transferred from one object to another, thereby changing their motion. In such collisions, some energy is typically also transferred to the surrounding air; as a result, the air gets heated and sound is produced. (4-PS3-2, 4-PS3-3) PS3.C: When objects collide, the contact forces transfer energy so as to change the objects’ motion. (4-PS3-3) Middle Position vs. Time Graphs, Forces and Potential and Kinetic Energy PS2.A: For any pair of interacting objects, the force exerted by the first object on the second object is equal in strength to the force that the second object exerts on the first, but in the opposite direction. (MS-PS2-1) The motion of an object is determined by the sum of the forces acting on it; if the total force on the object is not zero, its motion will change. The greater the mass of the object, the greater the force needed to achieve the same change in motion. For any given object, a larger force causes a larger change in motion. (MS-PS2-2) All positions of objects and the directions of forces and motions must be describes in an arbitrarily chosen reference frame and arbitrarily chosen units of size. In order to share information with other people, these choices must also be shared. (MS-PS2-3) PS3.A: Motion energy is properly called kinetic energy; it is proportional to the mass of the moving object and grows with the square of its speed. (MS-PS3-1) PS3.B: When the motion energy of an object changes, there is inevitably some other change in energy at the same time. (MS-PS3-5) PS3.C: When two objects interact, each one exerts a force on the other that can cause energy to be transferred to or from the object. (MS-PS3-2) Classroom Activities | www.msta-mich.org • 63 Table 1. Potential NGSS Connections to Runaway Train Grade Level Possible science content to focus on NGSS DCI High Energy/work/power, a cceleration in 1D, forces (with vector diagrams), momentum PS2.A: Newton’s second law accurately predicts changes in the motion of macroscopic objects. (HS-PS2-1) Momentum is defined for a particular frame of reference; it is the mass times the velocity of the object. (HS-PS2-2) PS3.B: Mathematical expressions, which quantify how the stored energy in a system depends on its configuration and how kinetic energy depends on mass and speed, allow the concept of conservation of energy to be used to predict and describe system behavior. Our teacher participants were tasked with choosing a specific challenge to address when designing their rail car. Though our broad directions, outline above, provided teachers with thegoal of creating a new design for a rail car to protect the cargo, they chose how to test theirdesign in order to convince the client that their prototype was the best. Possible challenges thatwere presented to teachers were to create rail cars that would: have minimum damage ordeformation after a collision, be able to withstand the greatest impact force or deceleration, or safely stop at a certain distance from a potential “crash site.” Our teachers put themselves in the role of the student and tackled their challenges in creative ways. Their solutions were based on their knowledge of force and motion, and they assessed their designs through data analysis and measurement. For example, one team of teachers used Vernier force sensors to measure the relationship between the velocity of the rail car and the impulse it experiences when striking a barrier. Another team related the total mass of the train car to the force it experiences upon collision, which is often expressed in science textbooks as Newton’s 2nd Law. These groups then modified their designs in order to better protect the egg being transported by their rail car. These scientifically-grounded changes were based on not only their own group’s findings, but on those of other groups as well. This iterative engineering process allowed teachers to engage in redesign in order to generate more robust rail cars that were better able to protect their cargo. Figure 1 depicts the final designs of four teams. As our professional development drew to a close, we encouraged the teachers to think of different applications of this engineering design challenge. For those who were interested in exploring how the egg was affected by a crash, analysis of the collision via slow motion video was suggested. Others were interested in higher level physics in which students would be able to use motion sensors to explore the train car’s acceleration down a ramp. Energy was another potential connection that teachers were interested in exploring; one example involved investigating the sources of friction within the rail car system itself. For those who were concerned about their students’ economical use of materials, teachers proposed a budgeting scheme in order to encourage students to carefully plan out their prototype designs. Teachers suggested many other variables to manipulate, such as the speed and size of the train car as well as the number of eggs to transport and protect. Others wondered 64 • MSTA Journal • fall 2015 Figure 1. Examples of rail cars. about the physical placement of the egg within the car and hypothesized how their designs would fare if the egg was mounted tothe front of the train rather than “cradled” in a cargo hold. Compared to the typical Egg Drop activity used in many classrooms, this rebuild of the activity offers multiple opportunities for science and mathematics connections. Students engagedin this activity are called upon to consider a timely and relevant problem to solve through ascientific understanding of motion with engineering design processes. Runaway Train is aflexible vehicle for learning in a number of specific content areas and meaningfully integratesdata analysis and measurement into the design and redesign of the rail car. Specifically,technologies such as Probeware and video analysis tools hold the potential to aid students’ abilityto connect mathematical concepts to scientific practices. Through mathematical analysis of thetrain’s motion, students can apply their knowledge of discrete graphs in order to reason about themotion of their rail car. It is our hope that teachers find unique and applicable ways to bringengineering-integrated science learning to their students through Runaway Train. Acknowledgements This study was made possible by National Science Foundation grant DRL-1238140. The findings, conclusions, and opinions herein represent the views of the authors and do not necessarily represent the view of personnel affiliated with the National Science Foundation. Classroom Activities | www.msta-mich.org • 65 References Brophy, S., Klein, S., Portsmore, M., & Rogers, C. (2008). Advancing engineering education in P-12 classrooms. Journal of Engineering Education, 97(3), 369–387. Crary, D. (2013, July 3). Lac-Mégantic’s resilience tested after ‘le train d’enfer’. The Kennebec Journal. Retrieved from http://www.centralmaine.com/2013/07/13/lac-megantics-resilience-tested-after-letrain-denfer Dare, E. A., Ellis, J. A., & Roehrig, G. H. (2014). Driven by beliefs: Understanding challenges physical science teachers face when integrating engineering and physics. Journal of Pre-College Engineering Education Research, 4(2), 5. Doorman, L. M., & Gravemeijer, K. P. E. (2009). Emergent modeling: discrete graphs to support the understanding of change and velocity. ZDM Mathematics Education, 41(1-2), 199-211. Hirsch, L. S., Carpinelli, J. D., Kimmel, H., Rockland, R., & Bloom, J. (2007). The differential effects of pre-engineering curricula on middle school students’ attitudes to and knowledge of engineering careers. Published in the proceeding of 2007 Frontiers in Education Conference, Milwaukee, WI. Koszalka, T., Wu, Y., & Davidson, B. (2007). Instructional design issues in a cross-institutional collaboration within a distributed engineering educational environment. In, T. Bastiaens & S. Carliner (Eds.), Proceedings of Work Conference on E-Learning in Corporate, Government, Healthcare, and Higher Education 2007 (pp. 1650– 1657). Chesapeake, VA: AACE. Moore, T. J., Stohlmann, M. S., Wang, H.-H., Tank, K. M., Glancy, A., & Roehrig, G. H. (2014). Implementation and integration of engineering in K-12 STEM education. In J. Strobel, S. Purzer, & M. Cardella (Eds.), Engineering in precollege settings: Research into practice. West Lafayette, IN. Purdue University Press. National Research Council. (2013). Next generation science standards. Retrieved from http://www. nextgenscience.org/next-generation-science-standards Transportation Safety Board of Canada. (2014). Lac-Mégantic runaway train and derailment investigation summary (Railway Investigation Report R13D0054). Gatineau, Quebec: Transportation Safety Board of Canada. 66 • MSTA Journal • fall 2015 Articles | www.msta-mich.org • 67 Twitter in the NGSS Classroom Saundra Rathburn, Science Department Chair, Lake Shore High School Science is different than any other discipline in its rate of change and advancement. This change is due to rapid changing technology, the changing environment and even the change of societal norms and values. As a science educator, it is crucial to stay informed of current research and to teach our students that what we teach them today, may be different tomorrow. Along with these changes in scientific knowledge, comes a change in how to teach students how to think about science. We are moving away from teaching a list of ideas to teaching students how to figure out these ideas on their own and even challenging them to contribute to new scientific discoveries. To challenge and advance student thinking, why don’t we use a communicative tool that many students are already using? Twitter as a Tool for Teachers Many of us face a dilemma with how much time we have in a day. We try to figure out how we can get the most “bang for our buck”, if you will. As an educator that likes to “unplug” at times, I do believe that using Twitter has saved me hours of classroom instruction and time spent communicating with parents. Twitter is a micro-blogging site that limits character usage to only 140. You simply can’t spend too much time! Ideas on how a science educator could use Twitter can be found in Figure 1. While you read these over, remember that possibilities are endless! Figure 1: Ideas on using Twitter Manage class discussions Classroom bulletin board Teaching small pieces of information Real life photos and examples of science Foreign correspondents Challenge problems Current science information Networking Collaboration with other educators In a virtual class discussion, students may feel more comfortable speaking out, more personally attached to their ideas and more likely to take risks than they would in class. As a teacher, you can send out ideas for students to discuss and facilitate their discussion, even after classroom hours. This keeps students actively thinking about your subject after they have left the brick and mortar of the classroom itself. Twitter can become a classroom bulletin board. You can send reminders to wear appropriate lab attire, to do their homework or to spread school announcements. Are your students struggling with homework? You can send out small bite-size pieces of information such as equations, food for though, and/or inspirational quotes. One of my goals as a science teacher is for students to see the world around them with a scientific “lens”. I can model this “lens” by sending out pictures of phenomena and have them do the same. For ex- 68 • MSTA Journal • fall 2015 ample, Twitter can be used to explain a rainbow, identify household chemical uses, or to identify a transfer of energy. Scientists all over the world are researching and discovering. Students can personally reach out globally to other students, teachers, and scientists. It’s up to you if students communicate with others locally and globally. Challenge students outside of the classroom through sending out posts on Twitter. They can be thinking and solving these problems before class the next day. Many people use Twitter as their primary source of news. Information travels fast and it’s current. Students can obtain current events, publications, charts, and data to analyze. What do students get out of using Twitter? Many of our students are already familiar with forms of social networking such as Facebook and Twitter. Teaching with one of these technologies impacts the content that they learn, but allows them to learn how to connect with professionals, learn how to express themselves, and learn how their knowledge impacts others in a far-reaching way. Ideas on how students can use Twitter are found in Figure 2. Figure 2: Student Uses of Twitter Sharing knowledge Learning Helping others Stop student isolation Express independent thinking Networking Community Involvement Technology Skills How to Set up a Twitter Feed Once you have created an account, you will need to do a couple of simple steps to get the most out of this micro-blog. You will need to find a list of educators and scientific professionals to follow, this can easily be found online. You will also decide on a hash tag (#). I find it helpful to have one # per class. When you upload anything for students or parents of a particular class, include the # that you have created. Students will use the same # and it then becomes a tool to gather information in one place. By searching the #, you can gather all of the “tweets” in the same place. Conclusion We have to use every tool in our toolbox in order to teach this next generation of science students. Not only can we teach technological advances through the use of technology itself, but also we can engage students when they are out of our classrooms and out in the world. What could be a better way for them to see the world wearing science “lenses”? References “Can Tweeting Help Your Teaching?” NEA. Rss. Web. 9 Sept. 2015. <http://www.nea.org/home/32641. htm>. Lasic, Tomaz. “Twitter Handbook for Teachers.” 8 Apr. 2009. Web. 9 Sept. 2015. <http://www.scribd. com/doc/14062777/twitter-handbook-for-teachers#scribd>. Classroom Activities | www.msta-mich.org • 69 Seasonal analysis: A 5E lesson on Michigan weather and the “reason for the seasons” Julie Henderleiter, Chemistry Department, Grand Valley State University Overview Many upper elementary and middle school students, as well as many adults, have a hard time understanding the “reason for the seasons” (Schnepps and Sadler, 1989). Providing students with a concrete connection between the concept and real-life experiences involving weather patterns, with the support of models, helps solidify understanding of the reason for the seasons. Understanding the motion of the Earth around the sun, and how this phenomenon gives rise to the seasons, addresses Michigan content expectations as well as NGSS standards. This lesson provides 4th – 6th grade students with weather data from the first full week of each month. Data are from the Kalamazoo/Battle Creek International Airport weather station, as recorded on the Weather Underground website from November, 2013 through October, 2014. Students work in groups of 2-3 to analyze the data and look for seasonal variations in the weather event. Based on the data, students determine which weather events vary with the seasons, then explain why. Students finish the lesson by critiquing the way they looked at weather patterns. Was sampling the first full week of each month a good approach? This lesson should be sequenced after a lesson or lessons modeling the motion of the Earth around the sun. Students should have experience with models or animations showing how the angle of sunlight changes with the seasons. The Seasons and Ecliptic simulator from the Astronomy Education Group at the University of Nebraska-Lincoln (2009) is a good simulation. Lesson Objectives 1. Find patterns in weather data. 2. Evaluate weather patterns to determine which can be explained by seasonal variations in light levels on Earth. 3. Comment on the appropriateness of the data collection method. Benchmarks Michigan Grade Level Content Expectations addressed (4th and 5th grade): • S.IA.04.11 Summarize information from charts and graphs to answer scientific questions. • S.RS.04.15 Use evidence when communicating scientific ideas. • E.ST.04.25 Describe the apparent movement of the sun and moon across the sky through day/night and the seasons. • S.IA.05.11 Analyze information from data tables and graphs to answer scientific questions. 70 • MSTA Journal • fall 2015 • S.IA.05.13 Communicate and defend findings of observations and investigations using evidence. • E.ES.05.61 Demonstrate and explain seasons using a model. NGSS/Proposed Michigan Standards addressed (middle): ­ NGSS middle school standards addressed: MS-ESS1-1 Develop and use a model of the Earth-sun-moon system to describe the cyclic patterns of lunar phases, eclipses of the sun and moon, and seasons. Advanced Preparation • Copy one of the seven weather patterns (Appendix 1) for each group, along with the first and last sheet of the activity (Figures 1 and 2). • Students work in groups of 2-3. Two or more groups can analyze the same weather pattern. Engage • Ask students about the weather over the past week or month. How has it changed? What types of weather patterns have they noticed? What weather variations typically occur over the current month? Over the season? Over a year? Ask students why the weather is changing. Record student ideas for future reference. • Ask students to assign each month of the year to a season. To simplify analysis, guide students to assigning three months to each season. Students may disagree on how best to group the months into seasons—winter may be November, December and January or December, January and February. Ask students to justify their groupings. Depending on prior experiences or lessons, students may refer to equinoxes or solstices as reference points for assigning months to seasons. Accept groupings that are supported by reasonable explanations and evidence. Explore • Introduce the activity to students with the first page of the student materials. The information about median values may be new to your students. Guide them through the first page of the lesson (Figure 1), where students learn about selecting the median value. We can all think back to an 80 degree day in October or a 50 degree day in July. Selecting the median, or middle values for high temperature, low temperature, humidity, wind speed, and wind gust speed help compensate for extremes in the data. If your students have learned about averaging, use average values instead. • Divide students into groups and allow them to work with their weather data. Circulate while students work, guide and prompt students to sort through the data and explain their results. Students working with temperature data may need help deciphering what is meant by the hottest high temperature, coolest high temperature, hottest low temperature, and coolest low temperature. Consider explaining these as the “hottest hot day” or “coolest hot day”. • As groups finish their analysis, have a student record their group results in a central location such as the board or copy of page 2 of the student handout (Figure 2). Each group should record results from peers. Classroom Activities | www.msta-mich.org • 71 Figure 1. Student handout page 1. Weather and seasons in Michigan: Is there a pattern? We are going to look at weather patterns and try to connect them to the seasons. The patterns are from the first full week of each month from November, 2013 – October, 2014. To help find patterns, we will use the median value for some information. The median value is just the middle value. Median values can be used instead of high or low values. Using median values helps make sure very high or very low values don’t affect results. For example, May is often a warm month in Michigan, not a hot month. Sometimes we have a few very hot days in May. Picking the middle (median) temperature instead of the high temperature keeps May as a warm month, not a hot month. Write down a list of 5 numbers in the space below. Then rewrite the numbers in order from smallest to largest. Circle the median (middle) value. After a short class discussion, explain median in your own words. Put your weather information on the board. As you wait for all groups to respond, look for patterns in the results. In your group, discuss what might cause these patterns. Figure 2. Student handout page 2. Weather event Season with the most or highest result Precipitation (inches) Daytime High Temperature (oF) Daytime Low Temperature (oF) Humidity (percent) Wind speed (miles per hour) Highest wind gust (miles per hour) Cloud cover (sunny or partly sunny) 72 • MSTA Journal • fall 2015 Season with the least or lowest result Answer the questions below in writing. Work individually on your answers. 1. Choose two weather patterns and tell how the position of the Earth and sun helps explain each pattern. Look carefully at your data. Think about what causes the seasons. Not all of the weather patterns we looked at can be explained by the position of the Earth and sun. 2. We looked at one week’s weather for each month over the year to find weather patterns. Was this a good way to look at weather patterns? Defend your answer. Explain • Once groups have recorded their results, discuss patterns that emerge. Ask students which results seem to make sense—should it be warmer in summer? Windier? More humid? Why or why not? Ask students if any results seem strange or if any results do not seem to fall into a pattern. Challenge students to explain their pattern, or explain why there is no pattern. Ask students what varies across the seasons. What causes the variation? • Students may notice that there is very little variation in median wind speed, wind gusts, and humidity. Ask them to think about what this might mean in terms of seasonal variation. Can there be windy or humid days in any season? Students will likely notice the number of sunny/partly sunny days seems to vary by the seasons—there are zero in winter and ten in the summer in this data set. Ask students if they can think of a reason related to the seasonal motion of the Earth around the sun that could explain the data. It is OK for students not to have an explanation of the number of sunny and cloudy days. This lesson is not intended to explore climate or weather concepts in detail, though the information could be used in other lessons if desired. • Have students explain, and model if needed, the motion of the Earth around the sun. Focus student attention on the angle of the light and relative length of the day. The Seasons and Ecliptic simulator from the Astronomy Education Group at the University of Nebraska-Lincoln (2009) is a good simulation. It shows the motion of the Earth around the sun, the angle of sunlight falling on the earth, and provides a moveable reference point on the Earth that can be set to your location. Use the orbit view, drag the stick figure to an appropriate latitude and select view from the side, then select sunbeam angle to see these features. Elaborate • Ask students what weather patterns would be different if the Earth’s axis did not tilt. Have students design a model to show how the angle of the sunlight would be different if the Earth’s axis had no tilt. Evaluate • Students complete the questions on the second page of the student handout. They should explain the seasonal variation in high and low daily temperatures by the amount and directness of sunlight during the seasons, mainly winter and summer. There are more hours of sunlight, and the angle at which the sun strikes Michigan is larger (more direct, or more perpendicular to Earth’s surface), during the summer. The increased duration of sunlight and angle of sunlight leads to warmer high and low temperatures. The shorter hours of sunlight and smaller angle of sunlight during the winter leads to less warming, so lower high and low temperatures. Classroom Activities | www.msta-mich.org • 73 • Student critiques about using the first week of each month to collect weather patterns should focus on the reasonableness of sampling the weather by this method. Students might suggest looking at averages for a month (or week) instead of the actual measures for individual days to account for outliers. Students may question why the first full week of each month was selected. They may wonder if the results would be different if the last week of each month were selected instead. Students may suggest that the middle month of each season would be most representative of the weather during the season. Accept student suggestions accompanied by a solid reason. My experience teaching this lesson is positive. My fourth grade students were interested in finding weather patterns. Prior to this lesson, we had modeled the motion of the Earth around the sun using balls and lamps. We also looked at the Seasons and Ecliptic Simulator and drew models of the Earth’s motion. We had good discussion about which weather patterns really related to the seasons. Dealing with the “red herring” data was a struggle for some students, though it brought up some great questions for future learning about weather patterns. References Michigan Department of Education (2007). Grade Level Content Expectations: Science. East Lansing, MI. Nebraska Astronomy Applet Project (2009, 19 March). Seasons and Ecliptic Simulator. Available from http://astro.unl.edu/classaction/animations/coordsmotion/eclipticsimulator.html NGSS Lead States. (2013). Next Generation Science Standards: For States, By States (Disciplinary Core Ideas). Retrieved from http://www.nextgenscience.org/ Schnepps, M. H., and Sadler, P. M. (1989). A Private Universe—Preconceptions that Block Learning [Video]. Cambridge, MA: Harvard-Smithsonian Center for Astrophysics. Retrieved from http://www. learner.org/resources/series28.html Weather Underground. (2015). Historical Weather Data, Kalamazoo/Battle Creek International Airport [Data file]. Retrieved from http://www.wunderground.com/history/ 74 • MSTA Journal • fall 2015 Appendix I. Weather pattern data sheets Weather Pattern #1: Precipitation in Michigan This is a chart with the amount of precipitation that fell for one week each month in Michigan. Snow and rain are combined and measured as liquids. Add up the total precipitation for each week. “T” means there was a trace of precipitation. ”Trace” means the street and sidewalk looked wet, but there was not enough water to measure. Precipitation in Michigan (to nearest tenth inch) Month January February March April May June July August September October November December Day 1 T T 0 0 T 0 0 0 T 0 0 Day 2 T T 0 T T 0 0 0 Day 3 T T T T T 0 Day 4 T T T 0 0 Day 5 T T 0 T 0 0 0 0 T 0 T 0 0 0 0 T T Day 6 Day 7 T T 0 0 T 0 T T T 0 0 T 0 0 T T 0 0 T Total Which was the wettest season? Which was the driest? Tell how you broke ties if you had one. Wettest season Driest season Classroom Activities | www.msta-mich.org • 75 Weather Pattern #2: High Temperature in Michigan This is a chart with the high daytime temperatures in Michigan during a year. Circle the median high temperature for each week. Month January February March April May June July August September October November December Day 1 30 15 20 55 55 85 79 85 77 57 51 44 Day 2 17 23 14 53 58 83 84 85 78 61 51 41 Day 3 0 24 21 58 67 73 78 79 78 65 54 45 Day 4 18 23 25 59 70 66 76 81 79 61 56 58 Day 5 22 16 31 68 87 76 77 83 61 58 47 58 Day 6 41 12 42 68 76 81 81 84 57 57 46 29 Day 7 41 16 39 74 71 83 83 82 61 59 55 23 Which season had the hottest high daytime temperature? Which had the coolest high daytime temperature? Tell how you broke ties if you had one. Season with hottest high daytime temperatures Season with coolest hot daytime temperatures Weather Pattern #3: Low Temperature in Michigan This is a chart with the low daytime temperature in Michigan during a year. Circle the median low temperature for each week. Low temperature each day in Michigan (in °F) Month January February March April May June July August September October November December Day 1 17 -5 3 24 43 57 62 53 50 40 33 28 Day 2 -12 -7 -6 30 42 67 66 61 51 45 37 25 Day 3 -11 0 1 36 36 57 65 65 60 42 50 31 Day 4 -13 11 14 32 47 55 60 59 57 43 42 40 Day 5 2 2 8 40 51 49 52 54 51 38 33 29 Day 6 22 -1 10 30 54 50 53 55 49 32 29 19 Day 7 33 3 21 38 49 52 62 57 43 34 41 12 Which season had the hottest low daytime temperature? Which had the coolest low daytime temperature? Tell how you broke ties if you had one. Season with hottest low daytime temperatures 76 • MSTA Journal • fall 2015 Season with coolest low daytime temperatures Weather Pattern #4: Humidity in Michigan This is a chart with the average humidity in Michigan during a year. Humidity tells about water vapor in the air. Humid days can make us feel sticky if the temperature is high. When it’s cold and humid, the day feels even colder. Circle the median humidity for each week. Humidity each day in Michigan (in percent) Month January February March April May June July August September October November December Day 1 85 74 70 54 52 47 74 65 68 62 70 67 Day 2 79 75 62 59 55 78 69 69 66 80 64 76 Day 3 69 74 58 61 63 66 72 78 69 75 75 81 Day 4 78 73 65 55 65 80 67 65 83 61 77 88 Day 5 79 78 63 57 59 63 0 64 78 65 64 74 Day 6 93 73 69 54 68 56 69 63 76 74 71 73 Day 7 84 74 74 51 56 60 74 66 72 64 59 69 Which season had the highest humidity? Which had the lowest humidity? Tell how you broke ties if you had one. Most Humid seaon Least Humid seaon Weather Pattern #5: Average wind speed in Michigan This is a chart with the average wind speed in Michigan during a year. Circle the median average wind speed for each week. Average wind speed each day in Michigan (in miles per hour) Month January February March April May June July August September October November December Day 1 9 4 8 5 11 8 10 4 3 13 5 4 Day 2 18 2 3 4 5 13 9 5 4 8 9 3 Day 3 15 3 5 9 7 11 10 5 5 9 8 6 Day 4 5 12 7 7 11 3 8 4 11 11 14 8 Day 5 5 10 6 13 10 5 3 2 11 5 9 14 Day 6 7 11 7 4 14 2 3 3 5 3 5 9 Day 7 12 6 9 10 8 4 8 4 7 1 16 7 Which season had the highest average wind speed? Which had the lowest average wind speed? Tell how you broke ties if you had one. Windiest season Calmest season Classroom Activities | www.msta-mich.org • 77 Weather Pattern #6: Wind gusts in Michigan This is a chart with the highest wind gust speeds in Michigan during a year. Circle the median wind gust speed for each week. Highest wind gust each day in Michigan (in miles per hour) Month January February March April May June July August September October November December Day 1 30 12 19 17 35 34 29 18 17 28 17 14 Day 2 38 13 15 19 19 36 30 19 15 18 26 16 Day 3 28 17 14 26 22 31 31 19 21 24 17 17 Day 4 13 27 17 23 26 16 21 25 33 38 31 25 Day 5 14 25 14 34 28 16 6 19 27 22 28 33 Day 6 25 26 16 22 42 16 16 16 14 15 16 18 Day 7 28 13 18 29 27 15 26 18 24 14 36 20 Which season had the highest median wind gust? Which had the lowest median wind gust? Tell how you broke ties if you had one. Highest wind gust season Lowest wind gust season Weather Pattern #7: Cloud cover in Michigan This is a chart with the cloud cover in Michigan during a year. Count the number of sunny (s) and partly sunny (ps) days each week. Count the cloudy (c) and partly cloudy (pc) days. Cloud cover in Michigan Month Day 1 Day 2 Day 3 Day 4 Day 5 Day 6 Day 7 January February March April c pc c s c pc ps c c c pc c c c c ps c c s c c c c s c c c c May ps pc ps c ps c s June July August September October November December c pc ps ps pc pc pc c c ps c c ps pc ps c c ps c pc c c pc pc c ps c c s ps s c s c pc s c ps c s c c s c s c s c c Total (s) and (ps) Total (c) and (pc) Which was the sunniest season? Which was the cloudiest? Tell how you broke ties if you had one. Sunniest season 78 • MSTA Journal • fall 2015 Cloudiest season Classroom Activities | www.msta-mich.org • 79 80 • MSTA Journal • fall 2015 Notes: ______________________________________________________________________________ ______________________________________________________________________________ ______________________________________________________________________________ ______________________________________________________________________________ ______________________________________________________________________________ ______________________________________________________________________________ ______________________________________________________________________________ ______________________________________________________________________________ ______________________________________________________________________________ ______________________________________________________________________________ ______________________________________________________________________________ ______________________________________________________________________________ ______________________________________________________________________________ ______________________________________________________________________________ ______________________________________________________________________________ ______________________________________________________________________________ ______________________________________________________________________________ ______________________________________________________________________________ ______________________________________________________________________________ ______________________________________________________________________________ ______________________________________________________________________________ ______________________________________________________________________________ ______________________________________________________________________________ ______________________________________________________________________________ ______________________________________________________________________________ ______________________________________________________________________________ Classroom Activities | www.msta-mich.org • 81 Michigan Science Teachers Association 1390 Eisenhower Place Ann Arbor, Michigan 48108 MSTA members! Access all MSTA publications by visiting www.msta-mich.org! Click on “publications” to get the information that interests you.