LIVE INTERACTIVE LEARNING @ YOUR DESKTOP Preparing for NGSS: Planning and Carrying Out Investigations Presented by: Rick Duschl October 9, 2012 6:30 p.m. – 8:00 p.m. Eastern time 9 NSTA Learning Center • 9,500+ resources – 3,200+ free! – Add to “My Library” to access later • Community forums • Online advisors to assist you • Tools to plan and document your learning • http://learningcenter.nsta.org 10 Developing the Standards 11 Developing the Standards Assessments Curricula Instruction Teacher Development July 2011 2011-2013 12 NGSS Development Process In addition to a number of reviews by state teams and critical stakeholders, the process includes two public reviews. 1st Public Draft was in May 2012 2nd Public Draft will take place in the Fall of 2012 Final Release is expected in the Spring of 2013 IT’S NOT OUT YET! 13 A Framework for K-12 Science Education 14 Released in July 2011 Developed by the National Research Council at the National Academies of Science Prepared by a committee of Scientists (including Nobel Laureates) and Science Educators Three-Dimensions: Scientific and Engineering Practices Crosscutting Concepts Disciplinary Core Ideas Free PDF available from The National Academies Press (www.nap.edu) Print Copies available from NSTA Press (www.nsta.org/store) Scientific and Engineering Practices 1. Asking questions (for science) and defining problems (for engineering) 2. Developing and using models 3. Planning and carrying out investigations 4. Analyzing and interpreting data 5. Using mathematics and computational thinking 6. Constructing explanations (for science) and designing solutions (for engineering) 7. Engaging in argument from evidence 8. Obtaining, evaluating, and communicating information 15 Crosscutting Concepts 1. Patterns 2. Cause and effect: Mechanism and explanation 3. Scale, proportion, and quantity 4. Systems and system models 5. Energy and matter: Flows, cycles, and conservation 6. Structure and function 7. Stability and change 16 Disciplinary Core Ideas Life Science Physical Science LS1: From Molecules to Organisms: Structures and Processes PS1: Matter and Its Interactions LS2: Ecosystems: Interactions, Energy, and Dynamics LS3: Heredity: Inheritance and Variation of Traits PS2: Motion and Stability: Forces and Interactions PS3: Energy PS4: Waves and Their Applications in Technologies for Information Transfer LS4: Biological Evolution: Unity and Diversity Earth & Space Science Engineering & Technology ESS1: Earth’s Place in the Universe ETS1: Engineering Design ESS2: Earth’s Systems ETS2: Links Among Engineering, Technology, Science, and Society ESS3: Earth and Human Activity 17 Closer Look at a Performance Expectation Construct and use models to explain that atoms combine to form new substances of varying complexity in terms of the number of atoms and repeating subunits. [Clarification Statement: Examples of atoms combining can include Hydrogen (H2) and Oxygen (O2) combining to form hydrogen peroxide (H2O2) or water(H2O). [Assessment Boundary: Restricted to macroscopic interactions.] Performance expectations combine practices, core ideas, and crosscutting concepts into a single statement. 18 Closer Look at a Performance Expectation Construct and use models to explain that atoms combine to form new substances of varying complexity in terms of the number of atoms and repeating subunits. [Clarification Statement: Examples of atoms combining can include Hydrogen (H2) and Oxygen (O2) combining to form hydrogen peroxide (H2O2) or water(H2O). [Assessment Boundary: Restricted to macroscopic interactions.] Performance expectations combine practices, core ideas, and crosscutting concepts into a single statement. 19 Taking Science to School 20 For States 21 By States A Framework to guide changes in K-12 science Assessment s Curricula Instruction Teacher Development 22 4 Strands of Science Proficiency • Understanding Scientific Explanations – understand central concepts and use them to build and critique explanations. • Generating Scientific Evidence – generating and evaluating evidence as part of building and refining models and explanations of the natural world. • Reflecting on Scientific Knowledge – understand that doing science entails searching for core explanations and the connections between them. • Participating Productively in Science – understand the norms for presenting scientific arguments and evidence and practice productive social interactions with peers around classroom science investigations. NRC, 2008 Ready, Set, Science! 23 Science & Engineering Practices • 1. Asking questions (for science) • Planning and Carrying Out and defining problems (for Investigations engineering) • Scientists and engineers plan and carry • 2. Developing and using models out investigations in the field or laboratory, working collaboratively as well • 3. Planning and carrying out as individually. Their investigations are investigations systematic and require clarifying what • 4. Analyzing and interpreting data counts as data and identifying variables or • 5. Using mathematics and parameters. computational thinking • Engineering investigations identify the effectiveness, efficiency, and durability of • 6. Constructing explanations (for designs under different conditions. science) and designing solutions (for • Planning and carrying out investigations may engineering) include elements of all of the other practices. • 7. Engaging in argument from evidence • 8. Obtaining, evaluating, and communicating information 24 Webinar Outline • Generating Evidence • Designing Experiments • Evaluating Evidence • Two Broad Themes: – The role of prior knowledge in scientific thinking at all ages – The importance of experience and instruction 25 Generating and Evaluating Chapter 5 Evidence and Explanations TSTS Major Findings in the Chapter: • Children are far more competent in their scientific reasoning than first suspected and adults are less so. Furthermore, there is great variation in the sophistication of reasoning strategies across individuals of the same age. • In general, children are less sophisticated than adults in their scientific reasoning. However, experience plays a critical role in facilitating the development of many aspects of reasoning, often trumping age. • Scientific reasoning is intimately intertwined with conceptual knowledge of the natural phenomena under investigation. This conceptual knowledge sometimes acts as an obstacle to reasoning, but often facilitates it. • Many aspects of scientific reasoning require experience and instruction to develop. For example, distinguishing between theory and evidence and many aspects of modeling do not emerge without explicit instruction and opportunities for practice. 26 Poll 1 Familiarity with the NRC Reports Taking Science to School and Ready, Set, Science! A. I have read both reports and understand the main messages and recommendations. B. I have only read Ready, Set, Science! and understand the main messages and recommendations. C. I have read Ready, Set, Science! and I am familiar with the main messages. D. I have heard about Ready, Set, Science! but have not examined the report. E. I have not heard about Ready, Set, Science! 27 Generating and Evaluating Chapter 5 Evidence and Explanations TSTS Two Major Shifts from Current Curriculum/Instruction: – Shifting of science from ‘lone’ scientist in an isolated laboratory to an image of science as both an individual and deeply social enterprise. (Talk & Argument) (Critique & Communication) (Models and Representations) – Shift in scientific reasoning as a highly developed form of logical thinking that cuts across scientific domains to the study of scientific thinking as the interplay of general reasoning strategies, knowledge of the natural phenomena being studied, and a sense of how scientific evidence and explanations are generated. (Building & Refining Models, Mechanisms, and Theories) (Problematize the Evidence) 28 Poll 2 – Generating Evidence The evidence-gathering phase of inquiry includes planning and designing the investigation as well as carrying out the steps required to collect the data. Which of the statements below do you think is NOT a part of Generating Evidence? A. asking questions B. deciding what to measure C. developing measures D. collecting data from the measures E. structuring the data 29 Generating Evidence Generating evidence entails all of the following: – – – – – – – – 30 asking questions, deciding what to measure, developing measures, collecting data from the measures, structuring the data, systematically documenting outcomes of the investigations, interpreting and evaluating the data, and using the empirical results to develop and refine arguments, models, and theories. Asking Questions and Formulating Hypotheses • An iterative cycle – not a one-time event • Begin with exploratory study of natural world with structured observations that lead to specific questions and hypotheses • Collection of data could lead to new questions and revision of hypotheses and perhaps another round of data collection • Asking questions is also about formulating the goals of the activity and generating predictions 31 Rick Duschl Ted Willard Submit your questions via the chat. REMINDERS 32 • To turn off notifications of other participants arriving go to: ¾ Edit -> Preferences -> General -> Visual notifications • You can minimize OR detach and expand chat panel • Continue the discussion in the Community Forums ¾ http://learningcenter.nsta.org/discuss Brynn Slate Collecting and Structuring Data Exercise for Healthy Heart 1. Intro Unit and Lab 1 (Day 1) – Conduct prelab including demonstration of STEP test and taking a pulse. Students collect data Lab 1- Resting Heart Rate at at 6,10,15,& 60 seconds. 2. Data Collection Labs 2&3 (Days 2&3) – Lab 2 - Activity Level (slow/fast stepping) and Heart Rate – Lab 3 - Weight (with/without hand weights) and Heart Rate 3. Data Analysis for Labs 2&3(Days 4&5) – Knowledge Forum Activity “What Matters in Getting Good Data” – Determining Trends and Patterns of Data – Developing and Evaluating Explanations for the Patterns of Data 33 Poll 3 - Exercise for a Healthy Heart Agree/Disagree with the following statements. ✔ = Agree, ✖ = Disagree 1. It matters where you take a pulse Wrist, neck, thigh 2. It matters how long you take a resting pulse 6-10-15-60 seconds 3. It matters how long you take an exercising pulse 6-10-15-60 seconds 4. It matters who takes a pulse 34 Resting Heart Rates 6, 10, 60 sec Heartrate/min 60 s 35 92 33 86 85 81 81 80 79 75 75 75 73 72 70 70 68 67 67 66 66 64 64 62 60 60 60 60 59 59 57 56 31 29 27 25 23 student 21 19 17 15 13 11 9 7 5 51 50 49 3 1 36 0 35 20 40 60 heartrate 80 100 Designing Experiments Experimentation can be designed to: – Generate observations/measurements that induce a hypothesis to account for a pattern (Discovery Context) – Test an existing hypotheses under consideration (Confirmation/Verification Context) – Isolating variables – control of variables is a basic strategy that allows for valid inferences and constrains the number of possible experiments to consider. 36 Prior Knowledge • At all ages, prior knowledge of the domain under investigation plays an important role in the formulation of questions and hypotheses. • Time engaging with the phenomena is very important; in some domains students have this experience, in others it must be built into the classroom events. 37 Prior Knowledge & Benchmark Activities • Tasks that are given to students at the beginning of a unit prior to any instruction • Students can choose how to respond – Drawing, Labeled Drawing, Story Board, Symbols, Writing • Used by teachers to target instruction and identify learner’s – Commonsense understandings (misconceptions) – Productive intuitions 38 What does the child seem to understand? What does the child appear to confuse? What is the student ready to learn? Drawing 1 Drawing 2 39 What differences did you see? • Use of arrows – S1 as lines of force; S2 as pointers • Force concept – S1 uses word ‘force’; S2 does not • Confusions – S1 has ‘weight of air’ acting as a downward force, a frequent commonsense idea; gravity arrows sideways – S2 has buoyancy > gravity to explain sinking • Guiding conception – S1 uses density to explain floating/sinking – S2 uses gravity=buoyancy to explain floating/sinking • Productive intuitions – S1 uses buoyancy arrows to show water pressure acting in all directions 40 Designing Experiments Domain-general – minimize the role of prior knowledge (knowledge lean) – Example – Law of the Pendulum – isolate the 3 variables (length of string, size of weight, height weight is released) to determine which variable influences the period/time of swing. One Lesson. Domain-specific – infuses the role of prior knowledge (knowledge rich) – Example – Build a 1 second timer using the data set gathered from class investigations examining varying lengths of string; find out if the 1 second length works with wooden sticks and/or metal pipes; i.e., will it give the same results for a 1 second timer. A Sequence of Lessons. Sequence matters! Sustained engagement with the phenomena is essential! “Get a grip on nature!” 41 Heartrate/min 60 sec What’s the range for a normal heart rate? 35 92 33 86 85 31 81 81 80 79 29 27 75 75 75 73 72 70 70 68 67 67 66 66 64 64 62 60 60 60 60 59 59 57 56 25 23 student 21 19 17 15 13 11 9 7 5 51 50 49 3 1 36 0 20 40 60 heartrate 42 80 100 43 Growth: First Grade 44 Growth: Third Grade 45 Growth: Fifth Grade Shifts in Distribution Signal Transitions in Growth Processes 46 47 Epistemic (What Counts?) Discourse & Data Texts Data Texts – Selecting/Obtaining Raw Data – Selecting Data for Evidence – Patterns & Models of Evidence – Explanations of Patterns & Models 48 Data Transformations for Epistemic Dialog – T1 - what data count, are worth using – T2 - what patterns & models to use – T3 - what explanations account for patterns & models 49 Evaluating Evidence that Contradicts Prior Beliefs Chinn and Brewer propose that there are eight possible responses to anomalous data. Individuals can: (1) ignore the data, (2) reject the data (e.g., because of methodological error, measurement error, bias); (3) acknowledge uncertainty about the validity of the data; (4) exclude the data as being irrelevant to the current theory; (5) hold the data in abeyance (i.e., withhold a judgment about the relation of the data to the initial theory); (6) reinterpret the data as consistent with the initial theory; (7) accept the data and make peripheral change or minor modification to the theory; 50 (8) accept the data and change the theory. Rick Duschl Ted Willard Submit your questions via the chat. REMINDERS 51 • To turn off notifications of other participants arriving go to: ¾ Edit -> Preferences -> General -> Visual notifications • You can minimize OR detach and expand chat panel • Continue the discussion in the Community Forums ¾ http://learningcenter.nsta.org/discuss Brynn Slate PRACCIS Promoting Reasoning and Conceptual Change in Science Clark A. Chinn Richard A. Duschl Ravit G. Ducan Principal Investigators 52 Learning Targets: The scientific strategies 1. Reasoning about methodological strengths and weaknesses of studies • E.g., sample size; reliability and accuracy of measures; alternative interpretations of data; the adequacy of controls. 2. Interpreting data 3. Constructing models or explanations that fit complex patterns of data from multiple studies 4. Resolving conflicts among studies with seemingly incompatible results 5. Deciding the extent to which one can generalize 53 Lesson 2: Modeling Cellular Transport Overview: In this lesson students develop several models for how materials cross cellular membranes. Each of these models will be explored in more detail over the week. Students view the results of the iodine experiment which proves the viability of the ‘Squeeze model’ of cellular transport (i.e. simple diffusion into the cell). Students set up the egg experiment which will test the squeeze model in more detail in Lesson 3 – it is essential that the egg experiment be set up on this day – students must at the very least complete Row A and B (from which they can easily calculate C, time-permitting). Finally, time permitting; students discuss criteria for evaluating models (this can be moved to the next day if necessary). Driving Question: How could things get inside cells? Learning Objectives: Students will learn that the very basic ‘Squeeze model’ (i.e. simple diffusion into the cell) is a viable model of cellular transport. Students will learn more about how models work and how to build and justify them. Materials: • Handouts: Egg experiment directions, Egg experiment data sheet. • Overheads: Students models (drawn by teacher from discussion); ‘3 Kinds of Models’ • Egg Experiment: Per group – 2 deshelled eggs, 4 cups, balance, 100ml syrup, 100 ml water, plastic wrap, soap, paper towel, 2 plastic spoons. 54 Data Table 55 Data Table 56 High 57 Medium/High 58 Medium I think the less the density of the substance the easier for smaller things to get into / through something small like the cell membrane. Something else is water goes from when there is more molecules to where theres less molecules 59 Low I know that lead can get into people’s blood stream. I don’t think it can do anything besides eat the cell so that is why I think that. Then I think it takes over the cell so that it is dead. 60 61 “Setting up a model of the world to study the world does not come easy to children” Leona Schauble, Vanderbilt University • Prolonged experience with phenomena • Posing and revising questions – working over time to make explicit and refine criteria for good questions • Parsing objects and events into attributes that bear on the question • Considering/debating means of measuring attributes in ways that support an initial model of the phenomenon (considering the measure properties of those attributes • Generating/creating data (observing its measure qualities, reliability, etc 62 Continued . . . • • • • • • Structuring data (patterns are made, not found) Interpreting data as evidence – model construction Model testing against the original phenomenon & new cases Generation/entertainment of alternative models Evaluation of model fit Model selection/revision . . . which usually results in theoretically deeper questions Lehrer, R., Schauble, L., & Lucas, D. (2008). Supporting development of the epistemology of inquiry. Cognitive Development, 23, 512-529. 63 Rick Duschl Ted Willard Submit your questions via the chat. REMINDERS 64 • To turn off notifications of other participants arriving go to: ¾ Edit -> Preferences -> General -> Visual notifications • You can minimize OR detach and expand chat panel • Continue the discussion in the Community Forums ¾ http://learningcenter.nsta.org/discuss Brynn Slate NSTA Website (nsta.org/ngss) 65 Upcoming Web Seminars on Practices Date 66 Topic Speaker 1 9/11 Asking Questions and Defining Problems Brian Reiser 2 9/25 Developing and Using Models Christina Schwarz and CindyPassmore 3 10/9 Planning and Carrying Out Investigations Rick Duschl 4 10/23 Analyzing and Interpreting Data Ann Rivet 5 11/6 Using Mathematics and Computational Thinking Robert Mayes and Bryan Shader 6 11/20 Constructing Explanations and Designing Solutions Katherine McNeill and Leema Berland 7 12/4 Engaging in Argument from Evidence Joe Krajcik 8 12/18 Obtaining, Evaluating and Communicating Information Philip Bell, Leah Bricker, and Katie Van Horne All take place on Tuesdays from 6:30-8:00 pm ET Next Web Seminar October 9 (two weeks from today) Analyzing and Interpreting Data Teachers will learn more about: scientific investigations that produce data; the range of tools scientists use for scientific investigations—including tabulation, graphical interpretation, visualization, and statistical analysis—to identify the significant features and patterns in the data; Presenter: how modern technology makes the collection of large data sets much easier, providing secondary sources for analysis; Ann Rivet engineering investigations that include analysis of data collected in the tests of designs; and the range of tools engineers use to identify patterns within data and interpret the results. 67 Graduate Credit Available Shippensburg University will offer one (1) graduate credit to individuals who attend or view all eight webinars. Participants must either: Attend the live presentation, complete the survey at the end of the webinar, and obtain the certificate of participation from NSTA, or View the archived recording and complete the reflection question for that particular webinar. In addition, all participants must complete a 500 word reflection essay. The total cost is $165. For information on the course requirements, as well as registration and payment information visit www.ship.edu/extended/NSTA 68 Community Forums 69 NSTA Area Conferences These three conferences will include a number of sessions about the K–12 Framework and the highly anticipated Next Generation Science Standards. Among the sessions will be an NSTA sponsored session focusing on the Scientific and Engineering Practices. 70 NSTA Print Resources NSTA Reader’s Guide to the Framework 71 NSTA Journal Articles about the Framework and the Standards Thank you to the sponsor of tonight’s web seminar: This web seminar contains information about programs, products, and services offered by third parties, as well as links to third-party websites. The presence of a listing or such information does not constitute an endorsement by NSTA of a particular company or organization, or its programs, products, or services. 72 National Science Teachers Association Gerry Wheeler, Interim Executive Director Zipporah Miller, Associate Executive Director, Conferences and Programs Al Byers , Ph.D., Assistant Executive Director, e-Learning and Government Partnerships Flavio Mendez, Senior Director, NSTA Learning Center NSTA Web Seminars Brynn Slate, Manager Jeff Layman, Technical Coordinator 73