October 31, 2012 Iowa Physics Modeling Professional Development Program

advertisement
October 31, 2012
Grant # 12-4057, Carver Charitable Trust, awarded January 20, 2012
Iowa Physics Modeling Professional Development Program
Grant amount: $87,225
FINAL REPORT to the Carver Charitable Trust Board of Trustees
Overview
This report spans two years of the Iowa Physics Modeling Workshops which received funding through the
Carver Charitable Trust in January of 2011 at Iowa State University and 2012 at the University of Iowa. This
report will communicate the final evaluation of the Modeling workshop held at ISU in the summer of 2011.
Additionally, a partial report (lacking student impact information which takes the school year to gather) of
participant data and reflections from the summer 2012 at the University of Iowa will be included. A final report
on that component will be sent next fall. Identical text from last year’s report has been lightened to help reduce
redundancy and clarify additions to previously submitted reports.
2011 Profile of ISU Participants
The 2011 Modeling Workshop at ISU hosted 20 funded educators and two self-funded educators. A month
prior to the workshop there were three participants on a waiting list and several inquiries well after participants
had been finalized. The participants covered a wide range of experience and physics content knowledge. In
relation to experience, participants included one practicing educator with experience at the research level in
physics, physics majors, out of field science majors pursuing endorsement in physics or with endorsements in
physics, several math educators as the only qualified staff members to instruct physics and pursuing
endorsement, and two recent undergraduates seeking education employment. The range of experience was just
as diverse, with an estimated average teaching experience around 10 years. The content knowledge of
participants was similarly diverse with a number of participants achieving maximum pre-test scores while
several scored well below the identified threshold for conceptual understanding with the majority in between.
2012 Profile of U of I Participants
The 2012 Modeling Workshop at the University of Iowa hosted 24 funded participants with one withdrawing
after the first week due to a training conflict with training required for a position accepted after committing to
the U of I workshop. The workshop also included two self-funded participants from a school district in Omaha,
NE. The participant number was allowed to increase due to the majority of participants choosing to commute to
the workshop, making housing funds available for additional participants. As with the previous year, there were
several candidates on a waiting list after the participant list was finalized. The make-up of participants was
diverse; although, similar in nature to the 2011 workshop. Participants were evenly distributed in experience.
About a fourth of participants were entering their first year of teaching or in their first 5 years of experience.
Half were distributed between their fifth and 15th year of experience and the remaining fourth had 15 years of
1
experience or more. Academic backgrounds were just as diverse as the ISU participants as well. Applicants
ranged from those pursuing endorsement, out of field (non-physics) majors with endorsements, previously
practicing engineers who had entered the teaching profession, and two middle school teachers wishing to
deepen their content knowledge and prepare for future physics openings in their districts. Similar to the ISU
workshop participant FCI scores ranged from below the threshold for conceptual understanding to Mastery level
of understanding Newtonian physics.
Impact Assessment:
Participant FCI Data
The 2012 participants produced a similar set of data as the 2011 group. Demonstrating conceptual
understanding on the FCI is a goal for participants and there is a correlation between instructor performance and
their students’ performance on the FCI. Effectiveness of Modeling workshops are typically determined by
Normalized gain on the FCI. Normalized gains are a measure of how much participants improve in relation to
how much they could possibly improve. The standard set by the ASU Modeling workshops have established
normalized gains of .5 over the duration of NSF funding and through ASU’s MSN program. FCI scores are
interpreted with a score of 18 as the established threshold for conceptual understanding of Newtonian concepts
and scores of 24 or greater established as mastery of Newtonian concepts. Of the 25 participants completing the
2012 workshop, 11 of the 25 demonstrated mastery entering the workshop with scores greater than 26 out of 30.
Similar to the ISU workshop, this group was populated by participants with degrees in engineering, physics,
chemistry, or had been teaching physics for a significant period of time. The remainder of participants were
distributed similarly to the ISU group above and below the Newtonian threshold. Both the 2011 ISU and 2012
U of I participant data are presented in appendix 1 and reference to the table might be helpful for the evaluation
commentary that follows.
Several methods were conducted when determining normalized gains for both groups. Initially, an average
of participants normalized gains was determined and appears in the same row with average pre and post FCI
scores for the groups. An average normalized gain of .48 for the 2011 workshop and .40 for the 2012 workshop
were determined. This method was determined to be problematic as it includes the twenty percent of
participants with perfect pre-test scores. These participants could only produce a negative normalized gain,
none of the participants in either group did, and it was determined that averaging in these zeros is the primary
reason for normalized gains less than expected when comparing to the ASU standards.
Normalized gains were also computed from the participant average pre-test and post-test scores. This
method produces a normalized gain respective of all participants yet avoids the problematic nature of averaging
in zeros for participants with perfect pre-test scores. With this method, the 2011 workshop produced a
normalized gain of .51 while the 2012 participants produced a normalized gain of .58. The 2011 workshop is
comparable to what would be expected and the 2012 workshop exceeds the expectation, when comparing to the
ASU standard for Modeling workshops
Averaging of the participants’ normalized gain was also conducted with participants but excluded perfect
pre-test scores. Averaging all participants who could produce a net positive gain provided mixed results for the
two groups. Averaging participants normalized gain in this manner produced a gain of .58 compared to the
normalized gain of .51 with pre and post-test averages for the 2011 group. The 2012 group produced a gain of
.50 compared to .58 with the pre and post-test averages. However, the 2012 group did have one high scoring
individual produce a 28 pre-test and a 27 post-test, likely statistically insignificant, yet leading to a -.44 gain.
Factoring this individual out yielded a .55 gain. There was an additional participant in the 2012 group clearly a
statistical anomaly producing a 12 pre-test and a 4 post-test, leading to a -.5 normalized gain. Factoring out this
anomaly in conjunction with the other led to a .6 normalized gain which is comparable to the normalized gain
figured from the average pre and post-test of .58.
2
As expected, the workshops seemed to be generating aggregate data for participants that are consistent with
published results from the ASU NSF and MSN Modeling workshops. To further evaluate participant data in
relation to the grant goal of producing well qualified physics instructors, an evaluation of participants below
mastery entering the workshop was conducted. The 2011 group consisted of 10 participants below mastery on
the pre-test and finished with six of those ten at or above mastery on the post-test. The average normalized gain
from the 2011 sub group produced a normalized gain of .53 and .49 when figured with the methods described
above. The 2012 group consisted of 13 participants below mastery and finished with twelve of the thirteen at or
above mastery on the post-test. This sub group produced an average normalized gain of .62 when including the
statistical anomaly described previously and gains of .71 and .75 when excluding the anomaly and using the
methods described previously.
Impact Assessment:
2011 ISU Participant workshop reflections
In addition to the preliminary participant evaluation for conceptual understanding, this section consists of
testimonial evidence provided through participant reflections. Selected segments from participant weekly
reflections will be provided and entire communications can be made available upon request. Selections of
participant reflections will attempt to highlight the workshops effectiveness while weaknesses identified in
reflections will be discussed in relation to making adjustments for future workshops. Three excerpts taken from
weekly reflections as well as two participants’ overall workshop reflections are provided.
“After five days at the workshop I have learned an uncountable number of
things. At least once per day we have covered a specific idea that I have
struggled with teaching every semester. Specific examples would include
the derivation of the v2 equation, the weight mass relationship for a static
body, impetus misconception. I cannot wait to get into the classroom next
school year.
I also love how the modeling curriculum tends to lend itself toward a more
meaningful system of assessment. This is important when you are
working towards a deep meaningful understanding of nature from your
students.”
–Eric Grebe, Newton
“When I stop and think about the past week it is difficult to not first talk
about where I was as a teacher before beginning this course. I knew
before coming to this course that I had serious deficiencies in teaching
physics. It was not until Tuesday of this week that I realize just how
serious those deficiencies are. The reasons are not important, but my
personal responsibility to address them is mandatory as an educator.
The first week of this course has been a week of exponential growth for
me in content and pedagogy alike. Knowing the math behind physics is
all I thought I needed to teach it. My approach to understanding physics
concepts was to go to the back of the chapter, learn how to do the math in
the problems, and teach that to my students. This week I had many aha
moments concerning the science the behind the math. I had never taken
the time to do qualitative problems before and had many of the
misconceptions that others talked about “students” having.” – Jamey
Smith, Central Lee
3
“My predominant feeling from week 2 was, man, my head hurts. That
might have just been due to the humidity in the room, but I felt like my
brow was furrowed a good bit of the time on Monday and Tuesday.
That’s good though, in my book, frustration and confusion are very
important parts of the learning process. Ultimately, I came around
because we put a lot of effort in the workshop into exposing more of the
things that we probably worried about when we first start teaching, but
over time we let slide: Circular definition of work, obvious misuses of a
consistent energy concept by students, obvious failures on our part to
convey the utility of energy as an explanatory theme in physics. I feel like
I had good activities for kinematics that exposed the ideas organically, but
my energy stuff always came off feeling really contrived. The articles
relating to the issues in Energy were very helpful. I can tell that I really
need to put some thought into how I’m going to approach energy and
being sure I’m thinking a couple of moves ahead in the energy unit.”
– Matt Harding, Iowa City West
Modeling Workshop Reflection: Overall
I can say without reservation that this workshop was the greatest professional development experience that I
have participated in.
Among the things that made this workshop worthwhile:
-Exposure to and practice with advanced physics teaching techniques
-The ability to interact and collaborate with like-minded individuals
-A deeper appreciation for the methods of physics
-A deeper understanding of fundamental physics concepts
-Exposure to resources for further professional development (i.e. Physics Education Research)
-A broader appreciation for what physics can offer the general population of students
-Exposure to a more accurate and descriptive method of grading (i.e. Standards-Based Grading)
-Learning how to apply many of the ideas that are taught in teacher preparation courses
Ryan Johnson, Ottumwa
From the instructors’ perspectives, the reflection selections highlight what some of the perceived successes
of the ISU modeling workshop. The reflections were purposefully selected from a diverse set of participants in
relation to content knowledge and pedagogical knowledge. Despite the workshops diversity in content and
pedagogical background and experience, these reflections indicate modeling workshops are conducted in a
manner that allows all participants to engage in high quality professional development that promotes deep
reflection and professional growth.
2012 Participant Reflection
To contrast the instructor selected reflections from a variety of 2011 participants, a single 2012 participant
reflection is being pasted below. The instructors felt this reflection articulated ideas many participants chose to
reflect on, yet the reflection provided was exceptionally clear and thorough as to how a modeling workshop
could resonate with a single individual. We thought it might be more effective than providing similar selections
from several individuals. With the exception of bolding the weekly headings the reflection is as the participant
sent it at the end of the workshop.
4
Participant Will Swain’s Reflections
Week one reflection:
One week ago (before the course began) I thought my physics teaching was OK with room for improvement.
After listening to the modelers I see my 1st year teaching physics was a year of wasted opportunities to teach a
group of kids what they really needed to know. My physics course will completely change next year.
The ground work for this transformation had been laid for years. I knew my students (in all subjects) were not
thinking as much as I wanted. I knew from test data (both standardized and my own) that my students were
weak in interpreting data of any type, including graphs. I had already incorporated a fragmented version of a
modeling approach for much of my astronomy course and I had already incorporated much discussion and
writing about the Nature of Science in 10th grade biology. I was ready to put it all together for physics… but I
wasn’t sure how…
What I needed from this workshop were specific ideas as to how to incorporate the modeling approach into my
class room and that is exactly what I am getting. The systems that Shannon, Brad, and many participants use
are simple and intuitive and I feel like an idiot for not thinking of at least some of this stuff on my own. White
boards, genius. What a simple perfect system.
Speaking of feeling like an idiot, in addition to methodology I also needed to enhance my own content
knowledge. I thought I had excellent graphing skills, I didn’t, I thought I had mastery of basic Newtonian
concepts; it became apparent that some of my knowledge was fragmented. I was teaching and emphasizing the
things I thought I understood well and glossing over things I felt less comfortable with. I rationalized that the
things I didn’t teach were not that important. This workshop helped me see that these lazy rationalizations were
negatively impacting my teaching and more importantly student learning. Whereas graphing seemed like silly
busy work that I never bothered with before, I now see it as a valuable tool in the construction of the models of
motion and as valuable problem solving tool. After seeing how the modelers use it, it is now difficult to
imagine not using it.
Week two Reflection
I wrote in my week one reflection about the many teaching epiphanies that were the logical fallout from a nonalgebraic approach to teaching physics. Week two had fewer ah-ha moments as the approach (which was no
longer new to me) did not change from week one, we were now just applying this approach to other Physics
models.
The assessment talk was the highlight. Two years ago I heard Sean speak …one day after I went to a 21st
century PD course, (I needed the credit and it fit my schedule). In the 21st century skill course they talked about
“soft skills” and behaviors that students would need in the workforce; the importance of teaching those
behaviors, responsibility, getting assignments in on time, etc... Then I went to a formative assessment PD
where Sean was a guest speaker. The juxtaposition of these two PD courses with back to back with widely
differing views as to what to assess was quite interesting. I went back to school thinking about changing my
assessment practices (leaning towards SBG) but I had a few doubts and some specific concerns as to how to
pull this off. In biology (I wasn’t teaching physics at that time) it was difficult to even come up with the
Standards. In the end dogmatic Inertia won the day and I changed little. We know from learning theory the
importance of revisiting ideas… Well here we are again two years later. And this time there is no conflict in my
mind regarding SBG. I will do standards based assessment for my physics class next year.
Sean’s ideas about assessment will work in almost any course, but they really complement the modeling
approach to Physics perfectly. I believe the reason this course is such a logical fit is that in a modeling
approach it is clear to you and to your students what you are trying to accomplish. Because the goals/ standards
5
are so clear, standard based assessment (SBA) becomes a powerful approach to guide the learning of the
students.
If I do the thought experiment: “WRIAEB how well a course is taught and the appropriateness of SBA for that
course,” I would predicted positive relationship. If in a modeling approach to Physics the transition to SBA
seems simple and in my other courses it seems difficult, that makes me wonder about the quality of my other
courses. In the end, the modeling approach to physics will likely cause me to approach the teaching of my other
courses differently. My Biology, Human Biology, and Astronomy courses will likely improve as a result of me
attending this workshop. The Carver folks are getting more out of a modeling approach to Physics than they
realize.
Week 3 reflection
The workshop has value because we get an opportunity to not only learn, but to pay attention to how we feel as
we learn.
At the beginning of Tuesday morning’s challenge lab neither my partner nor I had a clear idea of how to
approach the problem. I felt frustrated. I didn’t want to be rude but I felt his idea was needlessly complex. It
took some convincing but we went my route. That was about the last good decision we made as a team. After
that he lost me with his approach I lost him with my approach. We worked in isolation. When we compared
numbers we were not even close. Turned out my number was close while his was way off. After the test my
partner, who is not one to let things drop, kept at his calculations. He used a graphic approach just for fun and
showed it to me. It was solid, there were no flaws that I could see and yet the answer was slightly different
from mine. I revisited my calculations and I found that I had two offsetting errors. After the activity was over,
we worked together much better than we did while we were under a time constraint, which brings me to my
take home points from analyzing how I felt and how I learned.
#1 When people are under a tight time constraint they behave/learn differently. Not everyone processes
information or solves problems at the same speed. Slow processing has little to do with intelligence but we
assume they are synonymous so we set up tasks to reward the fast thinkers. These hurry up activities, which are
common (and fun!), might do a disservice to the slow processing kids. Poor learning behaviors like shutting
down, low confidence, not asking questions, are potentially reinforced with these timed activities for a child (or
adult) that processes a concept slowly.
#2 My partner and I would have worked better together if we felt there was time to LISTEN. Usually
communication problems are a result of listening issues not speaking issues. We had difficulty articulating our
thoughts because we were in the process of forming them and trying to communicate them while someone else
was talking to us. This was a set up for communication failure. (Part of the fun as a teacher is to watch the
Mayhem, but it is likely not the best way to foster good listening skills) A better approach that is more
conducive to teaching communication skills might be to give everyone the problem to solve, have them work on
it silently to collect their thoughts, and then group them.
#3 Having students explain how they got their wrong answer and teasing out where in the process they went
wrong, is more illuminating than the correct solution. It is better to find your own error than to have others
point it out, so this should be the first approach. Conversations about; multiple approaches, standard error
ideas, worst possible angles to choose are all important things to discuss after the lab. Taking time to do this is
important.
#4 In my Physics course last year I taught (covered) more than just mechanics (optics, Electromagnetic energy,
nuclear energy, thermodynamics, etc…) In hindsight there is no way that I gave students enough time with the
6
activities and concepts to learn mechanics and all of these topics. Hammering out fragmented understanding
of a concept takes time and not giving enough time leads to frustration.
Impact Assessment:
2011 Participant mid-year updates
Of the 22 workshop participants, Nine have indicated fully implementing modeling curricular resources with
fidelity. One other participant has implemented the majority of materials with room for open ended
explorations. Of the remaining 12, one completed Modeling activities but supplemented with other research
based activities rather than modeling curricular materials, two implemented to some extent but had to
compromise fidelity for curricular constraints, one abandoned implementation as a result of mandated curricular
assessments, two implemented parts with populations with specific needs, one could not implement due to
course schedule, three pursued educational advancement and are not currently teaching, and there were two who
did not respond with a mid-year update but have indicated they are implementing in previous communication.
In addition to addressing level of implementation, participants were also required to address several other
questions. Most importantly, participants commented on how their participation has helped them make progress
toward the goals they have for their students. In general most participants’ perceptions indicated significant
increases in understanding physics concepts, improved collaboration and team work, a far greater degree of
self-efficacy and self-regulation, improved student data analysis and interpretation skills, and improvements in
effective communication and representation of concepts. Comments related to problem solving ability, ability
to pose effective questions, a deeper understanding of the NOS, and greater understanding of how to learn were
also indicated in several respondents communication. In relation to how attending the workshop has improved
participants practice, overwhelmingly participants indicated an increased ability to diagnose their students
understanding and appropriately address student misunderstandings. Other insights related to practice include a
renewed enthusiasm for teaching, improved understanding of how students learn, greater insight on how to
appropriately scaffold concepts, improved ability to question students and guide them to greater understanding
with effective questioning, improved ability to explicitly teach the nature of science, and improved ability to
hold students accountable. These testimonials provide insight on the effectiveness of the ISU Modeling
workshop and to a great extent are directly related to broader educational reform goals.
Impact Assessment:
2011 Participant TPI Results
The Teaching Perspectives Inventory (TPI) was taken prior to the workshop and at the conclusion of the first
year of implementation. The delay was intentional due to the short duration of the workshop. While all
participants completed an initial TPI, with the exception of one, there were several individuals who never
returned a final TPI at the conclusion of the school year. Additionally, some participants only forwarded a
partial TPI summary addressing the perspectives only. Fortunately, the sample includes teachers that were
distributed reasonably well with the FCI performance. The TPI identifies 5 perspectives in three areas, beliefs,
actions, and intentions. For any given perspective the highest score possible is a 45 and the lowest possible is a
9. In general averages for any category are in the mid-thirties and dominant perspectives are those with higher
scores. A general descriptor of each perspective follows:
Transmission- Effective teaching requires a substantial commitment to subject matter or content.
Apprenticeship- Effective teaching enculturates students into a set of social norms and habits of mind.
Developmental- Effective teaching is planned and conducted from a learner’s point of view.
7
Nurturing- Effective teaching assumes long term persistent effort to achieve comes from the heart as well as the
mind.
Social Reform- Effective teaching seeks to change society in substantial ways.
As we might expect, for participants providing a pre and post TPI with beliefs, actions, and intentions, the
intention category experienced the highest frequency for a significant positive change. We can speculate this is
the result of participants reflecting on their first year of implementation and recognizing there is much work to
be done to achieve competency with Modeling methods and curricular materials. There were positive changes
in the beliefs of participants but they were not as great as the intentions although they had a similar frequency.
Perhaps the smaller change is due to participants holding similar beliefs prior to the workshop yet becoming
more convinced as a result of implementation. Finally, we saw less change in the action category with less
frequency for positive change. This might be expected given participants intentions experienced the greatest
changes and action would follow any intentions for change. This leads us to believe that the belief frequency
could be largely attributed to the workshops effectiveness on the participant since one would naturally think
belief changes would occur after action has taken place.
In relation to the teaching perspectives, most participants were fairly well balanced between the perspectives
with many having dominant developmental and/or apprenticeship perspectives entering the workshop and most
having social reform as the lowest scoring perspective. We would likely expect to see changes in the
developmental and apprenticeship categories given the nature of Modeling instructional practices and
theoretical underpinnings. Most candidates experienced a positive change in either category, with many having
a positive change in both. Developmental gains can be largely attributed to an instructional design that provides
concrete experiences first and heavy reliance on representational strategies that make abstract concepts more
concrete. Modeling instructional design can also help explain positive changes in apprenticeship perspectives.
The majority of Modeling instruction emphasizes the collection and validation of empirical data, argumentative
discourse, evaluating claims, evidence based reasoning, and the testing and validation of basic scientific models.
This activity closely models authentic scientific practice making an apprenticeship perspective practical.
Negative changes were most frequent in the nurturing category. Decreases in nurturance are likely attributed to
the use of strategies that place heavy demands on the learner and clear standards for proficiency helping
teachers separate personal and emotional relationships from the goals of the course. Interestingly, positive
changes in transmission were not expected and there were several individuals who had significant positive
changes in social reform. The positive changes in transmission seem to contradict changes in development, yet
modeling instructional strategies have a tendency to make explicit the tacit decisions a person with expertise
might not make explicit. Perhaps these positive gains are attributed to transmitting science as a way of knowing
rather than from a delivery of content point of view. The large positive changes in social reform are likely
attributed to participants that have the ability to promote modeling based practices as broader more
generalizable habits of mind. These habits are likely viewed as transferrable across many domains, lending
themselves to informed social decision making. While no clear correlation between TPI scores and FCI scores
can be identified, it is worth recognizing that participants with developmental scores closer to 40 and
transmission scores in the lower thirties all had reasonably high FCI gains. The one participant with a
transmission score at 40 and a developmental score in the mid-thirties had the lowest FCI gains.
Impact assessment:
Student FCI Data and VASS Profiling
To be consistent with first year modelers completing a modeling workshop at ASU, average student post-test
percentage of 52% or a normalized gain of 0.35. This is a correction from a previous report indicating that first
year modelers typically produced an average normalized gain of .42. The graphic below is a summary of
ASU’s findings for over 7,500 students.
8
FCI mean
score (%)
80
69
60
40
42
20
26
Post-test
52
26
29
Pre-test
Instruction
Novice
Expert
type
Modelers
Modelers
FCI mean scores under different instruction types
Traditional
A table of ISU participant student data is provided in appendix 2 to supplement the commentary that follows.
Sixteen participants sent FCI results for students with only five returning VASS survey data. The low response
for VASS data was likely due the request for VASS data occurring at the end of the school year and the labor
intensive nature of compiling the results. 2011 participant’s student data yielded expected results for a 3 week
modeling workshop. Average student normalized gain figured with average teacher gain and average pre and
post data yielded normalized gains of .40 and .39 respectively. These gains are slightly better than what is
expected from novice modelers after attending a three week workshop. More importantly, two of the 16
participants reporting data either could not fully implement due to curricular constraints or district mandated
tests. Three other participants fully implemented six or seven of the 9 units in the mechanics curriculum,
leaving out an important unit for discriminating addressed on at least 5 items on the FCI. Determining an
average normalized gain for participants who indicated full implementation or a majority implementation yet
completing the mechanics units yields a normalized gain of .46 and .45 respective to the gains reported for all
participants. For comparative purposes, the gains have been superimposed over a graphic from Hake’s
extensive study comparing interactive engagement methods (like modeling) to traditional instruction for high
schools, colleges, and universities. The overall participant student average is plotted as an orange star and the
blue star represents only students whose participants fully implemented.
9
Demographic information is provided with the FCI results with only one clear trend. There seems to be no
relation between rural, mid-sized, and urban population settings in relation to the student data. Trends in
percent free and reduced lunch and minority could not be clearly established. If all participants could have fully
implemented it might have been possible to infer some relations for these demographics. ASU findings indicate
Modeling’s effectiveness regardless of SES or minority demographic. Perhaps a similar trend will surface after
2012 U of I data are compiled with 2011 ISU student FCI results.
Two other findings can supplement the average student performance for the 2011 ISU participants. First, of
the 487 students taking the FCI, 73 students or 15% scored at the mastery level or above on the FCI. Secondly,
one participant who was familiar with interactive engagement methods, including modeling, and regularly
administered the FCI provided longitudinal data for his AP and regular physics courses. The longitudinal data
for this teacher clearly indicate an effect on this teacher’s student performance in both courses.
10
The VASS profiles obtained are consistent with what would be expected. The VASS survey probes students
in relation to scientific and cognitive dimensions. The survey profiles students into the following views: folk,
low transitional, high transitional, and expert. Folk views are associated with naïve realists and are considered
to be passive learners. Expert views are associated with scientific realism and hold active and critical learning
views. No students were classified with an expert profile and individual VASS scores reported for students
correlated well with performance on the FCI. Students performing well on the FCI post-test typically exhibited
a high transitional profile. Average VASS scores provide the same generalization, the higher students’ average
VASS profile, the greater the normalized gain for the small sample available.
Summary of Findings
The findings presented provide some solid evidence that the 2011 ISU Modeling workshop was a success.
Both participant and student data are consistent with published findings from the highly recognized ASU
Modeling Instruction Program. Participant FCI data demonstrates the majority of candidates performing below
11
mastery levels on the FCI can demonstrate mastery after attending a 3 week workshop. TPI data indicate
positive changes in the expected developmental and apprenticeship categories, with most participants
establishing positive changes in their intentions after the first year of implementation. Finally, student data
indicate performance on the FCI that is clearly better than traditional instruction and slightly above what is
expected from interactive engagement methods. Second year participant FCI data demonstrates improvement
when compared to the 2011 ISU data and we are hopeful this trend will continue through second year student
data.
Regards,
Iowa Modeling Workshop Consortium
Craig Ogilvie, Ph.D., Professor of Physics, Iowa State University
Mary Hall Reno, Ph.D., Professor and Department Chair, Physics, University of Iowa
Shannon McLaughlin, Physics Instructor, Norwalk High School
Jeff Weld, Ph.D., Director, Iowa Math & Science Education Partnership
12
Appendix 1
2011 ISU Modeling Workshop Participant FCI Scores
2012 U of I Modeling Workshop Part
Number FCI Pre-Test FCI Post-Test Normalized Gain
1
30
30
0
2
30
30
0
3
30
30
0
4
30
30
0
5
29
29
0
6
28
28
0
7
18
18
0
8
11
14
0.16
9
15
19
0.27
10
20
23
0.3
11
26
28
0.5
12
20
26
0.6
13
14
24
0.63
14
27
29
0.67
15
23
28
0.71
16
23
28
0.71
17
18
29
0.92
18
19
30
1
19
27
30
1
20
28
30
1
21
29
30
1
22
29
30
1
AVERAGES
22.44
26.28
0.48
0.51
Normalized gain with Average Pre and Post
Number FCI Pre-Test FCI Post-Test
1
22.00
NA
2
30.00
30.00
3
30.00
30.00
4
30.00
30.00
5
30.00
30.00
6
12.00
4.00
7
28.00
27.00
8
29.00
29.00
9
29.00
29.00
10
23.00
24.00
11
25.00
26.00
12
23.00
25.00
13
28.00
29.00
14
26.00
28.00
15
22.00
27.00
16
20.00
27.00
17
26.00
29.00
18
18.00
27.00
19
21.00
28.00
20
19.00
28.00
21
12.00
27.00
22
23.00
29.00
23
16.00
28.00
24
18.00
29.00
25
16.00
29.00
26
29.00
30.00
Averages
23.27
27.16
Average Normalized Gain N=5 through N=22
0.58
Normalized gain with Average Pre and Post
Average gain N=6 through N=26
Average gain N=7 through N=26
Average gain N=8 through N=26
13
ISU Participants Starting Below Mastery on FCI
FCI Pre-Test
FCI Post-Test Normalized Gain
18
18
0
11
14
0.16
15
19
0.27
20
23
0.3
20
26
0.6
14
24
0.63
23
28
0.71
23
28
0.71
18
29
0.92
19
30
1
18.1
23.9
0.53
Normalized Gain from averages
0.49
Appendix 2
14
U of I Participants Starting Below Mastery o
FCI Pre-Test FCI Post-Test Normalize
23.00
24.00
0.14
23.00
25.00
0.29
22.00
27.00
0.63
20.00
27.00
0.70
18.00
27.00
0.75
21.00
28.00
0.78
19.00
28.00
0.82
12.00
27.00
0.83
23.00
29.00
0.86
16.00
28.00
0.86
18.00
29.00
0.92
16.00
29.00
0.93
12.00
4.00
-0.44
18.69
25.54
0.62
without the highlighted anomolie belo
19.25
27.33
0.71
Normalized Gain from averages
0.75
2011 ISU Student FCI, Vass, and Demographic
Teacher Number
14598 AP
56742
78574
84747
14598 Regular
14879
45668
57832
32585
42789
87888
85784
97845
98766
56788
68522
N
60
41
6
16
61
4
12
4
22
49
73
FCI Post
24.47
20.00
21.00
18.61
18.43
17.50
17.25
17.75
17.09
14.20
12.45
FCI Pre
11.93
8.00
10.80
7.89
8.61
6.00
8.33
8.00
10.41
6.35
6.73
% Gain N Gain VASS Implementation
41.80
0.70
150 Full
40.00
0.55
80% w/open inquiry
34.00
0.53
Full
35.74
0.49
150 Full
32.73
0.48
Full
38.30
0.48
80%-competitions
29.79
0.43
Full
32.50
0.44
150 Full
22.27
0.37
147 Full
26.19
0.34
144 Full
19.06
0.25
Full
9
11
15
15.00
17.73
11.27
10.22
10.91
6.53
15.93
22.73
15.78
0.29
0.37
0.19
Full through unit 6
Full through Unit 7
Full through Unit 6
0
2
9
3
3
5
13.91
12.10
7.00
7.92
23.02
14.56
0.31
0.20
Curricular constraints
Low time/curriculum
57
14
6
16.80
8.48
27.78
18.07
8.46
32.04
0.39
0.40
0.45
0.46
11
93
487
Cohort Averages All Participants
Average of N gain
Cohort Averages of full Implement
Average of N gain
Percent of students achieving mastery on FCI= 15%
15
% minority %
25
NA
0
44
1
25
NA
7
1
0
18
5
Appendix 3
number
unit cost
total
Iowa Regents Modeling Workshop Budget Proposal
Participant Independent Costs
1 IMSEP Adminstration Costs
$4,125
$4,125
0 UoI admin support, registration, email, logistics, & benefits
$2,500
$0
1 Evaluation costs
$2,500
$2,500
Other
$0
Other
$0
Management Costs
4 Regents Professor Visits
$6,625
$150
University Faculty
$600
0
2 Facilitator Stipend
$4,500
$9,000.00
2 Facilitator Expenses (Housing, Travel, Meals)
$1,000
$2,000.00
Facilitator Costs
$11,000.00
Participant Dependent Costs
20
20 Stipend @ $150/day
$2,250.00
$45,000.00
20 housing $50/day for 20 days
$1,000.00
$20,000.00
$150.00
$3,000.00
$50.00
$1,000
Participant total cost $3,450.00
$69,000.00
20 Meal Stipend
20 Teacher Supplies (Binder and Printed Materials)
20
total expenditure
16
$87,225.00
Download