AN ABSTRACT OF THE THESIS OF
Ben U. Sherrett for the degree of Master of Science in Mechanical
Engineering presented on March 15, 2012.
Title: Characterization of Expert Solutions to Inform Instruction and
Assessment in an Industrially Situated Process Development Task
Abstract approved:
Milo D. Koretsky
Robert B. Stone
What constitutes a quality solution to an authentic task from industry? This
study seeks to address this question through the examination of two expert
solutions to an authentic engineering task used in the Chemical, Biological
and Environmental Engineering curriculum at Oregon State University. The
two solutions were generated by two teams of expert engineers with varying
backgrounds. The experts solved a process development problem situated in
the semiconductor manufacturing industry. Transcripts of audio recordings,
design notebooks, and other work products were analyzed to identify
common features in the two expert solutions. The study found that both
experts placed a large focus on information gathering, modeling before
experimentation, and fine tuning of the process. These solution features
define a core set of expert competencies and facilitate understanding of high
quality solution traits. An additional goal of the study was to identify
competencies unique to each expert solution. It was observed that the expert
teams used different proportions of first principles modeling and statistical
experimental design to solve the problem. This proportion was dependent on
the problem solver’s background and therefore should be expected to vary
among student solutions. Implications of the work regarding instruction and
assessment in engineering education are discussed.
©Copyright by Ben U. Sherrett
March 15, 2012
All Rights Reserved
Characterization of Expert Solutions to Inform Instruction and Assessment in
an Industrially Situated Process Development Task
by
Ben U. Sherrett
A THESIS
submitted to
Oregon State University
in partial fulfillment of
the requirements for the
degree of
Master of Science
Presented March 15, 2012
Commencement June 2012
Master of Science thesis of Ben U. Sherrett presented on March 15, 2012.
APPROVED:
Co-Major Professor, representing Mechanical Engineering
Co-Major Professor, representing Mechanical Engineering
Head of the School of Mechanical, Industrial and Manufacturing Engineering
Dean of the Graduate School
I understand that my thesis will become part of the permanent collection of
Oregon State University libraries. My signature below authorizes release of
my thesis to any reader upon request.
Ben U. Sherrett, Author
ACKNOWLEDGEMENTS
There are numerous people who have contributed greatly to my development
and ultimately to my ability to complete this work. First, my family on both
my mother’s and father’s side has always been supportive of my educational
and research endeavors. My father nurtured in me a curious and questioning
intellect and my mother dedicated herself to ensure that I had a solid
academic and moral foundation.
During my time in college I have had the great privilege of developing and
enjoying friendships with many amazing people. These friends have helped
me gain a greater understanding of the world and have given me reprieve
from the labor of graduate school when needed. These special people are
invaluable to me and have helped motivate me in countless ways. A sub-set
of these friends is composed of my colleagues within the Engineering
Education group at Oregon State University; Erick Nefcy, Bill Brooks, and
Debra Gilbuena. This group has served the role of both friends and research
collaborators. Without them, this research would have been much more
difficult and much less fun.
I have also had the pleasure of working with several past and present faculty
members who have offered guidance and provided me with opportunities, for
which I am extremely thankful. These faculty members include Dr. John
Parmigiani, Dr. Edith Gummer, Deanna Lyons, and my Co-Advisor, Dr.
Robert Stone.
This study would not have been possible without the participation of expert
engineers. I recognize that it uncomfortable to be observed, recorded, and
analyzed and I appreciate the willingness of the experts to expose their
thought processes as they worked to solve this complex and time-consuming
problem.
There are two people who have made especially large contributions to the
research and my personal development during my time in graduate school.
First, my co-advisor Dr. Koretsky, has been integral to my graduate studies.
In addition to advising the research presented in this thesis, Dr. Koretsky has
provided guidance regarding writing, teaching, civil service, and life in
general. He is someone who works extremely hard for something that he
feels passionately about and I admire him for that. The lives of countless
students have been enriched because of his efforts.
Finally, I would like to acknowledge my wife, Jen. She has continuously
supported me during my time in graduate school; serving as both a thoughtful
listener and a spirited motivator. I greatly appreciate her.
Funding for the study was provided by Intel Faculty Fellowship Program and
the National Science Foundation’s CCLI Program (DUE-0442832, DUE0717905).
CONTRIBUTION OF AUTHORS
Dr. Milo Koretsky, Erick Nefcy and Dr. Edith Gummer assisted with analysis
and writing for both manuscripts. Debra Gilbuena assisted in writing the
second manuscript.
TABLE OF CONTENTS
Page
1. Introduction ..................................................................................................1
2. Manuscript 1: Characterization of an Expert .............................................4
Solution to Inform Assessment in a Computer
Simulated and Industrially Situated
Experimental Design Project
2.1. Introduction ..........................................................................................5
2.2. Learning and assessment in computer .................................................9
enhanced learning environments
2.2.1. Types of Computer enhanced learning .........................................9
environments
2.2.1.1. Computer enhanced learning environments..........................10
placing the learner in the role of student
2. 2.1.2 Industrially situated computer enhanced ..............................11
learning environments
2.2.2. Evidence Centered Design as an assessment ............................13
Framework
2.3. Development of the VCVD Learning System ....................................16
2.3.1. The task model ............................................................................17
2.3.2. The evidence model ....................................................................19
2.3.3. The need for a competency model: motivation .........................22
of the expert –novice study
2.4. Method….. ..............................................................................................24
2.4.1. Expert Participant........................................................................24
2.4.2. Novice Participants .....................................................................25
2.4.3. Task .............................................................................................26
2.4.4. Data Sources ...............................................................................27
2.4.5. Analysis Methods........................................................................27
2.5. Results…. ...........................................................................................28
2.5.1. The expert’s solution...................................................................29
2.5.1.1. Information gathering ...........................................................29
2.5.1.2. Formulating the problem ......................................................32
TABLE OF CONTENTS (CONTINUED)
Page
2.5.1.3 Iterative modeling and experimentation to ...........................36
solve the problem
2.5.1.4. Summary of the expert’s solution .........................................40
2.5.2. Investigation of the application of the VCVD ...........................40
Learning System assessment framework:
comparing the expert modeling competencies
to student solutions
2.5.2.1. Evaluation of Modeling in Team A’s ...................................42
Solution (low performing team)
2.5.2.2. Evaluation of Modeling in Team B’s ...................................43
Solution (low performing team)
2.6. Discussion ..........................................................................................47
2.6.1. Reflections on the utility of the competency model ...................47
2.6.2. Limitations of the study and future work ....................................48
2.7. Conclusions ........................................................................................49
3. Manuscript 2: An Expert Study of Transfer in an .....................................51
Authentic Problem
3.1. Context and Research Questions ........................................................52
3.2. Expertise and Transfer........................................................................54
3.3. Methods ..............................................................................................55
3.4. Findings ..............................................................................................57
3.4.1. The Expert FPM Solution ...........................................................57
3.4.2. The Expert SED Solution............................................................58
3.5. Discussion ..........................................................................................59
3.6. Acknowledgements ............................................................................61
4. Conclusion…. ............................................................................................62
4.1. Informing instruction and assessment of the .....................................62
VCVD task
TABLE OF CONTENTS (CONTINUED)
Page
4.2. Informing instruction and assessment of other .................................63
engineering design tasks
4.3. Providing a transferable assessment framework ................................63
5. Bibliography…. .........................................................................................65
LIST OF FIGURES
Figure
Page
2.1. A diagram showing information gathering ............................................7
and the iterative cycle of model development,
use, and refinement
2.2. The logical flow suggested by Backwards..............................................14
Design and ECD is shown compared to the
framework used in the VCVD learning system
2.3 The expert Model Map ...........................................................................30
2.4. Two excerpts from the expert’s notebook (1) ........................................31
showing information gathering
2.5. Two excerpts from the expert’s notebook .............................................38
showing iterative modeling
2.6. Student Team A Model Map ..................................................................42
2.7. Student Team B Model Map ..................................................................44
3.1. A triangle representing three possible solution .....................................53
approaches
3.2. The two expert Model Maps ..................................................................60
LIST OF TABLES
Table
Page
2.1. A legend showing the meaning of various symbols ..............................21
used in the Model Maps analysis technique.
2.2. Competencies demonstrated in the expert’s solution..............................41
2.3. Evaluation of student team solutions ......................................................46
3.1. A key for the representations used in the Model Maps ..........................58
3.2. Comparison of the FPM and SED solution approaches .........................61
and reactor performance
Characterization of Expert Solutions to Inform Instruction and Assessment in
an Industrially Situated Process Development Task
1. Introduction
Ill-structured tasks are at the core of engineering practice and yet they often receive little
attention in the undergraduate curriculum, especially in the first three years. Instead,
instruction tends to favor deterministic problems with converging and limited solution
paths that result in a single correct solution. While this incongruence might seem
alarming, there are several plausible explanations why instruction may focus on more
constrained, deterministic types of problems, and thus this type of thinking is so heavily
entrenched in universities. First, constrained problems can adequately educate students
on the basic declarative and procedural knowledge associated with an engineering
domain (Shavelson, Ruiz-Primo, Li & Ayala, 2003). For example, there is a set of correct
steps for many of the basic engineering calculations such as for many mass, momentum,
and energy balance problems. Second, these types of problems are appealing to
educators because they do have a correct answer as well as a correct or favorable solution
path. These aspects aid instruction and assessment. The premise of the research presented
in this thesis is as follows: In order to facilitate integration of authentic and ill-defined
problems a better understanding is needed of both the knowledge structures required to
perform these types of tasks and the characteristics of "high-quality" solution paths.
Observation and analysis of how experts undertake such tasks can help elucidate both
these elements.
This work seeks to contribute to the advancement of engineering education through
offering a framework for classifying high-quality solutions to a specific authentic,
industrially situated, process development problem. The problem is referred to as the
Virtual Chemical Vapor Deposition (VCVD) process development task. It was created to
develop students’ abilities to apply fundamental engineering principles to solve realworld problems. It requires that the students propose and complete an experimental
design project situated in the semiconductor manufacturing industry. The objective of
2
the students’ project is to determine optimal settings for a high-volume manufacturing
process. The problem is complex and solutions typically take on the order of 10-30 hours.
The findings presented in this work are based on the classification of two expert
engineers’ solution to the same process development problem that students are given. We
identify competencies demonstrated in the expert solutions related to the use of strategy
and modeling. The intent is that (i) these competencies may be used to inform instruction
and assessment of student solutions to the VCVD task, (ii) the competencies may be
interpreted in the larger context of engineering practice to identify the transferable
knowledge and skills that are useful in solving process development problems, and (iii)
our framework may be useful for other engineering educators classifying quality
solutions in learning systems based on real-world engineering tasks.
This thesis is composed of two manuscripts focusing on characterizing expert solutions to
the VCVD process development task. The study began by first observing and
characterizing the solution of a single expert chemical engineer. This initial
characterization is reported in the first manuscript. The second manuscript describes our
work observing and characterizing the solution of a Mechanical Engineering team and
comparing it to the solution generated in the first part of the work. Further details
regarding the manuscripts follow.
Chapter 2 presents the first manuscript, which has been submitted to the journal
Computers and Education. The work presented in that paper is outlined as follows: First
an argument is made for the need to develop assessment methods in situated and
computer enhanced learning environments. Next, we review methods employed by other
researchers in the design and assessment of such learning environments. The paper then
introduces the framework employed in the development and assessment of the Virtual
CVD learning system. This introduction is accomplished through describing and
establishing connections between: (i) the task (the VCVD process development task), (ii)
the methods used in capturing evidence of student performance (Model Maps), (iii) and
the target competencies to be developed and assessed in students. The three components
of the framework are based on Evidence Centered Design, a well-established framework
3
for assessing context dependent, computer-enabled learning systems (Mislevy, Almond
& Lukas, 2003).
Prior work is then discussed regarding the development of the task and evidence
components of the VCVD framework. A list of target competencies is then populated
through examination of a single expert’s solution to the VCVD task. The list of target
competencies is referred to as the competency model and is used to evaluate two student
team solutions in order to examine the utility of the assessment framework.
The second manuscript constitutes Chapter 3. It was presented at the 2011 Research on
Engineering Education Symposium. It focuses on examining transfer of expertise in
expert solutions to the VCVD task. Transfer is important in engineering education
because engineers of the future will work in increasingly complex environments with
continuously advancing technologies. Thus, developing in students the ability to apply
core engineering principles to solve novel problems is essential.
The two experts examined in this manuscript had largely different backgrounds within
the field of engineering and neither had prior experience with developing CVD
manufacturing processes. The paper identifies three similarities in the strategic
approaches taken by both the experts. These similarities are important as they represent
general strategies that were applied independent of the expert engineers’ backgrounds.
Additional strategic approaches which differed between expert solutions are identified
and discussed. Implications for engineering education are presented.
4
2. Characterization of an Expert Solution to Inform Assessment in a Computer
Simulated and Industrially Situated Experimental Design Project
Authors
Ben U. Sherrett
008 Gleeson Hall, Oregon State University
Email: sherretb@onid.orst.edu
Erick Nefcy
008 Gleeson Hall, Oregon State University
Email: nefcye@onid.orst.edu
Edith Gummer
Education Northwest, Portland, Oregon
Email: Edith.Gummer@educationnorthwest.org
Milo D. Koretsky
103 Gleeson Hall, Oregon State University
Email: milo.koretsky@oregonstate.edu
Submitted to Computers and Education, Elsevier
5
Abstract
Context dependent, computer enhanced learning environments are a relatively new
instructional approach. Such environments place students in roles not typically found in
the classroom and the computer is intended to increase learning. The context dependent
nature of the learning environments often focus on eliciting complex learning related to
the application of knowledge and skill. While initial measures of learning and student
performance have been positive, assessment in these computer enhanced environments is
difficult and more work is needed.
This paper presents the study of an expert solution to an engineering process
development task. The task was developed to present to students in a computer enhanced
learning environment and it is situated in industrial practice. From examination of the
expert’s solution, 14 competencies are identified. These competencies are then used to
assess two student solutions to the same task.
2.1. Introduction
The potential for computers to positively impact educational practice is reflected in
several recent studies characterizing student learning in computer enhanced learning
environments (Quellmalz, Timms & Schneider, 2009; Podolofsky, 2010; Clark, Nelson,
Sengupta & D’Angelo, 2009; Gredler, 2004, Vogel, Vogel, Cannon-Bowers, Bowers,
Muse & Wright, 2006; Podolefsky, 2010), and summarized in a recent National Research
Council panel “Learning science: computer games, simulations, and education” (Honey
& Hilton, 2011). In this paper, we investigate a learning system that places students in the
role of real-world engineers to solve a complex problem. The learning system is
enhanced by a three-dimensional computer simulation that provides the students with
authentic data. Other authors have referred to similar learning environments using terms
such as “technology-enhanced student-centered learning environments” (Hannafin &
Land, 1997), “computer simulations and games” (Honey & Hilton, 2011), and “computer
6
enhanced learning environments” (Jacobson, 1991). In this paper, we utilize the term
computer enhanced learning environment as a general descriptor.
In the past decade, significant effort has been devoted to developing and implementing
computer enhanced learning environments at both K-12 and university levels; however,
systematic assessment practices have not kept pace with this rapid development (Honey
& Hilton, 2011; Buckley, Gobert, Horwitz, O’Dwyer, 2010; Quellmalz et al., 2009). In
this paper we describe an assessment framework used to investigate student learning in
one such computer enhanced learning environment, which we term the Virtual Chemical
Vapor Deposition (VCVD) Learning System. The VCVD Learning System is built on the
situated perspective of learning (Johri & Olds, 2011; Collins, 2011). It places students in
the role of process development engineers working in teams to complete an industriallysituated and authentic process development task. The experiments that student teams
design are performed virtually using the computer simulation. Previous research in our
group has demonstrated that student teams are afforded opportunities to practice the
complete, iterative cycle of experimental design where they develop and refine solutions
through experimentation, analysis, and iteration (Koretsky, Amatore, Barnes & Kimura,
2008). Integral to their success is the ability to develop and operationalize models and to
create appropriate strategies (Koretsky, Kelly & Gummer, 2011).
One challenge in assessing this complex, authentic task is the broad spectrum of
knowledge and skills that this activity elicits. While we acknowledge there are many
other abilities that are necessary to complete the task, our current focus is to specifically
examine modeling competencies - the ability to apply knowledge and skill to develop and
use models to complete the task. We use the definition of a scientific model proposed by
Schwarz et al. (2009, p. 633) that a model is “a representation that abstracts and
simplifies a system by focusing on key features to explain and predict scientific
phenomena.” We extend this definition to the context of engineering by claiming that
models allow engineers to better develop possible solutions to a design problem.
Furthermore, we assert that a key element of engineering practice is the ability to apply
and operationalize models.
7
Modeling has been identified as a critical element of engineering and science practice
(Cardella, 2009; Gainsburg, 2006; Clement, 1989). Several innovative STEM learning
systems have explicitly attempted to develop modeling skills in students, such as ModelBased Learning (Buckley, 2000) and Model Eliciting Activities (Lesh & Doerr, 2003;
Yildirim, Shuman & Besterfield-Sacre, 2010; Diefes-Dux & Salim, 2009). Theories
regarding modeling in STEM fields contend that models are constructed from prior
knowledge and newly gathered information and that they are refined in an iterative cycle
of creation, use, evaluation, and revision (Buckley, et al., 2010).
Figure 2.1. is adapted from the work of Buckley et al. (2010) and hypothesizes the
iterative cycle of the modeling process as might be used by a student team in our process
development task. The team first uses prior knowledge and information from the
textbooks, journals, and/or web sites to develop model components useful in
understanding the system.
Literature
Prior
Knowledge
Information
Gathering
Generate
New Model
Component
Model
Revise Model
Use Model to
Make
Predictions
Confirm
Model
Perform
Experiment
Artifacts
Actions
Abandon
Model
Component
Analyze Data
Data From
Experiment
Figure 2.1.
A diagram showing information gathering and the iterative cycle of model
development, use, and refinement commonly implemented in experimentation in
science and engineering.
8
The process of generating new model components may also involve the mathematization
of concepts. Mathematization is the process of using mathematical constructs to afford
greater understanding of real-world systems through activities ranging from “quantifying
qualitative information, to dimensionalizing space, and coordinatizing locations” (Lesh &
Doerr, 2003, pp. 15-16).
These model components are then assimilated into an overall model of the system which
is used to determine input parameters for an experiment. After running the experiment
and collecting data, the team compares the data against their model, resulting in
confirmation, revision, or abandonment of the model. Additionally, analysis of data may
lead to the development of a new model component. The newly refined model is then
used to predict future runs. In this iterative process, the model informs input parameters
and data from the process performance informs modeling efforts.
This paper explores the development and implementation of an assessment framework to
determine the extent that student teams engage in modeling in the VCVD Learning
System. The elements of the framework are based on Evidence Centered Design
(Mislevy, Almond & Lukas, 2003) and include the outcomes or competencies that are
desired, the evidence that is obtained from student teams that demonstrate these
competencies, and the task itself. The objective of this study is to characterize a set of
competencies related to modeling through investigation of an expert engineer’s solution
to the VCVD process development task.
The specific research questions guiding this study are as follows:
1. What modeling competencies manifest as an expert completes the VCVD task?
2. Do the competencies demonstrated by the expert inform the assessment of student
learning as verified by the examination of student work?
9
2.2. Learning and assessment in computer enhanced learning environments
2.2.1. Types of computer enhanced learning environments
Computer enhanced learning environments have been created to address a wide range of
content. STEM examples range from reinforcing conceptual understanding of core
biology (Horwitz, Neumann & Schwartz, 1996), chemistry (Tsovaltzi, Rummel,
McLaren, Pinkwart, Scheuer, Harrer et al., (2010)), and physics (Wieman, Adams &
Perkins, 2008) to understanding complex biological systems (Buckley, Gobert, Kindfield,
Horwitz, Tinker, Gerlits, Wilensky, et al. 2004).
Alternatively, computer enhanced learning environments can be examined in regards to
the role and context within which they intend to place students. From this perspective,
two primary classifications emerge. In the first type, the learner is placed in the role of a
student in a typical classroom or laboratory. In this role, the learner performs tasks
common in traditional classrooms or laboratories except they are performed on a
computer. Many virtual laboratories fit into this classification.
In the second type, the software and instructional design attempts to remove the learner
from the role and context of a traditional classroom or laboratory and place them into an
alternative role and context. We term such cases situated computer enhanced learning
environments. Examples include learning environments which place students in the role
of a wolf, hunting in the wilderness (Schaller, Goldman, Spickelmier, Allison-Bunnell &
Koepfler, 2009), a video gamer traversing a challenging obstacle (Davidson, 2009), or a
pioneer navigating the perils of the Oregon Trail (Chiodo & Flaim, 1993).
For the purpose of this paper, we focus primarily on a particular branch of situated
learning environments in which the learner is placed in the role of practicing professional.
We term this case, industrially situated. To set the context for the study, we next review
literature describing development and assessment of computer enhanced learning
10
environments placing the learner in the role of traditional student and environments
placing the student in the industrially situated role of practicing professional.
2.2.1.1. Computer enhanced learning environments placing the learner in the role of
student
A large body of work describes computer enhanced learning environments that place
learners in the role of traditional student. A common form of this type of learning system
is the virtual laboratory created to replicate the physical laboratory (Sehati, 2000; Shin,
Dongil, En Sup Yoon, Sang Jin Park & Euy Soo Lee, 2000; Wiesner & Lan, 2004; Pyatt
& Sims, 2007; Mosterman, Dorlandt, Campbell, Burow, Bouw, Brodersen et al., 1994;
Hodge, Hinton & Lightner, 2001; Woodfield, Andrus, Waddoups, Moore, Swan & Allen
et al., 2005; Zacharia, 2007; Abdulwahed & Nagy, 2009). In this context, the computer
can be used to reduce the resources required or to help students better prepare for the
physical laboratory. With the intent of enhancing learning, virtual laboratories may be
supplemented with visual cues or alternative representations not possible in physical
laboratories (Wieman et al., 2008; Dorneich & Jones, 2001; van Joolingen & de Jong,
2003; Finkelstein, Adams, Keller, Kohl, Perkins & Podolefsky et al., 2005; Corter,
Nickerson, Esche, Chassapis, Im & Ma, 2007).
When assessing student learning in such computer enhanced learning environments, the
presence of an analogous physical learning environment supports the use of pre-existing
assessment tools. Additionally, systematic studies to assess and compare learning
between the virtual and physical modes are often performed. When such learning is
compared directly, research shows equivalent and often greater learning gains in the
virtual mode (Wiesner & Lan, 2004; Pyatt & Sims, 2007; Zacharia, 2007; Finkelstein et
al., 2007; Corter et al., 2007; Campbell, Bourne, Mosterman & Broderson, 2002; Lindsay
& Good, 2005; Vogel et al., 2006).
11
2.2.1.2 Industrially situated computer enhanced learning environments
Industrially situated computer enhanced learning environments place students in the role
of a practicing professional, wherein students complete tasks commonly found in the
work place by interacting with simulated systems and instruments representative of real,
industrial-scale devices. Development of these situated learning environments is
motivated by the widely accepted claim that mastery of technical content alone does not
adequately equip students for professional practice. Rather, students should also engage
in learning activities that promote the development of knowledge and skills associated
with the application of this technical content to solve real-world problems (Bransford,
Brown & Cocking, 2000, p. 77; Litzinger, Lattuca, Handgraft & Newstetter, 2011;
Brown, Collins & Duguid, 1989; Herrington & Oliver, 2000).
Via tabulated data or mathematical simulation, computers can rapidly generate data
representative of real-world systems. Such data enables learners to engage in solving
problems representative of those found in industry. For instance, a computer enhanced
learning environment tasks civil engineering students with testing the dynamic responses
of multi-story structures to earthquakes, providing them with realistic data (Sim, Spencer
& Lee, 2009). In chemical engineering, learning environments based on full-scale
industrial processes of styrene-butadiene copolymerization and hydrogen liquefaction
have been reported (Kuriyan, Muench & Reklaitis, 2001; Jayakumar, Squires, Reklaitis,
Andersen & Dietrich, 1995). One capstone environmental engineering design project uses
a computer simulation to allow students to perform as field engineers at a virtual
hazardous waste site (Harmon, Burks, Giron, Wong, Chung & Baker, 2002).
In addition to a realistic task, some industrially situated computer enhanced learning
environments seek to provide a social context representative of industry. In these learning
environments, students interact with peers and the instructor(s) as they would colleagues
and project mentors. The social context is meant to encourage students to enter a
community of practice (Lave & Wenger, 1991) where they begin to reframe their identity
in order to think, act, and value as do practicing professionals in their field (Shaffer,
12
2008). For example, a suite of computer enhanced learning environments offering both an
industrially situated task and social context have been developed and termed Epistemic
Games (Shaffer, 2005). The games put students in the role of professionals engaging in
tasks in the fields of journalism, urban planning, and engineering (Rupp, Gushta, Mislevy
& Shaffer, 2010). In one such Epistemic Game, Nephrotex, learners act as new hires at a
high-tech bioengineering firm. The students interact with a computer simulation to design
an optimal dialyzer for a hemodialysis machine. The computer provides both the platform
for experimentation and interactive correspondence with simulated co-workers and a
graduate student acting as project mentor (Chesler, D’Angelo, Arastoopour & Shaffer,
2011). Chesler and colleagues describe the instructional goal of the Epistemic Games as
ranging far beyond the “conceptual understanding” pursued in many learning
environments. Rather, Epistemic Games focus on developing students’ skills,
knowledge, identity, values, and epistemology common to the community of practice
within which they are participating (Rupp et al., 2010).
Studies of industrially situated computer enhanced learning environments commonly
describe the use of computers in delivering an innovative instructional design. However,
studies that assess and evaluate student learning in these environments are far less
common. One likely reason is the challenge inherent in assessing students’ acquisition of
the rich set of instructional objectives typically put forth in these leaning environments
(e.g. thinking, valuing, and identifying as a member of a professional community as
mentioned by Rupp et al.). Assessment strategies must be developed that align with the
specific instructional objectives and several different approaches have been reported.
Hmelo-Silver (2003) investigated cooperative construction of knowledge in a clinical
trial design task that was presented to medical students via a computer simulation. She
used think-aloud protocol with fine and coarse grain discourse analyses to argue that the
learning environment promoted a joint construction of knowledge. While protocol
analysis offers detailed insight into the actions and cognition of study participants, it is
time consuming and impractical for the routine assessment of a large number of student
solutions to a complex tasks.
13
Chung, Harmon, and Baker (2001) studied student learning in the capstone
environmental engineering learning system cited above using pre- and post-concept
maps. They compared student concept maps to one(s) generated by an expert in the field.
They found that the students’ concepts and connections between concepts became more
aligned with the expert after the learning experience. However, a pre- and post-test
assessment approach poses potential misalignment for industrially situated learning
systems. Such learning systems are typically designed to develop the knowledge and skill
of a student within an authentic context; pre- and post-tests are often given in sequestered
context removed from the authentic task.
2.2.2.
Evidence Centered Design as an assessment framework
While the studies above show evidence of student learning in industrially situated
computer enhanced learning environments, several challenges limit the systematic and
widespread assessment of student performance and learning in such environments
(Honey & Hilton, 2011; Buckley, et al., 2010; Quellmalz et al., 2009; Gulikers,
Bastiaens, & Kirschner 2004). These challenges include: (i) lack of the ability to capture
complex performances that depend heavily on the authentic context within which they are
presented, (ii) capturing student performance in authentic ways during performance of the
task rather than before and after, and (iii) assessing student performance related to
application of knowledge and skills rather than solely possession of such competencies
(i.e., doing versus knowing). Recently, researchers have addressed these challenges using
the framework of Evidence Centered Design (ECD) (Bennet, Jenkins, Persky, & Weiss,
2003; Rupp et al., 2010; Quellmalz et al., 2009; Bauer, Williamson, Mislevy & Behrens,
2003; Shute & Kim, in press).
The ECD framework (Mislevy, Almond & Lukas, 2003) is built on the logical argument
posited by Backwards Design; that is, in order to effectively design learning
environments, educational designers should first identify learning outcomes, then
determine what student actions will provide evidence of their achievement of those
14
outcomes, and finally, develop learning activities that will elicit those student actions
(Wiggins & Mctighe, 2005; Pelligrino, Chudowsky & Glasser, 2001). ECD builds on this
logical progression by linking observations of student performance on a task with the
likelihood that a student has achieved a given learning outcome using statistical methods
such as Bayesian or social network analysis.
In ECD, the three curricular design components referred to above are termed models1.
When ECD is applied to computer enhanced learning environments, there is a typical
progression in the development of the three models, as illustrated in Figure 2.2a.
a
The process typically
followed in Evidence
Centered Design
State/National
Standards, Task
Analysis, or
Expert Survey
b
The process of developing
the VCVD Assessment
Framework
Industrial
Practice
“In the Wild”
Cognitive Demand
and University
Constraints
Competencies
Task
Evidence
Evidence
Task
VCVD LS
Model Maps
Competencies
Study of Expert Solution
Figure 2.2.
The logical flow suggested by Backwards Design and ECD is shown in a. The
flow of the assessment framework implemented on the VCVD learning system is shown in b.
The italicized text show the artifact used to satisfy each of the three ECD models in the VCVD
learning system framework.
1
While the ECD framework uses the term ‘model’ to describe its three curricular design components, this
paper also uses the term ‘model’ to describe a representation created with the intent of affording greater
understanding. When referring to ECD, competency, evidence, or task will always prelude model and the
text will be italicized. All other instances of the term ‘model’ refer to its more general description.
15
First, the competency (or student) model is developed. The competency model contains a
list of desired knowledge, skills, or other attributes to be developed and subsequently
assessed. These competencies may be identified by the instructor, by mandated learning
standards (Mislevy, Almond, Lukas, 2003), by a task analysis of experts (Bauer et al.,
2003), or by a survey of experts (Shoot & Kim, in press). Second, the evidence model is
developed by asking what observable student response will be able to provide proof that
the student possesses the desired competencies. Third, the task model is developed by
defining the specific tasks and actions that the students will be asked to complete in order
to elicit responses that will in turn inform the evidence model.
Bauer, Williamson, Mislevy & Behrens (2003) describe the use of ECD to assess student
learning. The task in the study is a web based instructional system designed to elicit
essential competencies required of workers in the field of computer networking. The
learning environment “uses realistic scenarios to set the stage for authentic design,
configuration and troubleshooting tasks” (Bauer et al., 2003, p. 3). In the design process,
the authors developed a competency model of the knowledge, skills, and abilities to be
assessed based on a task analysis of computer networking professionals and experts
within the field. The model included categories of “network proficiency,” “network
modeling,” and “domain disciplinary knowledge.” An evidence model was then
developed using Bayes Nets to analytically link specific observable actions of the
students to the possession of competencies. The intent of the work was to develop a
computer based assessment tool to characterize students’ acquisition of the complex
knowledge and skills required of computer networking professionals, such as system
design and complex problem solving.
Akin to Bauer et al., the goal of our research is to be able to make valid claims regarding
the students’ demonstration of competencies required of professionals in the field of
chemical engineering. In our assessment framework, we focus on assessing the students’
application of their knowledge and skills as they engage in an authentic engineering task,
the VCVD process development task. In order to accomplish such an assessment, we
focus on the process of developing the three assessment models suggested by ECD: the
16
competency model, the evidence model, and the task model. However, in developing the
VCVD Learning System, we have taken a fundamentally different approach compared to
Backwards Design and ECD. We do not begin by defining a set of competencies to
develop in students. Rather, we intentionally choose to first define an engineering task
“in the wild” and then render it to the academic setting. This approach naturally inverts
the form that the ECD framework takes. We elaborate on the justification for our
approach and then describe the process of developing the three ECD models in the
following sections.
2.3. Development of the VCVD Learning System and associated assessment
framework
The VCVD Learning System was developed based on the idea that thinking and learning
are inextricably tied to the context of the given activity or task by which the thinking and
learning are prompted (Bransford et al., 2000; Ambrose, Bridges, DePietro, Lovett,
Norman & Meyer, 2010; Lave & Wenger, 1991; Shaffer, Squire, Halverson & Gee, 2005;
Litzinger et al., 2011; Johri & Olds, 2011). The premise for our approach is that the
cultivation of the ability of students to apply their knowledge and skills “in the wild” is
most effectively developed through engagement in tasks that reflect the authentic
environments of professional practice as closely as possible (Fortus, Krajcik, Dershimer,
Marx & Mamlok-Naaman, 2005; Prince & Felder, 2006; Bransford et al., 2000).
However, while providing an authentic context, we also must be mindful of the limits of
the cognitive resources possessed by students and the other constraints of the academic
environment.
With these ideas in mind, we focus first on an authentic task “in the wild.” Consequently,
the ECD construct is essentially inverted, as shown in Figure 2.2b. In the evolution of the
VCVD Learning System, it is the task model that was developed first, based on
characterization of typical tasks identified from interactions with practicing professional
engineers in industry. Second, an evidence model was developed. As discussed above,
development of modeling skills in students is a primary emphasis of the VCVD Learning
17
System. Thus, the evidence model must characterize the ability of students to develop
models and identify strategies to employ these models to complete the task. Finally, a
competency model is developed. Our framework relies on a task analysis of an expert
engineer’s solution to the same task given to students in the VCVD Learning System.
In essence, instead of asking, “What competencies do we want students to demonstrate
and how can we build a task that elicits those competencies?” we first ask, “What are
elements of complex tasks that practicing engineers routinely perform?” and then, “Given
an authentic task that embodies many of those elements, what competencies are
necessary to engage in the task?
While they are described briefly below to provide context, the framework for the task
model and the evidence model used in developing the VCVD learning system is based on
more extensive work that is described elsewhere (Koretsky et al., 2008; Seniow, Nefcy,
Kelly & Koretsky, 2010). The focus of this paper is to identify their role in terms of an
assessment framework and then to develop the third component, the competency model.
In order to achieve this last objective, we examine the solution of an expert.
2.3.1. The task model
The process of developing the task model began by direct input from practicing
professionals in the form of informal interactions with engineers during industrial
research projects, with student interns and their project supervisors, and with our home
university’s Industrial Advisory Board. Based on these interactions, we developed a list
of common traits of complex engineering tasks. This list was confirmed by characteristics
of tasks in which practicing engineers engage that are described in the literature
(Herrington & Oliver, 2000; Todd, Sorenson & Magleby, 1993). Such tasks:
1. Are commonly completed using an iterative cycles of design and analysis.
18
2. Involve systems that function according to complex phenomena that are not easily
understood and cannot be classified with absolute certainty due to variations in
process and measurement.
3. Are completed in a social context and typically involve interacting with
teammates and a project supervisor or a customer.
4. Have ambiguous goals and competing performance metrics requiring engineers to
make difficult decisions about tradeoffs in performance in order to produce the
“best” possible solution.
5. Are completed with an emphasis on budgetary and time constraints.
In the VCVD Learning System, a specific task was chosen from industry that aligned, as
much as possible, with these general traits. Students are placed in teams and play the role
of process development engineers. They are tasked with determining a recipe of input
parameters for a chemical vapor deposition reactor so that it can be released to high
volume manufacturing. Students evaluate their input parameters by designing and
performing experiments and analyzing the results.
The task was designed to be as authentic as possible within the constraints of a class
setting. The experiments that the student teams design are performed virtually within a
three dimensional user interface intended to replicate a semiconductor processing facility.
Data for students are generated from an advanced mathematical model of the process, and
random and systematic process and measurement error are added to the output. Data are
only provided for the process parameters the students choose to run at the wafer locations
that they choose to measure. As in industry, student teams are charged virtual money for
each experimental run and each measurement. Research of students’ perceptions has
shown that students believe that the VCVD task is representative of a real-world
engineering task (Koretsky, et al., 2011).
“Good” runs are chosen according to a trade-off between two performance metrics,
product quality (uniformity of the thin film deposited to a target thickness) and process
cost (amount of chemicals used and time of process). Additionally, the student teams
19
seek to complete the task while keeping their total experimental costs as low as possible.
Optimizing performance in one metric typically results in decreasing performance in the
other metric, requiring student teams to evaluate the relative importance of both
performance metrics and their total experimental cost. Due to the complex interactions
between input parameters and performance metrics, an iterative problem solving
approach is necessary. Since the data are collected easily through the computer, student
teams are afforded opportunities to practice the complete, iterative cycle of experimental
design within the time frame and resources available to an undergraduate course
(Koretsky, Amatore, Barnes & Kimura, 2008; Koretsky, Barnes, Amatore & Kimura,
2006).
While we report the assessment framework for this specific task in the domain of
chemical engineering, the list of traits associated with authentic engineering tasks is
general and this approach could be used in a similar way to identify and develop
authentic tasks for a wide variety engineering disciplines.
2.3.2. The evidence model
One central desired outcome of the VCVD Learning System is to have student teams
engage in modeling to help them complete the experimental design task. The objective is
to have students recognize and use their engineering science background to more
effectively pursue the experimental design process rather than being explicitly instructed
to use a specific model. Through our experiences with this learning system, we have
observed a wide spectrum of model components that students have developed in their
efforts to complete the task. These model components include quantitative predictive
analytical model components, qualitative descriptive conceptual model components, and
empirically based statistical model components. In addition, there is a range of
sophistication in both the engineering science model components themselves and the
ways in which the students operationalize the model components to complete the task.
20
In order to capture evidence of the modeling activity demonstrated by student teams, we
have developed an analysis framework called Model Development and Usage
Representations, or, in short, Model Maps (Seniow et al., 2010). Model Maps provide a
succinct and graphical representation and are used as the evidence model in our
framework. The Model Maps analysis technique looks in vivo at the solution process
followed by student teams, focusing on the model components that they develop and how
they use their model to complete the VCVD task. The construction of a Model Map
involves transforming information from student work products into an information rich,
chronological representation of the solution path that was followed. In this way, a given
team’s solution to the VCVD task, which typically takes a team of 2 to 4 students 10 to
60 hours, is reduced to a 1 page graphical representation. This condensed representation
reduces the amount of time it takes to evaluate many student teams regarding their
modeling competencies and enables direct comparisons between the student teams.
The Model Maps analysis framework graphically displays information regarding a
student team’s model components and experimental runs by associating characteristics
with specific symbols. Table 2.1. shows the symbols used and their corresponding
description. Model components are represented according to their type (qualitative or
quantitative) and their action (operationalized, abandoned, or not engaged). Additionally,
the impact of modeling activity for each run is identified by the shape associated with the
run markers. Finally, additional descriptors provide further description of the modeling
process such as identification of model components that are clearly incorrect or sources
used in information gathering. All of these features are organized along a line
representing the chronological progression of the student team’s solution process.
Model Maps are constructed by researchers based primarily on interpretation of
information contained in the experimental design notebooks that students are required to
keep. While notebooks have been used as a primary source of data in other studies aiming
to characterize student cognition in engineering design problems (Sobek, 2002; EkwaroOsire & Orono, 2007), we acknowledge they are limited by the varying degree to which
21
participants record their cognitive processes and our researchers’ ability to correctly
identify and interpret the intended meaning of the student inscriptions.
Table 2.1.
A legend showing the meaning of various symbols used in the Model Maps
analysis technique.
Additional Descriptors
Run Type
Modeling Actions
Type of Model
Component
Symbol
Problem
Scoping
Sources
Model
Model
Name and Description
Circles represent qualitative model
components; relationships that do not rely
on numbers, e.g. “As pressure decreases,
uniformity increases”
Rectangles represent quantitative model
components; relationships that rely on
numbers (typically in the form of equations),
e.g. “Film Deposition Rate=Reaction Rate
Constant x Concentration of Reactant”
An operationalized model component is
one that is developed and then used
throughout the solution process.
Abandoned model components are
developed and then clearly abandoned.
A model component is classified as Not
Engaged if it is clearly displayed in the
notebook but no evidence exists of its use.
A quantitative model is used by the group to
determine the parameter values for model
directed runs.
A statistical approach is used to determine
the parameter values for statistically defined
runs (e.g. DOE)
Teams analyze the data provided by
qualitative verification runs to qualitatively
verify models that they have developed.
No explicit reason is given or deducible for
runs not explicitly related to modeling.
Often they represent a “guess and check” or a
“fine tuning” run.
An X is placed over any clearly incorrect
model component.
A box is shown at the beginning of the map,
signifying the Information Gathering stage.
All sources listed in notebook are displayed.
Primary model components are along the
central problem line and are used repeatedly
and are essential to the overall solution.
Secondary model components are
connected to the central problem line and are
peripheral to the overall solution
A run reference number denotes the run(s)
which a given model component was
explicitly applied.
22
To address these issues, we take two approaches. First, students are explicitly requested
to keep detailed experimental notebooks describing the experimental design that they
execute and justifying the reasoning behind strategic decisions. The student teams
participating in this study complete a physical laboratory project immediately before the
VCVD process development task. During the physical laboratory, they have been
instructed and provided feedback on keeping detailed experimental notebooks. The
notebooks in both the physical laboratory and the VCVD task are graded based on their
content and level of detail. Second, the Model Maps coding process relies on additional
sources of data, including: (i) the database of student run and measurement parameters
cataloged by the VCVD computer program, (ii) the memoranda that teams are required to
bring to the design strategy and project update meetings, and (iii) the team’s final report
and final oral presentation slide show. The information from these sources serves to
confirm, explain, or expand upon the notebook content.
The construction of Model Maps is based on a reliable and valid coding method that is
described in greater detail in Seniow et al. (2010). Recently, our group has performed a
study comparing Model Maps constructed using the methods described in this study to
Model Maps constructed using protocol analysis of full-length transcripts of think-aloud
protocol of solutions to the VCVD task. We have found that while Model Maps
constructed based on the finer grain approach provide greater detail, the fundamental
characteristics of the solution process are well represented by the Model Maps technique
used in this study (Nefcy, Gummer & Koretsky, 2012).
2.3.3. The need for a competency model: motivation of the expert –novice study
In this study, we seek to develop an appropriate competency model based on the task
model and the evidence model described above. To achieve this objective, our approach is
to observe an expert chemical engineer’s performance while he completes the VCVD
task and to characterize his solution using Model Maps. This approach derives from
studies of expertise in the learning sciences (Bransford et al., 2000; Litzinger et al., 2011;
Atman, Kilgore & McKenna, 2008) and is intended to capture the modeling
23
competencies. Once developed, the competency model may be used together with the
evidence model (Model Maps) to assess student solutions to the VCVD task.
As mentioned above, our focus in this work is directed at characterizing expert
competencies associated with modeling. To frame our investigation of the expert’s
solution, we focus on three fundamental stages of the modeling process as discussed in
literature (Buckley et al., 2010) and as interpreted in the context of modeling in the
VCVD Learning System in Figure 2.1. We define each of the stages below and discuss
prior findings from literature.

Information Gathering: As shown in Figure 2.1, information gathering in one of
the primary ways that the modeling process begins. It has previously been
identified as a competency critical to modeling (Maaß, 2006). Correspondingly,
information gathering is a common focal point in many studies of expert solutions
to problems in a variety of fields (Robinson, 2011; Delzell, Chumley, Webb,
Chakrabarti & Relan, 2009; Kirschenbaum, 1992). Studies in engineering
education have focused on information gathering in the context of design
problems and have found that experts exhibit significantly different patterns in
information gathering when compared to students. In a study performed by Atman
et al. (2007), experts spent more time gathering information, requested
information more often, and requested information from more categories than
students. These findings are consistent with other studies in engineering which
have identified information gathering as a critical stage in the design process (Jain
& Sobek, 2006; Ennis & Gyeszly, 1991).

Formulating the Problem: As shown in Figure 2.1., model components are
generated before the first experiment based on gathered information and prior
knowledge. These initial model components are generated so that the problem
solver may understand the problem and frame it so that action (in this case,
performing experimental runs of the VCVD reactor) may ensue. Such activity is
referred to as problem scoping, problem structuring, or problem framing. Studies
of engineering designers have shown that such activity is an important stage in
design (Restrepo & Christiaans, 2004), albeit one that varies based on context of
24
the problem and the designer’s past experience (Cross & Cross, 1998; Cross,
2003).

Iterative modeling and experimentation: Iteration of modeling and
experimentation is shown in Figure 2.1. as the clockwise arrows, denoting the
cyclic progression of the solution process. The process involves a model to
predict input parameters for an experiment and then revising that model based on
analysis of the results of the experiment. Iterative modeling is not only beneficial
in the development of conceptual understanding in the modeler (Lesh & Harel,
2003), but is also considered an essential ability of scientists (Buckley et al.,
2010). Likewise, iteration is commonly referred to as a critical aspect of the
engineering design process (Ullman, 2009). When investigated in the context of
engineering, iterative design practices have been shown to vary based on
experience and to correlate with success in design projects (Adams, 2001).
2.4. Method
2.4.1. Expert Participant
The expert in the study is a highly qualified chemical engineer. He received his
undergraduate and doctorate degrees in chemical engineering from top tier programs and
has eighteen years of industrial experience working in both high-tech and pulp and paper
industries. His positions in industry have included roles in microfluidic design,
mechanical design, reaction engineering, and research and development management. He
has been promoted regularly, culminating in a management position and a designation of
“master engineer” in a global high-tech company. The expert is generally acknowledged
by his peers for his technical mastery and he holds over 20 patents.
While the expert was educated as a chemical engineer and has extensive and varied
experience with both chemical engineering project work and experimental design, he did
not have any specific experience working with CVD. The choice of an expert with this
background was deliberate as we sought an expert who would need to activate his
25
foundational domain knowledge to complete the task and not rely on any specific
practical experience from similar tasks.
In some tasks, experts primarily use their specific practical experience from similar tasks
as opposed to activating foundational domain knowledge to develop a solution. For
example, Johnson (1988) found that experts used task specific prior knowledge to
troubleshoot a generator and Hmelo-Silver, Nagarajan & Day (2002) found that experts
used specific prior knowledge of the drug trial design process in the design of a clinical
trial for a new drug. Alternatively, in order to evaluate how experts approach novel
design tasks, researchers have created problems where the expert does not have prior
experience, such as sand box design (Atman et al., 2007) or the design of “imaginary
worlds” with unique design criteria (Smith & Leong, 1998). Since the expert in this study
has general disciplinary knowledge but no specific experience with CVD or related
processes upon which the Virtual CVD laboratory is based, it is analogous to the second
approach. The lack of process specific experience aligns his cognition with that of the
students; both must complete the task using the activation of their foundational domain
knowledge.
One limitation of the study is that the expert worked alone while the students worked in
teams as part of cohort. In this way, we believe the expert was at a disadvantage since he
did not have the rich socio-cultural environment experienced by students. The results
presented here should be considered with this difference in mind. In the future, we intend
to study teams of experts so that we can further discern important socio-cultural elements
in the solution process.
2.4.2. Novice Participants
The novice participants included two student teams selected from a cohort of senior
chemical, biological, and environmental engineering students at a large public university.
Both teams contained three students. The teams were selected based on the project
supervisor’s perception that the solutions were representative of a low performing team
26
and a high performing team. The selection and discussion of student solutions perceived
as representative of the general student population is seen in other studies of problem
solving in engineering (Atman, Bursic & Lozito, 1996).
The students were assigned the Virtual CVD process development task as one of three
laboratory projects in the first term of the senior capstone laboratory sequence. Prior to
this course, they had completed courses addressing core chemical engineering science
content such as material balances, reaction kinetics, mass transfer, and statistical analysis
of data including Design of Experiments (DOE). Students self-selected into teams in their
laboratory sections. The project was approved by the Institutional Review Board and all
participants signed informed consent forms.
2.4.3. Task
Both the expert and student participants performed the same VCVD Learning System
process development task as described above and were then given three weeks to
complete the project. The participants were first given a project introduction that included
a presentation containing background information regarding thin films manufacturing, the
CVD process, and the basic principles contributing to CVD.
During the project, they were required to meet with the faculty member serving as
“project supervisor” twice. In the first of these project update meetings, the participants
were asked to present their experimental strategy, including first experimental run
parameters and overall projected budget. In the second meeting the participants were
required to give the project supervisor an update regarding their progress and to outline
the experimental strategy to be used in completing the task. In both of these meetings,
effort is placed on maintaining a situated environment for the project. Students were
required to bring typed memoranda to each of the meetings while the expert was asked to
verbally describe the items listed above.
27
The participants performed all experiments using the VCVD reactor and used the userinterface to submit their final recipes of input parameters. After the submission of a final
recipe, the students were required to deliver final oral and written reports while the expert
was not. These activities both develop communication skills and promote reflection and
metacognition. However, they would have minimal impact on our analysis of the expert
modeling competencies and we simply could not ask the expert to spend the time these
activities would require.
2.4.4. Data Sources
Data have been collected from five sources for this study. First, participants were
instructed to thoroughly document their work in an experimental notebook. They were
instructed to keep track of the run parameters, a summary of output, their data analysis,
and explain what they inferred from the analysis and what they planned to do next. After
the project, the notebook was collected for analysis. Second, experimental information
was gathered from the instructor interface of the Virtual CVD laboratory software and
database. Using the instructor interface, the researchers were able to view the parameters
and results of the runs and measurements that each team completed as they progressed
through the project. Third, for the students only, work products were collected including
the two typed memoranda and their final written report and final presentation slides.
Fourth, following think-aloud protocol, the expert was instructed to verbalize his thought
process as he solved the problem (Ericsson & Simon, 1996). The expert was audio
recorded as he worked and during his two project meetings with the project supervisor.
All audio recordings were then transcribed. Fifth, the expert was interviewed after the
project using a semi-structured format with both open ended and directed questions. The
interview was audio recorded and transcribed.
2.4.5. Analysis Methods
The primary analysis method implemented in this study is the development and
interpretation of Model Maps for each participant, as described in the ‘evidence model’
28
section above. As aforementioned, the design notebook, additional work products, and
experimental run data from the instructor interface of the VCVD Learning System are the
data sources used by researchers to construct Model Maps. Further analysis using Model
Maps was performed in two stages in order to address each of the two research questions.
We addressed the first research question regarding the expert’s modeling competencies
by interpreting the information in his Model Map in the context of commonly identified
features of expertise and commonly accepted expert practices in modeling as reviewed in
Section 3.3. This examination provides evidence as to the manifestation of such expert
traits in the specific context of the VCVD process development task. To gain greater
insight into the expert’s modeling, traits identified in the expert Model Map were
investigated further by searching the think-aloud and interview transcripts, and revisiting
the expert’s notebook. In this fine grain analysis, transcripts and the notebook were
reviewed in their entirety and a researcher parsed out segments of these documents that
were related to the traits identified in the analysis of the Model Map. In the case of longer
sections of parsed data, our research team interpreted the statement and a discussion of
our interpretation is presented. In cases of shorter and more direct parsed segments, the
data is presented directly along with a discussion.
To address the second research question, we analyzed Model Maps for the two
student teams. This analysis involved examining the Model Maps for both teams
regarding the expert modeling competencies previously identified.
2.5. Results
The results section includes the presentation of the expert’s solution as represented by a
Model Map, including the identification and discussion of the modeling competencies
demonstrated in the three stages of his solution. We then assess the solution of two
student teams’ based on the competencies of the expert.
29
2.5.1. The expert’s solution
The Model Map developed from analysis of the expert’s work products is shown in
Figure 2.3. The Model Map can be interpreted using the three stages of modeling
described in Section 3.3. The expert began the solution process to the VCVD task with
information gathering, citing six sources. The expert then developed 11 model
components in formulating the problem. These model components are seen before the
first experimental run (shown by the small filled black square run marker labeled ‘1’).
Next, he entered the solution stage of iterative modeling and experimentation. In this
stage, he used information from eight experimental runs to revise two of his previously
developed model components and develop three new model components. The expert
concludes his solution with submitting a final recipe of inputs (shown by ‘Final
Parameters’ in Figure 2.3.) for release to high-volume manufacturing.
We next identify the competencies that are demonstrated by the Model Map associated
with each of the three stages. Model Map analysis is also supplemented through analysis
of the expert’s notebook, the think-aloud transcript, and the post project interview.
2.5.1.1. Information gathering
The process of information gathering is intentionally designed into the VCVD task and
forms an important foundation for the modeling that each team performs. While students
are given a brief overview of CVD technology and associated first-principles during the
project introduction, they must engage in self-directed information gathering to
investigate the usefulness and application of relevant theories and strategies. Information
gathering is also used to determine reasonable input parameter values for experiments.
This stage of the solution process typically involves searching texts, journal articles, and
websites.
30
As Pressure
Increases,
Reaction Rate
Increases
1 
 1
T 


-Welty Wicks
MA MB 
DAB 
Wilson and
P
3/ 2
 
kke
DOE: 3Factor: P,
NH3 Flow,
DCS Flow
 EA
RT
0
Rorrer
-Teasdale
Information
Gathering
Material
Balance 1
-Wolf and Tauber
-May and Sze
-Xiao
-Wikipedia
C2 

 1 ln C 
1
T2   

E 
 T1

k 

Zone By Zone
Material
Balance
1
1
Ideal Gas Law
1
As DCS/NH3
Ratio Increases,
Reaction Rate
Increases
R  kCDCS
1, 5
Project Update
Meeting
Kinetic
Model 2
Utilization
ALL
8 7 6 5
Final
Parameters
4
3
Material
Balance 3
4
Project Update
Meeting
X2

 1 ln X
1
T2   
E
 T1

k







1
2
1
Material
Balance 2
7,8
1
Thickness is
linear with
time
2
Figure 2.3. A Model Map displaying the information gathering and modeling demonstrated by
the expert in his solution process. The dashed hexagon is included to illustrate the four
interlocking model components that form the core of the model.
Examination of the Model Map depicting the expert’s solution indicates that six
references were listed in his notebook. Three of these sources were chemical engineering
texts, two were journal articles, and one was a website. Direct examination of the expert’s
notebook yielded insight into the competencies demonstrated during this stage of his
solution. For example, from the Teasdale et al. journal article (the second source shown
in Figure 2.3. in the ‘Information Gathering’ box), he noted the CVD process input
parameter values reported (Teasdale, Senzaki, Herring, Hoeue, Page & Schubert, 2001).
Additionally, he noted that the reactor volume reported in Teasdale et al. differed from the
reactor for which he was developing input parameters. This note is interpreted as a
31
reminder to use caution when translating the values reported in Teasdale et al. to the
VCVD process.
A second example of the expert’s information gathering competency was found in his
citation of the Wolf & Tauber text (Wolf & Tauber, 1986). Here, similar to his citation of
Teasdale et al., he recorded the range of input parameters that Wolf & Tauber list for
CVD reactors. Additionally, in the Wolf & Tauber citation, he noted one of the essential
constraints limiting the maximum temperature in CVD processes, i.e., the typical
temperature range the reactor can operate under before the chemical reaction transitions
into a regime wherein product uniformity dramatically decreases. This transition point in
CVD reactors became a focal point in the expert’s efforts later in formulating the
problem.
Using Wolf & Tauber, the expert also noted the qualitative effect that the input parameters
of pressure and flow ratios of the two reactants have on the deposition rate of the film.
Figure 2.4a. shows the expert’s notes regarding this relationship. Examination of the
expert Model Map shows that this information was later assimilated and became two
qualitative model components (denoted by the two circular symbols in Figure 2.3.) which
were useful in formulating the problem and performing iterative experimentation and
modeling (used to determine input parameters for runs 1 and 5).
a
b
Figure 2.4.
Two excerpts from the expert’s notebook showing (a) the formulation of a
qualitative model component relating deposition rate (
flow ratios (
) to pressure (P) and input
) and (b) an explicit comparison between deposition rates (dep rates)
found in two literature sources.
32
The expert’s information gathering continued in this pattern; he cited sources and
extracted material which he identified as useful. To make judgments regarding the
validity and applicability of the information he found, the expert compared inputs and
relationships reported in various sources amongst one another and to the process on
which he was to perform experiments. A sample of this activity is found in Figure 2.4b.
In this example, the expert explicitly compares a deposition rate reported in Wolf &
Tauber to a deposition rate reported in Teasdale et al.
In summary, the expert’s information gathering activity demonstrated the following
competencies: (i) collection of multiple (six) sources from texts, journals, and the
internet; (ii) evaluation of information credibility and applicability via cross referencing
sources and comparison of reported systems to the system being optimized; (iii)
assimilation of information thought to be reliable and applicable to form the foundation
for model development in future solution stages.
2.5.1.2. Formulating the problem
In the expert Model Map shown in Figure 2.3., evidence of modeling activity in the
problem formulation stage is shown between the box signifying ‘Information Gathering’
and the first experimental run, indicated by the filled black square run marker labeled
with a ‘1.’ During problem formulation, the expert placed an emphasis on modeling,
developing 11 of the 16 total model components represented on the Model Map.
Furthermore, ten of the 11 model components are based directly on his fundamental
knowledge of chemical engineering first principles (all but the 3-Factor DOE in the
dashed box). Finally, we notice the core of his modeling activity is represented by four
clustered model components. These components are indicated by the added dashed
hexagon in the Model Map. We term them clustered since they are integrated and become
applied as a whole.
33
The modeling activity in the problem formulation stage culminates in a first experiment
which we term model directed, as indicated by the filled black square run marker. In a
model directed run, the model is used to predict exact numerical values for the input
parameters for the experimental run. Such behavior is aligned with the overall goal of the
VCVD Learning System - to develop in students the ability to apply fundamental
chemical engineering principles to solve real-world engineering problems. The project
supervisor has witnessed student teams performing similar model directed runs; however,
in experience with over 100 teams, he has never observed a student team use modeling to
determine their first experimental run parameters. We argue below that the expert’s
performance in this regard is facilitated by a broad set of competencies.
Key to the expert’s modeling strategy is the identification of a critical element to the
reactor performance that focuses and directs his modeling activity. Identification is
followed by conceptualization, mathematization, simplification, and solution. This
interpretation is based on evidence in the expert’s design notebook and the think-aloud
transcript, and is confirmed through the post-project interview. Below we elaborate on
this sequence of observed modeling activity.
To understand the expert’s modeling approach, some engineering science background is
needed for context. CVD reactors can be operated in two regimes, which are termed
‘mass-transport limited’ and ‘reaction-rate limited.’ The input parameters determine the
regime in which the VCVD reactor operates. However, the different “zones” along the
reactor may operate in different regimes since the conditions change as the feed gases
react to form products (the film and waste gases). The operating regime is critical since
one regime enables high film uniformity (one of the quality metrics) and the other does
not. However, the desired regime also results in slower film growth rates and
correspondingly longer process times (one of the cost metrics) which is not desirable. As
a result, a good solution strategy is to investigate the transition from one regime to the
other and select input parameters so that all of the zones operate just barely within the
desired regime. This strategy results in a film that is grown as fast as possible, but is still
uniform.
34
In the process of formulating the problem, the expert first identified that reactor regime is
critical to performance. This identification was based on a combination of information
gathered as mentioned above, and his understanding of first principles. Next, through a
qualitative conceptualization of the processes occurring along the length of the reactor,
the expert correctly recognized that it is the last zone of the reactor that is most
susceptible to transition to the undesired regime. Direct evidence of this
conceptualization was found in his think-aloud transcripts. However, due to the technical
nature of the utterances, we omit them from this discussion.
The expert’s notebook reveals that he mathematized his conceptual
understandings by generating a set of four quantitative model components based on first
principles. These model components were identified using Model Maps analysis and are
shown in Figure 2.3. in the dashed hexagon. He used this quantitative model along with
the needed conditions at the last zone of the reactor to determine what conditions were
required in each of the prior zones. In other words, the expert developed his model, in a
backwards manner, starting with the conditions needed at the critical last zone of the
reactor and then modeling each zone forward to determine input parameters to the first
zone.
In solving equations to determine numerical values, the expert purposefully simplified
some of the quantitative model components. That is, the expert’s notebook shows
quantitative model components of greater complexity than those he ultimately used to
determine input parameters. The expert did not directly state why he chose to use the
simplified model components. However, he was readily able to apply the simplified
model components and achieved expected results on his first experiment. The more
complex model components may represent an alternate approach to be used by the expert
if the simplified modeling approach proved insufficient.
Once his model was formulated, the expert incorporated appropriate values for model
parameters (constants) which he had previously identified during information gathering.
The expert then solved the resulting equations to numerically calculate the input
35
parameters for his first run (the model directed run). Such actions of developing,
simplifying, and solving mathematical equations to model a real-world system
demonstrate a high level of mathematical procedural competence.
We propose that development and deployment of the expert’s modeling strategy was
facilitated by his retrieval of connected chunks of knowledge regarding fundamental
chemical engineering concepts such as reaction kinetics, properties of gases, and material
balances. Such a structure of connected bodies of knowledge is often referred to as
schema and experts are generally acknowledged as having more developed schemas than
novices (Bransford et al., 2000). The expert’s schema is evidenced in two ways in his
solution. First, to develop his model in a backwards manner, the expert needed to
conceptually characterize the reactor behavior - both to identify the desirable regime and
then to determine which end of the reactor was most susceptible to transition out of this
regime. The expert would not have been capable of employing this strategy if he
considered fundamental principles in a piecemeal fashion. Rather, this conceptualization
required the expert to simultaneously apply multiple fundamental chemical engineering
concepts. Second, in formulating the quantitative model, the expert again demonstrated
his interconnected knowledge structure; he mathematized his conceptual understanding
of how the reactor worked through complimentary equations, such that all equations
could be solved simultaneously.
Analysis of the expert’s post-project interview provides evidence of the intentionality of
his model intensive approach in problem formulation. When prompted to simply ‘reflect
on the project,’ the expert mentioned his focus on the initial part of the solution several
times. He discussed the effort he spent learning about the mechanisms that affect CVD, “I
really wanted to understand what I was trying to do before I started.” Later in the
interview, he articulated two reasons for this emphasis, both situated in his industrial
experience. He first mentioned a focus on understanding the problem, “as a seasoned
professional, if I don’t understand my objectives, I can’t make good engineering
decisions.” He also mentioned his desire to maintain his credibility in industry, “You got
to have your ducks in a row before you walk up to someone (an operator or supervisor)
36
because that first impression of credibility is really important.” The expert then
mentioned the role of understanding and application of chemical engineering first
principles in formulating the problem, “What you don’t want to do is go up to equipment
and start running things, building things and not think about the basic principles at play.”
He continued to state, “I believe in the application of our (chemical engineers’) training
in scientific fundamentals to solve problems” and “So you look at something (referring to
a new project) and you immediately go to the physical principles that you were trained in
and begin to apply them.”
In summary, while formulating the problem, the expert demonstrated the following list of
competencies: (i) he identified that operating regime was a critical feature of the problem;
(ii) he conceptualized the reactor’s operation to identify where and how the operating
regime issue would manifest in the reactor; (iii) he transferred his conceptual
understanding of how the reactor worked into a series of mathematical equations; (iv) he
chose to simplify the mathematical model components which he then solved to determine
input parameters; and (v) strategically, he placed a distinct emphasis on modeling to
understand the problem before performing experiments.
2.5.1.3 Iterative modeling and experimentation to solve the problem
During the iterative modeling and experimentation stage, the expert iteratively used his
model to predict experimental inputs parameters. He also revised his model based on
analysis of experimental results and on discussions with the project supervisor during
project update meetings. His first experimental run’s input parameters were determined
based on his quantitative modeling. Additionally, Model Maps analysis reveals that, of
the eight experimental runs performed by the expert, there was explicit evidence of the
application of model components to determine input parameters in all but two runs (run 3
and run 6). Likewise, the Model Maps analysis method classified six of the expert’s
model components as ‘primary’; these are located directly along the solution path line in
Figure 2.3. Model components are classified as primary when they are interpreted as
essential to the overall solution path by the research team.
37
Regarding the revision of his model, although the majority of the expert’s model
components were developed before his first experimental run, he generated five model
components while performing experiments, three of which were explicit revisions of
earlier model components (‘Material Balance 3’, ‘Kinetic Model 2’, and ‘T2=…’). These
model components were developed by the expert based on analysis of data from his
experimentation and feedback during project update meetings with the project supervisor.
Further insight into the iterative nature of the expert’s modeling and experimentation and
his metacognition regarding strategy was found in his notebook. Figure 2.5. depicts two
excerpts from the expert’s notebook explicitly demonstrating his iterative plan of
modeling and experimentation. In Figure 2.5a., the expert has defined his first
experimental run input parameters and is describing his future plans after performing the
first experiment and analyzing the data. The cyclical diagram shows that after running his
first experiment, he plans to verify the ‘coherence’ of his theory. If found coherent, he
will transition to experiments guided by a ‘factorial’ experimental design, a commonly
used experimental design from the Design of Experiments methodology.2 If not
‘coherent’, he will modify his theory and run another experiment. Examination of thinkaloud data revealed that this plan was further iterated upon as the expert did verify his
theory, but chose to apply a tuning process guided by first principles knowledge instead
of a factorial experimental design.
Figure 2.5b. shows a similar diagram from the expert’s notebook which he inscribed after
his fourth experimental run. In this diagram, the expert shows his solution path forward
after analyzing run 4. The notebook excerpt reveals that if the ‘characteristic shape’ of
film deposition is the ‘same’ (signifying good film uniformity), he will proceed to
perform experiment 5. If not, he will ‘think.’ A similar reactive process is proposed to
follow run 5. The think-aloud transcript reveals the expert’s cognition regarding the
2
Design of Experiments (DOE) is an experimental design methodology that lays out patterns for
determining experimental inputs so that empirical data may be gathered and analyzed using statistical
principles, to develop quantitative models predicting the relationship between inputs and process
performance.
38
factors contributing to his future strategic choice; specifically, whether to apply
qualitative modeling to increase flow rates or to move back towards a more first
principles quantitative approach of tuning temperature zones.
“So it is interesting, I need to start thinking about what I am going to do
depending on what these results look like. If I bring up that curve, it is still
monotonic, but if I start to bring it up, then I’m going to go back in and
bump the DCS flow up. If I go back to my concave up curve, I think that I
will go to a temperature modification strategy.”
a
b
Figure 2.5.
Two excerpts from the expert’s notebook describing his iterative approach to
modeling in his solution to the VCVD task.
The expert Model Map and think-aloud transcript shows additional proof of dynamic
interaction between his solution approach and the feedback he received from analyzing
experimental results. As mentioned in the problem formulation section above, the expert
began the solution process implementing a quantitative first principles modeling
39
approach. However, in the middle portion of the solution process he transitioned to an
approach more reliant on qualitative modeling (used in runs 2-6) and finally to a fine
tuning approach (runs 7-8). During runs 2-6, the expert adjusted parameters using
primarily qualitative model components based on first principles, such as ‘As DCS/NH3
Ratio Increases, Reaction Rate Increases.’ During the tuning runs, the expert used
primarily a quantitative model component (the model component in Figure 2.3. denoted
by the run numbers ‘7,8’ below) to determine input parameters based on empirical data
from analyzing past runs. Inspection of the think-aloud transcript gave insight into why
the expert considered alternative strategies:
“I think you have to wait and see how much change there is along the
length and within the zones (referring to uniformity of the film thickness)
to decide if I am ready to start my zonal temperature tuning, which I have
always imagined is like the tuning variable.”
This quotation and further examination of the corresponding text suggests that the expert
attempted to level any monotonic increasing or decreasing of deposition along the length
of the reactor using qualitative model components. His strategy then transitioned to using
first principles to quantitatively ‘tune’ the zonal temperatures. This tuning strategy
resulted in the expert arriving at a recipe of inputs that he submitted his final parameters.
We conjecture that the expert managed his strategies in an effort to continuously
converge on an optimal recipe of input parameters. Examination of recorded data from
the VCVD computer interface confirms that the expert did indeed converge on an optimal
set of input parameters. That is, he was able to use his model to determine input
parameters for subsequent experiments so that, over the course of his experimentation,
the reactor’s performance improved. As a result, the expert submitted a final recipe of
input parameters that was the same as those used in his last experimental run, except for
an inadvertent 0.5 oC deviation in one of the temperature zone inputs.
40
The expert summarized the iterative nature of his solution approach in one of the last
comments captured in his think-aloud transcript. After completing his eighth and final
experiment and selecting his final recipe of input parameters for submission to highvolume manufacturing, the expert reflected on the solution process: “Pretty happy with
that result and I am going to stop there. Kind of neat to see how [my solution process]
iterated to the answer”.
In summary, the expert’s activity during this stage of the solution demonstrated the
following list of competencies: (i) the use of modeling to determine experimental input
parameters and to converge on high performing input parameters; (ii) the iterative
evaluation and revision of his overall model; (iii) the use of first principles throughout the
solution process; and (iv) the strategic transition between three different approaches for
determining input parameters.
2.5.1.4. Summary of the expert’s solution
A summary of the modeling competencies the expert demonstrated in his solution to the
VCVD task is shown in Table 2.2. The competencies listed represent our efforts to
address our first research question: What modeling competencies manifest as an expert
completes the VCVD task? They also constitute a preliminary competency model for the
VCVD Learning System regarding modeling. These competencies may now be compared
to the competencies demonstrated in student solutions as represented by Model Maps.
2.5.2. Investigation of the application of the VCVD Learning System assessment
framework: comparing the expert modeling competencies to student solutions
In order to address our second research question, Do the competencies demonstrated by
the expert inform the assessment of student learning as verified by the examination of
student work?, we have investigated the use of the modeling competencies shown in
Table 2.2. to assess two sample student teams’ solutions. To expose the framework to the
41
spectrum of potential student solution quality, we selected two student teams, one of
whose solution was perceived to be of low quality and the other of high quality.
Table 2.2.
Competencies demonstrated in the expert’s solution. Evidence from: *Model
Map analysis, ** notebook, †think-aloud, ‡post project interview.
Formulation of the problem
Information gathering
Solution
stage
Modeling competency demonstrated
1. Identification of multiple sources of
information from texts, journals, and
the internet.
 Six sources cited including three chemical engineering
texts, two journal articles, and one website.*, **
2. Evaluation of information credibility
and applicability via cross
referencing sources and comparison
of reported systems to the system
being optimized.
 Explicit comparison amongst sources and between
sources and the VCVD process. **
3. Assimilation of information to form
foundation for future model
development.
4. Identification of a critical aspect of
the problem (reaction regime).
5. Conceptualization of the reactor’s
operation.
 Themes identified in information gathering developed
into model components in later solution stages. **, †
6. Mathematization of conceptual
understanding.
7. Simplification of mathematical
equations.
8. Solution of mathematical equations.
9. Strategic emphasis on modeling
before experimentation.
10. Use of modeling to select input
parameters.
Iterative modeling and
experimentation
Evidence from expert’s solution
11. Evaluation and revision of model
based on analysis of experimental
data.
12. Use of iterative modeling and
experimentation to converge on high
performing input parameters.
13. Strategic changes in approach to
determining input parameters.
 Identification of operating regime as critical to reactor
performance. **, †
 Verbalization of relevant principles at play as the reaction
occurs along the VCVD reactor. †
 Generation of multiple interconnected model
components. *, **, †
 Generation and solution of simplified mathematical
equations predicting reactor performance. *, **, †
 11 model components developed before the first run.*, **
 Discussion of importance of understanding applicable
first principles before experimentation. ‡
 A quantitative model based on first-principles used to
determine run 1 input parameters. * & †
 Model components explicitly linked to the determination
of input parameters for six of the eight experiments. *, **
 Three revisions to model components present.*, **
 Five model components developed after the first
experiment. *,**
 Six model components deemed ‘primary’.*,**
 Expert submitted a final recipe of input parameters nearly
identical to his last run. *
 Three distinctly different approaches guided modeling in
the beginning, middle, and end of the project. *, **, †
42
2.5.2.1. Evaluation of Modeling in Team A’s Solution (low performing team)
The Model Map depicting student Team A’s solution is shown in Figure 2.6. When the
Team A Model Map is assessed relative to the competencies identified in the expert
Model Map, several differences are readily apparent. First, during the information
gathering stage, the Team A Model Map indicates that they did not cite any literature.
Second, in an effort to formulate the problem, Team A developed only two model
components before their first experimental run.
Material
Balance
Utilization
Project Update
Meeting
Project Update
Meeting
1 2
3 4 5 6
Information
Gathering
Reaction
Limited vs
Diffusion
Limited
DAB  T
3/ 2
Material
Balance:
Depletion
Ideal Gas Law
11
7 8 9 10
Final
Parameters
6
Economic
Analysis
Figure 2.6.
R  kCDCS
The Model Map representing the solution of student Team A (low performing).
43
Regarding iterative modeling and experimentation, none of the team’s model components
were developed in an integrated fashion and none were ‘primary model components.’
That is, no model components are connected to each other and none lie directly along the
central solution line. The team developed a model to predict reaction rate (labeled
‘R=kCDCS’), but only after the second project update meeting. Likewise, while the team
developed this reaction rate model, according to their notebook, they did not explicitly
use it to inform any of their experimental runs (no run reference number is shown at the
bottom right corner of the model). However, aligning with the expert solution, it can be
seen that the team developed a qualitative “Material Balance: Depletion” model before
their seventh experimental run that is a revision of the first model that they developed in
the solution process. Additionally, they developed 5 model components which were
integrated into their solution process while performing experiments. This iterative
behavior aligns with that of the expert.
Further investigation of the Team A Model Map reveals however, that the student team
did not reference any of their model components when justifying experimental run input
parameters in their notebook. The team submitted a final process recipe of input
parameters that is identical to their sixth experimental run, which was performed roughly
half-way through their performing of experiments. While performing multiple
experimental runs that do improve the solution is not a ubiquitous sign of a low level of
competence, in this case it is most likely a result of the loose connection between the
model components that Team A developed and their selection of input parameters. That
is Team A appears to have lacked either the fundamental knowledge of chemical
engineering first principles or lacked the ability to apply this understanding to solve the
VCVD task. Evaluation of Team A’s solution using the competency model is shown in
Table 2.3.
2.5.2.2. Evaluation of Modeling in Team B’s Solution (low performing team)
Student Team B’s Model Map is shown in Figure 2.7. During information gathering the
team explicitly noted two sources from literature in their notebook. While still less than
44
the six sources referenced by the expert, it is an improvement over the information
gathering activity reported by Team A.
Pierson, H. 1992
Sivaram, S. 1995
DOE: 5-Factor
2-Level: T, P,
NH3 Flow,
DCS Flow,
Time.
Ellipsometer
Variation
Reaction
Limited Regime
at lower
Temperature
Project Update
Meeting
7, 10,
11, 3
1
Information
Gathering
2
1
DOE: 3-Factor
2-Level: T, P,
NH3 Flow.
Material
Balance
1, 4, 5, 6
Ideal Gas
Law
a
Utilization
 bx
Thickness is
linear with
time
Mass Transfer
limited vs
reaction
limited
dx
Project Update
Meeting
6
5
3
4
R  kC A
7
Zone by
zone
material
balance
Ideal Gas
Law
Depletion
 
ke
Increasing
Temperature
decreases radial
uniformity
 Ea
ks
Uniformity
decreases as
temperature
increases
RT
0
12 is a
rerun of
11
Statistical
Analysis
12,5
7
8
9 10 11
12
Arrhenius
Plot
Final
Parameters
12
1, 2, 3
Reactor
Performance
Temperature
Zone
interactions
8
Figure 2.7.
The Model Map representing the solution of student Team B (high performing).
45
While formulating the problem, Team B’s Model Map shows the development of seven
model components before the first experimental run, all but one of those coming before
the first project meeting. This activity is evidence of Team B’s early modeling effort and
is reflective of the expert’s approach to solving the problem. Interestingly, two of the
team’s model components in this stage signify their pursuit of solving the VCVD task
using a Design of Experiment (DOE). Correspondingly, the triangular run markers
marking runs 1 and 4-6 show that the team implemented their second DOE strategy,
although only partially. The expert also considered a DOE approach, but did not execute
it.
After run 6, the team developed a reaction rate model cluster based on first principles,
identified with the dashed rectangle in Figure 2.7. The two model components in the
cluster (R=kCA and ks=koe(E/RT) ) are the same as two of the four model components in the
expert’s model cluster discussed above. As mentioned in the discussion of the expert’s
solution, such clusters suggest that the students in Team B possess an interconnected and
functional body of knowledge regarding fundamental chemical engineering concepts.
Additionally, Team B developed three other clusters of two model components each,
further demonstrating competency of developing interconnected model components.
Team B also explicitly noted the use of model components to inform experimental input
parameters. Inspection of Team B’s Model Map reveals that all but two of their 11
experimental runs (runs 2 and 9) were influenced by their model. The highlight of this
modeling effort came in run 7, wherein the team used their reaction rate modeling cluster
to determine the input parameters of a model directed run (filled black square run marker
labeled ‘7’).
Like the expert, student Team B practiced iterative modeling and experimentation. That
is, Team B refined their overall model of how the CVD process worked by integrating
information from analyzing experimental data and feedback from project update
meetings. Team B’s Model Map shows that the team refined their modeling cluster
46
related to ‘Material Balance.’ They also developed four qualitative model components
after beginning to perform experiments. These model components, such as ‘Increasing
temperature decreases radial uniformity’ are likely the result of the team’s analysis of run
data. Further evidence of Team B’s advanced application of iterative modeling and
experimentation is provided by the team converging on an optimal solution. That is, they
submitted final process input parameters that were identical to those used to perform their
last experiment. Evaluation of Team A’s solution using the competency model is shown
in Table 2.3.
Table 2.3.
Evaluation of Team A and Team B’s solution using the list of modeling
competencies demonstrated in the expert’s solution. ‘-’ represents little or no
evidence found in the student solution, ‘+’ represents significant evidence
found in the student solution, ‘n/a’ signifies competencies that are unknown
based solely on Model Maps analysis and therefore are not applicable.
Formulation of the
problem
Information
gathering
Solution
stage
Team A
(low performing)
Team B
(high performing)
1. Identification of multiple sources
-
+
2. Evaluation of information credibility
and applicability
-
n/a
3. Assimilation of information
n/a
n/a
4. Identification of a critical aspect
n/a
n/a
-
+
Modeling competency demonstrated
5. Conceptualization via interconnected
model components
6. Mathematization
7. Simplification
8. Solution
n/a
quantitative model
components
2 model components
7 model components
10. Modeling to select input parameters
-
+
11. Evaluation and revision of model
+
+
12. Convergence on high performing
input parameters
13. Changes in strategic approach
-
+
n/a
+
+
+
9. Modeling before experimentation
Iterative modeling and
experimentation
+, application of
14. Use of first principles throughout the
solution process
47
2.6. Discussion
2.6.1. Reflections on the utility of the competency model
The competency model shown in Table 2.2. contains modeling competencies identified
from analyzing the expert’s solution at both the coarse and fine grain level and constitutes
the desired knowledge, skills to be developed and assessed in the students. In this paper
we have shown the assessment of student solutions against the competency model using
Model Maps analysis of the student solutions. This coarse level analysis aligns with the
grain size of assessment that is typical in large scale assessments.
Our examination of student Team A and B’s solution using the competency model
successfully served as an initial demonstration of the utility of our assessment
framework. We were able to examine the student team Model Maps in light of the
competencies demonstrated by the expert in each of the three solution stages and arrive at
conclusions regarding the degree to which each solution demonstrated the expert
modeling competencies. Our evaluation is summarized in Table 2.3. and shows that
Team B’s solution demonstrated more of the competencies demonstrated by the expert
than did Team A’s solution and therefore constituted a higher quality solution.
Information at a fine grain size supplementing the Model Maps can be seen in Table 2.3.,
represented by the competencies evaluated as ‘n/a’ (not applicable) in the student teams.
This additional information, although not directly applicable in the large scale assessment
of all student solution work products, offers a more detailed understanding of the expert’s
modeling competencies. This additional information is useful for the VCVD learning
system project supervisor as he interacts with students personally in the project update
meetings and the final presentation. In each of these interactions, discussions focus on
specific aspects of the students modeling strategies. By understanding the expert’s
modeling competencies at a fine level of detail, the project supervisor may offer feedback
to student teams to guide them toward a more ‘expert like’ solution.
48
2.6.2. Limitations of the study and future work
There are limitations of this study which will guide future work. First, the expert worked
alone while the students worked on a team. When working to solve complex problems,
teams have been shown to be beneficial and change the nature of the solution process
(Brophy, 2006; Wiley & Jolly, 2003; Ullman, 2009).
Additionally, we have studied the solution of only one expert. The VCVD process
development task is complex and has a large solution space. Thus we expect to see
notable differences in modeling strategies as we observe additional expert teams from a
variety of backgrounds. For example, the expert in this study had received a doctorate
and therefore, most likely had been trained in the practice of thoroughly reviewing
literature. Experts without a doctorate degree could approach the problem by placing a
lesser emphasis on reviewing literature.
A second example of expected variety in future expert solutions is in the use of DOE to
solve the VCVD task. Such a DOE approach was considered by the expert in this study
and is indicated in his Model Map by a non-operationalized DOE model component
developed while formulating the problem. Likewise, examination of the expert’s thinkaloud transcript and post project interview indicated that he considered DOE a common
approach used in industry to solve similar process development problems.
Examination of the high quality solution generated by Student Team B also provides
insight into such alternative approaches. For example, Team B implemented a DOE
approach in four of their experiments. Additionally, Team B’s model map shows that
they performed statistical analysis to characterize variation in the VCVD process and in
the tool used for measurement. We view these solution features as likely representing a
high quality solution, but they were not present in the expert’s solution. Thus, in future
work, we will seek to gain a more holistic list of modeling competencies that incorporate
a wide range of expert solution approaches to the VCVD process development task.
49
An additional limitation to this study is the heavy focus on characterizing modeling
competencies. While such a focus was necessary in the early stages of development of
our assessment framework, we have witnessed many other competencies demonstrated in
student solutions to this complex task that are not captured by the Model Maps technique.
A sub-set of these includes intra and interpersonal competencies, project management
competencies, and metacognitive competencies. Future work will focus on developing
ways to capture evidence enabling us to make claims regarding students’ demonstration
of such competencies. In order to accomplish this holistic investigation of expert
solutions, we will first seek to form teams of experts, rather than the single expert who
was used in this study, and second, work to develop more systematic and finer grain size
protocol analysis framework to analyze expert think-aloud transcripts on an utterance by
utterance basis.
Finally, while examination of the student team’s solution demonstrated the utility of the
competency model and our framework, more work is still needed to streamline the
assessment framework so that it may easily be applied to a large number of student
solutions in the VCVD Learning System to yield numerical scores. These efforts could
include the integration of the statistical method typically used in the ECD framework.
2.7. Conclusions
Industrially situated computer enhanced learning environments present students with an
opportunity to practice as professionals within the constraints of the typical classroom.
However, student performance in such environments tends to be difficult to assess. In this
paper we have described a framework for the assessment of student learning in one such
learning environment, the VCVD Learning System. The assessment framework is
composed of an authentic task situated in industry; analysis of evidence of student
performance, as represented in Model Maps; and modeling competencies identified from
the examination of an expert’s solution to the same task as student teams are given. This
paper described our effort to develop a list of target modeling competencies using the
study of an expert’s solution to the VCVD process development task. These
50
competencies were framed within the solution stage in which they were demonstrated and
the organized list became the competency model for the VCVD task (as shown in Table
2.2.).
The competency model was then used to qualitatively assess two student teams’ solutions
as evidenced by their respective Model Maps. Table 2.2. provided a framework to clearly
identify modeling competencies in the student teams’ solutions that were aligned with the
expert’s competencies, and to identify aspects of the student solutions that appeared to be
lacking in modeling rigor. Such information can inform both instruction and assessment
practices.
51
Chapter 3: An Expert Study of Transfer in an Authentic Problem
Authors
Ben U. Sherrett
008 Gleeson Hall, Oregon State University
Email: sherretb@onid.orst.edu
Erick Nefcy
008 Gleeson Hall, Oregon State University
Email: nefcye@onid.orst.edu
Debra Guilbuena
008 Gleeson Hall, Oregon State University
Email: gilbuend@onid.orst.edu
Edith Gummer
Education Northwest, Portland, Oregon
Email: Edith.Gummer@educationnorthwest.org
Milo D. Koretsky
103 Gleeson Hall, Oregon State University
Email: milo.koretsky@oregonstate.edu
Proceedings of the Research in Engineering Education Symposium
October 4th-7th, 2011, Madrid, Spain
52
Abstract: As engineering educators, we seek to equip students with the ability to apply
their knowledge and understanding to solve novel problems. But how is this ability best
developed? This study investigates such transfer by examining two different expert
solutions to a complex process development project. The experts have different domain
knowledge - one in chemical engineering processes, the other in mechanical engineering
design. Solutions are examined using an emerging methodology termed “Model Maps”.
3.1. Context and Research Questions
Process development is a common and critical task for chemical engineers in industry;
however, it is difficult to create activities at the university to give students practice where
they are directly engaged in this task. Using a computer-based simulation, we have
created a process development project for students, the Virtual CVD Project. The project
has been specifically designed to provide the students an authentic, industrially-situated
task which they can solve using the fundamental knowledge and skills that they have
learned as undergraduates. This ability to transfer core understanding to solve novel
problems is central to the practice of engineering, but challenging to develop and assess.
This Virtual CVD Project tasks students with designing and performing experiments to
develop a ‘recipe’ of input parameter values for a chemical vapor deposition (CVD)
reactor. The reactor deposits thin films of silicon nitride on polished silicon wafers, an
initial step in the manufacture of transistors. The process development project is
completed using a virtual laboratory. Based on the input parameters that are chosen, the
computer simulation generates data for film thicknesses at locations that the students
chose to measure. The output values incorporate random and systematic process and
measurement error and are representative of an industrial reactor. The students use
results from successive runs to iteratively guide their solution and are encouraged to
apply sound engineering methods since they are charged virtual money for each reactor
run and each measurement. They continue until they have found a 'recipe' of input
parameter values that they believe yields acceptable reactor performance. There are four
metrics that define favorable reactor performance: the students should perform as few of
experiments as possible to develop a recipe that deposit films as uniform as possible, with
53
the highest reactant utilization, and lowest reaction time. This project is complex and
participants typically spend between 15 and 25 hours to complete the project.. More
information may be found in Koretsky et al. (2008).
In this project, there is no one 'correct' solution path.
Through examination of over sixty student solution
First Principles
Modeling
(FPM)
paths, we see that although no two solutions are the
Statistical
Experimental
Design
(SED)
same, they can be classified by three fundamental
approaches. These approaches are labeled by the vertices
of the triangle in Figure 3.1. In a sense, the triangle can
be considered analogous to a ternary phase diagram and
solution paths may be anywhere in the triangle,
Guess and Check/Tuning
Figure 3.1. A triangle representing the three
possible methods for solving experimental
design problems
representing either a single method approach (at the vertices) or a mixed methods
approach (inside the triangle). First, students may attempt to model the relevant physical
and chemical phenomena affecting the process using first principles modeling (FPM). We
define a model as the representation of a phenomenon that facilitates understanding and
promotes further inquiry. These analytical models can be used to predict input parameter
values that will result in desirable reactor performance - what we call a model directed
run. Second, students may adopt a statistical experimental design (SED) approach. This
method relies on empirical data to develop statistical models which relate input
parameters to process performance metrics. The statistical model may then be used to
predict how the input parameters impact process performance. SED includes such
methods as Design of Experiments (DOE) and Robust Design using Taguchi Methods
(Phadke, 1989). Finally, students may proceed in the solution path by 'intuitively'
guessing and checking or, later in the project, tuning. This strategy relies, at best, on a
qualitative understanding of the mechanisms driving the reactor. In this study, we focus
on solution paths guided by FPM and SED.
Although both FPM and SED solution approaches are used in practice, they are seldom
directly compared. The curricula in chemical engineering is foundationally constructed
to develop skills in FPM. Certainly understanding the fundamental phenomena governing
processes is valuable; however, many practicing engineers resort to SED. Such input has
54
led to recent integration of these methods into the curriculum (Koretsky, 2010), albeit
often in a standalone manner. To confound matters, text books in DOE and Robust
Design (Phadke, 1989) typically lack any discussion of FPM. To our knowledge, the
literature lacks a description of the tradeoff between these two approaches and a
discussion of what role each might provide in the solution to a process development
project such as that presented in this study. It is unclear when, where and why each
approach should be used on a given project. A balanced analysis that simultaneously
considers both these approaches is needed to enable systematic curriculum design and to
inform instructor feedback of development projects.
To address this issue, we observe and analyze the solutions of two experts, each with
expertise in a different domain, as they complete the project. Neither expert has direct
experience with the type of process upon which the Virtual CVD Project is based. One's
expertise aligns with an FPM approach and the other's with an SED approach. In this
way, our study is reductionist; we have selected individuals to intentionally investigate
these two possible solution approaches and compare the similarities and differences.
This work is part of a larger investigation which seeks to answer the following research
questions:
1. What characterizes the two different approaches (SED and FPM) these experts take as
they complete the Virtual CVD process development project? Which components of
their solutions are similar and which are different?
2. What approach leads to a higher quality solution? Are both methods suited for this
project, or is either method preferred? Why?
3. Based on these observations, what conjectures can we make about engineering
curriculum design, instruction, and feedback?
3.2. Expertise and Transfer
A clear goal of education is to help students move towards expertise. Following this goal,
many studies across a wide range of domains have sought to characterize expertise (Chi,
1989; Cross, 2004). It has been found that experts possess well connected and rich
55
knowledge structures regarding topics within their field of expertise. These knowledge
structures, often referred to as schemata, facilitate a rich understanding of the problem
and rapid recall of relevant “chunks” of information during the solution process
(Bransford, 2000). When studying solutions to engineering design projects, it has also
been found that experts tend to spend more time problem scoping than novices do
(Atman et al., 2007). However, once experts choose a solution concept, they often stick
with the concept throughout the solution process, whether it is good or bad (Ullman et al.,
1988).
In the field of engineering where new technologies emerge daily, graduates need the
ability to apply familiar domain content in new situations. This application of core
knowledge to solve novel real world problems requires transfer. In order to understand
transfer in this context, a new branch of expert studies has emerged, and has been
classified by Hatano and Iganaki (1986) as ‘adaptive expertise.’ Adaptive expertise
contrasts with ‘routine expertise’ in that it is flexible and may be applied to increase
learning and performance in a wide range of new situations. Recently, efforts have been
reported in the engineering education literature discussing how to increase adaptive
expertise in students (McKenna et al., 2006) using pedagogies such as “challenge-based
instruction, brainstorming in groups to generate ideas, receiving input from multiple
experts in the field, and formative assessment integrated into powerful computer
simulations” (Pandy, 2004, p. 9).
Our expert study is informed by this literature. We hypothesize that the experts will both:
(i) execute their solution path in a sophisticated and effective manner, and (ii) rigidly
adhere to a path that aligns with their foundational domain knowledge. By contrasting the
solution paths of experts with different domain knowledge, we seek to identify and
compare a FPM approach and a SED approach. This knowledge can be used to develop
strategies for increasing adaptive expertise in students.
3.3. Methods
We selected two experts based on the characteristics of their expertise and observed them
as they completed the Virtual CVD Project as described above. The first expert has
doctorate in chemical engineering (ChE) and eighteen years industrial experience. He has
56
been promoted regularly, culminating in a management position and a designation of
“master engineer” in a global high-tech company. This expert has a robust understanding
of the fundamental phenomena that govern the CVD reactor (e.g. diffusion, reaction
kinetics) and has self-reported being “known in industry for his ability to apply
fundamental engineering first principles to solve problems.” The second expert has a
doctorate in mechanical engineering (ME) and over twenty years’ experience in design
research and teaching at the university. In addition, he served as lead engineer and
president of an outdoor recreation company whose primary product line was developed
utilizing his design skills. This expert is a certified ‘Taguchi Master’ and has developed
and taught courses on Taguchi’s methods of Robust Design.
While these experts had general disciplinary foundational knowledge, they lacked
specific experience with CVD or related processes upon which the Virtual CVD Project
is based. This lack of process specific experience allows for the study of adaptive
expertise where the experts need to activate foundational domain knowledge to complete
the project and cannot rely on specific experience from similar tasks. This research
design provides an opportunity to study a project where the resources the experts need to
solve the problem approximate the resources that can be developed in students.
Our research group has developed a methodology termed Model Maps intended to
summarize the complete solution paths into a diagram showing critical solution
components. Model Maps are created by researchers who transform the data contained in
participants’ notebooks and other work products and the virtual laboratory database into
information-rich, chronological visual representations which illustrate the development
and use of models as the participants complete the project (see Seniow et al., 2010).
Model Maps display the types of model components employed (quantitative or
qualitative), their degree of utilization (operationalized, abandoned, or not engaged), and
their correctness along a central problem line that also contains numbered experimental
runs. A key for the different model map components is shown in Table 3.1. The Model
Maps analysis method was initially developed to analyse student solutions; 29 such
solutions have been examined using the method. These solutions are not presented in this
paper but we will allude to them in discussion of the experts’ solutions.
57
3.4. Findings
Truncated Model Maps of the two expert solutions are shown in Figure 3.2. and each of
the expert’s solutions is summarized briefly below. Both solutions are detailed and with
the limited space available, we focus this discussion on aspects of the solution that (i)
address the research questions and (ii) will stimulate fruitful discussion and feedback at
REES.
3.4.1. The Expert FPM Solution
The expert ChE approached the Virtual CVD Project by first performing a literature
review to identify which fundamental principles could be applied to the process and to
find typical input parameter values. This activity is shown in the box labeled
“information gathering” in Figure 3.2a. He also used information from the literature to
bound the design space, identifying potential combinations of input parameters that
would result in unfavorable results. Based on this information, the ChE developed
several first principles qualitative and quantitative models to help him understand the
process. These models can be seen along the upper solution path line in Figure 3.2a. and
they culminated in a single analytical model constructed of several FPMs shown in the
dotted hexagon in Figure 3.2a. He then used this analytical model to predict the input
parameter values for his first run, denoted by the square in Figure3. 1a., which signifies a
model directed run. The expert performed seven more experimental runs during his
solution. He used runs 2-4 to qualitatively verify his model and to further develop it (as
illustrated by 5 new models). He fine-tuned his input parameter values in runs 5-8 and
then submitted the final recipe.
3.4.2. The Expert SED Solution
58
The expert ME immediately framed the
Virtual CVD Project as an SED
Table 3.1. A key for the representations used in the
Model Maps
Symbol
Gathering’ box). This information led to
the development of a Taguchi L18
experimental design model and
complimentary signal to noise ratio
model. The oval shown in Figure 3.2b.
shows these as the first models
operationalized by the ME and one of
the team’s only primary models. The
Taguchi method was used to set
parameter values for 18 experimental
runs in order to test the effects of six
Modeling Actions
parameter values (see the ‘Information
Run Type
Each with respective run # or #’s
the literature for suitable ranges of input
Type of Model
problem, and searched 8 sources from
factorial design would require 729 runs).
The Model Map shows 13 additional
‘secondary models,’ not on the primary
solution path line. These models were
used by the ME to attempt to understand
Secondary Components
input parameters at three levels (a full
Problem
Scoping
Sources
Model
Model
Name and Description
Circles represent Qualitative models,
relationships that do not rely on numbers,
e.g. “As pressure decreases, uniformity
increases”
Rectangles represent Quantitative Models
which allow teams to quantitatively relate
numbers(typically in the form of equations),
e.g. “Film Deposition Rate=Reaction Rate
Constant x Concentration of Reactant”
An Operationalized Model is one that is
developed and then used throughout the
solution process.
Abandoned Models are developed and then
clearly abandoned.
A model is classified as Not Engaged if it is
clearly displayed in the notebook but no
evidence exists of its use.
A quantitative model is used by the group to
determine the parameter values for Model
Directed Runs.
A statistical approach is used to determine
the parameter values for Statistically
Defined Runs (e.g. DOE)
Teams analyze the data provided by
Qualitative Verification Runs to
qualitatively verify models that they have
developed.
No explicit reason is given or deducible for
Runs Not Explicitly Related to Modeling.
Often they represent a “guess and check” or
a “fine tuning” run.
An X is placed over any Clearly Incorrect
Model.
A box is shown at the beginning of the map,
signifying the Information Gathering stage.
All sources listed in notebook are displayed.
Primary Models are along the central
problem line and are used repeatedly and
are essential to the overall solution.
Secondary Models are connected to the
central problem line and are peripheral to
the overall solution
interactions between and relative effects
of the input parameters. They were based on first principles but were developed in a
qualitative and fragmented fashion which revealed a lack of fundamental domain
expertise. For example, he had difficulty determining which of the many equations
identified in the literature review were applicable and how they might be applied. Once
the ME had designed his experiment using Taguchi methods and defined the ranges of
input parameters, he conducted all 18 experiments in the design without developing
further models (shown by the triangle and “1-18” in figure 3.2b.). He then took the best
performing run and used the model generated by Taguchi methods to improve the
59
reactors performance in two subsequent ‘fine tuning’ runs. The expert ME voiced his
desire to run another designed experiment but was constrained by time and submitted his
‘final recipe’.
3.5. Discussion
As predicted, each expert utilized the approach that corresponded to his expertise. As
compared to student teams who pursue FPM or SED approaches, the experts enacted
their solution approaches earlier, more fluently, and in a more sophisticated manner. For
example, no student teams have used a model to predict the parameters for their very first
run or have used Taguchi methods to design and analyse experiments. The solution
approaches employed by each expert had distinct advantages and disadvantages. The
FPM approach allowed the ChE to converge on a solution quickly (i.e. in a low number
of runs). The FPM approach also allowed the ChE to focus on the performance metric
that he thought was most important (film uniformity) and achieve high performance
regarding that metric. However, the FPM approach covered a smaller component of the
design space and relied primarily on the expert ChE’s robust understanding of first
principles and his ability to apply them in an effective way. The SED approach pursued
by the ME facilitated a broader survey of the solution space and allowed the ME to solve
the problem with little technical domain knowledge. However, this solution approach
required many more experiments. The final recipe of the ChE had better performance
with regard to uniformity and development cost while the ME had favorable results
regarding reaction time and reactant utilization. These findings are summarized in Table
3.2.
60
(a) The Expert Chemical Engineer’s First Principles Modeling (FPM) Approach
As Pressure
Increases,
1 
 1
1 T 

3/ 2  1

-WeltyTWicks

-Welty Wicks Wilson and M A D MB   M M 
P
DAB 
Wilson and
Rorrer
P
Rorrer
-Teasdale
-Teasdale
A
B
AB
Information
Information Gathering
Material
Balance 1
Gathering
-Wolf and Tauber
-May and Sze
-Xiao
-Wolf and Tauber
-May and Sze -Wikipedia
-Xiao
-Wikipedia
k  k0 e

1
Zone By Zone
Material
Zone By
Zone
Balance
Material
Balance
T2  
1
1
v
ln
RT
RT
C2 
CC 
1
 ln 1 2 
E
1
 T1
C 
T2    k 1 
E 
 T1

k 1

As DCS/NH3
Material
Balance 2
R  kCDCS
Ratio Increases,
As DCS/NH3
Reaction Rate
Ratio Increases,
Increases
Reaction Rate
1, 5
Increases
Ideal Gas Law
DOE: 3Factor: P,
NH3 Flow,
DCS Flow
 EA
k  k0 e
Material
Balance 1
Ideal Gas Law
  
 EA
As Pressure
Reaction Rate
Increases,
Increases
Reaction Rate
Increases
3/ 2
R  kCDCS
Project Update
Meeting
Kinetic
Model 2
8 7 6 5
Utilization
ALL
4
3
Material
Balance 3
5-8
Final
Parameters
Thickness is
linear with
time
Utilization
Kinetic
Model II
4
Final
Parameters
Material
Balance 3;
4 new silicon
Project
Update
nitride
density
X2

 1 ln X
1
T2   
E
 T1

k

1




3

7,8
Meeting
1
2
X2

 1 ln X
1
T2   
E
 T1

k

Thickness is
linear with
time






1
Material
Balance 2
2
1
1
2
(b) The Expert Mechanical Engineer’s Statistical Experimental Design (SED) Approach
Taguchi
Bobaoca
Journal de physique ’89
Shenasa, et al, 1992
Information
Gathering
Wang
Henda
Koretsky, 2006
Pollard et. al, 1996
Deposition
increases
linearly with
temperature
R  kCA2
Higher
temperature
yields higher
variance
Signal To
Noise Ratio
  EA 



20 19 1-18
Arrhenius
Plot
Economic
Analysis
Taguchi L18
Design
k  k0e RT
Final
Parameters
Lower
pressure
yields higher
uniformity
Randomized
Measurement
Scheme
r2  kd PNH 3
Higher
NH3/DCS Ratio
yields higher
uniformity
r1 
ks PDCS PNH 3
1  k1 PDCS  k2 PNH 3
Reaction
Controlled vs
Diffusion
Controlled
Radial
Uniformity
Ideal Gas
Law
Material
Balance II
Material
Balance I
Figure 3.2. Model Maps showing the solution paths followed by the two experts.
61
Both overall solution paths pursued by experts in this study relied heavily on surveying
the literature and developing models early in the solution process to facilitate an
understanding of the project. Both ended with ‘fine tuning’ runs. We argue that these are
universal solution strategies that may be transferred to many process development
problems. Accordingly, such front end ‘problem scoping’ skills and back end ‘tuning’
should be explicitly taught and modeled in engineering classrooms. Additionally,
students should understand the relative merits and drawbacks for the FPM and a SED
approaches. The appropriate method to choose depends on both the expected behavior of
the process and the knowledge that the designer possesses. Although the experts in this
study rigidly chose one method, from experience observing student teams, we feel that a
hybrid FPM/SED approach is also a viable option for solving the problem. More
investigation is needed to examine the potentially fruitful integration of these two
approaches into one solution path.
Table 3.2. Comparison of the FPM and SED solution approaches and reactor performance.
Frist Principles
Modeling (FPM)
Statistical
Experimental
Design (SED)
CVD Results
Advantages
- Minimizes number of
experimental runs
- Does not require
DOE/Taguchi methods
knowledge
- Requires minimal
understanding of underlying
phenomena
- Can map a large area of the
solution space
- Higher uniformity / lower
experimental cost
Disadvantages
- Requires robust understanding of
first principles
- Relevant first principles must be
identifiable
- Experimentally costly
- Difficult to capture complex, nonlinear input interactions
- Lower reactor time / higher
reactant utilization
3.6. Acknowledgements
The authors are grateful to the two experts who participated in the study and for support
from the Intel Faculty Fellowship Program and the National Science Foundation’s CCLI
Program (DUE-0442832, DUE-0717905). Any opinions, findings, and conclusions or
recommendations expressed in this material are those of the authors and do not
necessarily reflect the views of the NSF.
62
4. CONCLUSION
The objective of this study is to contribute to the engineering education community and
to further develop best-practices, specifically in situated learning environments. The
work focused on contributing in three ways: (i) informing instruction and assessment of
the VCVD task, (ii) informing instruction and assessment of other engineering design
tasks, and (iii) providing an assessment framework that may be used in other learning
systems.
4.1. Informing instruction and assessment of the VCVD task
First, through examination of an expert chemical engineer’s solution, this work provides
an initial set of competencies that can be used for instruction and assessment in the
VCVD learning system. In the first manuscript, we identified a detailed list of 13
competencies associated with modeling to solve the VCVD task. The second manuscript
reaffirmed general findings from the first manuscript and additionally examined the
variation in solution approaches employed by the expert from the first manuscript and an
expert mechanical engineer.
This information can directly inform both instruction and assessment in the VCVD
learning system. Specifically, the list of competencies demonstrated by the experts, and
knowledge of the two solution approaches can be used by instructors to guide teams
during the project. In this way, instructors can help students close the gap between
current performance and expert performance with confidence. Additionally, the list of
competencies can be used by instructors to quantitatively assess student solutions. Such
expert-based assessment adds objectivity, transparency, and justification to the
assessment process and reinforces the authenticity of the project.
The variability in expert solutions also informs instruction as it offers proof that there is
no single correct solution path to the VCVD problem. Rather, the experts in the study
chose largely different approaches based on their prior knowledge, skills, and experience.
Understanding that there are many approaches affords a more dynamic and differentiated
63
approach to instruction, where individual student aptitudes can help determine the
feedback each student team receives.
This work also affords the use of the VCVD task in new institutions. In the past six years,
the VCVD task has been used in a variety of settings, from high schools to graduate
schools. However, the VCVD task is complicated and it has been difficult, in the past, to
communicate the traits of an ‘ideal’ student solution to instructors who desire such
information. This work represents progress in regards to defining traits associated with
high-quality solutions, in a manner that is clear, well-founded, and may be effectively
communicated to other institutions.
4.2. Informing instruction and assessment of other engineering design tasks
We conjecture that the competencies identified in this study may also be used to inform
instruction and assessment in other engineering design tasks given in engineering
curricula. The experts’ emphasis on information gathering and modeling to understand
the problem aligns with other studies that have found such traits to result in high-quality
solutions to engineering design problems (Atman, Chimka, Bursic & Nachtmann, 1999).
Additionally, there are many engineering design problems where empirical methods (e.g.
statistical experimental design) and first principles modeling may both be used to
accomplish similar goals. Examination of the expert solutions in this study provided
insight regarding the factors that contributed to expert engineers’ selection and
implementation of the two approaches. We also identified relative merits and drawbacks
of each approach in the context of process development.
4.3. Providing a transferable assessment framework
The first manuscript in this thesis outlined a framework for identifying target
competencies in order to inform instruction and assessment of student solutions to realworld engineering tasks. This framework uses the study of expert solutions and the
components of task, evidence and competencies as suggested by Evidence Centered
64
Design (Mislevy, Almond & Lukas, 2003). Our framework is different in that it inverts
the order in which these components are typically developed and, in doing so, places a
large emphasis on ensuring that the task is authentic. Our framework can be used by
educators to develop learning systems from the ground up, as it was in the case of the
VCVD learning system. Additionally, it may be implemented in a retroactive fashion to
define target competencies for industrially situated learning systems which have
previously been developed. In this role, the framework can be used to add objectivity,
transparency, and justification to the instruction and assessment practices already in
place.
The goal of this work is to advance the field of engineering education by outlining a path
for developing objective and defendable instruction and assessment tools for use in
industrially situated learning environments. By furthering instruction and assessment
practices in this field, our objective is to promote the wide-spread use of authentic tasks
situated in industrial practice to educate engineers.
65
Bibliography
Abdulwahed, M., & Nagy, Z. K. (2009). Applying Kolb’s experiential learning cycle for
laboratory education. Journal of Engineering Education, 98(3), 283–293.
Adams, R.S. (2001). Cognitive Processes in Iterative Design Behavior. Thesis. Seattle,
WA: University of Washington.
Ambrose, S. A., Bridges, M. W., DiPietro, M., Lovett, M. C., Norman, M. K., & Mayer,
R. E. (2010). How learning works: Seven research-based principles for smart
teaching. Jossey-Bass.
Atman, C. J., Adams, R. S., Cardella, M. E., Turns, J., Mosborg, S., & Saleem, J. (2007).
Engineering design processes: A comparison of students and expert practitioners. .
Journal of Engineering Education, 96(4), 359.
Atman, C. J., Bursic, K. M., & Lozito, S. L. (1996). An application of protocol analysis
to the engineering design process. ASEE Annual Conference Proceedings.
Atman, C. J., Chimka, J. R., Bursic, K. M., & Nachtmann, H. L. (1999). A comparison of
freshman and senior engineering design processes. Design Studies, 20(2), 131–152.
Atman, C. J., Kilgore, D., & McKenna, A. (2008). Characterizing design learning: A
mixed-methods study of engineering designers’ use of language. Journal of
Engineering Education, 97(3), 309–326.
Bauer, M., Williamson, D., Mislevy, R., & Behrens, J. (2003). Using evidence-centered
design to develop advanced simulation-based assessment and training. World
Conference on E-Learning in Corp., Govt., Health., & Higher Ed (pp. 1495–1502).
Bransford, J. D., Brown, A. L., & Cocking, R. R. (2000). How people learn. National
Academy Press Washington, DC.
Brophy, D. R. (2006). A comparison of individual and group efforts to creatively solve
contrasting types of problems. Creativity Research Journal, 18(3), 293–315.
Brown, J. S., Collins, A., & Duguid, P. (1989). Situated cognition and the culture of
learning. Educational Researcher, 18(1), 32–42.
Buckley, B. C. (2000). Interactive multimedia and model-based learning in biology.
International Journal of Science Education, 22, 895–935.
Buckley, B. C., Gobert, J. D., Kindfield, A. C. H., Horwitz, P., Tinker, R. F., Gerlits, B.,
Wilensky, U., et al. (2004). Model-based teaching and learning with BioLogicaTM:
what do they learn? How do they learn? How do we know? Journal of Science
Education and Technology, 13(1), 23–41.
Buckley, B. C., Gobert, J. D., Horwitz, P., & O’Dwyer, L. M. (2010). Looking inside the
black box: assessing model-based learning and inquiry in BioLogicaTM. International
Journal of Learning Technology, 5(2), 166–190.
Campbell, J., Bourne, J., Mosterman, P., & Brodersen, A. (2002). The effectiveness of
learning simulations for electronic laboratories. Journal of Engineering Education,
91(1), 81–87.
Cardella, M. E. (2009). Mathematical modeling in engineering design projects. In Lesh,
R., Galbraith, P. L., & Haines, C. R. (2009). Modeling students’ mathematical
modeling competencies (pp. 87–98).Springer Verlag.
Chesler, N., D’Angelo, C., Arastoopour, G., and Shaffer, D.W. (2011). Use of
Professional Practice Simulation in a First-Year Introduction Engineering Course.
66
Paper presented at the American Society for Engineering Education Conference
(ASEE), Vancouver, BC.
Chiodo, J. J., & Flaim, M. L. (1993). The link between computer simulations and social
studies learning: Debriefing. The Social Studies, 84(3), 119–121.
Chung, G. K. W. K., Harmon, T. C., & Baker, E. L. (2001). The impact of a simulationbased learning design project on student learning. Education, IEEE Transactions on,
44(4), 390–398.
Clark, D., Nelson, B., Sengupta, P., & D’Angelo, C. (2009). Rethinking science learning
through digital games and simulations: Genres, examples, and evidence. Learning
science: Computer games, simulations, and education workshop sponsored by the
National Academy of Sciences, Washington, DC.
Clement, J. (1989) ‘Learning via model construction and criticism: protocol evidence on
sources of creativity in science’, in J.A. Glover, R.R. Ronning and C.R. Reynolds
(Eds.): Handbook of Creativity: Assessment, Theory and Research, pp.341–381,
Plenum Press, New York.
Collins, A. (2011). Situative View of Learning. Learning and Cognition, 64.
Corter, J. E., Nickerson, J. V., Esche, S. K., Chassapis, C., Im, S., & Ma, J. (2007).
Constructing reality: A study of remote, hands-on, and simulated laboratories. ACM
Transactions on Computer-Human Interaction (TOCHI), 14(2), 7.
Cross, N. (2004). Expertise in design: an overview. Design Studies, 25(5), 427–441.
Cross, N. (2003). The expertise of exceptional designers. Expertise in Design, Creativity
and Cognition Press, University of Technology, Sydney, Australia, 23–35.
Cross, N., & Cross, A. C. (1998). Expertise in engineering design. Research in
Engineering Design, 10(3), 141–149.
Davidson, D. (2009). From experiment gameplay to the wonderful world of Goo, and
how physics is your friend. Well Played 1.0 (pp. 160–176).
Delzell, J. E., Chumley, H., Webb, R., Chakrabarti, S., & Relan, A. (2009). Informationgathering patterns associated with higher rates of diagnostic error. Advances in health
sciences education, 14(5), 697–711.
Diefes-Dux, H. A., & Salim, A. (2009). Problem Formulation during Model-Eliciting
Activities: Characterization of First-Year Students’ Responses. Proceedings of the
Research in Engineering Education Symposium 2009, Palm Cove, QLD.
Dorneich, M. C., & Jones, P. M. (2001). The UIUC virtual spectrometer: A Java-based
collaborative learning environment. Journal of Engineering Education, 90(4), 713–
720.
Ekwaro-Osire, S., & Orono, P. O. (2007). Design notebooks as indicators of student
participation in team activities. Frontiers In Education Conference-Global
Engineering: Knowledge Without Borders, Opportunities Without Passports, 2007.
FIE’07. 37th Annual (p. S2D–18).
Ennis, C. W., & Gyeszly, S. W. (1991). Protocol analysis of the engineering systems
design process. Research in Engineering Design, 3(1), 15–22.
Ericsson, K. A., & Simon, H. A. (1996). Protocol Analysis: Verbal Reports as Data
(revised edition). Cambridge, MA: MIT press.
Finkelstein, N., Adams, W., Keller, C., Kohl, P., Perkins, K., Podolefsky, N., Reid, S., et
al. (2005). When learning about the real world is better done virtually: A study of
67
substituting computer simulations for laboratory equipment. Physical Review Special
Topics-Physics Education Research, 1(1), 010103.
Fortus, D., Krajcik, J., Dershimer, R. C., Marx, R., & Mamlok-Naaman, R. (2005).
Design-based science and real-world problem-solving. International journal of science
education, 27(7), 855–880.
Gainsburg, J. (2006). The mathematical modeling of structural engineers. Mathematical
Thinking and Learning, 8(1), 3.
Gulikers, J. T. M., Bastiaens, T. J., & Kirschner, P. A. (2004). A five-dimensional
framework for authentic assessment. Educational Technology Research and
Development, 52(3), 67–86.
Gredler, M. E. (2004). Games and simulations and their relationships to learning.
Handbook of research on educational communications and technology, 2, 571–581.
Hannafin, M. J., & Land, S. M. (1997). The foundations and assumptions of technologyenhanced student-centered learning environments. Instructional Science, 25(3), 167–
202.
Harmon, T. C., Burks, G. A., Giron, J. J., Wong, W., Chung, G. K. W. K., & Baker, E.
(2002). An interactive database supporting virtual fieldwork in an environmental
engineering design project, Journal of Engineering Education, 91(2), 167–176.
Herrington, J., & Oliver, R. (2000). An instructional design framework for authentic
learning environments. Educational technology research and development, 48(3), 23–
48.
Hatano, G., & Inagaki, K. (1984). Two courses of expertise. Research and Clinical
Center For Child Development Annual Report, 6, 27–36.
Hmelo-Silver, C. E. (2003). Analyzing collaborative knowledge construction: multiple
methods for integrated understanding. Computers & Education, 41(4), 397–420.
Hmelo-Silver, C. E., Nagarajan, A., & Day, R. S. (2002). “It’s harder than we thought it
would be”: A comparative case study of expert–novice experimentation strategies.
Science Education, 86(2), 219–243.
Hodge, H., Hinton, H. S., & Lightner, M. (2001). Virtual circuit laboratory. Journal of
Engineering Education, 90(4), 507–511.
Horwitz, P., Neumann, E., & Schwartz, J. (1996). Teaching Science at multiple levels:
the GenScope Program. Communications of the ACM 39, 100.
Honey, M., & Hilton, M. (2011). Learning science: computer games, simulations, and
education. National Academies Press.
Jacobson, M. J. (1991). Knowledge acquisition, cognitive flexibility, and the instructional
applications of hypertext: A comparison of contrasting designs for computer-enhanced
learning environments.
Jayakumar, S., Squires, R.G., Reklaitis, G.V., Andersen, P.K., & Dietrich, B.K. (1995).
The Purdue-Dow Styrene Butadiene Polymerization Simulation. Journal of
Engineering Education. 84:271-277.
Jain, V. K., & Sobek, D. K. (2006). Linking design process to customer satisfaction
through virtual design of experiments. Research in Engineering Design, 17(2), 59–71.
Johri, A., & Olds, B. M. (2011). Situated engineering learning: Bridging engineering
education research and the learning sciences. Journal of Engineering Education,
100(1), 151–185.
68
Johnson, S. D. (1988). Cognitive analysis of expert and novice troubleshooting
performance. Performance Improvement Quarterly, 1(3), 38–54.
Kirschenbaum, S. S. (1992). Influence of experience on information-gathering strategies.
Journal of Applied Psychology, 77(3), 343.
Koretsky, M. D., Kelly, C. & Gummer, E. (2011). Student Perceptions of Learning in the
Laboratory: Comparison of Industrially Situated Virtual Laboratories to Capstone
Physical Laboratories. Journal of Engineering Education, 100(3), 540–573.
Koretsky, M. D., Amatore, D., Barnes, C., & Kimura, S. (2008). Enhancement of student
learning in experimental design using a virtual laboratory. Education, IEEE
Transactions on, 51(1), 76–85.
Koretsky, M. D., Barnes, C., Amatore, D., & Kimura, S. (2006). Experiential learning of
design of experiments using a virtual CVD reactor. American Society of Engineering
Education Conference Proceedings.
Kuriyan, K., Muench, W., & Reklaitis, G.V. (2001). Air Products Hydrogen Liquifaction
Project: Building a Web-Based Simulation of an Industrial Process. Computer
Applications in Engineering Education. 9:180.
Lave, J., & Wenger, E. (1991). Situated learning: Legitimate peripheral participation.
Cambridge: Cambridge University Press.
Lesh, R. A., & Doerr, H. M. (2003). Beyond constructivism: Models and modeling
perspectives on mathematics problem solving, learning, and teaching. Lawrence
Erlbaum.
Lesh, R., & Harel, G. (2003). Problem solving, modeling, and local conceptual
development. Mathematical Thinking and Learning, 5(2-3), 157–189.
Litzinger, T., Lattuca, L. R., Hadgraft, R., & Newstetter, W. (2011). Engineering
education and the development of expertise. Journal of Engineering Education,
100(1), 123–150.
Lindsay, E. D., & Good, M. C. (2005). Effects of laboratory access modes upon learning
outcomes. IEEE Transactions on Education, 48(4), 619–631.
Maaß, K. (2006). What are modeling competencies? ZDM, 38(2), 113–142.
McKenna, A. F., Colgate, J. E., Olson, G. B., & Carr, S. H. (2001). Exploring adaptive
Expertise as a target for engineering design education. Proceedings of the
International Design Engineering Technical Conferences and Computers and
Information in Engineering Conference (pp. 1–6).
Mislevy, R. J., Almond, R. G., Lukas, J. F. (2003). A brief introduction to evidencecentered design. Research Report-Educational Testing Service Princeton Rr, 16.
Montgomery, D. C. (2010). Design and analysis of experiments (7th ed.). John Wiley &
Sons Inc.
Mosterman, P. J., Dorlandt, M. A. M., Campbell, J. O., Burow, C., Bouw, R., Brodersen,
A. J., & Bourne, J. (1994). Virtual engineering laboratories: Design and experiments.
Journal of Engineering Education, 83(3), 279–285.
Nefcy, E. J., Gummer, E. & Koretsky, M. D. (2012). Characterization of Student
Modeling in an Industrially Situated Virtual Laboratory. To be published in 2012
American Society of Engineering Education Conference Proceedings.Podolefsky, N.
(2010). Research on Games and Simulations in Education: Among Great Diversity,
The Journal of Science Education and Technology Stands Out. Journal of Science
Education and Technology, 1–2.
69
Pandy, M. G., Petrosino, A. J., Austin, B. A., & Barr, R. E. (2004). Assessing adaptive
expertise in undergraduate biomechanics. Journal Of Engineering EducationWashington-, 93, 211–222.
Phadke, M. S., & Laboratories, A. B. (1989). Quality engineering using robust design.
Prentice Hall New Jersey.
Prince, M. J., & Felder, R. M. (2006). Inductive teaching and learning methods:
Definitions, comparisons, and research bases. Journal of Engineering Education,
95(2), 123.
Pyatt, K., & Sims, R. (2007). Learner performance and attitudes in traditional versus
simulated laboratory experiences. In ICT: Providing choices for learners and learning.
Proceedings ascilite, Singapore 2007.
Quellmalz, E., Timms, M. J., & Schneider, S. (2009). Assessment of student learning in
science simulations and games. Retrieved from the National Academies website:
http://www7.nationalacademies.org/bose/Schneider_Gaming_CommissionedPaper.pdf
Reimann, P., & Chi, M.T.H. (1989). Human expertise. In K.J. Gilhooly (Ed.), Human
and machine problem solving (pp. 161-191). New York: Plenum.
Robinson, F. E. (2011). The Role of Deliberate Behavior in Expert Performance: The
Acquisition of Information Gathering Strategy in the Context of Emergency Medicine.
Wright State University.
Restrepo, J., & Christiaans, H. (2004). Problem structuring and information access in
design. Journal of Design Research, 4(2), 1551–1569.
Rupp, A. A., Gushta, M., Mislevy, R. J., & Shaffer, D. W. (2010). Evidence-centered
design of epistemic games: Measurement principles for complex learning
environments. Journal Of Technology Learning And Assessment, 8(4), 1–45.
Schaller, D. T., Goldman, K. H., Spickelmier, G., Allison-Bunnell, S., & Koepfler, J.
(2009). Learning in the wild: What Wolfquest taught developers and game players.
Museums and the web.
Schwarz, C. V., Reiser, B. J., Davis, E. A., Kenyon, L. O., Acher, A., Fortus, D.,
Shwartz, Y., Hug, B., & Krajcik, J. (2009). Developing a learning progression of
scientific modeling: Making scientific modeling accessible and meaningful for
learners. Journal of Research in Science Teaching. 46:6, 632-654.
Sehati, S. (2000). Re-engineering the practical laboratory session. International journal
of electrical engineering education, 37(1), 86–94.
Seniow, K., Nefcy, E., Kelly, C., & Koretsky M. (2010). Representations of Student
Model Development in Virtual Laboratories based on a Cognitive Apprenticeship
Instructional Design. American Society of Engineering Education Conference
Proceedings.
Shaffer, D. W. (2005). Epistemic games. Innovate. 1(6).
Shaffer, D. W., Squire, K. A., Halverson, R., & Gee, J. P. (2005).Video games and the
future of learning. Phi Delta Kappan, 87(2), 104-111.
Shute, V. J., & Kim, Y-J. (in press). Does playing the World of Goo facilitate learning?
In D. Y. Dai (Ed.), Design research on learning and thinking in educational settings:
Enhancing intellectual growth and functioning. New York, NY: Routledge Books.
Shin, D., Yoon, E. S., Park, S. J., & Lee, E. S. (2000). Web-based interactive virtual
laboratory system for unit operations and process systems engineering education.
Computers & Chemical Engineering, 24(2-7), 1381–1385.
70
Sim, S. H., Spencer, Jr., B. F., & Lee, G. C. (2009). Virtual laboratory for experimental
structural dynamics. Computer Applications in Engineering Education, 17(1): 80–88.
Smith, R., & Leong, A. (1998). An observational study of design team process: A
comparison of student and professional engineers. Journal of Mechanical Design, 120,
636.
Sobek, D. K. (2002). Use of journals to evaluate student design processes. Proceedings of
the American Society for Engineering Education Annual Conference & Exposition.
Teasdale, D., Senzaki, Y., Herring, R., Hoeye, G., Page, L., & Schubert, P. (2001).
LPCVD of silicon nitride from dichlorosilane and ammonia by single wafer rapid
thermal processing. Electrochemical and Solid-State Letters, 4, F11.
Todd, R. H., Sorensen, C. D., & Magleby, S. P. (1993). Designing a senior capstone
course to satisfy industrial customers. Journal of Engineering Education, 82(2), 92–
100.
Tsovaltzi, D., Rummel, N., McLaren, B. M., Pinkwart, N., Scheuer, O., Harrer, A., &
Braun, I. (2010). Extending a virtual chemistry laboratory with a collaboration script
to promote conceptual learning. International Journal of Technology Enhanced
Learning, 2(1), 91–110.
Ullman, D. G. (2009). The mechanical design process (Vol. 4). McGraw-Hill New York.
Ullman, D. G., Dietterich, T. G., & Stauffer, L. A. (1988). A model of the mechanical
design process based on empirical data. Artificial Intelligence for Engineering, Design,
Analysis and Manufacturing, 2(01), 33–52.
van Joolingen, W. R., & De Jong, T. (2003). SIMQUEST: Authoring Educational
Simulations. In Murray, T., Blessing, S., & Ainsworth, S. Authoring Tools for
Advanced Technology Learning Environments: Toward cost-effective adaptive,
interactive, and intelligent educational software (Ch. 1). Springer.
Vogel, J. J., Vogel, D. S., Cannon-Bowers, J. A. N., Bowers, C. A., Muse, K., & Wright,
M. (2006). Computer gaming and interactive simulations for learning: A metaanalysis. Journal of Educational Computing Research, 34(3), 229–243.
Wieman, C. E., Adams, W. K., & Perkins, K. K. (2008). PhET: Simulations that enhance
learning. Science, 322(5902), 682–683.
Wiesner, T. F., & Lan, W. (2004). Comparison of student learning in physical and
simulated unit operations experiments. Journal of Engineering Education, 93(3), 195–
204.
Wiggins, G. P., & McTighe, J. (2005). Understanding by design. Association for
Supervision & Curriculum Development.
Wiley, J., & Jolly, C. (2003). When two heads are better than one expert. 25th Annual
Meeting of the Cognitive Science Society.
Woodfield, B., Andrus, M., Waddoups, G.L., Moore, M.S., Swan, R., Allen, R., Bodily,
G., Andersen, T., Miller, J., Simmons, B., Stanger, R. (2005). The virtual ChemLab
project: A realistic and sophisticated simulation of organic synthesis and organic
qualitative analysis. Journal of Chemical Education, 82(11), 1728–1735.
Wolf, S., & Tauber, R. N. (1986). Silicon Processing for the VLSI Era,vol. 1, Process
Technology. Lattice Press.
Yildirim, T. P., Shuman, L. J. & Besterfield-Sacre, M. (2010). Model eliciting activities:
assessing engineering student problem solving and skill integration processes.
International Journal of Engineering Education, 26(4), 831–845.
71
Zacharia, Z. C. (2007). Comparing and combining real and virtual experimentation: An
effort to enhance students’ conceptual understanding of electric circuits. Journal of
Computer Assisted Learning, 23(2), 120–132.