Efficacy of Simulation-Based Learning of Electronics Using

advertisement
Chen, Y.-L., Hong, Y.-R., Sung, Y.-T., & Chang, K.-E. (2011). Efficacy of Simulation-Based Learning of Electronics Using
Visualization and Manipulation. Educational Technology & Society, 14 (2), 269–277.
Efficacy of Simulation-Based Learning of Electronics Using Visualization and
Manipulation
Yu-Lung Chen, Yu-Ru Hong, Yao-Ting Sung1 and Kuo-En Chang*
Graduate Institute of Information and Computer Education, National Taiwan Normal University, Taipei, Taiwan //
1
Department of Educational Psychology and Counseling, National Taiwan Normal University, Taipei, Taiwan //
ylchen@ice.ntnu.edu.tw // carenna@ice.ntnu.edu.tw // sungtc@ntnu.edu.tw // kchang@ntnu.edu.tw
*
Corresponding author
ABSTRACT
Software for simulation-based learning of electronics was implemented to help learners understand complex and
abstract concepts through observing external representations and exploring concept models. The software
comprises modules for visualization and simulative manipulation. Differences in learning performance of using
the learning software either with or without the simulative manipulation module were investigated in 49 college
sophomores. The learning performance was higher for learning software utilizing simulative manipulation and
visualization yields than for that lacking simulative manipulation, which suggests that learning performance can
be enhanced if visualized learning can appropriately integrate simulative manipulation activities. An analysis of
the learning process revealed that the use of simulative manipulation activities to verify and clarify the existing
knowledge is crucial to improving the learning performance.
Keywords
Simulations, Interactive learning environments, Applications in electronics
Introduction
Visualized simulation provides external representations of complex and abstract concepts, and helps learners to
understand their nature. Gordin and Pea (1995) and Ainsworth (2006) found that visualized simulation helps learners
to achieve a higher level of cognition by facilitating their interactions with multiple external representations and
reflection on phenomena, as observed when learning a given abstract concept. Moreover, visualized learning also
motivates learners and helps them to transfer concepts into long-term memory (Colaso et al., 2002; Naps et al.,
2003).
The many studies on computer-based visualized simulation learning have covered almost all subjects of science
education (Jensen et al., 2002; Khoo & Koh, 1998; Luo et al., 2005). However, studies on how computer
visualization improves learning performance have produced diverse results, with some of them revealing positive
influences of computer visualization on learning (Catalano & Tonso, 1996; Colaso et al., 2002; Jensen et al., 2002;
Luo et al., 2005; Meyer & Krzyzkowski, 1994; Naps et al., 2003; Wallace & Mutooni, 1997), with others finding
that visualization does not enhance learning performance (Reamon & Sheppard, 1997; Regan & Sheppard, 1996).
Many researches have attempted to understand why visualization would not have a positive impact on learning, and
our review of these research findings revealed that learning performance can be enhanced if a visualized learning
environment promotes learner interactions and gives them opportunities for manipulation (Colaso et al., 2002; Jensen
et al., 2002; Korhonen & Malmi, 2000; Naps et al., 2003; Tversky et al., 2002). The results of this review agree with
the overall learning theory stating that interactive learning activities that gains in motivation, and enhanced
motivation and learning outcomes (Prensky, 2002). The use of computer simulation by learners allows them to
readily manipulate parameters and observe the resulting changes in a given phenomenon, which helps the process of
higher-level reasoning (Gallagher, 1987).
A common problem faced by learners of electronics is being unable to fully understand the abstract concepts that
underlie the system responses predicted by theoretical models. This often results in learners being unable to see the
link between models and actual circuits (Ronen & Eliahu, 2000). Learners frequently cannot understand the abstract
concepts underlying the microscopic world of electrical circuits since they cannot see the flow of electric currents
authentically, and some teachers are also unable to clearly explain the details of these complex concepts. Moreover,
learners may feel frustrated and discouraged if there is no real-time feedback about a learning problem when they are
attempting to understand complicated abstract concepts (Oakley, 1996). Ronen and Eliahu (2000) believed that
simulation has great potential as a supplementary tool, with simulation as a medium helping learners to repair
missing links between theories and actual electronics processes.
ISSN 1436-4522 (online) and 1176-3647 (print). © International Forum of Educational Technology & Society (IFETS). The authors and the forum jointly retain the
copyright of the articles. Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies
are not made or distributed for profit or commercial advantage and that copies bear the full citation on the first page. Copyrights for components of this work owned by
others than IFETS must be honoured. Abstracting with credit is permitted. To copy otherwise, to republish, to post on servers, or to redistribute to lists, requires prior
specific permission and/or a fee. Request permissions from the editors at kinshuk@ieee.org.
269
Interactive simulation learning as an area worthy of study, thus, previous studies often fail to provide the information
necessary to determine if interactive simulation is indeed helpful (Vogel et al., 2006). For this study we implemented
electronics simulation learning software aimed at helping learners to understand the nature of the abstract electronics
concepts. The software contains two modules: (1) the concept visualization module, which provides a visualized
representation with corresponding narrations and descriptions of texts and symbols to help learners understand the
complicated abstract concepts, and (2) the simulative manipulation module, which guides learners to observe
changes resulting from manipulations of relevant parameters, thereby facilitating cognitive reflection and integration
of the relevant abstract concepts. This study investigated the effectiveness of the learning activities of visualization
and manipulation, and also whether the learning results of using the learning software differed depending on whether
or not it contained the simulative manipulation module.
Simulation-based Learning Activities
The simulation-based learning model constructed in this study contains three phases: concept learning, simulative
manipulation, and concept clarification (see Figure 1). In the concept-learning phase, the learner conducts the
activities through the visualized representation with corresponding narrations and relevant texts and symbols that
help learners understand complicated abstract concepts and facilitate reflection through learning-reflection learning
path. After the basic concepts are understood, a learner then manipulates parameters and observes the corresponding
simulated output changes in order to iteratively reflect on the concepts and understand them in more detail through
manipulation-reflection learning path. If a learner finds that learned concepts conflict with each other during
manipulation, he/she can reuse the concept visualization module to further clarify them through learningmanipulation-reflection learning path.
Figure 1: Simulation-based learning model contains three phases
Software for simulation-based learning of electronics was implemented to help learners understand complex and
abstract concepts through the simulation-based learning model. The software comprises modules for visualization
and simulative manipulation in the “half-wave rectifier”, “half-wave rectifier with filter”, “full-wave rectifier”, and
“Zener diode” units. A “half-wave rectifier” is the diode circuit used to convert AC to pulsating DC is to simply
allow half of the AC cycle to pass, in contrast with “half-wave rectifier”, the “full-wave rectifier” is the diode circuit
used to convert AC to pulsating DC is to allow full of the AC cycle to pass. A “Zener diode” is a type of diode that
permits current not only in the forward direction, but also in the reverse direction if the voltage is larger than the
breakdown voltage.
Concept learning
When conducting concept-learning activities, a learner clicks on the concept visualization module that divides the
learning unit into several topics, such as “half-wave rectifier circuit”, “theories of the half-wave rectifier circuit”,
“the positive cycle when power is on”, “after reaching the peak value of the positive cycle, the input voltage is less
than the output voltage”, and “when the voltage input is again greater than the output voltage”. For example, the
learning screen shown in Figure 2 is displayed when a learner clicks on the topic of “theories of the half-wave
rectifier circuit”. In the concept-learning activity, the upper-left corner and the lower-left corner of screen show the
waveform of the input voltage and the output voltage, respectively. At the same time, the right side of screen allows
the learner to observe how it is affected by the input voltage and the changes in electric current, thus imparting a
270
gradual understanding of the principles of operation of the half-wave rectifier circuit. This process is facilitated by
narrations that accompany the visual demonstration. During the process, the learner can choose to pause or repeat the
material, relearn it, or go to the next topic at his/her own speed.
Figure 2: Concept learning: show the waveform of input and output voltage on the screen, respectively
Simulative manipulation
Based on the learned concepts, a learner observes the changes of the outputs resulting from manipulating parameters
in a simulative context in order to verify whether he/she has correctly understood the concepts. The simulative
manipulation module asks a series of questions that guide the learner to manipulate the parameters and observe the
results. The question shown at the top of the example shown in Figure 3 is as follows: “For a sine wave voltage
amplitude of VP=30 V, a frequency of f=60 Hz (Hz is the unit of frequency), and a load resistance of R=10 kΩ (Ω is
the unit of resistance, 1kΩ=1000Ω), please adjust the capacitance to 20 F (F is the unit of capacitance, 1F=10-6F)
and observe the amplitude of the ripple voltage on the oscilloscope”. The question prompts the learner to adjust the
value of capacitance, set the values for the circuit components, and connect the oscilloscope to the two ends of
resistance R to observe whether the value of the ripple voltage is as expected. The result shows that when the
capacitance was 20 F, the ripple voltage was about 2.5 V.
Figure 3: Simulative manipulation: question prompts the learner to adjust the value of circuit components, and
observe whether the value of the ripple voltage is as expected
271
The simulative manipulation module guides learners to conduct manipulations, and each set of questions guide
learners to understand complicated abstract concepts. Each set typically comprises two to four questions. For
example, for the “half-wave rectifier circuit” unit, the topic addressed by the question set comprising four questions
was the inverted relationship between capacitance and ripple voltage. As shown in Figure 3, a learner conducts
manipulations under the guidance given in question 1, and when the capacitance was 20 F, the ripple voltage was
2.5 V. Question 2 prompts the learner to observe the value of the ripple voltage when all conditions remain the same
except for the capacitance decreasing to 10 F. After the manipulation, the enlarged display of the oscilloscope
showed that the ripple voltage was increased to about 5 V. Question 3, which is “what is the value of capacitance that
will produce the minimum ripple voltage amplitude?”, and then guides the learner to reflect on the results from
questions 1 and 2. For example, based on the experimental results from questions 1 and 2, a learner may
continuously adjust capacitance and observe the ripple voltage, and having a result that the ripple voltage value
changes to 3.2 V when the capacitance is changed to 17 F and, but increases to 4.2 V when the capacitance drops to
13 F. Through the continuous manipulation of capacitance parameters and observation of voltage waveforms, the
learner gradually discovers how changes in the capacitance affect the ripple voltage, as indicated by question 4:
“please adjust the capacitance, observe the changes in waveforms, and decide which of the following descriptions
regarding the relationship between the capacitance and ripple voltage is correct”.
Concept clarification
Based on the guidance provided by the questions, a learner manipulates parameters in order to understand the
relationships between parameters and outputs, and cross-examines them with the concepts learned during concept
learning. If the manipulation outputs match the learned concepts, the learner can draw his/her own conclusions based
on the simulative results. However, if the outputs conflict with the learned concepts, the learner will return to the
concept learning phase and review the relevant topics in order to resolve the conflict. For example, if the learner
originally assumes that capacitance is directly proportional to the ripple voltage, but the simulative results suggest
otherwise, he/she can return to the concept visualization module and review the descriptions of the function of
capacitance in a rectifier circuit (Figure 4). When a learner clicks on the finger icon next to the capacitance in Figure
4 the system will explain the functions of capacitance in order to help him/her reflect on how the value of the
capacitance will influence the ripple voltage.
Figure 4: Concept clarification: review the descriptions of the function of capacitance in a rectifier circuit
272
Experiments
In this study, we investigated differences in the learning performance when using electronics simulation learning
software that contain either only a concept visualization module (provide learning-reflection learning path only) or
both concept visualization and simulative manipulation modules (provide learning-reflection, manipulationreflection, and learning-manipulation-reflection learning paths), in order to determine the efficacy of including a
manipulation mechanism in the learning software. The learners in the experimental group used software that
contained both modules in order to perform the following three activities: concept learning, simulative manipulation,
and concept clarification; whereas those in the control group only performed concept learning activities through the
concept visualization module.
Subjects
The research subjects were 49 sophomore students in Taipei (37 males and 12 females), all of them had learned
about diodes in an electronics course provided before this study. They were randomly divided into the control group
(16 males and 7 females) and the experimental group (21 males and 5 females).
Experimental design
We adopted a quasi-experimental design for the study, in which the independent variables were the groups
(experimental and control groups) and the test phases (pre- and posttests). The dependent variables were the posttest
results of the four units on electronics topics related to diode circuits: half-wave rectifier, half-wave rectifier with
filter, full-wave rectifier, and Zener diode. In order to avoid experimental errors due to the use of different
instructional methods and learning materials, the same instructor and materials were used for both the experimental
and control groups. The mid-term electronics examination scores of the participants were used as covariance to
eliminate the influence of prior knowledge of electronics on the learning results. The mid-term examination covered
the four units related to diode circuits. Pearson's correlation coefficient between the scores of mid-term examination
and pretest in the experimental and control groups was 0.63(p=.006<.01) and 0.61 (p=.008<.01). There was a strong
positive correlation between the mid-term examination and pretest scores in both groups.
Tools
The tools used in the experiments were (1) electronics simulation learning software as described above and (2) preand posttests constructed based on the electronics material that the participants studied. The test contents were based
on college-level electronics courses, and the tests were examined and amended by two experienced instructors in this
subject area. Seventy-two college students participated in the pilot study, in which 36 copies of pre- and posttests
were randomly given out. The Kuder-Richardson reliabilities of the pre- and posttest were .78 and .69, respectively.
Procedures
All learners underwent a 30-minute-long pretest prior to the commencement of the experiment, with the
experimental treatment beginning in the following week. The simulation learning activities lasted 3 weeks. The
“half-wave rectifier” and “half-wave rectifier with filter” units were given in the first week, in which both groups
underwent a 20-minute system introduction before undergoing 60-minute-long learning activities. The “full-wave
rectifier” unit was given in the second week, in which both groups underwent a 30-minute-long learning activity. The
“Zener diode” unit was given in the third week, and both groups again underwent a 30-minute-long learning activity.
After the learning activities ended, both groups received a 30-minute-long posttest and a 15-minute-long
questionnaire survey.
273
Results
We used a two-way mixed ANCOVA to evaluate the learning performance in both groups and compare the
differences between them. After eliminating the influence of prior knowledge on the learning performance of
learners, we analyzed whether there were significant differences between (1) the posttest scores in the experimental
and control groups and (2) the pre- and posttest scores in each of the experimental and control groups. Table 1
summarizes the pre- and posttest scores in the two groups.
Table 1: Mean and standard deviation (SD) values of pre- and posttest scores in the two groups
Pretest
Posttest
Group
N
Mean
SD
Mean
SD
Experimental
26
74.90
18.55
84.82
10.87
Control
23
71.17
17.37
71.17
19.94
Tests of the homogeneity of the regression coefficient revealed that interaction F(2,45) between the independent
variables and covariance was .68, and not significant (p > .05). This confirms the hypothesis of homogeneity of the
regression coefficient.
The mid-term examination scores were used as the covariance to check the significance of differences in changes in
the pre- and posttest scores in the experimental and control groups. Table 2 indicates that there were interactions
between the pre-/posttests and groups (F(1,46) = 4.90, p < .05), and a test on simple main effects was conducted.
Table 3 lists the results of a simple main effect test on the factors of group and pre-/posttest. There was no
significantly difference between the experimental and control groups at the pretest (F(1,94)=.593, p > .05), where the
posttest scores were significantly higher in the experimental group than in the control group (F(1,94) = 7.933, p <
.05). This indicates that the learning performance was significantly better in the experimental group than in the
control group. Moreover, there were no significant differences between the pre- and posttest scores in the control
group (F(1,47)=.000, p > .05), whereas posttest scores were significantly higher than pretest scores in the
experimental group (F(1, 47) = 10.620, p < .05). This indicates that the experimental group improved significantly
whereas the control group did not.
Source of variation
Between subjects
Group
Error
Within subjects
Pre-/posttest
Interaction
(pre-/posttest × group)
Error
*p < .05
Table 2: Summary data from ANCOVA
SS
df
MS
F
p
1920.70
15733.38
1
46
1920.70
342.03
5.616*
.022
678.670
1
678.67
5.57*
.023
595.993
1
595.99
4.90*
.032
5606.259
46
121.88
Table 3: Summary of simple main effects in pre- and posttests in the two groups
Source of variation
SS
df
MS
F
Pre-/posttest
Experimental group
1279.033
1
1279.03
10.620*
Control group
.000
1
.000
.000
Error
5660.027
47
120.426
Group
Pretest
169.952
1
169.9522
.593
Posttest
2274.150
1
274.150
7.933*
Error
26945.913
94
286.659
*p < .05
p
.002
.992
0.443
.005
274
Discussion
It is commonly believed that visualization technologies have positive impacts on learning. However, the educational
benefits of such technologies would be impaired if they do not help learners become active in the learning process
(Naps et al., 2003), and many studies suggested that increasing interactions and opportunities for manipulation
improves learning (Calaso et al., 2002; Jensen et al., 2002; Korhonen & Malmi, 2000; Naps et al., 2003; Tversky et
al., 2002). Therefore, we incorporated interactive operations in computer-based visualized simulations and
constructed simulation-based learning models aimed at helping learners to conduct concept explorations and
verifications through concept learning, simulative manipulation, and concept clarification activities. Simulationbased learning activities help learners to acquire knowledge through the process of observation, exploration,
experiencing, and reflection.
The posttest scores in our empirical study were significantly higher in the experimental group than in the control
group. Moreover, the posttest scores were significantly higher than the pretest scores in the experimental group, but
they did not differ in the control group. These results indicated that the learning performance was higher when
integrating visualization and manipulation than when using visualization only, and was not improved when
visualization was not included in the interactive manipulation mechanism.
A detailed analysis of the study revealed that learners whose grades improved in the experimental group often
returned to previous topics to review concepts after completing the explorations in manipulation activities in order to
clarify relevant concepts and revise previous concept models, and that this behavior was absent in those who showed
no improvements. In contrast, there was no such difference between those who improved and those who did not
improve in the control group. Although the learners in the control group also reviewed relevant concepts, they lacked
a concept clarification process and did not gain a better understanding of the concepts or construct a comprehensive
knowledge structure. This resulted in most of them not improving, and also performing worse than the subjects in the
experimental group. The limitation of study is that it had focused on the qualitative exploration of learning activities;
detail research is needed to analyze the learning behavioral patterns in such learning environment from a quantitative
perspective.
Doulai (2001) stated that the use of computer simulation software improves the motivation of learners, with the
resulting exploration activities improving their learning performance. Matching this idea with the questionnaire
findings, we found that 66% of the learners in the experimental group stated that the manipulations helped to
increase their interest in electronics, with 77% of them stating that manipulation benefited their learning. In contrast,
only 30% in the control group stated that concept visualization motivate them, with only 48% of them stating that it
benefited their learning.
Conclusions
Based on the electronics topic of diode circuits, learning models that included concept learning, simulative
manipulation, and concept clarification phases were formulated in this study in order to realize learning software that
contained concept visualization and simulative manipulation modules. Applying this learning software to 49 college
sophomores showed that the learning performance was higher for those who utilized manipulative tools, which
indicates that integrating visualization with an appropriate manipulation context benefits the learning achievements
of learners. Our analysis of the learning process revealed that the important factors for an enhanced learning
performance are whether learners can follow the simulation-based learning model, construct knowledge through
concept learning, simulative manipulation, and concept clarification activities, make use of simulative practice to
identify the nature of concepts, and integrate this with their existing knowledge.
Even though some previous studies reporting negative results about educational technology enhance learning, but
other studies revealed that learning performance can be enhanced when pedagogy is sound (Kadiyala & Crynes,
2000). The meaningful learning will take place when these technologies allow learner to be engaged in the
knowledge construction, conversation, articulation, collaboration, authenticity, and reflection activities (Jonassen et
al., 2000). An important aspect of the pedagogical model presented in this paper is the idea that authentic
manipulation in simulation-based learning environment will stimulate deeper reflection about the electronics concept
while learners construct knowledge through the learning-manipulation-reflection learning path base on the
275
simulation-based learning model. In the learning model, another finding is that problem solving strategy used in the
manipulative environment will facilitate critical reflection through the manipulation-reflection learning path.
Interactive simulation helps learners engaged in the authenticity, reflection and knowledge construction activities to
achieve a higher level of cognition. Moreover, interactive learning also motivates learners and useful to improve
learning in complex domains (Spector, 2000).
To increase and refine our knowledge of this subject, it would be interesting to extend the study to other learning
domains, and to conduct quantitative studies involving large numbers of students using these environments in real
learning context. It would also be worth investigating the learning behavioral patterns, either in qualitative or
quantitative studies, in the simulation-based learning environment. On the other hand, Learners exhibit diverse types
of misconceptions when learning about electronics (Ronen & Eliahu, 2000). We have demonstrated the efficacy of
applying simulative manipulation to learning performance, and future studies should attempt to elucidate whether
simulative manipulation can clarify learner misconceptions.
References
Ainsworth, S. (2006). DeFT: A conceptual framework for considering learning with multiple representations. Learning and
Instruction, 16(3), 183-198.
Catalano, G. D., & Tonso, K. L. (1996). The sunrayce' 95 idea: adding hands-on design to an engineering curriculum. Journal of
Engineering Education, 85(3), 193-199.
Colaso, V., Kamal, A., Saraiya, P., North, C., McCrickard, S., & Shaffer, C. (2002). Learning and retention in data structures: A
comparison of visualization, text, and combined methods. Paper presented at the Proceedings of ED-MEDIA 2002, June 24-29,
Denver, Colorado, USA.
Doulai, P. (2001).The role of computer simulation in electric energy systems education. In Tomorrow's Education in Electrical
Technologies: Revisited Methods and Tools for Renewed Motivation, European Power Electronics and Drive Association.
Gallagher, J. (1987). A summary of research in science education- 1985. Science Education, 71(3), 307-457.
Gordin, D. N., & Pea, R. D. (1995). Prospects for scientific visualization as an educational technology. Journal of the Learning
Sciences, 4(3), 249-279.
Jensen, D., Self, B., Rhymer, D., Wood, J., & Bowe, M. (2002). A rocky journey toward effective assessment ofvisualization
modules for learning enhancement in engineering mechanics. Educational Technology & Society, 5(3), 150-162.
Jonassen, D; Peck, K. & Wilson, B. (2000). Learning with Technology.: A Constructivist Approach. Prentice Hall.
Kadiyala, M., & Crynes, B. L. (2000). A review of literature on effectiveness of use of information technology in education.
Journal of Engineering Education, 89(2), 177-189.
Khoo, G. S., & Koh, T. S. (1998). Using visualization and simulation tools in tertiary science education. Journal of Computers in
Mathematics and Science Teaching, 17(1), 5-20.
Korhonen, A., & Malmi, L. (2000). Algorithm Simulation with Automatic Assessment. Paper presented at the 5th Annual ACM
SIGCSE/SIGCUE Conference on Innovation and Technology in Computer Science Education (ITiCSE 2000). Helsinki, Finland.
Luo, W., Stravers, J. A., & Duffin, K. L. (2005). Lessons learned from using a web-based interactive landform simulation model
(WILSIM) in a general education physical geography course. Journal of Geoscience Education, 53(5), 489-493.
Meyer, D. G., & Krzyzkowski, R. A. (1994). Experience using the video jockey system for the instructional multimedia delivery.
Paper presented at the ASEE Frontiers in Education Conference, November 6-9, San Jose, CA, USA.
Naps, T. L., Rößling, G., Almstrum, V., Dann, W., Fleischer, R., Hundhausen, C., et al. (2003). Exploring the role of visualization
and engagement in computer science education. ACM SIGCSE Bulletin, 35(2), 131-152.
Oakley, B. (1996). Implementation of interactive tutorials for an introductory circuit analysis course, Retrieved May 15, 2010,
from http://www.ewh.ieee.org/soc/es/Aug1996/002/cd/tutorial.htm.
Prensky, M. (2002). The motivation of gameplay or, the real 21st century learning revolution. On the Horizon, 10(1).
Reamon, D., & Sheppard, S. (1997). The Role of Simulation Software in an Ideal Learning Environment. Paper presented at the
ASME Design Engineering Technical Conference, September 14-17, Sacramento, CA.
276
Regan, M., & Sheppard, S. (1996). Interactive multimedia courseware and the hands-on learning experience:an assessment.
Journal of Engineering Education, 85(2), 123-131.
Ronen, M., & Eliahu, M. (2000). Simulation — a bridge between theory and reality: the case of electric circuits. Journal of
computer Assisted Learning, 16(1), 14-26.
Spector, J. M. (2000). System dynamics and interactive learning environments: Lessons learned and implications for the future.
Simulation & Gaming 31(4), 528-535.
Tversky, B., Morrison, J.B. and Betrancourt, M. (2002). Animation: can it facilitate?. International Journal of Human-Computer
Studies, 57(4), 247-262.
Vogel, J. J., Vogel, D. S., Cannon-Bowers, J. A., Bowers, C. A., Muse, K., & Wright, M. (2006). Computer gaming and
interactive simulations for learning: a meta-analysis. Journal of Educational Computing Research, 34(3), 229-243.
Wallace, D. R., & Mutooni, P. (1997). A comparative evaluation of world wide web-based and classroom teaching. Journal of
Engineering Education, 86(3), 211-219.
277
Download