ABSTRACT
We developed a construction toolkit for teachers to create visual educational simulations. Because teachers are subject matter experts in the classroom, they are excellent candidates to develop educational software that meets their own pedagogical goals. We report an evaluation of nineteen teachers creating educational simulation microworlds for physical science using this environment.
Teachers were trained in visual simulation programming, working from a minimalist instructional tutorial and an interaction guide. Additional programming support was provided in the form of reusable template objects which contain basic object functionality derived from analysis of our prior work with teachers [ 11 ]. The current study helps us further refine this set of reusable components, facilitating ease of use for our novice educational simulation developers during simulation creation. The significance of this work is to provide greater support for teachers, by providing content-specific, interactive, modifiable educational software, and thus to facilitate visual programming by this group of novice programmers.
Author Keywords
Usability evaluation, visual programming, reuse, toolkit
ACM Classification Keywords
Graphical user interfaces, evaluation, interaction styles
INTRODUCTION
The use of computers in the classroom has not yet become a standard for several reasons. One of these is that teachers have little time or resources to investigate new technologies that they might integrate into their curriculum. The current research seeks to address this issue by introducing teachers to a software environment that enables them to build their own software, helping them to make the transition from novice programmers to “software developers”.
Another obstacle to having more specialized software in the classroom is that teachers are rarely involved in the process of designing software intended for their use. Generally educational software developers obtain only high level requirements from teachers. Others create new educational software applicable only to grade-specific curricula (e.g., see SuperKids’ software review of JumpStartTM) .
Educational software developers create software anticipating that families and teachers will find it useful. A well-designed program, packed with practical and fun activities applicable to a typical kindergarten curriculum.
Many pieces of educational software are very colorful, use popular TV characters (e.g. School House Rock, Sesame
Street, Arnold), and multimedia programs with music, animation and helpful earcons, but have built in functionality that cannot be modified or augmented.
As a result, even though computers are beginning to appear in classrooms all over the country, their use has been rather unimaginative. In many cases, software for the classroom is created to reinforce basic knowledge using drill and practice routines, which is very beneficial, but does not promote more extemporaneous activities, and these types of environments are generally neither malleable nor suited to meet specific needs. They do little to help problem solving skills [6], and subsequent studies have shown that students learn substantially more from exploratory learning than from drill and practice routines [8].
One promising technology for exploratory learning is the development and use of educational simulations. Many systems for building educational simulations are currently available on the software market (e.g. ActivChemistry,
AgentSheets, LabVIEW, SimCity, Logo), but there are limitations to their usability. A simulation construction kit like SimCity is simple to use and fun, but its functionality cannot be modified. Software toolkits like LabVIEW have very rich functionality, but their functionality can only be modified by varying the input; furthermore these are professional toolkits that are too expensive for many schools to afford.. Many other systems have attractive features but either require the user to start from scratch or restrict them to prepackaged functionality. Our goal is to offer teachers the choice of building their own educational software or reusing and adapting existing functionality.
Most customizable software is not very user-friendly and few teachers have the necessary time and motivation to
craft their own software. Building a simulation from scratch is like an artist starting with an empty canvas and at times it causes your creative juices to go blank as well.
We argue that if computer simulations are to be effectively integrated into classroom teaching, the simulations and associated programming environments must be adaptable.
This raises another set of issues, namely end-user development and adaptation of software. Teachers need to be able to modify software to meet their needs, but learning to program requires more time and effort than most teachers can afford, and when learning new techniques most users get stuck in production paradox[4]. We are promoting visual programming as an educational medium, hoping that the same interactive techniques that teachers employ for their everyday computing needs can be used in support of more constructive (programming) activities. In the balance of the paper we present and discuss our evaluation of our own visual simulation creation environment, SimBuilder.
SimBuilder and Squeak
SimBuilder is a project created in the Squeak programming environment. Squeak is a version of Smalltalk, an objectoriented language that has been used for decades, and is well-suited for the type of educational simulations we are planning to create. It supports object creation, basic drawing functionality, user interface creation, and multimedia.
The SimBuilder interface opens with 2 windows with left scroll enabled (i.e. “Welcome To… Squeak” and “The
Worlds of Squeak” examples) and 3 project models (i.e.
WaterCycle, Starter World and Ozone World) as illustrated in Figure 1, our initial view of the Squeak Morphic projects and scripting area. The scripting area contains many objects, all of which can be manipulated with their Morphic
Halo, scripts, or instance variables.
Like Squeak, SimBuilder assumes an object-oriented construction metaphor. The simulation objects built in
SimBuilder are autonomous agents, encapsulating their detailed object information and behavior (i.e. scripted rules) unless “opened up” by the user. In the microworlds that teachers create and modify, these agents behave by reacting to messages they receive. They receive requests and respond to them; on occasion agents interact with other agents to complete their tasks when necessary.
Users create objects that are candidates for a particular simulation or microworld; each microworld can be seen as a small project that might convey a lesson, for example the water cycle project seen in Figure 2. For each object, rules and behaviors can be created. In order to transition from viewing mode to programming mode, the user accesses an object’s encapsulated rule by selecting the viewer icon.
Once an object’s viewer is open, all of the object’s rule palettes can be seen (we refer to these as scripting menus).
Teachers begin building SimBuilder simulations using direct manipulation techniques [13] and specify each character’s behavior utilizing the rule creation toolkit.
Users simply select a rule that indicates what they anticipate as their objects’ behavior, drag it from the objects’ scripting palette, and the rule expands into a rule scripting tile as in Figure 3, and multiple rules can be added if desired.
Basic behaviors are represented as traditional objectedoriented concepts, however presented visually as receiver object, or message. More detailed behaviors can incorporate conditional structures in the form of a Boolean and/or conditional test. With conditional assignments one or more preconditions are specified. The condition must be satisfied in order to execute the statement. To elaborate
SimBuilder “programming” in terms of more traditional programming constructs, consider each rule scripting tile as a procedure containing lines of code. The “!” indicates to run this scripts once, mainly used to test rules. “Mover move” is the title of the script, the clock indicates the speed of execution, the “normal” button indicates type of execution, the next (i.e. beige) tile contains conditionals, and the “X” will close the script. The second section is a classic Smalltalk expression <receiverObject> <message>, but is rendered visually in two tiles as <Mover> <forward by 5>. Each object must keep track of its values and state internally unless revealed (i.e. current x, y position, forward direction, color, etc.).
Figure 1: Squeak with SimBuilder
Figure 2: Water Cycle Microworld in SimBuilder
Figure3: Rule Scripting Tile
The direct manipulation style of programming greatly simplifies the creation of individual rules, and removes the burder of the user having to memorize textual syntax; they just have to become familiar with the reusable toolkit that presents them with a core set of rule scripts, on a rule palette. When the teacher adds behavior to an object, s/he selects an appropriate rule and drags it from the palette to an open space in the system window; it opens with the object’s detailed information shown in its current state. The user will then test the rule, make changes if necessary, run and save the simulation.
STUDY OF EXAMPLE BASED REUSE
In earlier work [11, 12] we studied reuse in visual simulation programming, and identified difficulties experienced by users that prevented them from effectively reusing the given examples. For example, we observed that the visual language VAT (used in Agentsheets [7]) allows only one world to be active at a time; this prevents the user from getting confused with to many windows, but at the same time reduces programming flexibility. VAT allows users to copy an agent’s behavior through a clone method; however, the behaviors of the clone are limited to behaviors of the base class. Our SimBuilder tool built in Squeak has addressed some of the problems observed for VAT. For example, several worlds can be opened simultaneously, which facilitates ease of using copy-paste techniques to reuse. The copy-paste style is the method of choice of most novice programmers. Furthermore each agent’s behavior, as well as its clone’s behavior is locally modifiable.
SimBuilder also includes a set of reusable components that we feel will encourage our teachers to reuse bits of context, through objects that instantiate a class of simulation activity. We want these component objects to support basic functionality that a novice programmer would want to include in a basic physical or earth science simulation tutorial. The initial set of components identified include the following objects: a base object (for backgrounds and stationary objects), mover, eraser (objects that consume other objects), emitter (objects that create other objects), and transformer (objects that transform into another object).
One goal of the current study was to validate that this set of objects will support teachers’ simulation needs.
Learning and Reuse Sessions
SimBuilder was evaluated as a tool for novice programmer teacher by comparing its effectiveness during learning and reuse to that of Agentsheets. The evaluation consisted of learning and a reuse session with a fifteen minute break.
Participants
The teachers were all preservice K-12 teachers, ranging in age from 20-45; the user group was 78% female. They were recruited from a graduate class in educational applications of technology. All had basic classroom experience (they had either taught in the past or were currently placed in classrooms). They were compensated with a $15 stipend and as an extra credit activity.
Apparatus
The usability laboratory consisted of an experiment room and an evaluation room connected with a two-way mirror.
The user in the evaluation room had a Gateway 2000e PC; the evaluator was located in a separate room to reduce user distraction. In episodes of serious breakdown, the evaluator was able to communicate with the user via an intercom. Video, audio and screen activity were recorded, and critical incidents observed were noted.
Procedures and Learning Materials
Participants completed a background questionnaire before entering the experiment room. They were instructed how to
“think aloud” during the session, so that we can study users’ thought processes [5]. During the simulation programming sessions users were given a minimalist tutorial to guide their work. In the design of this tutorial, efforts were taken to minimize the amount of reading and passive observation expected of students, and to provide open-ended questions that stimulate discussion and exploration. To help learners avoid or recover from errors, images of the proper ending state for many programming steps were provided, and checkpoint explanations and tips for error recovery were inserted at key points [1,3,9,10]. In general, the minimalist character of the tutorial can be summarized as follows:
Learners’ first activity is exploration of complex simulation.
Learners immediately begin augmenting and creating within this simulation by investigating objects, the objects encapsulated state information, and scripted behaviors.
To reduce errors, learners are given visual cues of manipulatives to be utilized during their task.
To reduce verbiage users were given example simulations to learn by analogy and were expected to reuse through recall and/or copy-paste reuse techniques
The manual for the learning session was 10 pages long and included an interaction guide to give users helpful hints to remind them of frequently used tasks based on what they have learned. The learning tutorial begins with the user watching the water cycle microworld (Figure 2), answering a few questions about the simulation. Once they have explored, their next task is to modify objects and create their own object to interact within this learning example.
Their second task is to create their own earth science example from scratch. The user creates a Volcano model in order to show their understanding or level of mastery of the environment by creating a tangible artifact. The reuse tutorial consists of 9 pages and the user’s task is to create
two simulations by reusing either an example-based or component-based reuse model. Using the provided reusable simulation models, teachers created both a photosynthesis model and an ocean model (see Table 1 for a synopsis of both tutorial activities).
Learning Tutorial Reuse Tutorial
Watch, answer questions about simulation
Two part session where user introduced to both Generic and Specialized Models
Explore and modify behaviors of objects
Study objects with behavior analogous to the ones you will be building
Create a new object Study objects with behavior analogous to the ones you will be building
Based of training create physical science simulation
Reuse the behaviors of the analogous objects, through recall or copy/paste
Test volcano simulation Create two new simulations utilizing reuse models
Table 1: Tutorial Activities
Once the user completed the two sessions, they filled out a final questionnaire assessing their reactions to the two environments. Finally they participated in a retrospective interview to ensure that the evaluator collected any other comments remaining.
General Observations
During the experiments users completed the learning and reuse activities with little intervention from the facilitator.
Each teacher was able to create three simulations; the average time to complete the SimBuilder sessions was 81 minutes, with Agentsheets sessions averaging 112 minutes.
Most participants spent 2 hours or less in learning and creating simulations. We find it generally promising that all of these users were able to complete these end-user programming tasks.
Participant Background
The background surveys revealed that most of the preservice teachers felt comfortable with computers, had considerable computer experience (averaging 12 years), and had used both PC and Macintosh computers. They had good ideas for roles of computers in the classroom, e.g.:
good for cause/effect relationships & to integrate technology into the curriculum
use in science experiments
supplement to class activities, to enrich lessons.
use of computers and technology is making it easier to set up scenarios for students to learn.
Provides students with real-life situations they may otherwise not be able to experience.
Computers help to enhance the student's educational experience--assisting student's evolution into independent learners
Participants also had many ideas for the use of educational simulations in the classroom: high & low tides; Events of
9-11-2001; microevolution, plant or animal growth/development, metabolic processes, virtual tours of places; battle/wars, volcanoes erupting, destruction from hurricanes or earthquakes; growth of cities; building collapses, tidal waves, dramatizations of plays, novels, and other literary works, and historical moments.
Learning Sessions
Participants began their simulation learning with a water cycle simulation. They learners had little trouble running and exploring this example simulation, although one user had trouble finding the SimBuilder object halo (learners were using a 3-button mouse and the middle button was required for selection of the object halo).
Learners were guided to create a new object to add to the example, and the flexibility of the drawing toolkit became apparent during this task. Agentsheets users complained that its drawing tools were too rigid. In contrast,
SimBuilder users were more satisfied with the drawing tools and created more realistic depictions as characters.
For example, compare Figure 4 (created with Agentsheets) and Figure 5 (created in SimBuilder). Both depict volcanic activity, but the second is a more realistic rendition of this.
With arrows, rocks, and other details that show the pressure building and the explosion occurring and other life forms
(birds, etc) fleeing the chaos.
Figure 4: Agentsheets Volcano
Figure 5: SimBuilder Volcano
Reuse Sessions
After the learning sessions, users attempted to create new simulations by reuse of two sample environments, one offering generic components, and the other a specific example world [11].
Each section of the reuse session began with a reuse example—either the component-based Figure 6 or the example-based Figure 7. These examples are visual representations of the action/interaction classes identified in earlier work as a possible basis for simple simulations.
Figure 6: Component Based Reuse technique. We will give a brief example of reuse task performance in both experiment environments.
Agentsheets users could reuse by recall when creating a new simulation, they could change the look of the agent in an existing world, but they could not modify or extend an object’s behavior. In contrast, SimBuilder users could select a reusable object and make a copy or place that object into their new project and make any changes they desired (e.g. name, behaviors, depiction, etc.).
A good solution developed in this fashion by an
Agentsheets user is in Figure 8, where the user has created
5 objects. The action of the simulation is the following: the sun creates sunrays, which travel southeast. Once a sunray shines on or contacts a small tree, it grows or transforms into a large tree. A good SimBuilder solution is in Figure 9 showing an ocean model. In this model the user has also created five objects. The actions of the simulation include clouds which float, an ocean which produces waves and a tropical island that the waves crash into.
Figure 7: Example Based Reuse
The participants’ task was to first design a quick paper rendition of an educational simulation, either a photosynthesis model or an ocean world. After planning their basic objects and behaviors, their next task was to run an example simulation, either the ozone depletion model or starter component model. Each reuse model contains five components, with the same behaviors; we expect that when an object’s visual form indicates its function, users do not need to analyze its meaning by opening and studying its behavioral rules. This should allow faster and more indepth understanding of an object’s functionality.
After studying a reuse example, participants were to assess where it offered objects that could be good candidates for reuse in creating their new simulations. They began with investigation and exploration of the environment. Next they would create any background if they desired. Then if they identified an agent that would be a good reuse candidate, they would reuse that object either by recall or copy-paste
Figure 8: Agentsheets Photosynthesis Model
Each participant created two artifacts, one from each reuse model. We were pleased to see that users were able to reuse both types of material; however we believe that the quality of simulations will improve as our library of reusable simulations increase. We will have to study the possibility of a user finding a simulation that is of very near context to their desired artifact. This gives us more
Figure 9: SimBuilder Ocean World Model
questions to address in future research: Will this affect the usefulness of our generic starter with reuse components; will it still be an attractive form of reuse by our teachers?
If it proves not to be as attractive for reuse, because more near context simulations are available, we are still confident that this will be a good instrument for training teachers about the major components of visual programming of earth and physical science simulation models.
Detailed Results
In the learning sessions we observed that participants got started quickly: the average time spent on an Agentsheets volcano simulation was 29 minutes, consisting of an average of 4.63 objects and 2.54 rules. SimBuilder learning sessions averaged 23 minutes, involving an average of 4.85 objects and 4.85 rules. Although there is a small positive difference in favor of SimBuilder for speed and number of objects created, this was not significant.
As they began their reuse activity, users seemed more comfortable with the environment. During their initial
(either Agentsheets or SimBuilder) reuse activity,
Agentsheets reuse sessions averaged 24 minutes, involving creation of 3.27 objects and 0.54 objects reused, and development of 2.63 rules with 0.36 rules reuse.
SimBuilder reuse sessions averaged 20 minutes, involved creations of 3.71 objects with 0.85 objects reused, and 2.71 rules with 0.85 reused. Again there are small differences in favor of SimBuilder, but they are not significant.
During the second reuse activity, Agentsheets users averaged 23 minutes, with 3.63 objects and 2.36 objects reused. SimBuilder users average 18 minutes, with 3.57 objects and 3.14 reused (see Table 2). Once again, the small positive differences in favor of SimBuilder environment were not statically significant.
To examine differences between the systems more carefully we examined the complexity of rules developed within each environment. In rule creation we defined three criteria for complexity. Basic Rules include only simple movement (e.g. object forward by five); Intermediate Rules include basic movement but also incorporate more sophistication (e.g. objects using multimedia and creating sounds, objects interacting by creating, testing, or deleting another object); Expert Rules include all of these activities and more advanced programming activities (e.g. incorporate basic variables, global variables, function calls, or switch statements.)
In the first reuse activity users tended to reuse significantly more basic and intermediate rules in SimBuilder than
Agentsheets (see Table 2). In the second reuse activity
SimBuilder users reused more of the basic rules.
We observed that during a user’s first reuse session there were fewer objects created than in the learning session, but with each subsequent activity (i.e. as a user becomes more
Raw numbers of artifacts created
Significance AS
Reuse Session1
R1 Objects Reused 0.4
SB
0.54 0.85
R1 Number of Rules
Reused
0.1*+
R1 Reused Rules: Basic 0.1*+
0.36 1.00
0.27 0.85
0.09 0.57 R1 Reused
Intermediate
Rules:
R1 Proper working rule
0.08*+
0.1*+
Reuse Session 2
R2 Rules
R2 Rules: Basic
R2 Rules: Intermediate
Objects Reused
0.08*+
0.01*
0.3
0.3
0.36 1.00
2.45 3.14
1.45
1.00
2.36
2.57
0.57
3.14
Number of Rules Reused
R2 Reused Rules: Basic
0.6
0.2
0.90
0.54
1.14
1.14
R2 Reused
Intermediate
Rules: 0.3 0.27 0.57
Significant p values: *+ <= .1, * <=.05, ** <=.01, ***<=.001
Mean Values indicate calculated mean of raw number of rules or objects.
Table 2: ANOVA Table of Reuse Activities familiar with reuse programming) we expect that they will create a greater number and more useful objects.
User Satisfaction
In the post-session questionnaire, we found several user ratings suggesting a preference for SimBuilder: “easy to get started”, “easy for novices to use”, “I understand how to use the environment”, “creating behaviors was not complicated”, and “my simulation works logically”.
Detailed results appear in Table 3.
We believe that these factors are very important, and when users feel confident with their abilities, this will reduce motivational obstacles to using an environment. For example, responses to “fun for building simulations” revealed a significant difference; we believe that fun perception will be an important predictor of future use.
Participants also felt that they were able to create working simulations, and reported that they would be willing to create simulations in the future.
Subjective
User Satisfaction
Properties,
Easy to get started
Easy for novices
Fun for building sims
I understand how to use (approaches *)
I created a working sim
Creating Behaviors was not complicated
Rule ordering was easy to perform
Significance
ME
E
M**
EM
EMP**
PE
PM*+
My simulation works logically
Created working simulation
P
EM**
Rules creation was easy to perform
EM*
I am enthusiastic about creating sims
M*+
Agentsheets
Mean
SimBuilder
Mean
3.17
2.58
3.58
3.42
3.75
3.42
2.58
2.45
3.75
2.5
3.08
3.57
2.71
4.43
4
4.57
3.86
3.43
3.29
4.57
3.49
3.86
Properties: E=Ease of Use, M=Motivation & Fun,
P=Programming Style, R=Reuse; Significant p values: *+
< .1, * <=.05, ** <=.01, ***<=.001
Table 3: ANOVA Table of User Satisfaction
Expert Comparative Evaluation
As a final evaluation metric, we recruited five experts to judge the quality of users’ simulations. These experts were recruited based on highest degree in computer science, experience in teaching and learning or related fields, experience in software engineering or design, and experience in usability engineering, user interface design, interface evaluation or graphic design.
Each evaluator was emailed a user interface evaluation set that included background characteristics, and ratings of simulation action (Table 4) and visual representation (Table
5). Evaluators were instructed that the evaluation was to ascertain the aesthetic quality of these artifacts and was to be graded with two individual ratings, one of visual representation one rating the characters’ actions.
A third section provided rating guidelines, with examples of simulations illustrating different levels of quality. Raters were given 40 samples of user created simulation microworlds in random order. We instructed evaluators to print the document (or at least the rating guidelines).
Our experts were given information about type of information we were seeking with this analysis; we also supplied them with a rationale for the evaluation: “We are
Ratings Action
Representation
5 Excellent A rating of Excellent indicates that the user models the phenomena with a variety of relevant working objects (5 or more).
4
3
2
1
Above Average A rating of Above Average indicates the user has fairly realistic depiction with at least 4 working relevant objects.
Average
Below Average A rating of Below Average indicates the user has at least one object working.
Poor
A rating of Average indicates the user has a realistic depiction with at least 2-3 objects working.
A rating of Poor indicates that the user model has no working objects.
Table 4: Ratings for Action Representation
Ratings Visual
Representation
5 Excellent
4
3
Above
Average
Average
A rating of Excellent indicates that the user had good realistic drawing that models the phenomena with a good use of color and 5 or more objects.
A rating of Above Average indicates the user has fair use of color and realistic depiction with at least 4 relevant objects.
A rating of Average indicates the user has fair use of color and realistic depiction with at least 3 relevant objects.
2
1
Below
Average
Poor
A rating of Below Average indicates the user has a fair use of color and realism with at least 2-3 objects.
A rating of Poor indicates that the user vaguely conveyed the model with 2-4 objects.
Table 5: Ratings for Visual Representation not trying to find out the quality of the artists, but the perceived quality of the artifact (i.e. being a good representation of the desired model)”.
Experts’ ratings generally favored simulations built with
SimBuilder: the visual ratings of the volcano, ocean, and photosynthesis simulations were significantly in favor of
SimBuilder at the value of p < .001. We attribute this to greater ease of use of the tools, greater variety of drawing tools, and the ability to create objects of any size that they wished, given them more freedom of expression.
The action ratings were also significantly in favor of
SimBuilder at the value of p <=.01 for the volcano actions and p < .001. for ocean and photosynthesis actions. We attribute this to greater variety of rules available for users to choose from in the rule creation toolkit and also that the user could more easily reuse rule behavior from one simulation to the next. The user could easily just copy a rule, which is placed in the paste buffer to be reused in a subsequent simulation or to just change the name and look of an object and reusing its predefined behaviors.
CONCLUSIONS AND FUTURE WORK
As in our previous work with teachers we had considerable positive outcomes and feedback from participants: All of the pre-service teachers were able to create new simulations using the SimBuilder and Agentsheets. They were able to program with the support of direct manipulation techniques, rule palettes, and reusable template objects.
The complexity of their creations varied widely based upon the users’ understanding of the tool and task, but 64% of the users indicated a higher satisfaction with SimBuilder.
Our expert user evaluations revealed that the best six models were all created with SimBuilder.
During the users’ training experience, we tried to exploit reuse strategies of expert programmers to facilitate novice programmers in software reuse. We introduced them to the usage context and encouraged them to assess the similarity of simulation objects and rule behaviors to their assigned task. Then, after studying these bits of context, they would reuse the objects and their behaviors by analogy, by borrowing the context through recall or with copy paste techniques. As a final test of their creation they analyze their modified objects through incremental testing [2].
Several issues will be addressed in future studies examining direct manipulation techniques for exploring and accessing an object’s encapsulated information. We also identified opportunities for improved error detection and recovery, and error notification. The next iteration of the tutorial will attempt to address these learning problems. We also plan to generalize and perform this study with a population of inservice teachers, and study teacher-student collaboration during simulation programming, as ultimately the learning experience will involve both populations. Finally, we are organizing a longitudinal study, where teachers and their classes contribute ideas and simulations to populate a repository of salient examples of simulation microworlds, hoping to support reuse and rapid customization.
ACKNOWLEDGMENTS
This work was supported by NSF REC-9554206 and NSF-
GAANN. We are grateful to Alex Repenning for introduction and access to Agentsheets software, to Mark
Guzdial for introduction and access to Squeak software and sponsoring our visit to SqueakEnd at Georgia Tech. Also many thanks to John Burton and Paulette Goodman for help in identifying teachers to participate in our studies and to user interface experts who did final analyses of the two environments.
REFERENCES
1.
Anderson, C.L. Knussen, and M.R. Kibby, “Teaching teachers to use HyperCard: a minimal manual approach”, British Journal of Educational Technology, 24(2), 1993.
2.
J. M. Carroll & M.B. Rosson, “Scaffolded Examples for
Learning Object-Oriented Design”. Communications of the
ACM, Vol. 39, No. 4, 1996, pp. 46-47.
3.
J.M. Carroll, Ed, Minimalism after the Nurnberg Funnel, MIT
Press, Cambridge, MA, 1998.
4.
J.M. Carroll, and M.B. Rosson, “Paradox of the Active
User”,Interfacing Thought: Cognitive Aspects of Human-
Computer Interaction, ed. J.M. Carroll, MIT Press, Cambridge,
MA, pp.80-111, 1987.
5.
Lewis, Using the “thinking-aloud” method in cognitive interface design. Technical Report RC9265, Watson Research
Center, Yorktown Heights, NY. 1982.
6.
Papert, S.Mindstorms: Children, Computers, and Powerful
Ideas, Basic Books, New York, 1980. See 23, 2 (Feb. 1982),
Rev. 38,960
7.
Perrone, C., and A. Repenning. (1998). Graphical Rewrite Rule
Analogies: Avoiding the Inherit or Copy & Paste Reuse
Dilemma. Proceedings of the 1998 IEEE Symposium of Visual
Languages, Nova Scotia, Canada, Computer Society, 1998, pp.
40-46.
8.
Radar , C., Brand, C., and Lewis. C. (1997). Degrees of
Comprehension: Children’s Understanding of a Visual
Programming Environment.
9.
M.B. Rosson, J.M. Carroll, C.D. Seals, and T.L. Lewis,
“Community Design of Community Simulations”, Proceeding of
Designing Interactive Systems, London, ACM Press, July 2002.
10.
M.B. Rosson, J.M. Carroll, and R.K.E. Bellamy, “Smalltalk
Scaffolding: A Case Study in Minimalist Instruction”,
Proceedings of CHI’90, New York, ACM, , pp. 423-430, 1990.
11.
M. B. Rosson, C. D. Seals, C.D. 2001. “Teachers as
Simulation”. Proceedings of CHI2001, pp237-244.
12.
C. D. Seals, M. B. Rosson. “Fun Learning Stagecast Creator:
An Exercise in Minimalism and Collaboration.” IEEE Human
Centric Computing, pp. , 2002.
13.
Shneiderman, “Direct Manipulation: A Step Beyond
Programming Languages”, Computer, 16, 1983.
14.
E. Soloway, M. Guzdial and K. Hay, “Learner-Centered
Design: The Challenge For HCI In The 21st Century”, HCI interactions, 1994, pp. 36-48.