IDE Tools for Novice Programmers

advertisement
-Arthur Lewis (alew525)
Contents
 Background and Introduction
 Overview of Popular Environments
 Empirical Studies
 Borland Delphi v/s SimplifIDE
 Gild v/s Eclipse
 BlueJ
 Eclipse and other IDEs
 Discussion
Introduction
 Programming is not easy for new comers to grasp
 Moreover, exposure to a professional development
environment/IDE can be an added burden
 A number of pedagogical IDEs have been designed to
address this issue
Overview of Popular Environments
 From current studies, BlueJ and DrJava are the most
popular pedagogic environments
 Both have been designed for novice programmers
picking up Java
 DrJava focused on providing a less “intimidating”
interface as compared to Eclipse
 BlueJ focused on simplifying the learning of OO
concepts via a visual interface
Overview of Popular Environments
 DrJava’s prominent feature is a read-eval-print (RELP)
loop
 Intuitive testing and debugging capabilities also
supported along with error detection
 However, no research done to test its effectiveness
 Empirical studies are a must to ascertain its actual
impact with respect to learning
Overview of Popular Environments
 BlueJ had a visual interface designed to make OO concepts
in Java easier for novices
 Another study compared the features of BlueJ with DrJava
Main Window of the BlueJ Interface (Source: Kölling, M., Quig, B.,
Patterson, A., & Rosenberg, J. (2003). The BlueJ system and its pedagogy.
Computer Science Education, 13(4), 249-268.)
Overview of Popular Environments
 Other tools included Penumbra, DrScheme and Gild
 Penumbra was designed for Eclipse but never really
took off
 DrScheme’s purpose was to simplify the functional
programming language Scheme
 Researchers made claims which require empirical
evidence to support validity
 Gild was made to create a “Student Perspective” for
Eclipse
Borland Delphi v/s SimplifIDE
 Two related studies (part of the same work) compared
a pedagogical environment (SimplifIDE) with a
professional one (Borland Delphi)
 Studied 2 groups: treatment and control
 Took students academic abilities into account based
on past performance by implicitly classifying them
into two further categories
Borland Delphi v/s SimplifIDE
 One study focused on academic performance and
programming behavior of students
 The other study took into account the perceived sense
of learning derived by students
 Results:
 Weaker students benefited more from the pedagogical
environment
 Improved programming behavior observed in weaker
students
 Overall academic performance remained unaffected
Borland Delphi v/s SimplifIDE
 Categorizing the students into weak and strong
categories was a crucial aspect of this study
 The sample size was adequately large and the study
was conducted in a naturalistic environment
 Some students opted out
 Choice of pedagogical environments doesn’t affect
academic performance
Gild v/s Eclipse
 Study considered metrics such as efficiency,
effectiveness, understanding and satisfaction
 Hypothesized Gild will perform better than Eclipse
 Consisted of problems followed by a qualitative
feedback to measure understanding and satisfaction
 Students preferred Gild over Eclipse
Gild v/s Eclipse
 Study sample was small (N=6)
 Participation was voluntary
 Some students were familiar with other IDE tools
 Students had trouble using some complex features of
Eclipse such as a debugger
 Customizing Eclipse’s interface to help students would
lower its standards as a professional IDE
Empirical Studies: BlueJ
 One study tested BlueJ in an academic setting
 An initial evaluation using a beta version
 Students were reported to have a higher passing rate as
compared to previous years
 However, this was the first time Java was taught and
BlueJ was being used
 Second evaluation evaluated students understanding
of abstract OO concepts
Empirical Studies: BlueJ
 Students pursuing their second programming unit
were asked to complete a survey
 Results were positive
 Study claimed students had a better comprehension of
the study material
 More evidence is needed to support these claims
Empirical Studies: BlueJ
 Another study compared BlueJ with TextPad
 Sample populated comprised students of different
disciplines
 Opting out wasn’t an option
 Course was taught in two sections, one using BlueJ and
the other with TextPad
 Two different samples were used for each section
Empirical Studies: BlueJ
 Student performance was assessed
 BlueJ didn’t have a significant impact on student
performance
 Students liked some features of the environment
 Sample size was small i.e. within the 10-17 range
 Sample had students from non computing disciplines
Eclipse and Other Professional IDES
 Some researchers argue that Eclipse or other IDE tools
are suited for classroom purposes
 However these studies had limitations:
 Small sample size
 Mental manipulation of students
 One of them did agree with the notion of academic
performance being independent of the IDE
Discussion and Conclusion
 Most of the existing pedagogical environments
support OO languages (preferably Java)
 More empirical analysis needed to ascertain their
limitations and actual impact on learning
 An individual’s programming aptitude is strongly
related to his/her analytical skills
 Tools may affect programming behaviors
Thank You!!!
Questions
Download