2. PH Grade Assist: Homework in the Twenty

advertisement
PH Grade Assist: Homework in the Twenty-First
Century
Gregory H. Nail, Ph.D., P.E.1
Abstract
PH Grade Assist (PHGA) is an internet-based application supported by Prentice Hall (© 2005 Prentice Hall, Inc., a
Pearson Education Company, Upper Saddle River, New Jersey 07458). It is accessible by instructors and their
students through a web site. Instructor access includes many hundreds of problems lifted from previous editions of
the texts, converted to electronic form. The instructor can select from among these files to form an assignment, which
is then available for student access. An unusual feature of PHGA allows some, or all, of the numerical data given
with a problem statement to take on differing values, appearing as the problem is viewed or attempted multiple times.
This paper describes an implementation of PHGA in an undergraduate Dynamics course at The University of
Tennessee at Martin – during the fall 2004 and spring 2005 semesters. Initial difficulties as well as subsequent
adjustments and successes are discussed.
Keywords: PHGA, dynamics, software, internet-based, instructional
INTRODUCTION
Many trends continue to influence engineering education. Recent advances in computer, information, and related
technology have resulted in availability of a myriad of audio, visual, and multimedia aids. Many of these devices or
computer programs offer complex and innovative features that were the subject of science fiction only a decade ago.
Research into the way in which people learn has resulted in a large variety of non-traditional instructional techniques.
Contemporary pedagogy within engineering has come to include many modes and styles of delivery for course
content. Despite all the progress and advances, it is still generally recognized that practice, or repetition, is as
important as anything else in the course of learning something new.
The traditional approach to promote practice is to periodically assign, collect, and mark an adequate number of word
problems. These problems are specifically designed to incorporate the principles or theory within the subject matter
being learned. After receiving the marked solutions students are then able incorporate feedback, learning from their
errors. The desired result is that students learn and gain understanding, progressively, as they complete more of these
exercises. This approach has a long history of success in many fields, not just engineering.
However, the author has observed students bypassing the learning that is built into the practice. In some cases
students will simply not complete the assigned problems. Reasons for this include the perception that the time used
in completing such exercises is not efficient or wisely spent. Perhaps the most common breakdown in the traditional
model occurs when students are able to easily obtain answers to assigned problems from other students or class files.
1 Assistant Professor, Engineering Department, The University of Tennessee at Martin,
Martin, TN, 38238. E-mail: [email protected]
2006 ASEE Southeast Section Conference
In this way students are able to fulfill requirements for homework assignments while learning very little. The
common availability of solution manuals is another problem that is occurring with increasing frequency. A small but
increasing number of students are following through on this opportunity. Possession of a solution manual potentially
enables a student to simply copy out of it, with the result that little is learned. The immediate result of this situation is
that students are again able to fulfill requirements for homework assignments while bypassing the learning obtained
through practice. The author has experienced both of these situations as an instructor. Overall, the end results often
are high homework scores followed by low test scores.
BACKGROUND
One obvious solution to the problems outlined above is to simply discontinue the collecting and marking of assigned
homework problems. Another approach is to utilize technological advancements to provide the same opportunity for
practice to students. The security level possible within computer programs or software has the potential to offer
practice to students in a secure environment. Software accessed via internet, CD, or network can do exactly this. The
history of software applications in instructional settings is not long, in terms of the total history of college level
instruction. Clearly, this is due primarily to the fact that it is only within the past ten years or so that computing
platforms and accompanying software capabilities have reached the point to where complex instructional
applications are possible. A review of the literature reveals that a number of packages and programs are available
and more are being developed.
Almost ten years ago Jack [7] utilized extensive on-line software in the engineering program at Grand Valley State
University. He describes the conversion of a typical undergraduate statics course over to a format where paper-based
assignments were entirely eliminated. This involved the conversion of all course materials, not just homework
assignments, over to electronic format. Objectives centered around exploring usage of the internet and software more
than overcoming any existing problems with the integrity of homework assignments. At present the interest in on-line
course content is not considered novel. However, one must consider that in 1996 this sort mode of delivery was still
being explored for the first time. Regarding integrity of homework, he points out that the very nature of the
electronic files made detection of copying much easier.
A similar experience is reported by Eberts and Cheng [2] at Purdue University in 1997. They converted a traditional
statistics course over to a mostly on-line format. All lectures were presented over the internet via web pages
incorporating conventional text, figures, tables, graphs, and multimedia lessons. The primary mode of
communication with the instructor was e-mail. This was followed by a hybridized offering of the same course which
featured a mixture of traditional and on-line delivery of content. Tests and homework assignments were done,
however, with paper and pencil for all three course offerings. Student grades were tallied, (typical weighted average
of all assignments and tests for the entire term) totaling 200 points for a perfect score. A plot depicting overall class
performance for each of the three terms is shown. This plot shows that the initial scores for the traditional class are
followed a relative decline in performance for the internet-based course. Scores then rebound for the hybridized
version. Integrity of homework is not reported per se. The authors report students having difficulty with self-pacing
required by the internet-based course. They offer anecdotal evidence to support the hypothesis that significantly less
homework was completed by students taking the internet-based version of the class – leading ultimately to lower test
scores and drop in overall performance.
Woit and Mason [10] documented a case at Ryerson Polytechnic University (Toronto, Canada), during 2000, where
weekly on-line quizzes were implemented into a course in which on-line midterm and final examinations had already
been established. These quizzes were also marked electronically as well. The authors give some background,
revealing why and how this decision was reached by the instructors. Over a period of time they consistently observed
frequent copying of homework assignments (evidently from other students) accompanied by low performance levels
on the examinations. They summarized that, if homework assignments were not collected and graded students tended
not to do them – but that if they were collected and graded then students tended to copy them. It was hypothesized
that this habit of copying in order to fulfill homework requirements was leading to low performance, for a large
number of students. Overall, the intention was to target this group of failing and underperforming students by
providing accountability via weekly on-line quizzes.
2006 ASEE Southeast Section Conference
The on-line quizzes eventually led to improved performance, which was documented. The total percentage of
students failing the final examination dropped from approximately 25% to 7% as a result of implementation of the
on-line quizzes. However, students resisted the change in policy to the point of circulating a petition, the purpose of
which was to halt the weekly tests. The level of improvement in performance varied among the students. In general,
it was approximately a letter grade higher in the final analysis. Also, some student survey results were presented,
providing additional evidence supporting the conclusion that the on-line quizzes led to improved scores.
Pardue and Davennes [9] describe a more recent (2002) experience, at Tennessee Tech, where the WebCT webbased educational system was used to administer significant amounts of material in an on-line environment. This was
successfully accomplished for two undergraduate mechanical engineering courses; Dynamics and Vibrations. A time
line was provided, detailing the conversion of course content, which occurred over a two-year period beginning early
in 2000. Many features of this extensive software were utilized. Among the features were organizational,
communication, quizzing, and archival tools. Although student performance improvements were not quantified,
many of the on-line features are described with a fair amount of detail. Of note are the on-line examinations, which
were in multiple-choice format – allowing for essentially no partial credit. Student concern over this implementation
was reported.
A series of software were individually evaluated for their effectiveness as instructional tools by Hall et al. [3] during
2002. Five studies, with corresponding results, are presented, four of which were conducted at The University of
Missouri at Rolla. The fifth study was essentially a repeat of the previous four, except at other institutions outside
Missouri, including some outside the United States. Various types of software were evaluated including interactive
movies, games, multi-dimensional demonstration movies. These studies consisted of carefully assembled trials, most
of which had a control group of students and an experimental group of students.
Of particular interest is the use of the interactive movies. They were used and evaluated in a way such that they were
essentially electronic homework exercises. No comparisons are available to compare effectiveness of the movies, by
comparing test results with students who did not view them. However, student performance is presented
quantitatively in terms of student performance on tests versus how many movies viewed, time spent viewing them,
and number of interactive exercises completed. No statistically significant improvement in test scores was reported
as a result of the number of movies viewed, time spent on scenes, or number of exercises. However, their data does
reveal students interest in the particular portions of these movies which combine equations with three-dimensional
graphics. Qualitative results are presented in terms of student responses to survey questions.
The remaining studies report varying improvements in the test scores which were used to obtain quantitative results.
Increases in performance were not great in an absolute sense. However, they were found to be statistically
significant.
A recent report by Henderson et al. [4] describes implementation of a CD-based educational tool called The
Homework Laboratory® (HWL) at Tennessee Technological University, The University of Texas, and Monterey
High School. The HWL was used in engineering classes over a three year period, leading up to 2005. By way of
introduction, the authors visit the oft repeated observation of students defeating the learning inherent in traditional
homework assignments by copying from other students. The HWL software is windows-based, menu-driven, and
comprehensive in its features. It allows all aspects of a typical engineering course to be accommodated including
homework assignments, virtual classroom lectures, grades, and more.
Of interest is the homework feature, which allows randomization of variables. The result is that each individual
student is required to work from a different set of numerical parameters on any given homework assignment. Partial
credit is simulated through a grading scheme which reduces the total score possible with each additional attempt at
computing the correct answer.
Evaluation of the HWL was carried out utilizing a modified version of a cyclic assessment model devised by Ayers
and Collins [1]. Variables examined included final examination and test scores of experimental and control groups.
Results show that students with grade point averages (GPA) between 2.25 and 3.49 (4 point scale) performed better
on their final examinations after using the HWL. This corresponded to 66% of all students tested. Students with
GPA’s outside this range performed better without the HWL.
2006 ASEE Southeast Section Conference
PH GRADE ASSIST
Overview
PHGA is an internet-based software package designed to compliment certain Prentice-Hall texts. It has many
features that support most class instructional activities. The overall structure of the web site incorporates homework
assignments, tests or quizzes, grades, and individual or class communication. One, some, or all of the features can be
utilized, at any level, for any particular class. Help for instructors is available via e-mail. The entire package is menudriven from within the web browser in the windows environment. An instructor using PHGA first sets up a class
home page. Once set up this home page is essentially a web site accessible only by login and password, providing a
secure environment set apart for only the instructor and students enrolled in that particular course. This paper reports
primarily on the use of only one of the many features of PHGA.
Menus
Figure 1 shows the main menu, visible immediately upon logging into the class home page on the PHGA web site.
Note the assignment editor, question bank editor, gradebook, and system tools links. All activity on PHGA can be
handled through these four primary links. The assignment editor opens up a series of pages wherein a new
assignment can be generated, or an existing one can be modified. Assignments can also be hidden or deleted from the
system. The question bank editor link provides access to a series of web pages containing links to homework
problems in the form of electronic files. The selection of homework problems is taken directly from previous
editions of Prentice Hall texts. In the case the Dynamics text by Hibbeler [5] was used. The gradebook and system
tools links provide access to pages on which records of student performance on assignments is kept. Administrative
functions can also be performed.
When the question bank editor is accessed, all available question banks appear. Figure 2 shows the resulting view
when the question bank editor is accessed, and a particular electronic homework file is selected for viewing. The
particular file visible was taken from chapter 15 of the text. PHGA has options for instructors to generate their own
question banks. The file shown in Figure 2 originated from one of the question banks supplied by PHGA, which
corresponds to chapter 15 of the text. Note the panel display on the left portion of Figure 2. Various homework
problems are visible as separate links. Also note the values given with this particular problem. The train engine is 70
Mg, followed by three cars, 20 Mg each. Figure 3 shows another view of the same file when refreshed by the
browser. Note that everything appears identical to Figure 2 with the exception of the numerical values present in the
problem description. Instead of a mass of 70 Mg the train engine now has a mass of 90 Mg. The cars, which had a
mass of 20 Mg in Figure 2, are now 30 Mg in Figure 3. A script embedded within the file causes a different set of
values to appear each time the view is refreshed or accessed. This is the simple but effective algorithmic feature. The
calculated numerical answers are entered in the dialogue boxes appearing near the bottom of the screen, as shown in
Figures 2 and 3. Scoring occurs when the grade button is clicked. Note the links to the problems appearing in the
panel on the left side of Figures 2 and 3. Problems numbers ending with an A suffix indicate algorithmic. The
remaining problems do not exhibit the algorithmic feature.
The assignment editor is shown in Figure 4. As can been seen from the buttons across the upper portion on Figure 4,
new assignments can be assembled or existing ones can be modified, restored, or printed. The first dialogue box (not
shown) simply prompts for the name of the new assignment. The next screen, shown in Figure 5, allows the
instructor a specific list of links from which to choose when generating a new assignment. Each of the links, such as
14.1, etc., is a link to a homework problem in electronic form. As questions are selected from those available on the
left panel the right panel is populated with the chosen homework problems. When the desired problems have been
selected the select policies link can then be accessed. Options available include the number of attempts, time limits,
weighting of scores, and order of access. A particularly useful option allows the instructor to disable PHGA from
displaying the correct answer or final grade. The particular combination of policies selected will determine how the
assignment is presented to student users. The author has had success using the homework option almost exclusively,
with multiple attempts on a specified time limit. Another screen, review and finish (not shown), completes the
assignment editor process. The newly assembled assignment then appears to students as they log in on the class page.
2006 ASEE Southeast Section Conference
Figure 1. Main Menu of PHGA showing the four primary links.
Figure 2. Question Bank Editor of PHGA showing initial view of problem 15.9A.
2006 ASEE Southeast Section Conference
Figure 3. Question Bank Editor showing refreshed view of problem 15.9A.
Figure 4. Assignment Editor screen showing various assignments and options.
2006 ASEE Southeast Section Conference
Figure 5. Assignment Editor screen showing assembly of an assignment.
From the gradebook link, visible in Figure 1, another series of screen can be accessed (not shown). The gradebook
page allows viewing results, and other information, regarding all attempts by all student users on all assignments. A
particularly useful feature allows the instructor to request only the best result from each student, on specified
assignments, and converts the results into spreadsheet format ready for download onto local hard drive. The author
has utilized this particular combination of features with encouraging results.
Implementation
The PHGA software was implemented at The University of Tennessee at Martin, in a Dynamics class, by the author
initially during the fall 2004 semester. The primary reasons for embarking on the implementation were essentially the
same as those cited in the background section above. Specifically, PHGA was to be used in an attempt to address the
known problem of homework integrity.
Before proceeding with implementation it was decided that no control group of students would be formed or tested
separately. The reasons for this are inter-related. The typical enrollment in Dynamics at UT Martin is relatively small
(10-20 students). Enrollment during the first two semesters of implementation was 14 (fall 2004) and 18 (spring
2005). Fall 2005 enrollment was 10. Other authors had already demonstrated that software such as PHGA can
improve student performance (Huddleston [6], Ohlsson [8], and Woit [10]). Judgment and experience indicated that
any practice obtained via PHGA would be more beneficial for the students than the existing scheme entirely
dependent on conventional homework papers. Therefore, it was decided that our enrollment was too small to justify
the isolation of a control group. The author’s plan was to introduce PHGA for purposes of homework practice, and
re-evaluate after two full semesters.
Initial student reaction was negative. However, as became evident later, this may have been due to a number of
factors – rather than any problems inherent with PHGA. The primary difficulties encountered involved student and
instructor learning the features and how to use them correctly. Also, due to some delays, PHGA was not actually
2006 ASEE Southeast Section Conference
introduced at the beginning of the semester. It was begun about one third of the way through. A total of 6 homework
assignments were conducted using PHGA until each student in the class was able to access and work one
successfully. Two assignments on PHGA were actually counted towards the grade. As expected, the overall course
evaluation was 3.045 (on a 5-point scale), a relatively low value. Student comments were generally negative.
Concerns were initially expressed about the lack of partial credit available in the PHGA scoring.
Spring 2005 was actually the first full semester of implementation. Experience gained during the previous semester
was utilized to good effect. The instructor assembled a series of PHGA problems, complete with worked solutions,
ready in advance. The entire class met in a computer lab so that the PHGA could be introduced to all students at
once. Also, this arrangement facilitated registration for the students. One of the problems on PHGA was worked
beforehand so that the students could use and observe the algorithmic feature working.
Despite the initial improvements student resistance continued, as evidenced by the slow response to obtain PHGA
registration materials. The existence of PHGA as a requirement, for the course, had been made known to the students
from the very first day of class. However, at the first meeting in the computer lab (one week into the semester) only 8
of 20 students were ready with their login password for PHGA. After two weeks only 12 of 20 were ready. The
instructor decided to begin assignments in the hope that this would motivate the remainder of the students to act. At
the mid-term the average on PHGA assignments was 56%. A survey at that point revealed that much of the reason
for the low average was a lingering reluctance to accept the on-line assignments as required. The instructor offered
encouragement in class by introducing, or previewing, each assignment in some detail as it was made. By the end of
the semester the overall average had risen to 71%. A total of 10 assignments were administered. Student evaluation
of the course rose substantially, to 4.175. The first positive comments regarding PHGA appeared with student
evaluations. Concerns about the lack of partial credit remained, but seemed to be somewhat mitigated by the larger
number of assignments available for grade.
The fall 2005 semester has been completed with PHGA having been used in essentially the same way as spring 2005.
The overall average on the 10 PHGA assignments was 80%. Minor adjustments are still being made with the
implementation. Plans for the spring 2006 semester include expanding the number of PHGA assignments.
OBSERVATIONS AND CONCLUSIONS
An implementation of PHGA has been documented in a Dynamics course at UT Martin. The algorithmic feature was
used to address the problem of homework integrity. The implementation was not without difficulties but was
successful in the end. A control group was not isolated for comparison purposes. Student reaction was initially
negative but improved significantly later.
During the first two semesters of implementation the author changed the way in which the PHGA assignments were
presented to the class. Initially the assignments were simply announced with little additional commentary. Care was
taken to select electronic assignments that closely paralleled the classroom lecture material. However, no specific
helps or direction was given with regard to the PHGA assignment itself. It was observed that student reception
improved significantly after this format was changed gradually to that of previewing or leading in to the PHGA
assignment. This was done by first presenting a similar in-class exercise. The author has observed students
incorporating the underlying theory and practice as they work through the PHGA assignments, when introduced in
this way. Specifically, drawing of free-body diagrams, organizing of equations, and other basics are being practiced.
Of particular interest is the algorithmic feature. It is evidently this feature, combined with encouragement by in-class
example, which is leading students to often solve the PHGA problems in terms of variables. This has been observed
by the author. Although not quantified, these observations indicate that PHGA assignments are causing students to
gain at least some additional practice in the skills needed for success on traditional written exams.
The somewhat low averages on PHGA assignments, ranging from 56% to 80%, are evidence that students are unable
to copy each other’s work as is the case with conventional homework. The author has observed improvement in
student performance on partial-credit problems occurring on examinations. Student course evaluations trended
downward initially but recovered significantly following the second semester of implementation, from 3.045 to
2006 ASEE Southeast Section Conference
4.175. During fall 2004 only 2 assignments were actually completed for grade. Spring and fall 2005 classes each had
10 assignments administered. Implementation continues, with the next goal to expand the number of assignments
offered via PHGA.
2006 ASEE Southeast Section Conference
REFERENCES
[1] Ayers, J.B., and Collins, A.S., “An Evaluation Model for Curricula Innovation,” IEEE Transactions on
Education, Vol. E-28, 1985, pp. 52-55.
[2] Eberts, R., and Cheng, K., “Learning, or Not Learning, Through Multimedia,” Proceedings of the Human
Factors and Ergonomics Society 42nd Annual Meeting - 1998, pp. 606-610.
[3] Hall, R.H., Philpot, T.A., Hubing, N., Flori, R.E., and Yellamraju, V., “A Model-Driven Multi-Year Assessment
of a Software Development Project for Engineering Instruction,” Proceedings of the 2004 American Society for
Engineering Education Annual Conference and Exhibition, ASEE, Salt Lake City, UT, Jun. 20-23, 2004.
[4] Henderson, R.C., Tindall, W.R., Lowhorn, D.E., and Larimore, D.L., “Helping Students Become Proficient at
Solving Fundamental Engineering Problems through Practice – The Homework Laboratory,” Proceedings of ASEE
Southeast Section 2005 Annual Meeting, ASEE, Chattanooga, TN, Apr. 3-5, 2005.
[5] Hibbeler, R.C., Engineering Mechanics Dynamics, Tenth Edition, Prentice-Hall, Upper Saddle River, NJ, 07458,
2004.
[6] Huddleston, D.H., and Walski, T.M., “Using Commercial Software to Teach Hydraulic and Hydrologic Design,”
Computers in Education Journal, Vol. 13, No. 3, 2003, pp. 43-52.
[7] Jack, H., “A Paperless (almost) Statics Course,” Proceedings of the 1998 American Society for Engineering
Education Annual Conference and Exhibition, ASEE, Seattle, WA, Jun. 28-Jul. 1, 1998.
[8] Ohlsson, L., and Johansson, C., “A Practice Driven Approach to Software Engineering Education,” IEEE
Transactions on. Education., Vol. 38, No. 3, Aug. 1995, pp. 291-295.
[9] Pardue, S., and Darvennes C., “Dynamic and Resonating Use of WebCT,” Proceedings of the 2002 ASEE
Annual Conference & Exhibition, ASEE, Montreal, Quebec, Canada, Jun. 16-19, 2002.
[10] Woit, D., and Mason, D., “Enhancing Student Learning Through On-Line Quizzes,” SIGCSE 2000: 31st
Technical Symposium on Computer Science Education, Austin, TX, Mar. 8-12, 2000.
Gregory Howard Nail
Gregory H. Nail is currently an Assistant Professor on faculty of the Department of Engineering, The University of
Tennessee at Martin. He was employed as a Research Hydraulic Engineer by The United States Army Corps of
Engineers from 1991-2002. Nail holds M.S. and Ph.D. degrees from the Mechanical Engineering Department at
Texas A&M University, earned in 1986 and 1991, respectively. He earned a B.M.E. (Bachelor of Mechanical
Engineering) from Auburn University in 1984.
2006 ASEE Southeast Section Conference
Download
Related flashcards
Computer science

25 Cards

ARM architecture

23 Cards

X86 architecture

22 Cards

MSN

28 Cards

Create flashcards