Report - Litre

advertisement
Using Handheld Electronic Responders in the Classroom to Provide Immediate
Feedback and Enhance Student Learning.
Final Report
LITRE Project
Gary Moore
Department of Agricultural Education
Project Overview
This project employed the use of electronic hand held responders in selected
undergraduate classes in the College of Agriculture and Life Sciences and the College
of Natural Resources. Specifically the Classroom Performance System (CPS)
manufactured by E-instruction (http://einstruction.com/) was used. The system consists
of wireless handheld responders that students
use to transmit information to a wireless
receiving unit. Each responder is numbered and
the student uses the same responder over the
course of the semester. The wireless receiving
units transmits student information to the
instructor’s computer which has the E-instruction software installed. All of this occurs in
a second or less.
The system can be used for a variety of purposes ranging from taking class attendance
to administration of exams. We used the system to determine if
A. Providing immediate feedback to students during class enhanced their learning
of the materials.
B. Student attitudes toward the course were enhanced by use of the response
system.
In this study, students were asked 5-10 multiple-choice questions either at the start or
end of the lecture or interspersed throughout the lecture. The instructor of the course
determined which process to use. The questions related to the content of the lecture.
These questions were projected on a screen using a LCD projector. The students
responded using their handheld responders. As soon as the students responded, the
aggregated responses were displayed on the screen. Each student was then able to
determine if his/her response was correct.
It was hypothesized that student learning would be enhanced and they would have a
more favorable attitude toward the class as a result of using the handheld responders.
The initial plan was to compare the grades of the students in the class during the
semester when the responders were used with the grades of the students in the same
class the previous semester. Students were queried at the end of the semester to
gauge their reaction to the responders. Finally, the instructors were interviewed to
ascertain their reaction to the technology.
Because of some implementation issues that arose during the study (described later)
and the fact that we were using volunteers who had differing levels of commitment to
the study and different levels of usage of the system, one additional phase of research
was added to the project. The additional phase was to test the system in another
educational setting (a local high school) where strict experimental controls could be
employed.
Implementation Experiences
Six different instructors initially agreed to participate in this exploratory project. However
as the semester progressed two of the instructors couldn’t find the time to actually
implement the system. A number of attempts were made to get with these two
instructors and help them implement the system but these attempts were not
successful. However, one instructor was added as she saw the system being used in a
colleague’s class and wanted to become involved in the study. Therefore the results are
based upon six classes taught by five different instructors.
The actual names of the instructors will not be used in this report but they will be
referred to as letters of the alphabet. A brief discussion of the classes taught and
experiences with the system follows. The classes are listed in alphabetical order.
Instructor A. Twenty-three students in ANS 309 Livestock Evaluation participated in the
study. There was initially some slippage in setting up the system and getting it to work.
The room did not have a dedicated computer so a laptop system was used. It took
some time to set up the system and distribute the responders each time it was used.
The primary mode of instruction in this course involved numerous field trips, so the
system was used on an intermittent basis.
Instructor B: The CPS system was used with one section of ARE 201 Introduction to
Agricultural and Resource Economics. There were 51 students in this class. The course
was taught in a classroom in Nelson Hall. The instructor was a faculty member in CALS
but the classroom was under the control of the College of Management.
The first problem encountered was getting the software installed on the classroom
computer. The technical support staff that maintained the classroom wanted six months
of lead-time to install the software. And they did not want to attach the receiving unit to
the computer (it uses a USB connection) because they were afraid it might interfere with
the operation of the computer and other programs. The instructor became very
frustrated in trying to work with the technical support staff in the College of
Management. To say the support staff was uncooperative would be an understatement.
After trying for several weeks to get the software installed on the classroom computer,
the instructor resorted to using a laptop with the software installed. However, the
computer had an older operating system and that presented some problems. The
system would lock up. Eventually, a new laptop was obtained and the system was
finally implemented.
Instructor C: BIT 360 (Manipulation of Recombinant DNA), Section 001 was the third
class used in the study. There were 18 students in the class. A dedicated classroom
computer was used. The computer technician had no problems installing the software
and there were no technical issues. The instructor used the system on a regular basis.
She also used the system with a graduate class. At the end of the semester, this
instructor purchased the CPS system for use in the biotechnology program.
Instructor D: BIT 360 (Manipulation of Recombinant DNA), Section 002 was also
involved in this study. When the instructor learned that the other section of the course
was using the CPS system, she wanted to use the system also. There were no
technical problems in this class. There were 17 students in this section.
Instructor E: ET 252 Introduction to Spatial Technologies was the final class used in the
study. A dedicated classroom computer was used and there were no technical
problems. The instructor did not provide the researchers with the final grades of the
students, thus the researchers do not know the number of students involved in this
class. This instructor was very positive about the technology but did not complete the
instructor assessment instrument.
Findings
Student Achievement. The first hypothesis was that providing immediate feedback to
students during class enhanced their learning of the materials. This hypothesis was
examined by comparing the grades made by the students in the class during the spring
semester of 2005 (the semester in which the CPS was used) with the grades of the
students during the previous semester (when the CPS system was not used). Final
grades were provided by four of the five instructors and are shown in Table 1.
Table 1
Final Mean Grades of NCSU Classes Taught Using the CPS
Class
ANS 309
ARE 201
BIT 360
BIT 360
Mean Grade in
CPS Class
86.5
86.0
87.1
91.2
Mean Grade in
Comparison
Class (No CPS)
84.5
85.2
91.2
89.7
N of Students
in CPS class
23
51
18
18
N of Students
in Comparison
Class
12
46
16
17
The overall grand mean of the students in the control group was 86.97 while the overall
grand mean of the students in the CPS group was 87.10. Because of the
implementation problems in one class and the lack of rigorous experimental control
involved in this exploratory study (the use of volunteer instructors, differences among
classes in the use of the system, the comparison groups used and the small N), it would
not be appropriate to perform a sophisticated statistical analysis of these data.
However, a visual inspection of the data reveals there is no difference in the final grades
of the two groups. Because of the problems in quantitatively measuring the outcomes of
this phase of the study, it was decided to add another phase to the project. The
researchers decided to implement the system in another educational setting where they
would have more control of the process.
Phase II – The researchers obtained permission to use the CPS in the Agricultural
Education Department at Southern Nash High School in Nash County. This school has
three agriculture teachers who all teach the same class. Because the teachers teach
the same content and were willing to cooperate with NCSU, it was possible to
implement the system in a more highly controlled manner. This phase of the study used
a non-equivalent, quasi-experimental design. Three sections of Agriscience
Applications, taught by three different teachers at Southern Nash High School were
used in the research. Two classes served as the treatment groups and the other class
served as a comparison group, for the first instructional unit. Then during the second
instructional unit, the group roles were switched. This resulted in a modified switching
replications design. Students in the “treatment” group used hand held electronic
responders to receive feedback. Students in the control group received traditional verbal
feedback. At the end of each instructional unit an achievement test was administered to
each group. The experimental design for this study is visually presented in Table 2.
Table 2
.
Experimental Design
Agriscience Applications
Class Number
Class 1: Instructor 1
Class 2: Instructor 2
Class 3: Instructor 3
Instructional Unit 1:
FFA History
Traditional Feedback
Electronic Feedback
Electronic Feedback
Instructional Unit 2:
Leadership Development
Electronic Feedback
Traditional Feedback
Traditional Feedback
Class one had an enrollment of 23 students, class two had an enrollment of 20
students, and class three had an enrollment of 18 students. A total of 61 students were
involved in the study. Each of the students experienced both the audience response
system feedback and the traditional verbal based feedback.
Six hours of training, facilitated by the researchers, took place to teach the instructors
how to use the hand held audience response system. At the end of daily lessons,
instructors were directed to question students on the course material. The questions
posed to the students were multiple choice type questions. The students in the
treatment group would respond to the question using the hand held electronic
responder. The correct answer would then be displayed along with a summary of the
percent of students who chose each response. The students would immediately know
how they did and were able to see the total responses of the class. All the data were
grouped and there was no way for the students to identify how other students had
responded. Based upon the student responses, the instructor could discuss or clarify
any issues that needed to be addressed. In the control group, the instructor also asked
questions but called on individual students and provided them verbal feedback about
the adequacy of their response.
The three class instructors developed the two achievement tests used in the study. The
test items were based upon instructional objectives and competencies drawn from a
statewide test bank maintained by the state Department of Public Instruction. The tests
were examined for both validity and reliability. A panel of experts verified content
validity. To assess reliability, the Kuder-Richardson coefficient was used to determine
coefficients of 0.798 for test one (FFA) and 0.760 for test two (Leadership). Test scores
from each student were collected for units one and two. Then the scores were
combined to form the treatment group (Class 2 & 3, Unit 1 and Class 1, Unit 2) and the
comparison group (Class 2 & 3, Unit 2 and Class 1, Unit 1). An independent sample ttest was used to analyze any significant differences present between the two groups
Achievement Findings of Phase II
The treatment group (hand held electronic responders) had a mean achievement score
of 89.98 on a 100 point maximum scale, with a standard deviation (SD) of 8.817 and a
standard error of the mean (SE) of 1.116. The comparison group (traditional verbal
feedback) had a mean score of 84.41 on a 100 point maximum scale, with a standard
deviation (SD) 12.618 and a standard error of the mean (SE) of 1.616. These data are
presented in Table 3.
6
Table 3
Achievement Scores by Group
Group
Treatment
Comparison
N
61
61
M
89.98
84.41
SD
8.72
12.62
SE
1.12
1.62
An independent sample t-test was performed to test the null hypothesis “There is no
difference in student achievement between students who received feedback through the
audience response system and students who received only verbal feedback.” The value
of t was 2.835, which is statistically significance at the 0.005 probability level.
Student Attitudes. It was hypothesized that student attitudes toward the technology
would be favorable. In the NCSU phase of the study, three to six students in each class
were asked to complete a brief survey about their experiences with the CPS. Data were
collected from a total of 28 students in five classes (ANS 309, ARE 301, BIT 360 (2
sections) and ET 252). Generally students were very positive about the technology with
the exception of students in ARE 201 where there were technical problems. A complete
list of student comments is found in Appendix 1. Students liked the system. They found
that it made the class more interesting and they appreciated the fact that their
responses were anonymous. They believed the best use of the system would be to ask
questions throughout the lecture. They would like to see this technology used in other
classes.
In the high school phase of the study, focus group sessions were held with the students.
A random sample of seven students from each class participated in focus group
sessions. The student comments were recorded, transcribed and then analyzed using
TextSTAT 2.0. To analyze the content, word frequency tables were constructed. The
following words were used the most by the students in describing the CPS – helped,
easy, participate, fun, learn, pay attention, and understand. The frequency of each
utterance is found in Table 4.
Table 4
High School Student Focus Group Word Frequency Totals
Relevant Word or Word String
Helped
Easy
Participate
Fun
Learn
Pay Attention
Understand
Frequency on a Per Utterance Basis
65 instances
51 instances
38 instances
38 instances
25 instances
17 instances
10 instances
Instructor Reaction. The instructors, even the one with technical difficulties, saw the
value of the technology. They indicated they would use the technology in other classes
and believed that it improved student learning of the material. Overall, they were
positive about the CPS. The comments of the instructors are found in Appendix 2. The
high school teachers involved in phase II of the study had similar reactions.
Conclusions
Using a classroom response system, such as CPS, holds potential for enhancing
student learning. The students like the system and think it makes class more interesting.
They also believe it helps them learn the material better. While the collegiate grade
comparisons did not support this belief, additional research using a more rigorous
research design is needed. When such a design was employed with a high school
group, it was found that the learning was enhanced by using an audience response
system.
Dissemination
As a result of this project the following presentations have been made:
Looney, S. E., Stair, K.S., Moore, G. E., Conoley, J.,Croom, D. B.& Wilson, E. B. (2005)
Using Hand Held Electronic Responders to Induce Active Learning in the Classroom.
Poster presented to the Southern Agricultural Education Research Conference. Little
Rock, Arkansas. February.
Uricchio, C. K., Looney, S. E., Stair, K.S., Moore, G. E., Conoley, J.,Croom, D. B.&
Wilson, E. B. (2005) Using Hand Held Electronic Responders to Induce Active Learning in
the Classroom. Poster presented to the American Association for Agricultural Education.
San Antonio. May.
Conoley, J., Croom, D. B., Moore, G. E. & Flowers, J. F (2006) Impacts of an Audience
Response System on Student Achievement in High School Agriscience Courses. Paper
presented to the Southern Agricultural Education Research Conference. Orlando, Florida.
February.
Conoley, J., Croom, D. B., Moore, G. E. & Flowers, J. F (2006). The Use of Audience
Response Systems to Improve Student Achievement in High School Agriscience
Courses. Paper to be presented at the National Agricultural Education Research
Conference, Charlotte, May.
A journal article has been submitted to the Journal of Agricultural Education and one is
under development for the NACTA Journal (North American Colleges and Teachers of
Agriculture).
The technology has been demonstrated at two different teaching workshops conducted
in CALS.
Appendix 1: Undergraduate Student Evaluations of CPS
1. What are your overall impressions of the CPS Handheld Electronic Responder
System? Why?
ANS 309
My overall impression was a good one. It was more hands on instead of
constant note taking.
They are a waste of time…they don’t work half the time. It would have been
better if the range was better as well.
At least we didn’t have to pay for it this time.
ARE 201
This system was a pain, the program took a long time to set up and it seemed to
always have problems. The sensor that you point to was in good placement for
me because I sit in the front row but otherwise the teacher had to hold it up for
those in the back.
I didn’t really like it that much, but I think the reason for that is because the
teacher had so many problems with it. It seemed like it was more of a hassle
than a help.
I think that they could be beneficial if they worked correctly every time. They are
easy and interactive.
They are stupid and waste time. They take too long to set up and then they don’t
work once they are set up.
I think they have the potential of being useful for attendance and reviews, but the
process of going and getting the clickers and the time it takes to set up the
program wastes a lot of time.
I really enjoy using them – they are awesome for reviewing. I just wished they
worked more!
BIT 360, Section 1
It is a useful teaching too because it allows us to review material and answer
questions as if they were on a test.
Helpful in class to make sure you understood the material.
It seems to work, the UFO thing (receiver) is pretty cool.
A very good system to keep people paying attention and listening
We liked it!
It is helpful and encourages class participation.
BIT 360, Section 2
They are very helpful. I especially like them for this type of class. It helps me
focus on the important points.
It is a nice alternative to typical question asking in a class setting because it gets
the entire class to answer the question anonymously with instant and graphical
feedback. This is great for a no-pressure learning atmosphere.
I enjoyed the responder system because it allowed me to interact directly with the
instructor while not feeling singled out. However, the instructor also did a great
job preparing appropriate questions that complemented the responder system.
ET 252
I like using the system because it quickly gives you feedback on course material
and it provides an alternative way to learn.
My overall impressions are very good. I like instant feedback.
It’s a good idea because it’s a useful way of teaching.
I like it because it helps me learn better.
Innovative use of technology.
I enjoyed it because it gave you an option to answer something without having to
deal with everyone’s responses.
I liked it because it gave a little variety and more interaction to the class.
It was good. It added another dimension to the lectures.
I thought at times the responder system was useful.
Class was much more organized.
2. Do you think there is any value in using this system in the classroom setting?
Why or why not?
ANS 309
Yes, because it will keep the students more active in a classroom setting.
Maybe in huge classes, but I think it is a waste of time in small ones.
Yes…if you keep fresh batteries and not make it too hard.
ARE 201
I think this could be a very good idea. Extra questions could be added with in a
lecture to review what the teacher has gone over. However, this will only be
helpful if there isn’t too much of it to overwhelm the class.
I think there is value in using this system, but it was just not very effective for our
class.
I think that there is some educational benefit to them if used correctly.
No because they waste valuable time that we could use learning the material.
Yes, because the teacher doesn’t need to take role and it’s an easy way to
review what was covered in the previous class(es).
Absolutely, it refreshed the student with newly learned material.
BIT 360, Section 1
Yes, because it helps in learning the material.
Yes, the questions to go along with the lecture were good.
Yes, but limited to how good the questions are.
Yes, but more so for smaller classes. In large classes it would take too much
time.
Yes, it helps you learn the material.
Yes, because it encourages participation and keeps students attentive and
awake.
BIT 360, Section 2
Yes, it helps you get an idea of questions that may be asked in the future or what
to study.
Yes, because it is a great way for the instructor to gauge the class’s knowledge
without written assignments.
Yes, they allow immediate classroom interaction so that the professors can
change their lectures according to students’ learning needs.
ET 252
Yes, instant feedback and alternative way to learn…having different ways to
learn makes it more interesting.
Yes, we can quiz ourselves in class.
Yes, because it is interactive and makes you want to pay attention so you can
play.
Yes, because it makes students pay attention more.
Yes, everyone has to be involved.
Yes, gives the instructor a good way to review.
Yes, it takes less time to review material using this system.
Yes, I remembered the questions I had to answer more than the straight lecture.
I do. Writing down the questions and answers gave me good notes.
Yes, because it gets all the students involved.
3. In your opinion, what would be the most effective way of using this system in
the classroom? For example, would it be more effective to ask questions strictly
at the beginning of the class, interspersed throughout the class, or strictly at the
end of the class as a review?
ANS 309
Use it throughout, a system of all three would be great.
Interspersed throughout.
I think it would serve better to be done in the beginning for a review of the
previous class or at the end of class for review of the day’s material.
ARE 201
Interspersed throughout, just one or two every once in a while.
I think it would be better if used at the end of class, to recap all of the information.
I think that questions interspersed throughout would be most effective by keeping
the student engaged during class.
Well, I don’t think they should be used at all but I guess at the end of class. It
doesn’t really matter.
I think asking questions at the beginning of class to review the previous class
material is the best.
From experience, it was best used at the beginning for review from the last
lecture, and at the end for test reviews, etc.
BIT 360, Section 1
Interspersed or strictly at the end of class. I think it (the material) would be
remembered more easily this way.
Interspersed throughout the class. (4 students)
Can be used for attendance at the beginning of class, then use a combination of
interspersed and at the end.
BIT 360, Section 2
I think it is most helpful to ask questions all along throughout the lecture…to
incorporate the questions into the whole lecture.
I prefer interspersed if the transitition between lecture/ question is quick.
I enjoyed having questions interspersed throughout the class so material that
wasn’t understood could be reviewed immediately.
ET 252
I would think that it could be used in all of these manners and be effective, just
depends on the situation.
I bet it would be fun to have some random questions through class, only problem
is it might make it harder to go to sleep!
I think interspersed throughout the class so you pay attention throughout the
class.
I enjoyed at the beginning, but it would probably be more effective interspersed.
Works well for a review, but with some creative thinking, it would work for any
part of the class.
Any would help.
I think it works well at the beginning of class as a review. I don’t think it would
work as well during the lecture because it would disrupt the flow of the lecture.
Throughout the class: keeps the students involved.
If you do it throughout the class, you can make sure students are awake and
paying attention.
Ask questions throughout the class as a review and introduction to information.
4. Would you like to see this technology used in other classes?
ANS 309
Yes I would.
Yes, if it is free to students.
Yes, if it was done better.
ARE 201
If the technology worked properly, I believe it would be an “OK” addition.
No (2 students).
I think it has potential in a variety of classes so yes.
Only in smaller classes.
BIT 360, Section 1
Yes (all 6 students)
BIT 360, Section 2
In longer classes such as this one, it helps to keep the class involved and I think
it is very good! But for shorter classes, I don’t think it would be as effective.
Sure.
Yes, it is already used in physics classes on campus.
ET 252
Yes. (5 students)
Sure, why not.
Possibly, I think it could work but for certain curriculums, it may not work as well.
Possible…depends on the class.
Sure, if I’m not graded!
I would depending on the class.
5. Did the use of this technology learn the material better? Do you consider it a
helpful learning tool? Why or why not?
ANS 309
Yes, because it refreshes your memory of the material.
No, teachers can just ask questions.
Somewhat…it made it better to see you were one of only five students that got it
right or wrong.
ARE 201
Yes, students can see questions as how they might appear on homework, tests,
and quizzes.
It could, but many of the times we reviewed it didn’t work very well.
No, it wasn’t helpful because it hardly ever worked.
I think that interspersed questions throughout the class are a useful learning tool.
No, not really. It was too much of a hassle to be a help.
It did not help this class any because we did not use it enough to see any real
positive/negative outcomes.
BIT 360, Section 1
Yes, because it reviewed the material after we learned it and helped make sure
we understood the material.
Yes, it clarified issues we had with the material.
Yes, it was helpful (4 students).
BIT 360, Section 2
Yes, this technology helped me learn the material better by jump starting my
brain at certain intervals and helping to identify more important topics.
I think it was helpful to me in organizing the lecture info. Knowing what I needed
to work on or study more also helped.
Yes! It was a very helpful learning tool especially since you could answer
candidly and not worry about the consequences of being wrong. It also helped to
keep me involved with the class.
ET 252
Yes, I was able to take useful notes off the questions.
It helped me take better notes.
Yes, I had to think about things more during lecture; made it more interesting.
I would consider it a good review tool, but not necessarily a good tool to learn
new material.
Yes, it gave us a way to review everything each class and have the teacher
explain anything we got wrong.
No, there is no incentive to get the right answer.
Yes, because it makes me learn and pay attention.
Yes, we can quiz ourselves and have instant feedback as to if we were right.
It changes the way you learn so it keeps you on your toes.
6. Did the use of this technology give you a more positive or more negative
attitude toward the class? Why?
ANS 309
Neither (2 students).
It gave me a more positive attitude knowing that it wasn’t just going to be just
note taking.
ARE 201
More positive, if I answered questions correctly that’s awesome. If I didn’t, then I
learned the right answer. Plus, no ones embarrassed if you’re wrong because
only the teacher will know.
Neither, really. They didn’t have much of an affect on the class.
Negative because yet again it never worked properly.
I think if they would have worked correctly, it would have given me a more
positive attitude.
It didn’t really change my attitude toward the class.
No, it really didn’t make that big of a difference. It was just another learning tool.
BIT 360, Section 1
More positive because we were better able to understand the material.
More positive because everyone gets to participate.
More positive because it kept me more interested.
BIT 360, Section 2
More positive, it was a nice change of pace. The interactivity held my attention
better.
I enjoyed interacting throughout the class period through the use of this
technology. Therefore, I had a more positive attitude toward the class. However,
if my grade were dependent on using this technology, I would not have such a
positive attitude.
Positive. Without them it would have been harder for me to follow the lecture.
ET 252
Positive because it kept me motivated to learn more.
My attitude did not change because of this class.
N/A…we didn’t use it enough to change my overall impression.
A more positive attitude.
Positive…made the class more entertaining.
Slightly positive, made part of the class like a trivia game.
Positive because the questions were not too difficult.
More positive. It’s kind of fun when you only do it every once in a while.
Positive. Considering this class has technology in the name, having used
technology to learn is always a plus.
7. Please list the one thing you liked best about this system:
ANS 309
Get results right then and there.
Nothing.
ARE 201
Review of questions.
It was effective in keeping up with attendance.
Interactive.
Nothing.
The review questions.
The format and the system were easy to use. Everything is displayed clearly and
largely for all to see.
BIT 360, Section 1
Made class more interesting.
I liked how everybody got their own handheld responder.
Instant feedback on understanding.
BIT 360, Section 2
The questions being spread throughout the lecture and the supplemental
questions the lecture questions generated.
The ability to interact throughout the lecture!
Anonymous answering.
ET 252
Doesn’t tell the individual people whose answers are wrong.
Attendance.
Anonymous users.
Reviewing each class.
No speaking and no professor waiting for spoken answers.
Multiple-choice questions.
Instant feedback, we can write down what the correct answer is and its great for
studying.
Alternative way to learn.
8. Please list the one thing you disliked most about this system:
ANS 309
Liked it all.
Getting it set up.
ARE 201
Had a lot of problems.
It had too many problems.
Sometimes it doesn’t work.
It didn’t work.
The time it took to get the program working (started in the beginning of class).
It didn’t work a lot of times.
BIT 360
Took too long for students to buzz in.
Putting the responders back in order in the case at the end of class.
After you answer, there is a wait time (for everyone else to answer) which delays
class.
If you are in the back of the room, the remote may not work well.
BIT 360, Section 2
None
I really did not dislike any part of the system.
The software seemed a little tricky for the teacher sometimes.
ET 252
None.
No complaints.
Someday students will have to purchase the remote.
I would dislike it if it was actually used as a grade.
New technologies sometimes have kinks in them which creates down time.
Appendix 2: Instructor Evaluations of CPS
1. What are your overall impressions of the CPS Handheld Electronic Responder
System? Why?
The system has merit in the right circumstances but it has its limitations. (ANS 309)
I like the concept and the interaction that it gives the students and the instructor for
specific types of questions. Sometimes it's hard to judge when students get what
you're saying, and this is an alternate channel of communication that's worth merit.
(ARE 201)
I love it. Students seemed more actively engaged in lecture. They really enjoyed
using them and told me that. (BIT 360, Section 1)
Excellent, the students greatly enjoyed it (they complained when we didn’t use it in
lecture). The students felt more involved in the class, and ALL students, not just one
or two who always spoke up. So it forced everyone to think. For the instruction, it
helped me gauge understanding better. (BIT 360, Section 2)
2. Do you think there is any value in using this system in the classroom setting?
Why or why not?
Yes, generates direct student involvement in material being presented. (ANS 309)
Yes, because those students who are quiet and don't speak up get a chance to
interact when they wouldn't in any other way. (ARE 201
Yes, keeps students engaged and students can let you know anonymously if they do
not understand. (BIT 360, Section 1)
Definitely, the teacher can see where students are missing something conceptually
and go over it on the spot. Also helps keep students awake.  (BIT 360, Section 2)
3. In your opinion, what would be the most effective way of using this system in
the classroom? For example, would it be more effective to ask questions strictly
at the beginning of the class, interspersed throughout the class, or strictly at the
end of the class as a review?
Used sporadically (interspersed). (ANS 309)
I used it most to start and end classes to introduce and review material presented. I
think having more time to actually put questions into the lecture and having more
confidence in the system working would create better uses than I've personally seen.
(ARE 201
Definitely interspersed. (BIT 360, Section 1)
Interspersed, definitely. (BIT 360, Section 2)
4. Would you use this technology in other classes?
Yes, I would consider using this in my 2-year class as an incentive to stay focused.
(ANS 309)
Yes (ARE 201
Yes, I would use it for every class. We ended up buying a set, in fact. (BIT 360,
Section 1)
You bet. (BIT 360, Section 2)
5. Do you feel the use of this technology helped your students learn the material
better? Do you consider it a helpful learning tool? Why or why not?
Yes, my limitation in this class was too many class meetings outside the classroom.
(ANS 309)
I think in the way I used it, it helped solidify info in their minds about what they
should pay attention to. I'm not sure that it actually increased information as much
as increasing retention. (ARE 201
Yes, they paid attention better because they were actively engaged. Also, they
could let me know when they didn’t understand without feeling singled out. (BIT 360,
Section 1)
Absolutely! Again, instead of sitting back, not thinking, and letting other students
answer, everyone had to think. Plus they had to switch from listening/sleeping mode
to thinking/applying mode, which is always a good thing. (BIT 360, Section 2)
6. Do you think the use of this technology increased your students’ attitudes
toward the class? Why or why not?
N/A (ANS 309)
No, because the technology was a major flop for such a long time that they basically
gave up on it. (ARE 201
I think it did. (BIT 360, Section 1)
That I’m not sure, but they sure were disappointed when we didn’t use it. (BIT 360,
Section 2)
7. Please list the one thing you liked best about this system:
Focused student attention on lecture material. (ANS 309)
Free for students rather than having to pay the fee. (ARE 201
Students paid attention better because they were actively engaged. (BIT 360,
Section 1)
It encouraged classroom activities. (BIT 360, Section 2)
8. Please list the one thing you disliked most about this system:
Set-up time required!!! (ANS 309)
Technical glitches and lack of IT support. (ARE 201
Entering questions into software was clunky. (BIT 360, Section 1)
The games options (team activities) could be improved – allow many questions per
category on “challenge”. For “there it is”, allow electronic versions of the questions
to be asked. Also, it was time consuming to program but not too bad. I disliked that
you were limited how large the question box could be (it covered the lecture slides
from the power point). (BIT 360, Section 2)
9. Please list any other comments here:
This semester was very trying and shouldn't have been. I wish the IT folks were
more receptive to alternate uses of their equipment. (ARE 201
I would like to be able to make the question box smaller. (BIT 360, Section 1)
None. (BIT 360, Section 2)
Download