FINAL REPORT Advanced Usability Thuy Nguyen

advertisement
Advanced Usability
FINAL REPORT
Thuy Nguyen
1
Table of Contents
1.
Introduction
2
2.
Methodology
4
2.1
Test procedures
4
2.2
Test participants
5
2.3
Description of learning module
7
2.4
Data collection
11
2.5
Analysis method
14
3.
Analysis of Testing Results and Feedback
14
3.1
Technical usability
14
3.2
Pedagogy
15
3.3
Learning experience
15
4.
Issues Identified and Suggestions for Next Steps
18
5.
Limitations and Concerns
19
6.
Bibliography
20
Appendices
21
Appendix A
Consent Form
Appendix B
Pre-Test Survey (August testing)
Appendix C
Post-Test Survey (August testing)
Appendix D
Pre-Test Survey (November testing)
Appendix E
Post-Test Survey (November testing)
Appendix F
Completed Survey
2
1
Introduction
This usability study is part of an ongoing two-year research project on pedagogical design for
technology education of the construction workforce. In this case study, users consist of the
construction students and workers; we will also call them “learners”. This research has three
main objectives: to understand the learning styles of the learners, to assess their current status
of technology skills and knowledge, and to recommend design guidelines for technologyenhanced teaching tools that suit the learning characteristics of learners. Comprehensive selfassessed surveys are used to achieve the first two goals. To shed light on how teaching tools
using technology can support the learners’ cognitive preferences and help them improve their
technology skills, a learning module is developed as a simple stand-alone software program
installed on a tabletPC. The suitability and effectiveness of this software as a teaching tool for
the targeted audience are tested through usability testing of the learning module, which is the
focus of this report. Findings from this specific study will help the project team refine the learning
module to make it a better teaching tool, and more importantly, develop design guidelines for
future teaching tools. As this design process requires continuous iterations and improvements, it
is still in progress. This report presents what we have learned so far in this iterative process.
The study used the heuristic evaluation method of usability testing for the first two design
iterations of the tool. Testing was carried out in two phases: one on the original version of the
learning module, and one on the improved learning module. There were four users participating
in each of these two phases. In both cases, users were asked to use the software to perform a
material management exercise. A pre-test and a post-test survey were also answered by the
test participants to gain data on their demographic information, their learning styles or
preferences, their understanding of basic concepts introduced in the learning module as well as
the feedback they had from going through the exercise. The collected data were then examined
and analyzed to identify the gaps or mismatches between the preferred learning styles and the
design. Recommendations for improvements were made based on the data collected as well as
the personal judgment and expertise of the investigator.
Evaluation of the tool was made based on the two main components of this usability study: the
achievement of the learning objectives (pedagogical objectives) the exercise was designed to
3
deliver and the effectiveness of the hardware and software in supporting the learners to achieve
their goal. More specifically:

Learners should be able to substantially complete the material management exercise
within 30 minutes with minimal support from the instructor (who was also the
investigator).

Learners demonstrate basic understanding of the applications of RFID (radio frequency
identification) technology on construction jobsites, which is the main subject of the
learning module.

Learners should be aware of and able to use all the main features of the software based
on intuition and should not struggle to figure out how certain features work.

Learners should be able to reach the plateau of their learning curve by the end of the
exercise. This will be explained in more detail in the task description section.
Within the construction workforce, there are two distinct groups of learners with significantly
different learning characteristics. The first group is the civil engineering students, or construction
professionals that have college degrees and/or managerial experiences. The second group of
learners consists of construction workers who might have a lot of practical experience but do not
have much formal education as background. These two groups also differ in their computer
experience, especially with new advanced technologies and their applications. All these factors
are critical in the design of teaching tools. To date, we have been able to test participants who
fall into the first group. Most of the participants have up to five years of construction work
experience, only a few have never worked in construction before. In the next stage of the
project, testing will be expanded to field workers to cover all sub-groups of the targeted
audience.
The data collected from the testing revealed that the original design of the learning module was
lengthy and lacked some logical features needed to perform the task efficiently. The participants
were satisfied with the original interface in general, however, the tabletPC was a little heavy to
carry around for the duration of the task. Based on the feedback from the first testing, the
learning module was redesigned to reduce the number of activities to be performed and to
incorporate additional supporting features so that these activities can be carried out more
efficiently. The results from testing the second version confirmed that these modifications were
useful in enhancing the user experience and the teaching effectiveness of the tool. However,
4
there are problems that still exist in this second version that need to be addressed in the next
round of design iterations.
The participants were generally appreciative of the interactivity of the teaching tool, especially
the immediate feedback of the program to the actions taken. They also liked the technology
component of the learning module where live communication occurred between the tabletPC
and a network of sensors installed on the job site. The general comment was that the
technology they got exposed to in this learning module had great application potential to the real
construction job site.
Based on the users’ performance data, users’ feedback and the investigator’s own judgment
and expertise, recommendations for next steps were made for the following: 1) Reduce the
length of the exercise and/or introduce tasks of different nature, 2) Review questions in post-test
survey to avoid repetition and ambiguity, 3) Resize Activities pane and made pane dimensions
fluid, and 4) Use clicks to replace drag-drop features.
The methodology of research is presented in section 2 with focus on the test procedures,
background of test participants and the types of data to be collected and how this was done.
The next section is devoted to describing the whole testing process and data collected as well
as observations made during this time. Section 4 analyzes the feedback from the participants
about the website. Section 5 integrates users feedback and investigator’s judgment to
recommend solutions for problems identified. The last section discusses the limitations of this
study and related concerns in the research outcomes.
2
Methodology
2.1
Test procedures
The usability evaluation of the TDA website was conducted by Thuy Nguyen in Austin, Texas in
August, 2007 and November, 2007. There were four participants who took part in the first
testing, and four in the second testing.
5
For the August testing of the original version, each participant spent 1 hour to 1 hour and 15
minutes using the tabletPC and answering pre- and post-test questionnaires. During this time,
the participants:

Got informed about the testing procedures and signed consent form (Appendix A).
This was also used for the second testing of the refined learning module in
November.

Answered a pre-test survey about personal demographic information and
construction experience (Appendix B);

Answered the Index of Learning Styles questionnaire to help identify their learning
preferences. This is part of the pre-test survey (Appendix B)

Listened to the instructor’s introduction to the interface of the software and the
learning objectives of the module. This duration of this “talk” varied for different
participants, ranging from 5 to 10 minutes as sometimes the instructor was
interrupted to answer questions or address concerns raised by the participants. This
led to the difference in the depth of training for each participant, and was considered
a factor that might have impacted the performance of these users.

Carried out the material management exercise: 30 minutes to 1 hour.

Answered a post-test survey about their experience using the site (Appendix C).
Based on the findings from this testing, the module was redesigned to solve the issues identified
in the first version and to introduce some more new features. The second version was tested in
the last week of November, 2007. For this testing, the pre-test questionnaire was slightly
modified (Appendix D), and the post-test questionnaire was extensively revised (Appendix E) to
include a more in-depth and specific assessment to determine how much of the learning
objectives had been achieved by the learners. The 10-minute “training session” was also
redesigned to be a more formal PowerPoint presentation so that the amount and content of
information given to participants were more standardized. The testing took 1.25 to 1.75 hours to
complete, which was longer than anticipated by the investigator. The time spent using the
tabletPC ranged from 25 minutes to 65 minutes. The additional time needed for the whole
testing was found to be attributed to the longer time that was required to answer the
questionnaires.
2.2
Test participants
6
The background information about the participants were obtained through a pre-test survey that
consisted of a demographic information section and the Index of Learning Styles questionnaire
(ILS). The ILS is a tool developed by Felder et al (year) at the North Carolina State University to
determine the learning preferences of engineering students along four scales: active –
reflective, sensing – intuitive, visual – verbal, and sequential – global. For each of these eight
dimensions, a score of 1 to 11 is used. A score of 1-3 indicates that the learner is quite
balanced on the two dimensions of that scale. A score of 5-7 shows a moderate preference
toward one of the two dimensions of each scale. A score of 9-11 reflects a very strong
preference for one dimension of a scale. For example, participants 1 and 2 in Table 1 have very
strong and moderate preferences for the visual learning methods (their scores are 11 and 7
respectively towards the Visual dimension on the Visual – Verbal scale). All this background
information might be important when interpreting data, as it might help explain some bias or
avoid overgeneralization of trends. More importantly, it would help evaluate the usability and
effectiveness of the teaching tool for every individual. The completed surveys were included in
Appendix F, and a summary is given below.
Table 1. Profile of participants in August testing
Participant #
ID
Age
Gender
Work experience
Work area
1
2
3
4
227
766
208
656
18-25
35-45
18-25
25-35
Male
Male
Female
Male
< 2 years
> 5 years
None
3-5 years
Management
Management
Support
Management
Participant #
English
proficiency
1
2
3
4
Fluent
Fluent
Fluent
Fluent
Learning Styles
SEN-INT
VIS-VRB
5INT
11VIS
1SEN
7VIS
7SEN
1VRB
9SEN
3VIS
ACT-REF
IREF
3ACT
3REF
5REF
SEQ-GLO
9GLO
1SEQ
1GLO
5GLO
ACT = Active
REF = Reflective
SEN = Sensing
INT = Intuitive
VIS = Visual
VRB = Verbal
SEQ = Sequential
GLO = Global
Table 2. Profile of participants in November testing
Participant #
ID
Age
Gender
Work experience
Work area
1
SSE238
18-25
Male
None
Management
7
2
3
4
SY3578
LN3733
SSB668
25-35
18-25
18-25
Participant #
English
proficiency
1
2
3
4
Proficient
Sufficient
Fluent
Sufficient
Male
Female
Male
ACT-REF
5ACT
3ACT
7ACT
3REF
< 2 years
< 2 years
< 2 years
Support
Management
Management
Learning Styles
SEN-INT
VIS-VRB
3INT
9VIS
1SEN
3VIS
3SEN
7VIS
3SEN
3VIS
SEQ-GLO
3GLO
1SEQ
5GLO
5SEQ
It should be noted that the testing sample is a special group of people. All the eight participants
in the two testings were currently graduate students in construction engineering and project
management. This group of users was not representative of all learners in the construction
industry (they did not represent well the construction workers with limited education). However,
most of these students have worked before in the field and therefore had some practical sense
when approaching problems. Furthermore, none of these participants had used a tabletPC
before. In this aspect, they were about average of the whole population. These users could
safely be classified as the most sophisticated group of all the construction workforce, and hence
any technical problems and most of the learning issues encountered by them would likely be an
issue with the average user of the tool.
2.3
Descriptions of the Learning Module
The original version of the Learning Module
The learning module developed in this project is simply a material management exercise. On
modern construction jobsites, material palettes are attached with RFID tags that contain all the
information about the materials. When a person walks through the jobsite with a handheld
device that is equipped with an RFID receiver, this device can communicate with all the active
RFID tags and obtain material information to display on the screen of the handheld device.
Based on this information, one can have all the information needed to plan construction
activities. In this learning module, a tabletPC is used as the central device that receives
information from the RFID tags, and the learning module is the user interface for processing the
data and managing the construction activities based on the availability and location of materials.
8
Figure 1 captures the user interface of the learning module. There are four interactive panes on
the screen. Pane 1 displays the data received through live communication with the RFID tags
(in this project, we use sensors to mimic RFID tags for learning and teaching purposes). Pane 2
is the map of the construction jobsite. Pane 3 is the current construction schedule with activities
and their start and finish dates as well as durations. Pane 4 is a supporting feature to Pane 3:
when an activity in pane 3 is clicked on, Pane 4 displays the material required for that activity to
be carried out.
There are three groups of tasks the learners (or users) are supposed to perform in this learning
module: to locate the materials (shown in Pane 1) on the map, to associate the found materials
in Pane 1 (those that have been located on the map) with the corresponding activities in Pane 3,
and to validate the schedule based on the availability of the materials. As illustrated in Figure 2,
the users can carry out task groups 1 and 2 in sequence or in parallel, as the fluidity of the
program allows the users to explore the program in a flexible way.
1
2
4
3
9
Figure 1. User interface of the learning module. 1 – Data received from RFID tags. 1- Map of the
jobsite. 3 – Schedule of activities. 4 – Materials required for each activity (supporting information
for pane 3)
LOCATE
MATERIALS
ASSOCIATE
MATERIALS
with
Activities
VALIDATE
SCHEDULE
IDENTIFY
CONFLICTS
Figure 2 – Task sequence
In this first testing, the participants had to carry a tabletPC around a (virtual) jobsite, locate 34
material palettes on the map, associate these 34 material items with 14 activities in the
schedule, and validate or make changes to the schedule if need be.
The second version of the Learning Module
In terms of structure and content, the second version is essentially the same as the first one,
except for the following additional features:
-
The ability to list RFID data by IDs or alphabetically.
-
The ability to remove a material from the map or re-locate it if previously misplaced.
-
The ability to lock or unlock schedule to avoid accidental changes to the schedule.
10
Received from live
communication with RFID tags
Drag pin then drop on
map to locate material
Click to add note
if needed
Chain means the
material has
been associated
with an activity
Clicking on activity to view
required materials
Green dot indicates a
material palette has been
found and located
Click lock/unlock to stop/start
changing schedule
Bars can be moved to
reschedule activities
Color code for activity status: have all
materials required (green), some required
materials missing (gold), all required
materials missing (purple)
Figure 3 – The refined learning module
Compared to the first module, the second version of the learning module has fewer material
items and activities (21 and 9, as opposed to 34 and 14). The most critical change incorporated
in this refined version is the extensive modifications to the post-test questionnaire to include
more specific measures of what the learners learned. These will be discussed in more detail in
the section on assessment metrics.
The training module
Another change in the design of the second testing is the inclusion of a more formal training
module in form of a presentation to the participants. This training takes around 10 minutes with
the instructor conducting the presentation directly to the participants in person. The training
module includes a brief introduction to the basics of tabletPC skills, the application of RFID
technology in construction, the main features of the learning module, the sequence of tasks in
11
the exercise and the expected learning objectives. This is to provide participants with consistent
information and training before they start using the tabletPC so that their performance is
comparable with one another. Ultimately, this training module will be turned into an instructional
video for future testing.
Personas
As discussed earlier, learners in the construction workforce can be categorized into two groups:
group 1 consists of those with a college degree (or currently in college) or in the managerial
positions, and group 2 consists of construction workers with limited educational background.
These can be considered the two main personas for this tool. In this study, however, we have
been able to test only participants from group 1. The extension of usability testing to group 2
participants is the next step of the project. Findings from testing this first group will help
establish a benchmark point and a research framework for approaching the second group.
2.4
Assessment metrics
The post-test survey is in fact an assessment survey that provides data for the evaluation
metrics of the learning module. There are three groups of metrics that align with three main
goals: pedagogical metrics to evaluate the level of achievement of learning objectives, the
technical usability metrics to evaluate the suitability of hardware and software features to the
learners’ physical and cognitive preferences, and the learning experience metrics to determine
whether or not the design of the learning modules supports the users’ learning styles
(determined through the ILS questionnaire). The metrics described below and the results from
these evaluations are the data obtained so far for the post-test questionnaire used in the second
testing (which is vastly improved from the first testing). Summary of findings from the first
usability testing will be discussed briefly in the results section.
Pedagogical metrics
The ultimate learning objectives of this exercise are to make learners aware of the applications
of RFID and other wireless technologies on construction jobsites. It is also expected to help
them get some insight into the nature of real time planning and scheduling of construction
activities. Another objective is for learners to get familiar with the tabletPC and the software
12
program installed on it. The metrics used to evaluate users’ performance regarding these goals
include:

Percentage of task completion (the number of material items found, located and
associated with activities).

Ability to diagnose conflicts (missing materials, misplacement of materials, space
conflicts, schedule conflicts).

Understanding of RFID technology applied in this learning module. This metric and the
previous one are evaluated through a short test (built-in in the post-test survey) with
problem solving questions, multiple-choice questions, short answer questions and
true/false questions.
Technical usability metrics
To evaluate the technical usability of the tools, the following questions are asked:

Was the interface visually appealing?

How comfortable were you working with this device in general?

How comfortable were you working with using the stylus?

How comfortable were you with the lighting of the screen?

According to you, the size of the screen was
o Too small
o Rather small
o Just right
o Rather big
o Too big

How often did you find what you wanted to find?
o Never
o Rarely
o Occasionally
o Often
o Very often
Did the technology make the exercise more interesting or less interesting?
o A lot less interesting
o Somewhat less interesting
o No impact
o Somewhat more interesting
o A lot more interesting


What technical problems did you encounter when using the devices? Please check all
that apply.
o Unable to read data from a sensor
o Touch screen not sensitive
13
o
o
o
o
o
o
o
o
Unable to see screen clearly
Difficult to use stylus
Unable to find wanted functions
Difficult to navigate the site
Battery failure
Unable to load plan, schedule or material list
Difficult to switch views
Other problems (please specify)
More questions on the usability of the graphical interface are also included in the learning
experience section of the questionnaire.
Learning experience metrics
Learners are asked to indicate whether or not they agree with 16 statements about their
learning experience with the exercise, on a scale from 1 to 5. The answers to these questions
provide data on the matching of the users’ learning styles with what the module has to offer.
Sample statements are:

The interactive features of the exercise made me feel engaged throughout the whole
exercise.

I believe the promoted active interactions and thinking that facilitated good long-term
retention of the material.

I need more sequential instructions to avoid getting lost and not knowing what to do
next.

The instructional presentation was helpful in introducing the concept I would learn more
about in the actual exercise.
The complete list of statements (and the answers from participants) will be presented in the
results section. Participants are also asked to give their opinion on the overall design of the
exercise, such as:

The task descriptions were clear.

The flow of task was logical and easy to follow.

The expectations were communicated clearly and you understood what you were
supposed to do.

How was the amount of instruction given to you before the task?

How often did you need extra instruction from the instructor when you carried out the
task?
14

Was the task easy or challenging? Rate your experience

Was the length of the exercise appropriate?

Did you enjoy the experience? Rate your experience (1 being “did not enjoy it at all”, and
5 being “enjoyed it very much”)
2.4
Analysis method
The evaluation of pedagogical and technical usability metrics are in general straightforward as
data obtained from the answers for these two aspects are quite direct and tangible. For
example, technical problems can be easily identified when the users are unable to operate the
device or use a feature, the achievement of learning objectives can be determined from the
results of the assessment test. However, analysis of the matching between a user’s learning
styles and the design of the learning module is much more difficult as it cannot be directly
extracted from the answers. This analysis requires abstract interpretation of survey answers to
determine their meaning in the relation with the learning preferences.
3
Analysis of Testing Results and Feedback
3.1 Technical Usability
The feedback from the first testing revealed that in general the participants found the interface
visually appealing. Three people out of four found the tabletPC, the stylus and screen
comfortable or very comfortable to work with. All of the four participants thought the screen size
was just right. They often or very often found what they wanted to find on the screen and on the
jobsite. The consensus was that technology made the exercise more interesting, however, the
exercise was too long and the tabletPC was rather heavy to carry around for the duration of the
exercise.
Based on this feedback, modifications were made so that the exercise now has fewer materials
and activities. However nothing could be changed regarding the physical properties of the
device. The second testing still found that although the time taken to complete the exercise was
reduced, it was still long and some participants found it heavy. The feedback on the comfort of
the device was even more negative.
15
Other comments include:

It was hard to drag and drop push pins on maps.

Materials had to be dropped right on the bars otherwise it could not be associated with
the activities.

The Activities window was small. I would rather scroll the list of items than the schedule.

Size of window should be adapted to screen.

Difficult to use stylus.
3.2 Pedagogy
Table 3 summarizes the pedagogical metric performance for the three participants of the
second testing (one participant did not turn in post-test survey). It can be seen that participants
#1 and #3 did an excellent job, while participant #2 seemed to have struggled and also took the
longest time to complete the exercise. Overall, the learning objectives were successfully
achieved for two out of three participants.
Table 3. Performance of pedagogical metrics in November testing
Participant #
1
Time on task
40 min
Task completion
Understanding of RFID communication
Representations of material availability and status
Conflict diagnose
True/False questions on RFID and wireless
communication
2
3
1 hour
25 min
Substantially Completed
Completed completed
2/4
4/4
3/4
3/3
3/3
3/3
1/4
4/4
2/4
8/8
5/8
8/8
It was realized that the wording of some questions in this assessment test might have been the
reason for the poor performance of participant #2 (and #1 occasionally). Assessment needs to
be carefully designed when conceptual knowledge is to be tested.
3.3 Learning Experience
In general, the participants found the exercise long. However they all found that the task
descriptions were clear, flow of task was logical and easy, expectations were communicated
effectively and the amount of instruction given before carrying out the task was just enough.
16
Participant #1 thought the task was very easy, participant #2 found it challenging, and
participant #3 thought it was normal. In general they enjoyed the experience somewhat (except
for #2 was neutral).
A summary of responses to the learning experience questions is provided in Table 4. The last
column indicate the specific learning preference(s) corresponding to a high score in that
question.
Table 4. Respones to learning experience questions in November testing
1 – Strongly disagree 2 – Disagree 3 – Neutral
4 – Agree 5 – Strongly agree
# # #
#
Statement
Average High score
1 2 3
35 The interactive features of the exercise made me feel
3 4
3.33
3
engaged throughout the whole exercise.
36 As the design of the exercise was flexible and
4 4
4
INT
interactive enough for me freely explore different ways
4
to do things.
37 The range of things I could do at a time was too broad,
2 4
3.33
SEN/SEQ
4
and I got lost during the exercise.
4
ACT
38 The flexibility of the program and the repetitiveness of
3 5
some tasks helped me correct the mistakes I had made
4
and reinforce my previous learning.
3.33
ACT
The exercise motivated me to learn more about the
3 4
39 topic of RFID/wireless technology and its application in
3
construction.
4.33
ACT
40 I believe the exercise promoted active interactions and
4 4
thinking that facilitated long-term retention of the
5
material.
41 The number of repetitive tasks was just enough for me
4 3
3
INT
to understand how the exercise works and perform the
2
action smoothly without getting bored.
42 There was not enough structure to the learning module. 2 3
2.33
SEQ
I want a specific procedure to follow so that I don’t have
2
to think about what to do next.
43 The learning module was flexible enough for me to be
3 3
3.33
actively using my own judgment and intuition to make
4
decisions.
44 The design of the learning module represented well the 3 4
3.33
SEN
physical and conceptual relationships in the real world. I
3
can relate the virtual representations in the module with
the physical relationships in the real world.
1.67
VRB
45 The learning module had too many graphics without
1 3
enough text of audio instructions to help me
1
understand.
3.67
VIS
46 The graphical representations (such as push pins, color 4 3
codes, chain links) were helpful in improving my
4
understanding about the consequences of the activities
17
I was performing.
47 The design of the learning module was comprehensive
and fluid enough to give me the big picture of the
ultimate task at every stage.
48 I need more sequential instructions to avoid getting lost
and not knowing what to do next.
49 The instructional presentation was helpful in introducing
the concept that I would learn more about in the actual
exercise.
50 The flowcharts and list of learning objectives helped me
see the big picture and made learning more effective.
3
4
3
GLO
2.33
SEQ
4.33
GLO
4
GLO
2
2
3
2
5
3
5
4
4
4
To better interpret these responses, let us revisit the results of the learning style questionnaire.
From Table 2 below, we could see that all three participants (1, 2, 3) are active learners who
prefer to actively do things to understand about them, as opposed to just read and reflect. Two
of them prefer global thinking than sequential, and one is neutral. They seem to strongly prefer
visual learning to verbal, and does not have a clear preference between sensing and intuitive
methods of learning.
Going back to Table 4, we can see that the participants gave quite high scores for questions 38
to 40, which indicates that the learning module effectively supported their active learning
preference. Responses to questions 45 and 46 also confirmed that they felt comfortable with the
graphical user interface and did not feel the need to have more text information. The last 4
questions also revealed that the flexibility of the program and the content of the instructional
presentation supported the participants’ preference of global thinking.
It was found that some questions in this section are somewhat repetitive (for example #37 and
#48, or #42 and #48). This might lead to the case when a learner gives two conflicting answers
to the same concept asked about if the questions are ambiguously worded (this happened to
participant #3 in this testing).
For this small sample of three participants, the design of this learning module effectively
supports the learners’ learning characteristics. However, this sample is small and not
representative of the whole construction workforce. Further study of the tool on both students
and workers will reveal the gaps between the design and the cognitive needs of the learners
and help improve the learning module’s versatility for both audiences.
18
Table 2. Profile of participants in November testing
Participant #
English
proficiency
1
2
3
4
Proficient
Sufficient
Fluent
Sufficient
4
ACT-REF
5ACT
3ACT
7ACT
3REF
Learning Styles
SEN-INT
VIS-VRB
3INT
9VIS
1SEN
3VIS
3SEN
7VIS
3SEN
3VIS
SEQ-GLO
3GLO
1SEQ
5GLO
5SEQ
Issues Identification and Suggestions
for Next Steps
Issue #1: The exercise is long and somewhat repetitive. This makes the learning experience
less enjoyable and causes physical tiredness among learners due to the weight of the tabletPC.
Most participants commented that there were more material items than needed to understand
the concept. This means they could reach the learning curve plateau quite early in the process
and did not need too much repetition of the same task for reinforcement.
Suggestions:

While there is nothing (within the scope of this project) to be done about the weight and
size as well as other physical properties of the tabletPC, the design of the learning
module can definitely be improved by reducing the number of activities and material
items. This would cut down the time required to complete the exercise and still ensure
that the learners understand the main concepts introduced. It would induce more interest
and avoid boredom for them.

Another solution is to introduce other tasks (than material locating and associating) that
reinforces the same concepts but are different in nature or operations.
Issue #2: Some questions in the Learning Module Recap (assessment test) section of the posttest survey were unclear and ambiguous. Some other questions were essentially factual recall
and too easy. Some questions in the Learning Experiences were repetitive and ambiguous
which might lead to conflicting answers by the same participant.
Suggestions:
19

For assessment metrics, redesign questions so that the benchmarks for answers can be
established. Also avoid trivial questions.

Eliminate repetitive questions.

Include questions that reveal preferences of resistance to all the eight dimensions of
learning. The current questions are effective only if learners have very strong
preferences for active, visual and global learning styles.
Issue #3: Size of the Activities pane was too small, users had to scroll down to see most of the
activities. The size of the whole window was fixed, so when map was smaller or bigger than the
slot allocated, there was blank space that was not used, or users had to scroll to see everything.
Suggestions:

Use fluid widths and heights for the panes.

Increase the height of the Activities pane to make visible at least 6 out of 9 activities
without having to scroll.
Issue #4: The stylus was difficult to use. This made it hard to drag and drop. Furthermore,
materials had to be dragged and dropped exactly on top of the schedule bars to associate. This
made it even harder.
Suggestions:

Use clicks, double clicks and pop-up menus to perform this function intead of drag and
drop. This would help solve the problem of insensitive stylus and touch screen.

Make it possible to associate a material to an activity when dropped anywhere on the
row of that activity.
6
Limitations and Concerns
Some of the factors that raised concerns and can potentially hinder insights or bias judgments
are:

Sample size of the users was only four. This is too small to draw reliable conclusions
about the performance of the population, especially when learning styles are different for
every learner.
20

All participants were from the same background of college or graduate education with
good abstract thinking ability. When extending this study to the workers, some questions
have to be structured and/or worded differently.

There is a need to isolate the effectiveness of the instructional presentation and that of
the actual software program. The general idea is that abstract concepts are introduced in
the instructional (or training) module, and the learners will use the software program to
actively carry out the task. This would reinforce the concepts introduced in the training
module. However, if not properly designed, the training module might contain too much
information and assessment will not accurately reflect the effectiveness of the learning
module itself as a teaching tool.

The task was long. This might have caused boredom and/or frustration among the
participant which might have impacted their performance.
Bibliography
Usability.gov (2007) Usability Test Report Template. URL: http://www.usability.gov/templates
/docs/long_test_rep.doc. Date accessed: May 1, 2007
Felder et al. (2007) Index of Learning Styles. URL:
21
Appendices
22
Appendix A
Informed Consent
Title:
IRB PROTOCOL #
Conducted By:
Of University of Texas at Austin:
Telephone:
Evaluating Education Technologies for the Intelligent Job Site
Dr. William J. O’Brien
Civil, Architectural, and Environmental Engineering
512 471 4638
You are being asked to participate in a research study. This form provides you with information about the
study. The person in charge of this research will also describe this study to you and answer all of your
questions. Please read the information below and ask any questions you might have before deciding
whether or not to take part. Your participation is entirely voluntary. You can refuse to participate without
penalty. You can stop your participation at any time and your refusal will not impact current or future
relationships with UT Austin or participating sites. To do so simply tell the researcher you wish to stop
participation. The researcher will provide you with a copy of this consent for your records.
The purpose of this study is to evaluate the effectiveness of novel learning technologies and techniques
for education about the Intelligent Job Site (IJS). An IJS consists of sensors, wireless networks, and
handheld and desktop computing devices to aid construction productivity and safety. The specific
learning tool investigated in this study is a construction management learning module. The learning
module is designed as a self-contained computer application. The participants will use this application on
a TabletPC to complete a construction scheduling exercise. The total duration of this study is expected to
be approximately 60 minutes, with 15 to 30 minutes of direct interaction with the application, and 30
minutes of feedback and questionnaires. The information collected will be analyzed to propose ways to
improve the usability and usefulness of the application. No sensitive or private information will be
collected or recorded as part of the study. This research study is part of a construction workforce
education research project conducted at the University of Texas at Austin.
If you agree to participate in the study, you will be asked to provide some basic information on your
demographic background and your familiarity with mobile computing tools as well as your experience
with construction work and educational experience. You might also be asked to answer a questionnaire to
determine your learning preference. You will then use a TabletPC to explore the application interface and
features, and then complete the exercise. During the exercise, the supervisor will make observations of
your interaction with the learning module. If you have any technical questions during this time, you may
address them to the supervisor. After you complete the exercise, you will be asked to turn it in and answer
a questionnaire to give feedback on your experience with the learning module. The purpose of this
research is to study the usability of the application, and not to examine your performance in the task.
The physical risk associated with this study is no greater than that involved in everyday life. As you will
interact directly with researchers, and as researchers may observe your utilization of IJS technologies,
there is a minimal risk of loss of privacy and confidentiality. Specific protections are discussed below.
23
The potential benefit of being in the study is an increased knowledge of IJS technologies, which may aid
your future job performance. There is no compensation for this study.
Privacy and confidentiality protections: Your participation in this study will be kept confidential to the
extent allowed by law. Your name will only appear on this form, which will be kept in a secure location.
All additional data collected during the study will contain no identifying information that could associate
you with it, or with your participation in the study.
The records of this study will be stored securely and kept confidential. Authorized persons from The
University of Texas at Austin (principally, members of the research team), as well as members of the
Institutional Review Board, have the legal right to review your research records and will protect the
confidentiality of those records to the extent permitted by law. All publications will exclude any
information that will make it possible to identify you as a subject. Throughout the study, the researchers
will notify you of new information that may become available and that might affect your decision to remain
in the study.
The only exception to the confidentiality of your participation is if you request (below) that your employer
can verify your participation in the study. In this case, the only information released to your employer is that
you participated; all other data relating to your participation will remain confidential.
Contacts and Questions: If you have any questions about the study please ask now. If you have questions
later, want additional information, or wish to withdraw your participation call the researchers conducting
the study. Their names, phone numbers, and e-mail addresses are at the top of this page. If you have
questions about your rights as a research participant, complaints, concerns, or questions about the
research please contact Lisa Leiden, Ph.D., Chair of The University of Texas at Austin Institutional
Review Board for the Protection of Human Subjects, (512) 471-8871 or email: orsc@uts.cc.utexas.edu.
You will be given a copy of this information to keep for your records.
24
Statement of Consent:
I have read the above information and have sufficient information to make a decision about participating
in this study. I consent to participate in the study.
Signature:___________________________________________ Date: __________________
___________________________________________________ Date: ___________________
Signature of Person Obtaining Consent
Signature of Investigator:______________________________ Date: __________________
As a study participant, I agree that my participation in this study can be revealed to my employer if they
ask for verification of my participation. No other information about my participation will be revealed.
Signature: __________________________________________ Date: ___________________
Name of employer (firm): _____________________________
Your signature below indicates that you have read the information above and agree to participate in this
study. You will receive a copy of this signed document.
Signature of Participant
Date
Signature of Supervisor
Date
25
Appendix B
Pre-Test Questionnaire
Instructions:
The purpose of this study is to look at an integrated learning environment using mobile
devices to read sensor data. You will be assigned with a randomly generated ID. This
ID is used purely to classify participants’ background and is not linked to any of your UT
profile or record. You may choose to not answer any questions.
The data gathered in this study will be reviewed by Kathy Schmidt, Director of the
College’s Faculty Innovation Center. Should you have concerns please contact the
Office of Research Support and Compliance at 471-8871.
ID ___________________
Demographic information
1. How old are you?
□ 18-25
□ 25-35
□ 35-45
□ Over 45
2. Highest education level
□ Lower than high school
□ High school degree
□ College degree
□ Graduate degree
3. What is your fluency in English?
□ Native
□ Non-native, but communicate
well both orally and in writing as
near native
□ Communicate well only orally or
in writing
□ Understand but cannot
communicate
□ Know no English
4. How do you categorize yourself within
your organization? (or what do you think
would be your potential position?)
□ Management (executives)
□ Support (technical, estimating, sales,
accounting, etc.)
□ Supervision (foremen and
superintendents)
□ Labor (skilled and unskilled)
26
5. Gender
□ Male
□ Female
Pre-assessment questions
6. Please indicate the types of devices you have worked with before. Check all that apply.






A desktop computer
A laptop computer
A mobile phone
A PDA
A tablet
A GPS
7. How comfortable are you with touch screens?





Very uncomfortable
Uncomfortable
Neutral
Comfortable
Very comfortable
8. How many years of construction work experience do you have?




None
Less than 2 years
3 to 5 years
More than 5 years
9. Have you taken coursework or training in construction safety/productivity?




No, never.
Yes, vocational training.
Yes, at the undergraduate level.
Yes, at the graduate level.
27
Appendix C
Post-Test Questionnaire
Instructions:
The purpose of this study is to look at an integrated learning environment using mobile
devices to read sensor data. You will be assigned with a randomly generated ID. This
ID is used purely to classify participants’ background and is not linked to any of your UT
profile or record. You may choose to not answer any questions.
The data gathered in this study will be reviewed by Kathy Schmidt, Director of the
College’s Faculty Innovation Center. Should you have concerns please contact the
Office of Research Support and Compliance at 471-8871.
ID ___________________
28
Learning experience – qualitative questions
10. The task descriptions were clear.





Strongly disagree
Disagree
Neutral
Agree
Strongly agree
11. The flow of task was logical and easy to follow.





Strongly disagree
Disagree
Neutral
Agree
Strongly agree
12. The expectations were communicated clearly and you understood what you were
supposed to do.





Strongly disagree
Disagree
Neutral
Agree
Strongly agree
13. How was the amount of instruction given to you before the task?




Too little
Just enough
Some redundant
Overwhelming
14. Do you think more background reading was needed before you started the task?



No
Yes, some more
Yes, a lot more
15. How often did you need extra instruction from the instructor when you carried out the
task?

Never
29




Rarely
Occasionally
Often
Very often
16. Was the task easy or challenging? Rate your experience.





Very easy
Easy
Normal
Challenging
Very challenging
17. Was the length of the exercise appropriate?





Too long
Long
Just right
Short
Too short
18. Did you enjoy the experience? Rate your experience (1 being “did not enjoy it at all”, and
5 being “enjoyed it very much”)
1
2
3
4
5
19. Would you be willing to participate in this kind of experiment next time?



No
Maybe
Yes
20. Would you recommend a friend to participate in the experiment?



No
Maybe
Yes
21. Do you think this technology shows promise for future application to live construction
sites?


Yes. (Please explain why)
No. (Please explain why)
30
Active learning
22. You felt engaged throughout the whole exercise.





Strongly disagree
Disagree
Neutral
Agree
Strongly agree
23. You actively thought about/reflected on what you were doing (as opposed to just
following the instructions).





Strongly disagree
Disagree
Neutral
Agree
Strongly agree
24. You actively associated what you were going through with your previous experiences.





Strongly disagree
Disagree
Neutral
Agree
Strongly agree
25. The exercise motivated you to learn more about the domain.





Strongly disagree
Disagree
Neutral
Agree
Strongly agree
26. You got to iteratively reinforce your understanding about the problem as you explored
the program.




Strongly disagree
Disagree
Neutral
Agree
31

Strongly agree
27. You believe the exercise facilitated good long-term retention of the material.





Strongly disagree
Disagree
Neutral
Agree
Strongly agree
Technology usability
28. Was the interface visually appealing?
1 (not appealing at all) 2
3 (neutral)
4
5 (very appealing)
29. How comfortable were you working with this device in general?
1 (very comfortable) 2
3 (neutral)
4
5 (very uncomfortable)
30. How comfortable were you working with using the stylus?
1 (very comfortable) 2
3 (neutral)
4
5 (very uncomfortable)
31. How comfortable were you with the lighting of the screen?
1 (very comfortable) 2
3 (neutral)
4
5 (very uncomfortable)
32. According to you, the size of the screen was





Too small
Rather small
Just right
Rather big
Too big
33. How often did you find what you wanted to find?





Never
Rarely
Occasionally
Often
Very often
34. Did the technology make the exercise more interesting or less interesting?

A lot less interesting
32




Somewhat less interesting
No impact
Somewhat more interesting
A lot more interesting
35. What technical problems did you encounter when using the devices? Please check all that
apply.










Unable to read data from a sensor
Touch screen not sensitive
Unable to see screen clearly
Difficult to use stylus
Unable to find wanted functions
Difficult to navigate the site
Battery failure
Unable to load plan, schedule or material list
Difficult to switch views
Other problems (please specify)
36. Do you have any other comments? What would you suggest that we do to improve the
users’ experience?
33
Appendix D
Pre-Test Questionnaire
Instructions:
The overall purpose of this study is to look at an integrated learning environment using
mobile devices to read sensor data. This specific questionnaire is designed to capture
1) your demographic and background information, and 2) your preferred methods of
learning.
You will be asked to provide your UTEID (or student ID for non-UT students). This ID is
used purely to classify participants’ background and is not linked to any of your UT
profile or record. You may choose to not answer any questions.
The data gathered in this study will be reviewed by Kathy Schmidt, Director of the
College’s Faculty Innovation Center. Should you have concerns please contact the
Office of Research Support and Compliance at 471-8871.
ID ___________________
34
Demographic and Background Information
Age group:
18-25
25-35
35-45
Gender:
Male
Female
Over 45
Current academic standing:
Not in college
Freshman
Sophomore
Junior
Senior
Graduate school
Current academic major area (be
specific; if not yet have a major please
specify intended major):
Structural Engineering
CM/CEM/CEPM
Geotechnical Engineering
Environmental & Water
Resources Engineering
Architectural Engineering
Transportation Engineering
Building Construction
Architecture
Other (please specify):
Targeted job location after graduation:
USA
Others
How many years of construction work
experience do you have?
None
Less than 2 years
2 to 5 years
More than 5 years
Current or intended (after graduation) work
area:
Management (executives)
Support (technical, estimating, sales,
accounting, etc.)
Supervision (foremen and
superintendents)
Labor (skilled and unskilled)
Other (please specify):
English proficiency:
Oral
No skill
Limited
Sufficient
Fluent
Proficient
Written
35
Index of Learning Styles Questionnaire
(Copyright © Barbara A. Soloman & Richard M. Felder, North Carolina State University)
ID ___________________
Directions
This questionnaire is designed to identify your preferred style (styles) of learning. For each of the
44 questions below select either "a" or "b" to indicate your answer. Please choose only one
answer for each question. If both "a" and "b" seem to apply to you, choose the one that applies
more frequently.
1
I understand something better after I
(a) try it out.
(b) think it through.
2
I would rather be considered
(a) realistic.
(b) innovative.
3
When I think about what I did yesterday, I am most likely to get
(a) a picture.
(b) words.
4
I tend to
(a) understand details of a subject but may be fuzzy about its overall structure.
(b) understand the overall structure but may be fuzzy about details.
5
When I am learning something new, it helps me to
(a) talk about it.
(b) think about it.
6
I were a teacher, I would rather teach a course
(a) that deals with facts and real life situations.
(b) that deals with ideas and theories.
7
I prefer to get new information in
(a) pictures, diagrams, graphs, or maps.
(b) written directions or verbal information.
8
Once I understand
(a) all the parts, I understand the whole thing.
(b) the whole thing, I see how the parts fit.
9
In a study group working on difficult material, I am more likely to
36
(a) jump in and contribute ideas.
(b) sit back and listen.
10 I find it easier
(a) to learn facts.
(b) to learn concepts.
11 In a book with lots of pictures and charts, I am likely to
(a) look over the pictures and charts carefully.
(b) focus on the written text.
12 When I solve math problems
(a) I usually work my way to the solutions one step at a time.
(b) I often just see the solutions but then have to struggle to figure out the steps to get
to them.
13 In classes I have taken
(a) I have usually gotten to know many of the students.
(b) I have rarely gotten to know many of the students.
14 In reading nonfiction, I prefer
(a) something that teaches me new facts or tells me how to do something.
(b) something that gives me new ideas to think about.
15 I like teachers
(a) who put a lot of diagrams on the board.
(b) who spend a lot of time explaining.
16 When I'm analyzing a story or a novel
(a) I think of the incidents and try to put them together to figure out the themes.
(b) I just know what the themes are when I finish reading and then I have to go back
and find the incidents that demonstrate them.
17 When I start a homework problem, I am more likely to
(a) start working on the solution immediately.
(b) try to fully understand the problem first.
18 I prefer the idea of
(a) certainty.
(b) theory.
19 I remember best
(a) what I see.
(b) what I hear.
20 It is more important to me that an instructor
(a) lay out the material in clear sequential steps.
37
(b) give me an overall picture and relate the material to other subjects.
21 I prefer to study
(a) in a study group.
(b) alone.
22 I am more likely to be considered
(a) careful about the details of my work.
(b) creative about how to do my work.
23 When I get directions to a new place, I prefer
(a) a map.
(b) written instructions.
24 I learn
(a) at a fairly regular pace. If I study hard, I'll "get it."
(b) in fits and starts. I'll be totally confused and then suddenly it all "clicks."
25 I would rather first
(a) try things out.
(b) think about how I'm going to do it.
26 When I am reading for enjoyment, I like writers to
(a) clearly say what they mean.
(b) say things in creative, interesting ways.
27 When I see a diagram or sketch in class, I am most likely to remember
(a) the picture.
(b) what the instructor said about it.
28 When considering a body of information, I am more likely to
(a) focus on details and miss the big picture.
(b) try to understand the big picture before getting into the details.
29 I more easily remember
(a) something I have done.
(b) something I have thought a lot about.
30 When I have to perform a task, I prefer to
(a) master one way of doing it.
(b) come up with new ways of doing it.
31 When someone is showing me data, I prefer
(a) charts or graphs.
38
(b) text summarizing the results.
32 When writing a paper, I am more likely to
(a) work on (think about or write) the beginning of the paper and progress forward.
(b) work on (think about or write) different parts of the paper and then order them.
33 When I have to work on a group project, I first want to
(a) have "group brainstorming" where everyone contributes ideas.
(b) brainstorm individually and then come together as a group to compare ideas.
34 I consider it higher praise to call someone
(a) sensible.
(b) imaginative.
35 When I meet people at a party, I am more likely to remember
(a) what they looked like.
(b) what they said about themselves.
36 When I am learning a new subject, I prefer to
(a) stay focused on that subject, learning as much about it as I can.
(b) try to make connections between that subject and related subjects.
37 I am more likely to be considered
(a) outgoing.
(b) reserved.
38 I prefer courses that emphasize
(a) concrete material (facts, data).
(b) abstract material (concepts, theories).
39 For entertainment, I would rather
(a) watch television.
(b) read a book.
40 Some teachers start their lectures with an outline of what they will cover. Such outlines are
(a) somewhat helpful to me.
(b) very helpful to me.
41 The idea of doing homework in groups, with one grade for the entire group,
(a) appeals to me.
(b) does not appeal to me.
42 When I am doing long calculations,
(a) I tend to repeat all my steps and check my work carefully.
39
(b) I find checking my work tiresome and have to force myself to do it.
43 I tend to picture places I have been
(a) easily and fairly accurately.
(b) with difficulty and without much detail.
44 When solving problems in a group, I would be more likely to
(a) think of the steps in the solution process.
(b) think of possible consequences or applications of the solution in a wide range of
areas.
40
Appendix E
Post-Test Questionnaire
Instructions:
The overall purpose of this study is to look at an integrated learning environment using
mobile devices to read sensor data. This specific assessment questionnaire asks you to
provide feedback on your experience with the learning module as part of data that
support the study.
The assessment is designed to determine how well the learning module served as a
learning/teaching tool. Your performance is not relevant.
You will be asked to provide your UTEID (or student ID for non-UT students). This ID is
used purely to classify participants’ background and is not linked to any of your UT
profile or record. You may choose to not answer any questions.
The data gathered in this study will be reviewed by Kathy Schmidt, Director of the
College’s Faculty Innovation Center. Should you have concerns please contact the
Office of Research Support and Compliance at 471-8871.
ID ___________________
41
Learning module recap
1.
How often do you use a tabletPC?
Never
Rarely (a few times a year)
Occasionally (once or twice a month)
Often (weekly basis)
Often (weekly basis)
2.
How did this exercise improve your tabletPC skills?
Did not help
A little
Considerably
Significantly
Not applicable, as I’ve already mastered the skills.
3.
Were you aware of RFID technology before?
4.
Were you aware of any RFID applications on construction jobsites?
5.
You are holding the tabletPC (with the receiver attached) and walking through the site when
you see a material palette a few yards away. However you cannot find this material item on
the list under RFID data. Which of the following might be the reason for this? (check all that
apply)
This palette does not have an RFID tag attached to it.
This palette might have an RFID tag but the tag is not working therefore it is not
detected and shown.
This palette does have an RFID tag. The tag is working (radiating radio waves) but the
receiver (attached to the tabletPC) is too far away so it is out of range for that tag.
There might be too much obstruction that reduces the working range for the RFID tag.
6.
Y
N
Y
N
Refer to figure 1:
 List the material palettes (IDs only) that have been found and located on the map:

List the material palettes (IDs only) that have been associated with some activities:

List the material palettes (IDs only) that have been detected but not yet found and
located:
42
Figure 1: RFID Data Panel
Figure 2: Sample material locations and schedule status
7. For the map and schedule shown in figure 2, what are the potential problems? (shown a map
with lots of pins in one area, a schedule with many activities lacking materials)
43
8. Refer to figure 3 (show a figure with required materials for an activity). What are the
materials required for activity Overhead Electrical Rough?
Figure 3: Required materials for Overhead Electrical Rough
For each of the statements below (9 to 16), please indicate whether it is true or false or you
don’t know based on your learning from the exercise.
#
9
10
11
12
13
14
15
16
Statement
An RFID tag is a little pre-coded piece of hardware
attached to an item to be located.
An RFID tag communicates with a central device (a
receiver) via radio waves.
An RFID tag has a unique ID that can contain or can
be mapped to information on the item it is attached
to.
An RFID tag has to be wired to a central device for
communication.
In this learning module, real RFID tags were used.
In this learning module, sensors were used to
generate RFID-like data.
Data broadcasted by sensors or RFIDs can be
detected equally easily by a receiver in any
environment, rain or shine.
Data broadcasted by sensors or RFIDs are not
affected by obstructions such as walls, furniture and
other devices. They are detectable in the same range
whether or not obstructions are present.
True
False Don’t know
44
General
17. The task descriptions were clear.
Strongly disagree
Disagree
Neutral
Agree
Strongly agree
18. The flow of task was logical and easy
to follow.
Strongly disagree
Disagree
Neutral
Agree
Strongly agree
19. The expectations were communicated
clearly and you understood what you were
supposed to do.
Strongly disagree
Disagree
Neutral
Agree
Strongly agree
20. How was the amount of instruction
given to you before the task?
Too little
Just enough
Somewhat redundant
Overwhelming
21. How often did you need extra instruction
from the instructor when you carried out
the task?
Never
Rarely
Occasionally
Often
Very often
22. Was the task easy or challenging?
Rate your experience.
Very easy
Easy
Normal
Challenging
Very challenging
23. Was the length of the exercise
appropriate?
Too long
Long
Just right
Short
Too short
24. Did you enjoy the experience? Rate
your experience.
Did not enjoy at all
Did not enjoy it
Neutral
Enjoyed it somewhat
Enjoyed it very much
25. Do you think this technology shows
promise for future application to live
construction sites?
Yes. (Please explain why)
No. (Please explain why)
45
Technology usability
# Statement/Question
1
(not at all)
2
3
4
5
(neutral)
(very)
26 Was the interface visually appealing?
27 How comfortable were you working with this
device in general?
28 How comfortable were you working with using
the stylus?
29 How comfortable were you with the lighting of
the screen?
30. According to you, the size of the
screen was
Too small
Rather small
Just right
Rather big
Too big
31. How often did you find what you
wanted to find?
Never
Rarely
Occasionally
Often
Very often
32. Did the technology make the
exercise more interesting or less
interesting?
A lot less interesting
Somewhat less interesting
No impact
Somewhat more interesting
A lot more interesting
33. What technical problems did you
encounter when using the devices?
Please check all that apply.
Unable to read data from a sensor
Touch screen not sensitive
Unable to see screen clearly
Difficult to use stylus
Unable to find wanted functions
Difficult to navigate the site
Battery failure
Unable to load plan, schedule or
material list
Difficult to switch views
Other problems (please specify)
34. Do you have any other comments? What would you suggest that we do to improve the
users’ experience?
46
Learning Experience
For each of the following statements (35 to 50), please indicate whether or not you agree.
1 – Strongly disagree 2 – Disagree 3 – Neutral
4 – Agree 5 – Strongly agree
# Statement
1
35 The interactive features of the exercise made me feel
engaged throughout the whole exercise.
36 As the design of the exercise was flexible and interactive
enough for me freely explore different ways to do things.
37 The range of things I could do at a time was too broad, and I
got lost during the exercise.
38 The flexibility of the program and the repetitiveness of some
tasks helped me correct the mistakes I had made and
reinforce my previous learning.
The exercise motivated me to learn more about the topic of
39 RFID/wireless technology and its application in
construction.
40 I believe the exercise promoted active interactions and
thinking that facilitated long-term retention of the material.
41 The number of repetitive tasks was just enough for me to
understand how the exercise works and perform the action
smoothly without getting bored.
42 There was not enough structure to the learning module. I
want a specific procedure to follow so that I don’t have to
think about what to do next.
43 The learning module was flexible enough for me to be
actively using my own judgment and intuition to make
decisions.
44 The design of the learning module represented well the
physical and conceptual relationships in the real world. I can
relate the virtual representations in the module with the
physical relationships in the real world.
45 The learning module had too many graphics without enough
text of audio instructions to help me understand.
46 The graphical representations (such as push pins, color
codes, chain links) were helpful in improving my
understanding about the consequences of the activities I was
performing.
47 The design of the learning module was comprehensive and
fluid enough to give me the big picture of the ultimate task
at every stage.
48 I need more sequential instructions to avoid getting lost and
not knowing what to do next.
2
3
4
5
47
49 The instructional presentation was helpful in introducing the
concept that I would learn more about in the actual exercise.
50 The flowcharts and list of learning objectives helped me see
the big picture and made learning more effective.
Thank you for your participation!
48
Download