Uploaded by shanglucas100

Nimbarte et al. 2021 - Human Factors Evaluation of Energy Visualization

advertisement
1028693
ERGXXX10.1177/10648046211028693<bold>ergonomics in design</bold><bold>ergonomics in design</bold>
research-article2021
feature
Human Factors Evaluation
of Energy Visualization
Dashboards
By Ashish D. Nimbarte
FEATURE AT A GLANCE:
Energy visualization systems
provide information about use
in real-time to assist users
with energy efficiency. In this
study, three energy visualization
dashboards for small businesses
were developed and tested. Performance measurement, NASA
Task Load Index (TLX) workload
assessment, and posttest survey
were used to conduct the usability testing. Compared with the
dashboards that were designed
using line charts and tables, a
dashboard designed using visuals
(e.g., gauges, pie charts, and
flashing lights) produced quicker
response time, lower mental
and temporal demand and effort
ratings, and higher ratings
of engagement, interest, and
trustworthiness.
KEYWORDS:
visualization, energy, analytical
visuals, engaging visuals, small
businesses
, Nathaniel Smith, & Bhaskaran Gopalakrishnan
INTRODUCTION
Energy efficiency plays a pivotal role in
reducing the ever-increasing energy demand
(U.S. Energy Information Administration,
2016). The process to enhance energy efficiency is multifaceted and involves various
changes/upgrades. Among such changes are
the use of energy management and visualization systems. An energy management system
is defined as a system that employs microprocessors, sensors, and other devices that
are configured into a network to monitor
and control the use of various energy sources
that include but are not limited to electric,
natural gas, and water (Panke, 2001). Energy
visualization systems, on the other hand,
enable users to visualize their energy use in
real time. Energy visualization systems when
coupled with energy management systems
(often solely referred to as energy visualization systems), enable users to visualize pattern of use, peaks in energy use, signals of
any anomalies, analytics, and so on, in order
to manage short-term and long-term energy
use. This provides further context for users
to better manage their overall energy use in
real time, instead of waiting for a monthly
utility bill.
In recent years, residential and large
industrial sectors have benefited from energy
visualization solutions. However, limited
energy management system and energy
visualization options are available for light or
small commercial businesses. A “small
business” is defined as a company that is
staffed with 500 or fewer employees (U.S.
Department of Agriculture, n.d.). Such
businesses are typically housed in buildings
that are less than 50,000 square feet, which
comprise 90% of the total number of
buildings in the United States (Barnes &
Parrish, 2016).
Some prospects were considered to
provide energy visualization solutions to
small commercial businesses, such as
downsizing energy visualization systems
used for large commercial or industrial
applications, smart thermostats that can
control lighting or other loads, and upsizing
residential systems (Ehrlich, 2015). However,
residential energy visualization systems
cannot accommodate the many pieces of
equipment that a commercial facility has or
the amount of energy used. On the larger
spectrum, large-scale facility energy visualization systems are excessive for the light
commercial industry, both economically and
in complexity.
Thus, there is a technology gap wherein
small commercial buildings are not currently
being serviced in the energy visualization
realm (Lehrer et al., 2014; Lock et al., 2016).
The overarching goal of this research was to
develop and evaluate a new energy visualization system for the light commercial sector.
There were two main objectives:
1. Development of a framework with regard
to operational, technical, budgetary, and
other constraints specific to light commercial businesses. The framework was
developed adhering to the criteria of the
open database-driven system and was
tested for economic feasibility (Smith
et al., 2019).
2. Perform human factors testing of the
energy visualization system, which is the
topic of this article. Once the system was
developed, the energy visualization
dashboards were evaluated by a group of
participants consisting of energy
Month 2021 | ergonomics in design
1
feature
| Human Factors Evaluation of Energy Visualization Dashboards
Table 1. Key Performance Indicators Used in the Design of Energy Visualization Dashboards
Key performance indicators
Data (sensors)
Electricity used by major equipment
Current (current transducer)
Electricity used by lighting
On/off status (occupancy sensor)
Natural gas used by HVAC system
Operational characteristics (natural gas meter, temperature set points)
Water consumption
Water flow (water meter)
Note. HVAC = heating, ventilation, and air conditioning.
Figure 1. Dashboard 1 designed using line charts and tables.
managers (individuals responsible for optimizing the
energy performance of a facility, building or industrial
plant) and engineers, and other typical dashboard users
through analytical and subjective testing.
THE DESIGN APPROACH
Data visualization helps users to understand data by perceiving it from different viewpoints (Mizuno et al., 1997). Data
visualization has two main aspects: the content of the data and
the graphical representation of the data. Regarding the content
of energy data, it is important to identify the energy performance metrics, also known as key performance indicators. A
case study was performed to identify the key performance
indicators for small commercial businesses (Smith et al., 2019).
Table 1 identifies the key performance indicators, the relevant
data set, and the sensors used to obtain the data set.
Regarding the graphical representation of the data, a term
in recent years that has been trending is dashboard. A
2
ergonomics in design | Month 2021
dashboard is a single screen showing multiple visuals using
several data points to display information in an efficient
manner (Few, 2006). Energy dashboards should allow for
quick visualizations of pattern of use, peak use event, anomalies in energy use, opportunities for energy saving, and so on.
(Lehrer et al., 2010).
To identify the preferred visuals for different key performance indicators, several energy managers were consulted for
this study. There was no consensus among the energy managers regarding the preferred visual type. Therefore, an exploratory approach was used to design three energy visualization
dashboards by combining line charts, bar graphs, gauges, pie
charts, alternating lights, and tables.
Dashboard 1 (Figure 1) offered a simplistic view with line
charts. The charts displayed the current, natural gas flow,
and pulses from the energy sources. Total energy use data
along with a categorical percentage of each energy source
were presented using a tabular format. The black background and the white visuals used in the dashboard followed
feature
| Human Factors Evaluation of Energy Visualization Dashboards
Figure 2. Dashboard 2 designed using line charts, a light indicator, a pie chart, and tables. NG, natural gas.
Figure 3. Dashboard 3 designed using gauges, a light indicator, alternating stoplights, a pie chart, and tables. NG, natural gas.
the gestalt principle of figure-ground (Condly, 2003). The
visuals for electric versus natural gas were placed close
together based on the gestalt principles of proximity and
common region.
Dashboard 2 (Figure 2) offered simple tables and line
charts and also included engaging visuals including a light
indicator and a pie chart breakout for total energy use.
Inclusion of these visuals followed the gestalt principle of focal
point.
Dashboard 3 (Figure 3) presented the data using gauges,
alternating stoplights, lights, and a pie chart. Different shapes
of gauges for electric and natural gas (with arcs of 180° vs.
270°, respectively) followed the gestalt principle of similarity.
In addition, each dashboard included a button to display
monthly usage of electricity, natural gas, and water. To obtain
monthly data, a user would click on the button for a particular
dashboard and the data would appear in a pop-up window
(Figure 4).
Month 2021 | ergonomics in design
3
feature
| Human Factors Evaluation of Energy Visualization Dashboards
Figure 4. Visuals used to display historical data in Dashboard 1: (A) monthly electric usages and demand, (B) monthly natural gas
(NG) usages, (C) monthly water usage.
Figure 5. Dashboard testing procedure flow diagram. The three phases in blue were repeated for each of the three dashboards,
per participant. NASA TLX, Task Load Index.
The human machine interface for the energy visualization
system was designed to fit a small 14- to 16-inch touchscreen
laptop/tablet computer. The dashboards were developed using
the visual basic.Net programming language along with
Advanced HMI software (AdvancedHMI, 2018).
DATA COLLECTION PROCEDURE
The procedure used to conduct the human factors testing is
shown in Figure 5.
4
ergonomics in design | Month 2021
Participant Orientation
Twenty-five participants were recruited for data collection
from a pool of undergraduate and graduate students. The
majority of the students were employed by the Department of
Energy–funded Industrial Assessment Center. The Industrial
Assessment Center students routinely participate in the
energy audits performed at small businesses and are familiar
with their energy use. There were 18 males (M age = 24 years,
SD = 1.6) and 7 females (M age = 25 years, SD = 7.5).
feature
| Human Factors Evaluation of Energy Visualization Dashboards
Figure 6. Questions used to perform the performance test and survey.
Thirteen of the males and four of the females had engineering
backgrounds. Nearly 20 participants had a background and/or
training in energy efficiency. All participants were experienced with using dashboards for various applications. All participants read and signed a consent form approved by the local
institutional review board. A script adapted from Brown
(2002) and Flowers (2015) was read to greet participants and
explain the purpose of the study, system functionality, and
different dashboards. The script also encouraged participants
to think aloud so the researcher could understand their
thought process.
Phase 1: System and Dashboard Training
In this phase, the participants were given 5 minutes per
dashboard to view, interact, and simulate the data to mimic
business operations. Using the knobs and switches, the participants viewed how the dashboard changed as they managed
their energy use. The major energy sources, corresponding
key performance indicators and the different visuals offered
by each dashboard were shown to the participant to help with
identification.
Phase 2: Simulation and Performance
Measurement
In Phase 2 of the study, performance measures were gathered in a usability test. Usability tests of this sort are typically
used to discover user interface issues of a product in early
design phases (Jókai, 2009). A series of questions were asked
(Figure 6) and participants’ verbal responses were recorded.
The order in which the dashboards were presented and the
questions that were asked were randomized. Participants were
asked to answer the questions as quickly as possible.
Phase 3: NASA TLX
The NASA Task Load Index (TLX) – consisting of dimensions in subjective mental, physical, and temporal demands,
performance, effort, and frustration – was used to measure
mental workload. (Cao et al., 2009; Hart & Staveland, 1988).
Each participant rated the dimensions on a scale from 0 to
100. After the participants assigned their ratings, weights were
determined based on a pairwise comparison between the different dimensions. A final weighted or overall workload score
was calculated using the ratings and weights. A higher score
would represent increased mental workload.
Phase 4: Posttesting Survey
Finally, a survey was conducted to evaluate the dashboard
visuals, the potential for different dashboards to be used in a
light commercial business, and other pertinent issues. Openended questions used in the survey are shown in Figure 6.
DATA ANALYSIS
Data analysis were performed using repeated measures
analysis of variance (ANOVA). The independent variable,
dashboard, contained three levels (Dashboard 1, Dashboard 2,
Month 2021 | ergonomics in design
5
feature
| Human Factors Evaluation of Energy Visualization Dashboards
Figure 7. Mean total normalized response time for the three
dashboards. Error bars represent a 95% confidence interval.
Columns marked with bracket and “*” mark are significantly
different from each other. NG, natural gas.
Dashboard 3). Separate ANOVA analyses were performed for
the dependent variables:
Total normalized response time: the voice recorded
responses of the participants (Phase 2) were analyzed to
determine the response time per question and accuracy of
the responses. The response time was normalized with
respect to each participant’s maximum response time to
meet the normality and equality of variance assumptions of
ANOVA. The maximum response time ranged between 6
and 25 seconds with a mean of 13.2 seconds. Normalized
response time for all questions were summed to determine
total normalized response time.
NASA TLX ratings for mental, physical, and temporal
demand, performance, effort, and frustration levels and
overall weighted score (Phase 3).
Analytical, engaging, interesting, and trustworthy ratings
(Phase 4).
The statistical significance was set at = .05 and the
significant main effects were further evaluated using a paired
sampled t test with Bonferroni correction.
RESULTS
Total Normalized Response Time (Phase 2)
The effect of dashboard on the total normalized response
time was statistically significant (p = .024). Pairwise comparison showed that the mean total normalized response time for
Dashboard 1 was slower than Dashboard 3 (Figure 7).
Most participants answered all questions correctly. There
were no major identifiers for questions that were repeatedly
given the wrong answer – with the exception of one question.
When using Dashboard 2, 32% of participants gave incorrect
answers for Question 7 (“Which natural gas meter had more
6
ergonomics in design | Month 2021
Figure 8. Natural gas (NG) monthly usages for Dashboard 2.
consumption in February?”). As seen in Figure 8 for the
February column, Meter 1 was the correct answer, but some
participants indicated that due to the orange bars (Meter 2)
being on top of the blue bars, Meter 2 had more consumption.
NASA TLX Ratings and Score (Phase 3)
The effect of dashboard on the NASA TLX overall score
(p = .001) and the ratings for mental demand (p = .034), temporal demand (p = .024), and effort (p < .001) was statistically significant. Pairwise comparison showed that the means
for overall score, ratings for mental and temporal demand and
effort for Dashboard 1 were higher than Dashboard 3 (Figure
9). The mean rating for effort for Dashboard 2 was also higher
than Dashboard 3.
Posttesting Survey (Phase 4)
In response to Question 1, on dashboard visuals that provided the best information, 56%, 16%, and 4% of participants
directly identified Dashboards 3, 2, and 1, respectively. The
remaining 16% stated that gauges provided the best information and nearly 8% identified multiple features from various
dashboards.
In response to Question 2, on how realistic the dashboard
would be if used in a light commercial business, 44% of
participants responded “very realistic,” 20% of participants
responded “realistic enough/moderately realistic,” 28% of
participants responded “fairly practical/good option/could be
used,” and 8% of participant responded “not sure.”
In response to Question 3, on preference between the engaging
visuals (gauges, lights, alternating light water meter, etc.) versus
the analytical visuals (bar charts, line graphs, etc.), 64% preferred
engaging visuals, 16% preferred analytical visuals, and 20%
preferred a combination of engaging and analytical visuals.
In response to Question 4, on what participants enjoyed
about the energy visualization experience, 36% of the
feature
| Human Factors Evaluation of Energy Visualization Dashboards
Figure 9. Mean NASA TLX scores and ratings for the three dashboards. Higher score and ratings represent higher workload.
Error bars represent a 95% confidence interval. Columns marked with bracket and “*” mark are significantly different from each
other. NASA TLX, Task Load Index.
Figure 10. Mean design ratings for the three dashboards. Error bars represent a 95% confidence interval. Columns marked with
bracket and “*” mark are significantly different from each other.
participants liked the easiness of the dashboards to read and
understand the information, 40% of the participants liked the
real-time data or interactivity produced by such data, and 24%
of the participants liked that the information can be found
quickly or it is right in front of you.
In response to the Question 5, on what participants would
add or change to the dashboard designs, 20% of the participants stated that they would not change anything and the
remaining 80% of the participants made several recommendations for changes: use of more or different colors to differentiate between type and source of energy; use of larger titles,
graphics, or images; inclusion of warnings for unusual energy
consumption; and a combination of features from dashboards
to develop an ideal dashboard.
The effect of dashboard on the design ratings, analytical
(p < .001), engaging (p < .001), interesting (p < .001), and
trustworthy (p = .029) was statistically significant. Pairwise
comparisons showed that the mean analytical ratings for
Dashboard 1 were higher than Dashboard 2, and Dashboard 2
were higher than Dashboard 3 (Figure 10). The mean engaging and interesting ratings for Dashboard 2 were higher than
Dashboard 1 and Dashboard 3 were higher than Dashboard 2.
Additionally, the mean trustworthy ratings for Dashboard 3
were higher than Dashboard 1.
DISCUSSION
There is limited academic research on the effects of different energy visualization dashboards on user performance.
Users rely on working memory with any visualization. As this
type of memory is temporary and limited in nature, the design
should utilize “preattentive attributes,” allowing users to
quickly identify differences in data without much cognitive
processing (Few, 2013). Dashboard 3 produced significantly
lower response time compared with Dashboard 1. Dashboard
3 was designed using visuals such as gauges, pie charts, and
flashing lights. These visuals yielded a quicker response time,
suggesting more efficient perceptual and cognitive processing
Month 2021 | ergonomics in design
7
feature
| Human Factors Evaluation of Energy Visualization Dashboards
in using Dashboard 3. This further suggests that the visual
attributes used in the design of Dashboard 3 may possess
higher preattentive attributes compared with the line charts
used in the Dashboard 1 for energy visualization purposes.
Dashboard 3 was preferred over Dashboards 1 and 2 in
terms of NASA TLX scores. Dashboard 3 was designed
predominantly with gauges, and Dashboard 1 was designed
predominantly with line charts. On the gauges, the moving
component (pointer) and the data/readings (scale) were next
to each other. Therefore, reading the real-time data from a
gauge was easier than reading it from a line chart. In the line
chart, not only was the moving point at some distance from
the data axes but there was more to look at to find necessary
data, that is, a reduced signal to noise ratio. Thus, the participants were able to obtain the same information with less effort
by reading the gauges in Dashboard 3 compared with reading
the line chart in Dashboard 1, further making Dashboard 3
more engaging, interesting, and trustworthy.
Based on general comments collected after the experiment,
all participants rated the energy visualization dashboard
favorably for being utilized in a light commercial business.
A summary of few comments that could provide guidance for
further improving the designs of energy visualization dashboards included (1) a change from the black background and
white font of the dashboard to distinguishing between
different energy sources by color; (2) instead of the dashboards appearing on a traditional 14-inch laptop screen that
may have shrunk some of the visuals, larger visuals and a
larger display monitor could be used; (3) the frequency of the
displayed data could have been reduced from its polling rate
of 1.5 seconds to allow for further analysis and processing;
and (4) owing to a space constraint and the autoscaling option
used on monthly usage charts, the y-axes of the line charts
were not optimally scaled for easy reading, and participants
suggested intervals of 5 or 10.
Enhanced visuals such as load duration curves, stacked bar
charts, and double y-axes charts were found to be highly
successful in prior studies (Energy Efficiency & Demand
Management & AEE Northern Ohio Chapter, 2014; Kirk,
2012). However, 32% of this study’s participants incorrectly
answered a question regarding the stacked bar chart. It was
determined that participants had trouble understanding the
totality of the overall usage instead of breaking down the
overall bar in sections (Figure 8). As this was one example of
the many possible enhanced visuals, one cannot state that all
visuals, including the stacked bar chart, should not be used.
Instead, proper understanding and training of the visual may
be required when using enhanced visualizations in energy
dashboards, as these visuals may not be immediately intuitive.
The findings of this study are a function of the design and
experimental conditions tested and are therefore subject to
several limitations:
The proposed energy visualization system and dashboards
were designed to view real-time energy use data. Historical
8
ergonomics in design | Month 2021
(monthly use) data were also presented using pop-up windows; however, short-term (hourly or daily) trends were
not considered.
The historical data were plotted using line graphs with
autoscaling options since the data range varied substantially between the sources of energy. The autoscaling
option produced nonstandard y-scale values. Such values
should be avoided. Scale intervals should be numbered in
intervals of 5,10, 100, and so on, depending on the application (McCormick & Sanders, 1982).
The questions in the performance segment of the usability
study focused on assessing participants’ ability to read and
understand real-time data. Such an approach may have favored
certain type of visuals (gauges) over the other (line chart).
While most of the participants possessed some level of
training in energy efficiency, the possible end user of the
product, that is, small business owners, were not tested in
this study.
In conclusion, the proposed energy visualization dashboards may have the potential to assist the small business
owners with their energy efficiency endeavors. Among the
energy evaluation dashboards tested in this study, dashboards
with simple visuals such as gauges, pie charts and flashing
lights produced encouraging results in terms of reduced
response time and higher ratings. Gathering additional
information of users’ requirements and additional studies
involving target users would help to further refine the ultimate
design of energy visualization dashboards.
An energy visualization system cannot achieve desired
energy efficiency if it is failing to make a connection
with the users. This user evaluation study examines the
design features that would make energy visualization
dashboards desirable to users.
REFERENCES
AdvancedHMI. (2018). Homepage. https://www.advancedhmi.com/
Barnes, E., & Parrish, K. (2016). Small buildings, big impacts: The role of
small commercial building energy efficiency case studies in 2030 districts.
Sustainable Cities and Society, 27, 210–221. https://doi.org/10.1016/j.scs
.2016.05.015
Brown, C. (2002). Usability testing in internet time. Performance Improvement, 41(10), 40–46.
Cao, A., Chintamani, K. K., Pandya, A. K., & Ellis, R. D. (2009). NASA TLX:
Software for assessing subjective mental workload. Behavior Research
Methods, 41(1), 113–117. https://doi.org/10.3758/BRM.41.1.113
Condly, S. (2003). The psychologies of gestalt principles of visual perception
and domain expertise: Interactions and implications for instructional
practice. Florida Association of Teacher Educators Journal, 1(3), 1–17.
Ehrlich, P. (2015, May). Building automation for small commercial buildings.
Engineered Systems, 22.
Energy Efficiency & Demand Management & AEE Northern Ohio Chapter.
(2014). Advanced excel for energy efficiency professionals. http://www
feature
| Human Factors Evaluation of Energy Visualization Dashboards
.aeeohio.com/ADVANCED%20EXCEL%20FOR%20Energy%20Engine
ers_AEE_CLE_11142014.pdf
Few, S. (2006). Clarifying the vision. In Information dashboard design the
effective visual communication of data (pp. 23). O’Reilly Media. https://
flylib.com/books/en/2.412.1/clarifying_the_vision.html
Few, S. (2013). Information dashboard design: Displaying data for at-a-glance
monitoring. Analytics Press.
Flowers, J. (2005, June). Usability testing in technology education. The Technology Teacher, 64(8). http://citeseerx.ist.psu.edu/viewdoc/download?doi
=10.1.1.464.6538&rep=rep1&type=pdf
Hart, S. G., & Staveland, L. E. (1988). Development of NASA-TLX (Task
Load Index): Results of empirical and theoretical research. In P. A.
Hancock & N. Meshkati (Eds.), Advances in psychology: Vol. 52. Human
mental workload. (pp. 139–183). North-Holland.
Jókai, E. (2009). Usability testing methods on e-learning environment. Periodica Polytechnica, Social and Management Sciences, 17(2), 79–87. https://
doi.org/10.3311/pp.so.2009-2.04
Kirk, A. (2012). Data visualization: A successful design process. Pakt.
Lehrer, D., Kaam, S., & Vasudev, J. (2014). A usability study of a social media
prototype for building energy feedback and operations [Powerpoint slides].
https://www.cbe.berkeley.edu/research/pdf_files/Lehrer-2014-slidessocial-media-energy.pdf
Lehrer, D., Vasudev, J., & Webster, T. (2010). Visualizing building information. Center for the Built Environment [Powerpoint slides]. https://www
.cbe.berkeley.edu/research/pdf_files/Lehrer2010-slides-visualizing-info
.pdf
Lock, L. E., Shyan, T. E., & Hartman, T. (2016). Improving commercial building energy performance. ASHRAE J, 40–45.
McCormick, E. J., & Sanders, M. S. (1982). Human factors in engineering and
design (pp. 184–185). McGraw-Hill.
Mizuno, H., Mori, Y., Taniguchi, Y., & Tsuji, H. (1997). Data queries using
data visualization techniques. 1997 IEEE International Conference on Systems, Man, and Cybernetics. Computational Cybernetics and Simulation,
3, 2392–2396. https://doi.org/10.1109/ICSMC.1997.635285
Panke, R. A. (2001). Energy management systems and direct digital control.
Fairmont Press.
Smith, N., Nimbarte, A. D., Gopalakrishnan, B., & Wuest, T. (2019). Realtime energy visualization system for light commercial businesses. Sustainable Energy Technologies and Assessments, 34, 68–76. https://doi
.org/10.1016/j.seta.2019.04.005
U.S. Department of Agriculture. (n.d.). Small business. https://nifa.usda.gov/
topic/small-business
U.S. Energy Information Administration. (2016). Annual energy outlook
2016 with projections to 2040. https://www.eia.gov/outlooks/aeo/
pdf/0383(2016).pdf
Ashish Nimbarte
is a professor in the
Department of Industrial and Management
Systems Engineering at West Virginia University. He received a BS (Production Engineering) from Nagpur University (India) and
MS and PhD (Industrial Engineering) from
the Louisiana State University. He works in
the general areas of ergonomics and energy efficiency. His
research interest is motivated by the need to better characterize
factors that impede sustainability of industrial operations so that
effective control strategies can be implemented. ORCID ID https://
orcid.org/0000-0002-1059-9112
Nathaniel Smith is a partner at Thermdex
Engineering who designs and engineers
steam boiler control systems for various
manufacturing applications. He received BS
and MS in industrial engineering from West
Virginia University in 2016 and 2018, respectively. His research focuses on energy visualization, internet of things (IoT), and industrial integration and
automation.
Bhaskaran Gopalakrishnan is a professor
in the Department of Industrial and Management Systems Engineering at West Virginia University and Director of its Industrial
Assessment Center. He is a Certified Energy
Manager, Professional Engineer, LEED
Green Associate, and a U.S. Department of
Energy qualified specialist in DOE Best Practices software tools
such as AIRMaster+ (compressed air), PHAST (process heating),
FSAT (fans), PSAT (pumps) and SSAT/SSST (steam).
Copyright 2021 by Human Factors and Ergonomics Society. All rights reserved.
DOI: 10.1177/10648046211028693
Article reuse guidelines: sagepub.com/journals-permissions
Month 2021 | ergonomics in design
9
Download