Uploaded by penis

ap research (1)

advertisement
Finding and designing a better
white cane/walking stick alternative
Word count: 4022
Introduction
More than 4.2 million Americans aged 40 years and over are blind, having a visual acuity
of 20/200 or less or a visual field of 20 degrees or less. Other estimates put the number of people
with "vision problems" as high as 21 million, with an estimated 80 million Americans suffering
from potentially blinding eye diseases. Cataracts, age-related macular degeneration, diabetic
retinopathy, and glaucoma are the most common causes of visual loss (Prevent Blindness
America, 2012). Depression, diabetes, hearing loss, stroke, falls, cognitive decline, and early
mortality are all more common in those who have lost their eyesight. Reduced vision makes it
difficult to drive, read, keep track of finances, or travel in foreign regions, putting one's quality of
life at risk (Welp A, Woodbury RB, McCoy MA, et al., 2016). . The use of mobility aids benefits
persons with blindness and visual impairments by decreasing fall risk, enhancing confidence,
increasing autonomy, and over all improving their quality of life, yet many who might benefit
from using mobility aids do not use them (Resnik et al., 2010). According to Resnik et al. (2010),
societal pressures and perceived stigma keep them from using mobility aids but with greater
physician involvement, positive peer models, affordable, safe, and visually appealing devices
would promote greater acceptance of mobility aids. The number of blind and visually impaired
people increases as the elderly population grows and diabetic retinopathy and other eye diseases
become more prevalent (Saaddine JB, Honeycutt AA, Narayan KMV, Zhang X, et al., 2008). It is
therefore imperative to design an effective and versatile mobility aid for the blind and visually
impaired suffering from the effects of vision loss.
To my knowledge, studies concerning the development of a better mobility aid and the
commercial production of one do exist, such as the CaBot by Guerreiro et al. at Carnegie Mellon
University and the Light Detection and Ranging Assist Spatial Sensing (LASS) system by Ton et
al. The CaBot from Carnegie Mellon University, is a cart-like device that uses a handle similar to
that of a suitcase where the user receives tactile feedback, in the form of vibrations, from the
Light Detection and Ranging and image recognition camera to tell where and what obstacles
there are in the way. The CaBot also works similarly to a guide dog as it informs the user of
incoming obstacles that are either walls or people with tactile feedback and vibrations; the CaBot
is also able to avoid obstacles through the use of image recognition. Another system that uses the
Light Detection and Ranging is the LASS system developed by Ton et al which also uses Light
Detection and Ranging in a way similar to the CaBot but uses stereo sound with varying pitches
to provide feedback to the user. The stereo sound pitch is relative to the user's orientation and
horizontal distance, thus granting the user enhanced spatial perception of their surrounding areas
and potential obstacles. The research in this study will be adding on to pre existing research and
help create a better mobility aid. What differs from other research or designs is that instead of
using sound or vibration for sensory feedback, this research or design will focus on finding
alternative methods of sensory feedback, but find out the effectiveness and the efficacy of
LIDAR. Taking into account the importance of how persons with vision loss are negatively
affected by the decrease in the quality of life of the blind and visually impaired and knowing
that navigation robots can have the potential to further increase the mobility and independence of
the blind and visually impaired, Light Detection and Ranging and alternative methods to send
and receive data, and the effectiveness of them will help the medical and technological field
through the expansion of the traditional white cane and guide dog. I hypothesize that the use of
Light Detection and Ranging is an effective and efficacious mediator of using technology to
support and improve the quality of life for blind and the visually impaired and can be done so
affordably to increase accessibility through the use of entry level Light Detection and Ranging
equipment such as Slamtec’s rp Light Detection and Ranging a1 and other accessible devices
such as an arduino uno r3 and leds..
Methods
In order to test my hypothesis of using an affordable and accessible entry level Light
Detection and Ranging as a mobility aid, a device that effectively takes and interprets variable
distance data. An effective way of measuring or visualizing the effectiveness of the entry level
lidar, we can use light-emitting diodes to help us, in a way, see what the Light Detection and
Ranging sees and visualize the date.. A plan for the wiring and assembly of an effective device
that can test the application of Light Detection and Ranging can be found in Slamtec’s software
development kit (SDK) and would be a perfect fit for this research. The mobility aid uses a rp
lidar, an arduino uno r3, and an led. These devices work together to create a mobility aid that can
detect the location and the variable distances of obstacles around them and provide feedback in
the form of light based on its brightness. The choice of the rp Light Detection and Ranging a1is
based on the fact that it is cheap, accurate, and affordable for the casual hobbyist that has interest
in building one for them or someone else. The rp Light Detection and Ranging a1 uses low
power infrared light and can measure distance data about 16000 times per second so the Light
Detection and Ranging does a good job at efficiently scanning its environment and getting
variable distance data that can be interpreted by the microcontroller/ arduino. The arduino is an
open-source microcontroller board that can be equipped with many inputs and outputs that can
be interfaced with many other components. The arduino uno r3 will work similar to a brain in
our bodies, it will take and interpret the data from the rp Light Detection and Ranging and turn
that into visual data for users to interpret. The motivation for using an arduino compared to
anything else is that it is a simple microcontroller that does a simple task while compared to
something like a raspberry pi which would be more complicated in such that using a raspberry pi
to control a Light Detection and Ranging and interpret it into light-emitting diode light is not as
efficient use of having an raspberry pi as it is a computer that can perform many more complex
tasks. So in this case an arduino uno r3 would be perfect for this use case due to its simplicity
and efficiency in its tasks. Finally, the light-emitting diode will be the main source of outputting
the data from the Light Detection and Ranging to the user. In certain directions the light-emitting
diode will turn a certain color of a certain brightness as the Light Detection and Ranging
approaches a wall or obstacle. For example, if the Light Detection and Ranging where to go near
a wall, person, or obstacle in the north direction then the light-emitting diode will be blue and
100 percent bright but if a wall, person, or obstacle were to be in the somewhat far south
direction of the Light Detection and Ranging then the light-emitting diode would be red and dim
in brightness.
Fig. 1: The wiring diagram for the rp lidar, the arduino uno r3, and the led.
Wiring:
A video I sourced some of the wiring from was done by drone bot workshop and has a
video and a blog explaining the process to properly wire the Light Detection and Ranging to the
arduino. Initially his wiring did not work very well so with a mixture of the plans for the wiring
form Slamtec and drone bot workshop, I was able to create a working lidar. To get the rp Light
Detection and Ranging to connect with the arduino, a power and ground wire needs to be
plugged into the breadboard from the arduino which will eventually be plugged into the Light
Detection and Ranging using the 5 pin connector included with the rp Light Detection and
Ranging kit, ports 0 (RX), 1 (TX), and 2 (Motor Controller) will be plugged to their respective
areas along with ports 1 and 5 for ground on the Light Detection and Ranging and ports 4 and 7
for power. The RX and TX wiring has to be wired inversely from the arduino for the mobility aid
to work, and the Light Detection and Ranging since port RX and TX are input and output, the
Light Detection and Ranging interprets variable distance data and outputs that to the arduino to
its input ports. Ports 0 and 1 on the arduino uno r3 will communicate with the Light Detection
and Ranging and take the input from the Light Detection and Ranging and communicate it to the
led. Port 3 of the arduino uno r3 connects to the Light Detection and Ranging and it powers and
controls the motor controller that spins the Light Detection and Ranging itself so it's able to scan
its environment in a full 360 Degree Omnidirectional Laser Range Scanning. Ports 9, 10, and 11
actually control the brightness of the led, they should be connected to 3 of the 4 wires and the 4th
wire should be plugged into ground to complete the circuit. The 5V port will power the Light
Detection and Ranging and the light-emitting diode connected to the arduino. The gnd or the
ground port will be connected to all parts of the Light Detection and Ranging and light-emitting
diode that requires a ground.
Fig. 2 Schematic diagram of the wiring of the arduino uno r3, rp lidar, and led.
Software:
Now that the brain (the arduino uno r3) and body (the rp Light Detection and Ranging
and led) are connected, the process of making the mobility aid can work. To download the
necessary program to the arduino uno r3, the arduino IDE is needed. The arduino IDE is an
integrated development environment; the IDE works as the mediator between then code and
program to the actual arduino uno r3. The IDE works to put the program programmed onto the
arduino uno r3. The IDE for the arduino uno r3 can be found on
https://www.arduino.cc/en/software. The code that was used to program the arduino is an
example project downloaded from Slamtec, the company that manufactures the rp lidar a1. To
get the example project, go to the Slamtec website and download the software development kit
(SDK) from the website or through the github repository, a service that holds source code. With
the SDK, the example project and everything else for programming the rp lidar a1, we can then
verify and upload the code to the arduino uno r3. With the arduino uno r3, rp lidar a1, and
variably controlled leds, we can test the effectiveness of the Light Detection and Ranging by
moving the mobility aid in different horizontal directions to test proper detection of obstacles
such as holes, dips in the ground wall, or walls.
Results
In order to measure or visualize the effectiveness and efficacy of Light Detection and
Ranging and its applications to helping blind people and the visually impaired, I built a device
that takes variable distance data, using the LIDAR, and turns that into light based on brightness
through the use of light-emitting diodes to help us visualize that data. The reasoning for using
light-emitting diodes instead of developing a more tactile system is that the effectiveness of the
Light Detection and Ranging is that it is harder to qualify or qualitatively assess the LIDARs
effectiveness with a vibrating motor based feedback system or tactile sensitivity based arm then
brightness and light. The wiring and the schematic for the Light Detection and Ranging device
and mobility aid are in figures 1 and 2, and the wiring for the device are shown in figure 3. The
mobility device will be used to test the effectiveness and efficacy of Slamtec’s rplidar and to see
how applicable Light Detection and Ranging is to helping the blind and visually impaired. Upon
completion of the mobility the Light Detection and Ranging device is proved to be somewhat
effective in its purpose but fails in its accuracy to actually detect walls or any obstacles. The
mobility aid is capable of detecting objects in a certain direction but unable to or not very well in
others and the aid will assign certain colors to certain angles but mistakenly assign the same
color to another angle making the aid not very effective in notifying the user where and in which
direction the object and obstacle is. Using separate color light-emitting diodes did not fix the
problem of the Light Detection and Ranging not seeing a certain obstacle but instead all stayed
bright not working as intended or all dimmed with resistors applied to lower the current going to
the leds. The only workable solution was to go back to using an light-emitting diode while
having it malfunction in another aspect but function in one another. This effectiveness of this
Light Detection and Ranging as a whole is not very useful to blind people as they might not be
able to see the light-emitting diode but hopefully the result of this study could inspire other
researchers and even hobbyists to attempt another creation of a wrist mounted Light Detection
and Ranging system that, of course, works better and doesn’t rely on light. If the user is only
visually impaired and able to see color the mobility aid could still serve to be useful in being able
to see if something is in the way or if something is a safe distance away through the brightness of
the led. The potential variant of this device should also detect more than a couple feet to be
effective and take in consideration for different groups of handicapped people such as those with
limited motor functionality and the elderly whose perception of tactile feedback could not be
sensitive enough for feeling and sensing variably changing distance.
Though not very effective in its goal to help the blind and visually impaired. This sample
mobility aid does bring up more questions that can help guide other researchers in their own
research and engineering prototype. Light Detection and Ranging could be an excellent form of
perception that provides feedback that could potentially replace or add on to benefit existing
vision and increase the safety for both individuals who are blind and handicapped. Overall, the
adaptation of Light Detection and Ranging and other safety and mobility aids would make them
much safer.
Discussion
The aim of my research was to test the hypothesis that Light Detection and Ranging
would be effective in measuring and interpreting variable distance data that could be used for a
mobility aid that can help the blind and the visually impaired. To test this hypothesis, I created a
simple Light Detection and Ranging device and mobility aid that would take Light Detection and
Ranging data and translate and interpret that data for users to see and visualize that data to
qualify that data in an efficient way. The ability to see the data then feel it would help us look at
the data in a broad way and be clearer for our understanding or how Light Detection and
Ranging interprets that data. In figure 1 it shows the necessary wiring for the mobility aid to
work and figure 2 also shows how the rp lidar, arduino uno r3, and light-emitting diodes are
connected to function as a mobility aid that can potentially be used and built upon. My mobility
aid lacked accuracy in its effectiveness in measuring and interpreting distance data. It lacked
many of the basic features that would make it usable or functional. What I could have done was
program my own program for the arduino and Light Detection and Ranging so the code and
mobility aid would work more neatly together. Maybe another researcher with more electrical
engineering skill should take up the task and do a better version, preferably one that does not rely
on light and brightness to tell the user how far they are from the object. Preferably someone with
a degree. What did function was only that some of the angles would be able to variably sense
distance but fail in other angles; only the bright would change in one or two directions/angles
but just blink in other directions. It is safe to assume that using this device would not be very
usable in a real world situation. The Light Detection and Ranging itself and the system of
passing on the sense to the user is nowhere usable enough for a blind person to tell apart from a
person or a flying piece of paper. Potentially using a higher resolution Light Detection and
Ranging with way better engineering or a more detailed way of informing the user of what is
around them would be superior to what I have now.
The point of my research was to find out if a hand/wrist mounted Light Detection and
Ranging mobility aid was better to a waist or a whole separate system that used another source of
interpreting data while being cheap and accessible. The result was that if it was done quickly and
harshly it would not be better. But there is the possibility that it is effective if the research was
done by an electrical engineer with a superior understanding to effectively handle Light
Detection and Ranging technology and tactile feedback. I did not effectively fill any gap in my
field except for the fact that using enthusiast or entry level Light Detection and Ranging
technology is not as effective as a higher level or more expensive Light Detection and Ranging
that does 360 Degree Omnidirectional Laser Range Scanning. Another potential way of creating
a more effective but still affordable 360 Light Detection and Ranging would be purchasing a
stationary one direction and putting the stationary Light Detection and Ranging on a motor that
spins 360 degrees. The only potential drawback would be having to calculate the angle of where
the laser is coming from and where the laser goes to, the sensor or receiver, with the speed at
which the Light Detection and Ranging is spinning to confirm the distance accurately. Using a
stationary Light Detection and Ranging could add more difficulty to the backend development
requiring more knowledge of arduinos, programming, and computer literacy overall.
My findings are not that important in the field of electrical engineering or helping the
blind and visually impaired. What my findings do is show that a Light Detection and Ranging
does work and does return variable distance data that is given to the user in some way that is
possibly accurate enough to aid the user. The problem with my design and similar entry level
Light Detection and Ranging project alternatives is that it lacks detail; there is not enough
information for the user to tell things apart from each other. Even with a traditional cane or
walking stick the user can feel or sense what is beneath them and feel certain textures such as
bumps for crosswalks while a basic Light Detection and Ranging is not detailed enough to
provide efficient feedback.
I most likely had wired the mobility aid device in some wrong way to not work properly
or that there was something incompatible with the arduino I had that I probably could have fixed
with programming that I did not do. Proper knowledge of C plus plus and computer literacy of
arduino and electronics overall could be very beneficial to the production of a superior Light
Detection and Ranging device for a mobility aid.
Another issue found concerning the use of the futuristic Light Detection and Ranging and
mobility aid is that users of older age could possibly struggle with the use of tactile or sensitive
feedback devices such as the Light Detection and Ranging Assist Spatial Sensing (LASS) by Ton
et al. Older users could possibly not feel the sensitivity based feedback that these devices use in
order to provide feedback to the user. Compared to a traditional walking stick or mobility aid,
older users and the elderly might be more accustomed to and comfortable with a walking stick or
white cane instead.
My mobility aid using an entry and enthusiast level Light Detection and Ranging that is
also very accessible using easily accessible devices such as the arduino uno r3, Slamtec’s rp
lidar, and an led, was shown to be not very effective in obtaining variable distance data that was
accurately transferable to tactile feedback for user sensing. The proper way or the most
successful way of designing and engineering something more effective is to make a mobility
device that is similar to that of a self-driving car. Since a self-driving car is able to tell apart
different objects and understand signs and people, having something that relies on image
recognition instead or with the use of Light Detection and Ranging would probably be more
beneficial to the user and the development of tactile feedback. Being able to know what is
around them and it being paired with a walking stick or aided by Light Detection and Ranging
would be more impactful than solely relying on lidar. The way self driving cars navigate the
roads could be applied to people and the real world. Image recognition and sensors could be used
to detect hallways and signs that would inform the user of what is around them and how to avoid
them. An example would be the device reading what light the traffic light was on and if the cars
are stopped so they could walk safely across the street. Another alternative would be using a high
resolution stationary Light Detection and Ranging device that can scan more then one point and
digitally map points on a screen as usable data along with camera and position tracking. Luckily,
this technology is already built into modern iphones and ipads such as the iphone pro, iphone pro
maxs, and the ipad pro (apple.com). An example of a blind Light Detection and Ranging
mobility aid is made by youtuber stuff made here using an new generation ipad pro along with
some motors for tactile feedback, is a great representation of what is possible using something
readily available and in many people's pocket. Youtuber Stuff Made Here had created an ipad
app that uses the power of the ipad pro and its built in Light Detection and Ranging camera to
make a device that would translate pixels and the distance of objects into tactile feedback using a
motor that would control metal bumps that would either extend when the ipad pro senses
something in the distance and retract when the camera is pointed away
(https://www.youtube.com/watch?v=8Au47gnXs0w). In the future using a high quality, higher
resolution Light Detection and Ranging camera and maybe paired with image recognition and
machine learning could prove to be far more effective in helping people who are blind and
visually, and could be potentially beneficial for other people with disabilities. The potential
drawback of having a high resolution, high quality Light Detection and Ranging would be that it
would make mobility aids more expensive for people of lower economic situations but could be
paid for by health insurance companies and the government.
Looking at other viewpoints, Light Detection and Ranging is limited in the fact that Light
Detection and Ranging does not give enough feedback or not as much detail as eyesight would
but Light Detection and Ranging could offer the best alternative to eyesight when compared to
the tactile feedback of a walking can or other mobility aids. Along with the consideration that
Light Detection and Ranging requires accuracy for the safety of the mobility aids users, having
an efficient and properly efficacious mobility aid could require a higher cost Light Detection and
Ranging that would be of high enough quality and detail for its user. But the higher cost could be
paid for or in part by state and local governments as well as corporations part of a group health
plan.
1. Vision Problems in the U.S.: Prevalence of Adult Vision Impairment and Age-Related
Eye Disease in America. Schaumburg, IL: Prevent Blindness America, 2012. Available
from URL: http://www.visionproblemsus.org. [accessed January 25, 2022]
2. National Academies of Sciences, Engineering, and Medicine; Health and Medicine
Division; Board on Population Health and Public Health Practice; Committee on Public
Health Approaches to Reduce Vision Impairment and Promote Eye Health; Welp A,
Woodbury RB, McCoy MA, et al., editors. Making Eye Health a Population Health
Imperative: Vision for Tomorrow. Washington (DC): National Academies Press (US);
2016 Sep 15. 3, The Impact of Vision Loss. Available from:
https://www.ncbi.nlm.nih.gov/books/NBK402367/
3. Software. www.arduino.cc/en/software. Accessed 1 May 2022.
4. Saaddine JB, Honeycutt AA, Narayan KMV, Zhang X, Klein R, Boyle JP. Projection of
Diabetic Retinopathy and Other Major Eye Diseases Among People With Diabetes
Mellitus: United States, 2005-2050. Arch Ophthalmol. 2008;126(12):1740–1747.
doi:10.1001/archopht.126.12.1740
5. Resnik, L., Allen, S., Isenstadt, D., Wasserman, M., & Iezzoni, L. (2009). Perspectives on
use of mobility aids in a diverse population of seniors: implications for intervention.
Disability and health journal, 2(2), 77–85. https://doi.org/10.1016/j.dhjo.2008.12.002
6. C. Ton et al., "LIDAR Assist Spatial Sensing for the Visually Impaired and Performance
Analysis," in IEEE Transactions on Neural Systems and Rehabilitation Engineering, vol.
26, no. 9, pp. 1727-1734, Sept. 2018, doi: 10.1109/TNSRE.2018.2859800.
7. Getting Started with Light Detection and Ranging - YouTube.
www.youtube.com/watch?v=VhbFbxyOI1k&t=2003s. Accessed 1 May 2022.
8. How to Scan and Analyze Surroundings with the LIDAR Scanner on IPad Pro - Discover
- Apple Developer. https://developer.apple.com/news/?id=qwhaoe0x. Accessed 2 May
2022.
9. “See in Complete Darkness with Touch.” YouTube, 20 June 2020,
www.youtube.com/watch?v=8Au47gnXs0w. Accessed 1 May 2022.
10. C. Ton et al., "LIDAR Assist Spatial Sensing for the Visually Impaired and Performance
Analysis," in IEEE Transactions on Neural Systems and Rehabilitation Engineering, vol.
26, no. 9, pp. 1727-1734, Sept. 2018, doi: 10.1109/TNSRE.2018.2859800.
Download