Design of an Improved Electronics

advertisement
Design of an Improved Electronics
Platform for the EyeRing Wearable
Device
by
MASSACHUSET$rs M
OF TECHNOLGy
Kelly Ran
JUL 15 2014
S.B. E.E., M.I.T 2012
LIBRARIES
Submitted to the Department of Electrical Engineering and Computer Science
in Partial Fulfillment of the Requirements for the Degree of
Master of Engineering in Electrical Engineering and Computer Science
at the Massachusetts Institute of Technology
September 2013
@Massachusetts Institute of Technology. All rights reserved.
The author hereby grants to M.I.T. permission to reproduce and to distribute
publicly paper and electronic copies of this thesis document in whole and in part
in any medium now known or hereafter created.
Signature redacted
Author:
Department of Electrical Engineering and Computer Science
September
27
, 2013
<:-_ Signature redacted
Certified by:
Pattie Maes, Alexander-W: ~reyfoos Professor of Media Technology, MIT Media Lab
Thesis Supervisor
September 27, 2013
A
t
Signature redacted
Accepted by:
Albert R. Meyer
Chairman, Masters of Engineering Thesis Committee
E
Design of an Improved Electronics
Platform for the EyeRing
Wearable Device
by Kelly Ran
Submitted September 27, 2013 in Partial Fulfillment of the
Requirements for the Degree of
Master of Engineering in Electrical Engineering and Computer Science
Supervisor: Professor Pattie Maes
Abstract
This thesis presents a new prototype for EyeRing, a finger-worn device equipped
with a camera and other peripherals. EyeRing is used in assistive technology applications, helping visually impaired people interact with uninstrumented environments. Applications for sighted people are also available to aid in learning, navigation, and other tasks. EyeRing is a wearable electronics device with an emphasis
on natural gestural input and minimal interference. Previous prototypes used assemblies of commercial-off-the-shelf (COTS) control and sensing solutions. A more
custom platform, consisting of two printed circuit boards (PCBs), peripherals, and
firmware, was designed to make the device more usable and functional. Firmware
was developed to ameliorate the communication capabilities, microcontroller functionality, and system power use. Improved features allow the pursual of previously
unreachable application spaces. In addition, the smaller form factor increases usability and device acceptance. The new prototype improves power consumption by
X, volume by Y, and throughput by Z. Video input is now available, etc.
4
Acknowledgements
I would like to thank Pattie Maes for the wonderful opportunity and resources to
work on EyeRing. Thank you for welcoming me into your group and sharing your
advice and time. I truly value the huge learning experience that you have enabled
and the environment you have created in the Fluid Interfaces group.
Thanks to Roy Shilkrot for guidance and encouragement. I deeply appreciate the
energy that you've put into helping me. Thanks to the entire EyeRing team for your
previous work and current advancements. Thanks to Cassandra Xia for problemsolving advice and for being a superlative officemate.
Thanks to Steve Leeb and the Course 6 department for providing a TAship from
which I learned a lot.
Thank you to my family and friends, whose undying love and support have sustained
me through my time at MIT. Thank you to my labmates, solar car teammates, and
wolf pack buddies for the good times. Thanks to the MIT community for teaching
me the meanings of IHTFP.
5
6
Contents
Abstract
3
Acknowledgements
5
List of Figures
9
List of Tables
11
1
2
3
Introduction
1.1 Motivation . . . . . . . . .
1.2 EyeR ing . . . . . . . . . .
1.2.1 EyeRing motivation
1.2.2 Applications . . . .
1.3 Project objectives . . . . .
1.4 Chapter summary . . . . .
. . . . . . . . . .
. . . . . . . . . .
and basic system
. . . . . . . . . .
. . . . . . . . . .
. . . . . . . . . .
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
13
13
15
15
16
18
18
Background
2.1 Previous work in wearable electronics . . . .
2.1.1 Existing ring-like and seeing projects
2.2 Current solutions for the visually impaired .
2.3 EyeRing background . . . . . . . . . . . . .
2.4 Chapter summary . . . . . . . . . . . . . . .
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
21
21
22
24
27
28
.
.
.
.
.
.
.
.
.
.
29
29
30
30
31
33
35
35
36
37
38
Prototype design
3.1 Design preliminaries . . . . . . . . .
3.2 Electrical design . . . . . . . . . . . .
3.2.1 Form factor . . . . . . . . . .
3.2.2 Energy budget . . . . . . . .
3.2.3 Data acquisition and transfer
3.2.4 Peripheral inputs and outputs
3.2.5 Control and firmware . . . . .
3.3 Mechanical design . . . . . . . . . . .
3.4 Design phase conclusions . . . . . . .
3.5 Chapter summary . . . . . . . . . . .
7
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
Contents
8
4 Prototype implementation
4.1 Debug board ..............................
.............................
4.2 Final prototype .......
4.3 A note on component selection . . . . . . . . . . . . . . . . . . . . .
4.4 Chapter summary . . . . . . . . . . . . . . . . . . . . . . . . . . . .
41
41
43
47
48
Firmware implementation and discussion
5.1 Functional overview . . . . . . . . . . . . . . . . . . . . . . . . . . .
5.2 More firmware details . . . . . . . . . . . . . . . . . . . . . . . . . .
5.3 Chapter summary . . . . . . . . . . . . . . . . . . . . . . . . . . . .
49
49
51
52
6 Evaluation
6.1 Performance and Discussion . . . . . . . . . . . . . . . . . . . . . .
53
53
7 Conclusions and suggestions
7.1 Conclusions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
7.2 Suggestions for further work . . . . . . . . . . . . . . . . . . . . . .
7.3 Sum m ary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
55
55
55
57
A EyeRing schematics and layouts
59
Bibliography
65
5
List of Figures
1.1
1.2
1st-generation EyeRing . . . . . . . . . . . . . . . . . . . . . . . . .
lst-gen EyeRing system block diagram . . . . . . . . . . . . . . . .
16
17
2.1
2.2
2.3
2.4
2.5
2.6
Honeywell 8600 Ring Scanner [1]. . . . . .
Sesame RFID ring [2]. . . . . . . . . . . .
Fingersight prototype [3] . . . . . . . . . .
OrCam product with glasses [4]. . . . . . .
Close-up of refreshable Braille display [5]. .
Software architecture for first-gen EyeRing
. . . . . .
. . . . . .
. . . . . .
. . . . . .
. . . . . .
[Shilkrot].
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
23
23
24
24
26
28
3.1
3.2
3.3
3.4
3.5
2nd-gen EyeRing system block
lst-gen white EyeRing . . . .
Clay model (target volume). .
EyeRing battery . . . . . . . .
1st-gen device block diagram .
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
30
31
31
33
34
4.1
4.2
4.3
4.4
4.5
4.6
Debug board ........
..............................
Omnivision OV9712 camera. . . . . . . . . . .
Front view of camera module. . . . . . . . . .
Rear view of camera module. . . . . . . . . .
Side view of the new boards. . . . . . . . . . .
Other side view of the new boards. . . . . . .
.
.
.
.
.
41
42
44
45
45
46
4.7
Top PCB ("A") stack-up.
. . . . . . . . . . . . . . . . . . . . . . .
47
4.8
Bottom PCB ("B") stack-up.
. . . . . . . . . . . . . . . . . . . . .
47
5.1
Functional block diagram for firmware. . . . . . . . . . . . . . . . .
50
6.1
Prototype on finger.
53
diagram
. . . . .
. . . . .
. . . . .
. . . . .
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
. . . . . . . . . . . . . . . . . . . . . . . . . .
9
List of Tables
3.1
3.2
3.3
Sample battery energy densities. . . . . . . . . . . . . . . . . . . . .
Selected components . . . . . . . . . . . . . . . . . . . . . . . . . .
Theoretical power losses of selected components . . . . . . . . . . .
32
37
38
6.1
Comparison of EyeRings . . . . . . . . . . . . . . . . . . . . . . . .
54
11
Chapter 1
Introduction
This chapter covers the project's motivation and objectives.
1.1
Motivation
This project is motivated by the potential impact of wearable electronics. As electronic components have diminished in size and technology has advanced, the use of
mobile devices has exploded. Wireless device penetration in the United States (US)
is 102% [6], so it is reasonable to assume that most of the US population use wireless
mobile devices (as some people have multiple devices). As widespread as they are,
mobile phones have a computing interface whose main modalities are leftover from
the personal desktop computer: users type on a keypad interface for input and see
information on a visual display. Modern technology has enabled other interesting
paradigms, and we explore that of gestural input for wearable electronics.
In the past two decades, intentionally wearable devices (such as headsets, smart
watches, and augmented glasses) have been prototyped and made available as consumer devices. Currently, a host of wearables exist that bring to life the artifacts1
that have been hypothesized for decades [7]. These devices have myriad applications and fall under the overarching goal of improving human capabilities. Many
areas of exploration exist in the design space. We are motivated by the search for
a wearable that is relatively seamless to use. We also are motivated by the natural
gestural language that humans use.
'Tools or devices that augment human intelligence
13
Chapter 1. Introduction
14
Qualities that we seek in a wearable are: natural, using gestures and commands
that humans naturally intuit; immediate, which means instantly on-demand as
opposed to accessible after unlocking a screen and navigating to an app; nondisruptive, so that we may use our senses and bodies unhindered, especially our
sight and our hands; and discreet, to increase portability, mobility, and user acceptance.
This project was carried out in the Fluid Interfaces group of the MIT Media Lab.
The goal of the group is to explore new ways that people can interface and interact with their everyday environments. Areas of interest include wearable devices,
augmented reality, and intellect augmentation. The EyeRing project is creating a
wearable device that has both universal and niche applications. Our original prototype was designed as assistive technology for the visually impaired. Shilkrot and
Nanayakkara were motivated to help the blind "see" visual information that sighted
people take for granted. 2 Reports show that many VI individuals cannot perform
tasks that are required to function as independent adults. For instance, many VI
people are not able to manage their finances independently or even withdraw cash
from an ATM. Worldwide, there are 285 million visually impaired people. Of them,
39 million are blind [8] and could benefit from our work.
There are two main areas of text-based information that the blind must interpret.
The first area is reading published text in paper or digital form. The second area
of interest is reading text in everyday contexts, like in public transportation maps,
food & medicine labels, and restaurant menus. In both areas, studies have shown
that blind people overwhelmingly have difficulties in accessing relevant information.
Text-based interpretation tools exist, but many either are cumbersome to use, are
not mobile, or cannot parse all of the types of information available. The Royal
National Institute of Blind People found that out of the 500 visually impaired people surveyed, only 23% managed their finances completely independently. Various
activities eluded their grasp: for example, 45% of the people "experienced difficulties with distinguishing between bank notes and coins." One person reported that
"because [ATMs] are different therefore I'm unsure what service I am asking for
because I cannot see what I am pressing on the machine" [9].
2
"Visually impaired" (VI) refers to those with varying degrees of eyesight degradation. Our
assistive technology research is relevant mainly to the completely blind and those with severe sight
impairment. In this paper, the terms "visually impaired" and "blind" are interchangeable and
refer to the subset of the population who could benefit from EyeRing.
Chapter 1. Introduction
15
Not only do blind people have disadvantages in reading text, but they also have
issues with interpreting other visual information like colors, depth, and object locations. For example, a 2012 interview with a blind user revealed that in order to
select a shirt to wear, the user needed to unlock his mobile phone, pick the right
application, and use the phone's camera to find out the color of the shirt [10].
Thus, we see the need for a device to aid the blind, because the world requires much
visual sense to interpret. Existing solutions can be very difficult to use. For example,
many VI people have difficulting aiming devices correctly at text, such that their
results are subpar due to poor focusing or centering. A finger-mounted, discreet,
feedback-enabled device could help the blind interface with their environments more
easily and parse more information that would otherwise be unavailable to them.
Finger pointing has been shown to be a natural gesture in many cultures [11].
Despite the universality of the gesture, relatively few finger-worn user interfaces
have been developed.
EyeRing addresses the opening in this design space. The
EyeRing project also investigates how devices can provide just-in-time information
without being obtrusive and without impeding the user's natural experience.
This author was motivated by the idea that custom hardware could greatly increase
the functionality and user acceptance of EyeRing, and could erode the disadvantages
that the blind face.
1.2
EyeRing
The EyeRing project aims to use pointing gestures to make multimodal input more
intuitive and natural. EyeRing was originally developed to aid the blind in reading
visually communicated information, like price tags and product labels. We are now
exploring applications for sighted people as well.
1.2.1
EyeRing motivation and basic system
The basic EyeRing system, first conceived in 2012, consists of the EyeRing device
and a host that performs application processing. The EyeRing device consists of
a microcontroller, an wireless communication module, peripheral I/O, and energy
storage. The previous implementations ("first-generation") have used the Teensy2
Chapter 1. Introduction
16
Arduino-based microcontroller, the RN-42 Bluetooth module, an Omnivision-based
camera module with image compression engine, a push button, and a lithiumpolymer battery with a Sparkfun control board. A side view is shown in Figure
1.1.
Camera
Battery, power
management,
microcontroller
FIGURE 1.1: A side view of a first generation EyeRing implementation.
The push button and camera are input peripherals whose data are sent via Bluetooth
to the host, which can be a mobile phone or a desktop computer. The host then
uses the input data to gather information and send relevant output to the user. In
some applications, the host also uses its microphone and parses spoken inputs from
the user. The host typically outputs audio speech that the user can hear. A system
block diagram is shown in Figure 1.2.
1.2.2
Applications
EyeRing's assistive technology applications have been demonstrated to help the
visually impaired, especially the completely blind, do everyday tasks more easily.
Applications have been developed for identifying currency and supermarket items,
reading price tags, and copying and pasting text. User studies showed that these
applications helped the blind in shopping and using computers without the need
for assistance [10]. Currently the EyeRing team is also working on a text reading
application for published documents and everyday items (like printed books and
financial statements). We believe that a two-pronged approach (object recognition
& text reading) will be most useful the the blind.
Chapter 1. Introduction
17
User
Aural feedback
Input button clicks and
pointing at target
EyeRing
Device:
Host:
Otiec recognition and
database look-up
Image capture
imnage
data
FIGURE 1.2: EyeRing system block diagram for previous implementations.
Because EyeRing's technology can be useful to many subsets of the population,
we are developing applications for sighted people. People use the pointing gesture
when querying objects, as in "what is this?" "how does it work?" or when drawing
attention to an extraordinary object or event: "hey, look over there at the sunset!"
Pointing is also used to represent spatial reasoning: "move that box to the left."
When the pointing gesture is combined with EyeRing's computational abilities, we
can imagine useful applications. The host processor could hypothetically perform
object and character recognition, mine databases, acquire and upload information
from the cloud, store inputs into folders or documents, communicate with other
people, and more.
We reasoned that learning is a core application of EyeRing, because pointing is
a canonical inquiring gesture, and retrieving educational information can be done
with the host processor. We also identified navigation and "enlightened roaming"
as another key application. Learning about local landmarks and buildings could
be useful for both tourists on a mission and inquisitive explorers who just want to
play. Finally, we recognized life and health logging as other potential applications.
Taking images and recording motion, users could capture important life events as
well as mundane meals to compose a portofolio of their lives.
18
Chapter 1. Introduction
We found the learning and navigation applications to be more technically interesting, so we are pursuing those. Earlier in 2013, an application was been developed
for learning to read musical scores: the application reads sheet music, plays the
notes audibly, and shows the user which piano keys correspond to the music,. We
are now working on an application for indoor and outdoor navigation, which will
supply just-in-time information based on location and gestures.
1.3
Project objectives
Due to the limitations in hardware, first generation EyeRing devices can be improved upon greatly. The Teensy's 8-bit microcontroller has limited functionality
and is a bottleneck in data acquisition and transfer. In addition, the Teensy implementation has unreliable latency in running firmware.
The RN-42 Bluetooth
module ships with a software stack that limits the throughput of the communications. Additional peripherals, such as motion capture sensors, could be added
to track the user's finger speed and acceleration and to expand the gestural input
types. The device could be made smaller to increase user acceptance, and its battery life (and general energy budget) could be improved. Many of the limitations
described can be resolved with custom hardware.
The primary objective of this project was to create a more custom hardware implementation ("second-generation") of EyeRing. Many of EyeRing's hypothesized
use cases would be furthered and enabled by a second-generation device. Project
benchmarks were: to demonstrably improve upon EyeRing's size, energy budget,
and data capability; to develop a robust firmware suite; and to assist in the development of user-side applications. The EyeRing improvements could be made by
substituting some or all of the COTS modules with custom PCBs. These custom
boards would use components that were chosen to fulfill EyeRing's performance and
efficiency needs.
1.4
Chapter summary
Wearable electronics have huge potential to influence human society, and we are
particularly motivated by creating an intuitive, gestural wearable. Such a device
Chapter 1. Introduction
19
has many possible uses to help us learn new skills, navigate areas and discover
places, and record life & health metrics. It could also greatly ameliorate the lives of
blind people and narrow the gap between what sighted and what blind people can
"see."1
EyeRing, this project's device, has been prototyped before and shown as a proofof-concept for assistive technology for the blind. It has also been implemented as
a learning tool for sighted people. EyeRing's full potential could be explored with
a better design, which is the goal of this research. We envision EyeRing as a tool
for both organized and whimsical learning. One could use EyeRing in a classroom
setting to learn to read music, and then go outdoors to explore and learn about
nature. In both scenarios, EyeRing could snap memories and salient tidbits of one's
experiences.
Chapter 2
Background
Relevant wearable electronics projects are presented.
Devices that are similar in
form and function to EyeRing are listed. Limitations and tools pertaining to the
visually impaired are covered. Previous work on EyeRing is explained.
2.1
Previous work in wearable electronics
As discussed in the previous chapter, we are interested in creating a gestural wearable device with the following qualities: natural, immediate, non-disruptive, and discreet. The purpose of creating such a wearable is to further the cause of intelligenceaugmenting devices. Our dream for such devices is to empower humans with joy,
purpose, and experiential fulfillment.
As EyeRing is at the intersection of several fields (i.e. gesture-based, finger-worn,
wearable, intellect-augmenting devices), we will explore previous work in those related categories from the recent past.
The pointing gesture was used in Bolt's Put-That-There, an MIT project where
a user could create and manipulate shapes by speaking and pointing. Put-ThatThere would parse the user's speech, recognize the pointing direction, and display
the environment with a room-size projection and several computer monitors [12].
In the 1990's, emerging technologies enabled researchers to create the first wave of
wearable devices. MIT's Starner and Rhodes explored augmented memory with the
Remembrance Agent, which used a field-of-view display to show search results of
21
Chapter 2. Background
Chapter 2. Back qround
22
22
emails and past documents. Adding a head-mounted camera to this display allowed
the user to point at display options with her finger [13]. Thus, a mobile, wearable
device to show relevant just-in-time information was born. A decade later, MIT's
Maes and Merrill created the Invisible Media project, which provided a hands-free
audio "teacher" or "recommender" to the user, who could point at or manipulate
queried objects [14]. The user could wear a ring and point at instrumented scenarios
or objects. One application was My-ShoppingGuide, where users could experience
personalized suggestions for food and health products. Another application was
Engine-Info, where users could learn about internal combustion engine components.
Invisible Media used Rekimoto's two guidelines that wearable technologies should
be hands-free and socially acceptable.
Currently, augmenting mental and physical intelligence is manifested in many ways:
enforcing habits, providing just-in-time information, recording health metrics, and
life logging. Some well-known recent devices use augmented reality to display information to users. Mistry's Sixth Sense uses projected images along with a camera to
capture hand gestures, and Google Glass uses a heads-up display. Smart watches
are gaining in popularity and include Pebble and the Samsung Galaxy Gear. These
watches do away with the inconveniences of mobile phones. Health monitoring devices include Misfit Shine, Fitbit products, and Nike Plus. Life logging projects
include Sensecam and Memoto.
2.1.1
Existing ring-like and seeing projects
We have surveyed solutions that are similar to EyeRing in form and/or function.
Existing ring-like devices typically perform single functions. Of these devices, the
optical sensing types use the pointing gesture. Logisys, Mycestro, and Brando are
among those who produce optical finger mice for interfacing with computers. Typically the sensor camera faces distally, and a scrollwheel is mounted on the mouse's
side for thumb scrolling.
Various manufacturers produce finger-mounted optical
barcode scanners for package handling and inventory needs. These devices are usually quite large and include a thumb-actuated button. See Motorola's RS-1 Ring
or Honeywell's 8600 Ring Scanner (shown in Figure 2.1). Sesame Ring, from the
MIT-based startup Ring Theory, uses a finger-mounted RFID tag to interface with
public transportation card readers.
This device allows seamless subway (MBTA
Chapter 2. Background
23
"T" trains) access by tapping the ring against a subway entrance kiosk. See Figure
2.2 for a product rendering.
FIGURE 2.1: Honeywell 8600 Ring Scanner [1].
Ubi-finger was a Japanese project that used a sheath-like finger-worn device to
remotely control objects. By pointing and clicking at light switches, audio speakers,
and other devices, the user could turn on/off and adjust the device controls.
Fingersight used a camera and a vibration motor to help the visually impaired sense
and identify objects in a 3-dimensional space. Explored by researchers at Carnegie
Mellon University, Fingersight provided haptic feedback to users when the device
"saw" object edges or features from a database. Fingersight could also allow the user
to control a simple interface, like a light switch, after identifying it [3]. Fingersight
used a harness to connect to a host processor (as seen in Figure 2.3).
OrCam is an Israeli product that parses visual information from a head-mounted
camera (see Figure 2.4). A host processor, wired to the headpiece, performs object
recognition and generates speech, which is then conveyed to the user through a
FIGURE
2.2: Sesame RFID ring [2].
24
Chapter 2. Background
Chapter 2. Background
FIGURE
24
2.3: Fingersight prototype [3].
FIGURE 2.4: OrCam product with glasses [4].
bone-conduction earpiece [4]. OrCam uses an object database and has implemented
machine learning so the device can add to its database.
2.2
Current solutions for the visually impaired
As explained in the previous chapter, blind people need more ways to access visually
conveyed information. This is due to the following limitations:
" Blind-friendly publishing standards for digital multimedia are not universally
used.
" Not all digital media can be converted into blind-friendly formats anyways,
due to legal issues.
" Even when media are converted to accessible formats, solutions (like software
or devices) can be cumbersome and difficult to use.
Chapter 2. Background
25
* Not many solutions exist to tackle other forms of visual data, like colors,
depth, object proximity, etc.
Many popular e-books and documents are available in blind-accessible formats such
as Braille, large text, and audio description (AD). The advent of electronic readers
and portable Braille readers has helped disseminate such formats. According to the
Royal National Institute of Blind People (RNIB), an advocacy and research group
in the U.K., 84% of the top ten thousand most popular e-books were fully accessible
to the blind [15]. However, the World Blind Union estimates that only 1%-7% of
worldwide published books are converted to blind-accessible formats [16].
Thus,
less popular books, textbooks, scholarly articles, and other documents are largely
unavailable to the blind.
Copyright restrictions are a hindrance to converting documents blind-accessible formats. For instance, the 1996 Chafee Amendment allows copyright-protected literary documents to be converted by non-profit organizations. However, this is still
limiting because such organizations have relatively low bandwidth, and converted
documents cannot be shared with people in other countries (save for inter-library
loan agreements) [17]. More recently, The World Intellectual Property Organization (WIPO), an agency of the United Nations (UN), passed a treaty that bypassed
copyright law to benefit blind access. Some nations (including the United States)
opted not to sign the treaty [18]. Even if the US were to sign and ratify the treaty, it
is not guaranteed that the treaty would be implemented fairly, given the anti-treaty
stance taken by powerful lobbies like the Motion Picture Association of America
(MPAA) [19].
Existing solutions, like scanners and software to read documents, are useful. However, surveyed blind users commented that many such softwares can be inflexible,
reading from the top to the bottom of the document whilst the user only wanted
to scan sentences throughout the page. In addition, some books are not available
online. Furthermore, the RNIB reports that the elderly blind tend not to use computers at all (12% use computers in the age group over 75) [20].
Format conversion solutions exist and can help. However, as explained in the previously, many documents cannot be converted to blind-accessible formats due to
legal restrictions.
Text conversion methods are also limited by publishing stan-
dards and technologies. One of these standards is DAISY' Digital Talking Book,
'Digital Accessible Information System
26
Chapter 2. Background
which incorporates MP3 and XML data with electronic multimedia, like books or
magazines. DAISY files can be easily parsed into audio, large text, or refreshable
Braille. EPUB3 is a digital publishing standard that, among other things, includes
DAISY-compatible features. For example, EPUB3 calls for mathematical equations
to be embedded in XML format instead of as images. This way, the equations can
be read out to blind users. EPUB3 has been hoped to be a universal publishing
format for digital multimedia. However, many publishers prefer to use proprietary
formats to lock in customers [21]. In addition, application and device manufacturers don't always comply: among Adobe Digital Editions, Amazon Kindle, Apple
iBooks, CourseSmart, Google E-Books, Nook, Safari Books Online, and many other
readers and apps, not one has complete EPUB3 compatibility [22]. Only the future
will tell if EPUB3 will catch on universally. Until then, format compatibility will
be a recurring thorn in blind users' sides.
When blind-friendly file formats are used, they often work in tandem with screen
readers or Braille devices. Screen readers, which are programs that use speech synthesizers to read text, include JAWS, HAL, Window-Eyes, and Blio. Some e-readers
and mobile devices have built-in speech synthesizers, like Apple iOS' VoiceOver.
Plugging into computers or mobile devices, refreshable Braille displays can convert
digital text into Braille. These displays use moveable pins to reconfigure the Braille
surface that users can read.
Figure
2.5 shows the array of pins that form the
Braille characters. Braille converter software, like Braillemaker and Dolphin EasyConverter, convert digital text into digital Braille that can then be embossed onto
paper. Braille labelers allow users to mark everyday items for identification (e.g.
to mark prescription medications for differentiation). Labelers like 6dot can work
with a keyboard interface, allowing ASCII-to-Braille translation, or can work with
a 6-button interface to explicitly call out the Braille dot configuration.
FIGURE 2.5:
Close-up of refreshable Braille display [5].
27
Chapter 2. Background
Many natural environments contain visual information that has not been addressed
widely. The RNIB reports that 77% of non-shopping blind people do not go shopping because they have "difficulty seeing prices or reading labels" [201. An RNIB
survey on medicine labels found that a majority of surveyed VI people would use
a hypothetical pen-like device that could read labels, as Braille labelling can be
inconsistent in quality. (For example, 73% of the 120 surveyed said that they had
experienced Braille labels that had been covered up by other labels, making the
Braille unreadable [23]. OrCam, mentioned earlier in this chapter, is a new product
that aims to recognize objects. Fingersight, the CMU research project, can identify
depth and textures, but no existing commercial product has similar functionality.
Finally, no known products combine object recognition, text reading, and sensory
information.
2.3
EyeRing background
As described in the previous chapter, EyeRing came about by thinking of fingerpointing scenarios.
With the first-generation EyeRing, several applications were
developed. An EyeRing ShoppingAssistant application had two modes to help blind
people shop: CurrencyDetector to identify bills of money, and TagDetector to read
UPC barcodes on product labels.
A user group of 29 blind people commented,
almost unanimously, that the device was very easy to use and would solve some of
their everyday challenges. In a supermarket, a blind user could use the EyeRing to
choose a package of Cheez-Its out of shelves of morphologically similar foodstuffs.
The software architecture for ShoppingAssistant is shown in Figure 2.6. Another
application, DesktopAssistant, aided sighted people in "copying" images and text
from the environment and then pasting the visual data into a digital document.
Most of the surveyed users agreed that EyeRing was useful in their figure and text
insertion.
FingerDraw, a project spun off from EyeRing, is a joint research project from MIT
and the Singapore University of Technology and Design (SUTD). Using the EyeRing hardware, FingerDraw can capture colors and textures from the physical world.
Users can later use the captured palette in their tablet-drawn artwork, thus keeping natural environments connected to digital creations.
FingerDraw recalls the
Chapter 2. Background
28
Android OS / IPhone OS
Computer
Vision
IEngncurrency]
Image
Blue-
Speech
Synthesis
9
&j
ReO.-
Button
controller
tooth
Module
~
State
WidwOS/McS
Computer
Vision
Engine
mae
Operating
System
Screen
info.
Clhpboard
Track
Services
Cursor
Button
FIGURE
2.6: Software architecture for first-gen EyeRing [Shilkrot].
selection and mixing of oil paints for artwork and also draws inspiration from fingerpainting. FingerDraw is mainly used as educational technology for children to
stimulate creativity and to encourage natural, multisensory experiences [24].
More recently, joint research from MIT and SUTD explored feedback modalities
for EyeRing prototypes. This ongoing research analyzes the effects of using haptic
feedback (vibromotors) in different spatial and temporal configurations.
Although EyeRing has received overwhelmingly positive feedback, many obstacles
stand in its way to becoming a truly usable wearable.
The hardware prototype
could use revision, and the host application software could also be expanded and
improved.
2.4
Chapter summary
In this chapter, we first discussed wearable electronics and intelligence-augmenting
technology. A brief history of wearable electronic devices similar to EyeRing were
covered. The state-of-affairs and technologies for blind people were described. Finally, previous work in the EyeRing project was presented.
Chapter 3
Prototype design
This chapter presents the design considerations for the new Eyering prototype. Electrical, mechanical, and firmware specifications are discussed, and design decisions
are reviewed.
3.1
Design preliminaries
Our high-level concept for the second-generation EyeRing is to support powerful
and valuable applications for sighted and blind people. The requirements for such
applications drive the electrical specifications.
Many of the current Eyering ap-
plications could be improved with better electrical characteristics, and previously
unreachable applications could be implemented. The main improvement fields are
form factor, energy budget, and data transfer rate.
Additional desired features are gesture and motion capture, audio output, and directional (compass) capabilities. With previous Eyering prototypes, users reported
that it was difficult to aim the EyeRing camera properly. To address that concern,
we add two feedback modalities to the second-generation EyeRings. First, a set
of vibration motors guides blind users in text reading. Additionally, a laser diode
beam shows sighted users where they are aiming the camera. We also added a motion processing unit (MPU), consisting of a 3-axis accelerometer and a gyroscope.
This unit captures gestural input.
29
Chapter 3. Prototype design
30
30
Chapter 3. Prototype design
/
User
Haptic and vsu
feedback
Aural feedback
Input button clicks and
pointing at target
Image and gesture
data
EyeRing
Device:
Host:
Object recognimo and
database took-up
Image and gesture
capture
'q--
Haptic and visual
feedbackW
FIGURE 3.1: EyeRing system block diagram for new design.
The new EyeRing system is much like the previous design. Note in Figure 3.1 that
the new design includes haptic and visual feedback, and that the EyeRing device
captures gestural data in addition to images and button presses.
3.2
3.2.1
Electrical design
Form factor
Obviously, size and shape are major factors when users decide to use a wearable
technology.
Previous Eyering iterations have been roughly 3.25 cubic inches in
volume and have assumed both block-like (Figure 3.2) and aerodynamic shapes.
The battery is one of the most volumetric components of the electrical system.
There is an clear and inherent tradeoff that the smaller the battery, the less charge
it holds. Thus, given a certain battery chemistry, a smaller and more compact design
would have less energy available to power the electronics. The design assumes an
energy budget for the entire device, calculates a necessary battery charge capacity,
31
31
Chapter 3. Prototype design
Chapter 3. Prototype design
FIGURE 3.2: A first generation EyeRing with white housing.
and then chooses the smallest possible battery. The energy budget is discussed in
more detail in Section 3.2.2.
The peripherals (sensors and input) and microcontroller components were chosen to
be as small as possible. For shape and usability, the obvious requirements are that
the camera must face a useful direction and the input devices must be intuitive
and easy to trigger. We found that actuating input buttons with our opposable
thumbs was the best method. It was determined that a forward-facing camera and
side-mounted input button would be best. It suffices to say that care was taken to
minimize volume and area when selecting components and laying out the 6-layer
PCB.
We used previous mock-up clay models to figure out a reasonable target volume.
The clay model is shown in Figure 3.3.
FIGURE
3.2.2
3.3: Clay model (target volume).
Energy budget
Currently, users expect to be able to use personal electronics for at least a few hours
on a single battery charge. The first generation EyeRing design lasts for about one
Chapter 3. Prototype design
32
hour with moderate camera use (i.e. roughly one camera shot per second). We aim
for the second generation to last an order of magnitude longer (and thus be able to
support moderate use for a full work day).
The energy budget accounts for: stored energy (battery), consumed energy (by
the electronics), and gained energy (through harvesting techniques).
Harvesting
techniques such as photovoltaic, thermal gradient, piezoelectric, and radio frequency
harvesting were investigated and deemed suitable for perhaps future generations of
EyeRing. The added bulk and complexity of such harvesting systems are not worth
the amount of energy available for harvest.
Stored energy can be approximated by battery charge capacity (Amp-hours) multiplied by nominal battery voltage (Volts) to result in Watt-hours.
This is an
approximation because the battery voltage is follows a nonlinear curve through its
discharge. Watt-hour energy divided by required usage time (in hours) gives the
maximum average power that can be supplied by the battery. Because EyeRing is
a wearable device, we seek a battery that is energy-dense in terms of volume and
mass. In addition, the battery must be able to deliver enough current to supply the
Bluetooth module's power-hungry transmissions. Typical characteristics for different battery chemistries are shown in Table 3.1. As shown, lithium-ion (Li-ion) and
lithium polymer (LiPo) battery cells have relatively high energy density.
TABLE
Manuf. or
vendor
Part number
Mass
(g)
Volume
3
(cm )
Energy
(W-h)
Energy per
unit mass
(W-h/g)
Energy
per
unit
volume
Notes
(W-h/cm 3
)
Chemistry
3.1: Sample battery energy densities.
Lithium
ion
Lithium
polymer
Lithium
ion
Panasonic
NCR18650B
47.5
15.5
12.06
0.25
0.78
I
Cylindrical. Too large
for EyeRing.
Pouch. Chosen for EyeRing
Coin cell.
Primary,
insufficient
current
SparkFun
PRT-00731
2.65
1.9
0.4
0.15
0.21
Panasonic
CR2032
2.81
1
0.75
0.27
0.75
Lithium
Infinite
MEC202
0.98
0.23
0.01
0.01
0.04
sourcing.
Thin-film.
thin-film
Power
Solutions
Infinite
3.4
rent capabilities, good
for energy harvesting.
Available 2014; coin
"High En-
ergy Cell"
Power
Solutions
N/A
N/A
0.1
0.34
N/A
High cur-
cell form factor.
The Sparkfun lithium polymer batteries were chosen due to their electrical characteristics, ease of acquisition, and rectangular form factor (for easy packing into a
ring housing). The Panasonic 18650 form factor cells have high energy density and
are used in laptops and high-performance electric cars, but are too large to use in
Chapter 3. Prototype design
33
33
Chapter 3. Prototype design
FIGURE 3.4: The lithium polymer battery used for both 1st- and 2nd-generation
EyeRing devices.
EyeRing. The Panasonic CR2032 and other similar coin cells are not rechargeable
and cannot source enough current. The IPS MEC202 is great for energy harvesting
(due to its ability to withstand large voltages and charge at high currents) and may
be a good candidate for future EyeRing prototypes. The IPS High Energy Cell is
a coin cell replacement that is extremely energy-dense. As of summer 2013, it was
not available, but may be good to keep in mind for the next EyeRing generation.
We did consider other battery chemistries, like zinc-air batteries for hearing aids,
but chose to stay with lithium polymer.
In the final 2nd-generation prototypes, we chose Linear Technology's LTC3554
Power Manager IC for charging the battery via USB and for controlling two buck
converters that produced 1.8V and 3.3V. We also added a resistor divider as an input to one of the microcontroller's ADC pins. This divider allowed the measurement
of the battery voltage for telemetry purposes.
The following sections describe the selection of other ring components, and the
theoretical energy budget for these components is at the end of this chapter.
3.2.3
Data acquisition and transfer
Previously in first-generation devices, EyeRing was capable of sending VGA-resolution
images at a maximum of 1 per minute. As seen in Figure 3.5, potential bottlenecks
were: camera processing rate, camera-microcontroller SPI rate, microcontroller processing, microcontroller-Bluetooth UART rate, and Bluetooth module limitations.
The theoretical data capability of the RN-42 Bluetooth module is 240kbps, using
the Serial Port Profile (SPP).
1
'Over-air throughput is roughly 1-2Mbps.
34
Chapter 3. Prototype design
1'-generation EyeRing device
Module
Cameray2
Cenwa
KcoanbtIfer
ART
-
RN-42 Bluetoolth
Module
GPIO
Push
button input
Not shown: Battery and power
management board
FIGURE 3.5: First generation device block diagram.
For the new Eyering, we aim for VGA-resolution video at 10fps or higher. This
gives us a required data rate of 9.21megabytes per second, uncompressed 2. JPEG
compression typically reduces image data by an order of magnitude, so we aim
for roughly 1 megabyte per second. We also require an embedded system that
can move large chunks of data at this rate or higher. (Thus, a microcontroller or
microprocessor with Direct Memory Access (DMA) capabilities is desirable.)
Many applications require data rates in the order of 1-10Mbps. Technologies that
allow that are WLAN, Bluetooth, Zigbee, NordicRF, etc. Many COTS modules
limit packet size, etc, and are not customizable. The data transmission technology
must be broadly applicable and ported. Therefore, proprietary technologies like
Zigbee and NordicRF were ruled out because they require the host processor to be
equipped with a non-standard transceiver. WLAN and Bluetooth were compared in
terms of power consumption, device range, and complexity. Ultimately, Bluetooth
was chosen. Although WLAN has a much higher range, EyeRing is meant to be
paired with a nearby host device, so the required range is no more than a couple of
meters. Between WLAN and Bluetooth, Bluetooth has favorable power consumption characteristics [251. Bluetooth is a personal area network (PAN) which is more
aligned with the purpose of Eyering. In addition, the EyeRing device, like many
wearables, is paired to a personal cell phone. Bluetooth is well suited to such applications where an "accessory" device is tethered to a host. Finally, the Bluetooth
Special Interest Group (SIG) is continually working on the standard to better suit
2
RGB data is 3 bytes per pixel, and VGA resolution is 640 x 480 pixels.
Chapter 3. Prototype design
35
application needs. For example, Bluetooth Low Energy (BLE), the newest Bluetooth version, has extremely low data rates but allows devices to run on very low
power. This use model could work for some EyeRing packets, like simple telemetry
(containing battery voltage and user motion).
Among the smallest available Bluetooth modules, some came with "plug-n-play"
Bluetooth software stacks, and others required the host microcontroller to run a
Bluetooth stack. For maximum packet flexibility, we chose one of the latter Bluetooth modules, the PAN1326. The PAN1326 was also the smallest Bluetooth module we could find, and is capable of BLE as well as classic Bluetooth.
3.2.4
Peripheral inputs and outputs
The MPU added can capture 3-axis velocity and 3-axis acceleration.
The chip
can be configured to provide microcontroller interrupts upon the following events:
recognized gestures, panning, zooming, scrolling, free falling, high-G accelerating,
zero-movement conditions, tapping, and shaking.
The thumb-actuated button was kept.
The camera module was initially replaced by a different camera & compression
engine set, as described in the Debug Board section of Chapter 4, but due to the
time constraints, we stayed with the C329 camera module.
The camera module
itself is capable of taking up to 15fps.
Haptic and visual feedback modalities were added.
We designed a simple FET
circuit to control two vibromotors. We also have a similar FET circuit to control
a laser diode beam. More elaborate feedback mechanisms, like LED displays or
projectors, were considered and discarded as too large and power-hungry to be
practical.
3.2.5
Control and firmware
The previous EyeRings used the TeensyDuino 2.0, which is based on the Atmel
AVR architecture and uses the Arduino IDE and programming language. Previous
endeavors also explored the Teensy 3.0, which uses an ARM Cortex-based Freescale
36
Chapter 3. Prototype design
MK20DX128. Based on our experience we desired the new control system to have
the following attributes:
" Direct Memory Access (DMA) to allow speedy and concurrent data transfers.
" Relatively high-speed communication modules (like UART and SPI)
" Low power consumption
" Fast clocking ability
" Accessible programming software and support
" Relatively understandable firmware
/
programming options to allow the project
to be passed on to other people
For second-generation EyeRings, we considered the following solutions:
" 8-bit microcontroller, e.g. Atmel AVR
" 16-bit microcontroller, e.g. Atmel UC3
" ARM-based chips
" Programmable Logic Device-based chips, e.g. Cypress Semiconductor's Programmable System-on-Chip (PSoC)
" FPGAs, e.g. Lattice iCE
We eliminated 8-bit microcontrollers due to their lack of desirable features (e.g.
no DMA and slow clocking). PLDs and FPGAs were eliminated due to the desire
to enable project longevity and inheritance. Between 16-bit and 32-bit microcontrollers, Atmel UC3 was chosen due to its reasonably extensive support community,
its provided firmware suite, and its free development tools.
3.3
Mechanical design
The mechanical aspect of EyeRing can be broken into two components: the ring
housing and the PCB component selection, geometry, and assembly. Due to time
Chapter 3. Prototype design
37
constraints, the ring housing was not covered in the scope of this thesis and will be
designed and fabricated at a later date. The relevant PCB components (buttons,
connectors, and mounting clips) are discussed in the next chapter, as their details
are more relevant in the implementation stage.
3.4
Design phase conclusions
This chapter's discussion is summarized in Table 3.2, which shows the considered
components for the new EyeRing design. The chosen components are highlighted,
and their energy requirements are tabulated in Table 3.3.
TABLE
3.2: Selected components
Function
Component
Manufacturer
Control
AT32UC3B1256
Atmel
Bluetooth
microcontroller
PAN1326
Panasonic
(based
Image sensor
C329 module
Motion capture
MPU-6000
on
I
TI
chipset)
based on Omnivision components
Invensense
For the power consumption table, we used component datasheet information. The
battery voltage, which ranges from 4.2V to 3.6V, was approximated as 4V in places
where it was used. For instance, the PAN1326 radio runs off of the battery voltage,
so we calculated its power loss by using 4V and the datasheet-given current draw.
Its worst case power, 160mW, should be averaged over time to determine the actual
power consumption. The radio's effective duty cycle depends on how much data
needs to be transmitted, sniff interval, and packet type.
The 1.8V-3.3V level shifter draws 5/A from each of its voltage rails, so we simply
summed 1.8V and 3.3V for its calculations.
The power manager draws varying
amounts of quiescent current from the battery, depending on conditions, so we took
the worst case current, 341pA, for our calculations.
38
Chapter 3. Prototype design
TABLE 3.3: Theoretical power losses of selected components
Device
AT32UC3
B1256
Voltage
Peak
(V)
rent (mA)
cur-
Quiescent
current
Worst
case power
Best case
power
(mA)
(mW)
(mW)
3.3
N/A
15
N/A
49.5
4
40
N/A
160
N/A
1.8
N/A
1
N/A
1.8
5.1
N/A
0.005
N/A
0.03
3.3
64
20
211.2
66
3.3
4
N/A
0.341
4.1
N/A
N/A
1.36
13.53
N/A
micro
PAN1326
Bluetooth
module,
power
PAN1326
Bluetooth
module,
logic
TXBO106
level
shifter
C329 camera
MPU-6000
Power
Manager
Total
130.86
The buck converters, which supply all non-battery-sourced currents, have inefficiencies that we can estimate at no more than 20% of input power. So the worst case
buck converter efficiency is 80%. The chosen Sparkfun LiPo battery contains about
400mW-h of energy. When we include dc-dc converter efficiency and Bluetooth
transmission power, this means we theoretically run the ring for two hours on a
single battery charge. We can extend this battery life slightly by using microcontroller sleep modes. Althought this battery life does not meet our target, it is an
improvement.
3.5
Chapter summary
First, overarching design goals and ideas for the second-generation EyeRing were
presented. Then, design considerations were discussed in the areas of form factor,
Chapter 3. Prototype design
39
energy budget, wireless communication, control, mechanical assembly, and firmware.
Possibilities were shown, and the "winning" candidates were explained.
Chapter 4
Prototype implementation
Because the final second generation prototype needed to be extremely small, we
required a multilayer PCB with small form-factor components (mostly quad-flat noleads (QFN) and thin quad flat (TQFP) packages). This construction is typically
difficult to correct if there are errors, especially because many traces are internal.
Therefore, a preliminary "debug" board was created to test the circuits.
This
chapter covers the implementation of the debug board and the final prototype.
4.1
Debug board
For quick design and turnaround, this two-layer board was designed in Advanced
Circuits' PCBartist software. All surface mount components were broken out to
through-hole vias. See Figure 4.1.
FIGURE 4.1: The debug board used to test 2nd-generation circuits.
41
42
Chapter 4. Prototype implementation
42
Chapter 4. Prototype implementation
FIGURE
4.2: Omnivision
OV9712 camera.
Included in this PCB were:
" AT32UC3 microcontroller (B family). This part was a TQFP and had all pins
broken out. 8 GPIOs were used with LEDs as status & indication lights.
" PAN1326 Bluetooth module with broken out pins. An SOIC level shifter was
used to translate signals between 3.3V and 1.8V logic levels.
* Hardware for an Omnivision OV9712 camera (pictured in Figure
4.2): a
flexible flat cable (FFC) connector and an image compression engine. The
compression IC, the OV538-B88, comes in a BGA package and has a USB
interface.
Due to time constraints, we ended up not using this setup and
instead used the C329 camera module that previous EyeRings used.
The
C329 interface (SPI) is much less complex than the USB interface that the
OV538-88 uses.
" Micro-USB connector for programming and power.
* Jumper headers to allow modular testing. (Parts of the board could be electrically isolated from each other.)
" Buttons for triggering and resetting the microcontroller.
" Buck converters to go from the battery voltage to voltage rail levels. Unfortunately these circuits created saggy voltages which did not quite work. We
used different components for the final board.
Chapter 4. Prototype implementation
4.2
43
Final prototype
Based on the successes and failures of the debug board, we chose components for
the final board:
o We kept the microcontroller the same. Bypass capacitors were sized down to
0402, 0603, and 0805 (Imperial) chips.
o We kept the Bluetooth module and level shifter. The level shifter is offered
in a QFN package (as well as the SOIC used previously), so we chose QFN to
save board space.
* Due to ease of use, we chose the C329 camera module used in first-generation
EyeRings.
* Jumper headers were removed, and buttons were sized down.
o We chose a Linear Technology power management IC, the LTC3554, This
QFN chip has battery charging capabilities and also has two adjustable buck
converter controllers for creating 3.3V and 1.8V rails.
& A resistor divider was added to measure the battery voltage with an ADC on
the microcontroller.
o We added the MPU-6000 (Motion Processing Unit), a combination gyroscopeaccelerometer.
The final second-generation EyeRing was laid out in Altium Designer and fabricated
by PCBFabExpress. See Appendix A for the schematics and layouts. Because we
wanted the footprint to be small, we designed two double-sided PCBs to be stacked
vertically. The top board is known as board "A," and the bottom board is known
as board "B." We decided on 6-layer boards because they were a good compromise between cost and enough layers for shielding/routing. The board component
placement constraints were as follows:
* The camera module needed to face forward. The camera also ideally needed
to connect to board B (instead of board A) to keep the ring's profile as low
as possible.
Chapter 4. Prototype implementation
"
44
The laser diode needed to point forward. The laser beam needed to be aligned
as closely as possible to the camera's line of sight.
Because of the C329
board's geometry, we could either mount the laser diode module to the side
of the board (which would result in a laser beam that was very offset from
the camera) or we could mount the laser diode inside the "sandwich" of the
two custom PCBs (such that the beam exited through a mounting hole in the
C329 board). We chose the latter solution, pictured in Figure ??.
" A 2-row connector between boards A and B was chosen for its geometry so that
the boards would be the correct distance apart for laser mounting purposes.
" The PAN1326 Bluetooth Module needed to be placed on board A, on an edge
as specified in documentation, to minimize RF interference.
" The high-speed (USB) signal traces needed to be kept short, and preferably
sandwiched between copper planes to keep electromagnetic interference contained.
" The battery needed to be placed below all boards and components to lower
the center of gravity of the device. In addition, this allows easier access to the
boards for debugging.
FIGURE 4.3: Front view of camera module.
As seen in Figure 4.3 and 4.4, we hacked the camera lens by removing a screw hole
feature to allow the laser beam to shine through the mounting hole. We also needed
ChaDter 4. Prototuve implementation
FIGURE
45
4
4.4: Rear view of camera module.
FIGURE 4.5: Side view of the new boards.
to replace the C329's original connector (not shown) with a part that we spec'ed out
ourselves, due to lack of documentation on the C329 connector and mating parts.
Given the above constraints, we determined layer stack-up and routing details as
follows. The stack-ups can be seen in Figures 4.7 and 4.8.
* External layers (top and bottom layers) were reserved for signal routing due
Chapter 4. Prototype implementation
Chapter 4. Prototype implementation
46
46
46
FIGURE 4.6: Other side view of the new boards.
to ease of access to component pins. Grounded polygon fills were used on
these layers as well. Grounded through-hole vias were placed throughout to
connect copper islands and to reduce current loops.
e A copper keep-out region was imposed on all layers of both boards underneath
the Bluetooth antenna.
e The power manager and buck converters were placed on the bottom of board
B to be near the battery. Keeping the battery voltage traces short decreases
the resistance that the total current has to run through.
o The microcontroller was placed on the top layer of board B to ease routing of
its lines to the camera module.
o As discussed before, the Bluetooth module was placed on the top layer of
board A.
* Board A did not have as many signals to route, and had the luxury of 4 layers
of copper planes, as only 2 layers were needed for routing.
47
Chapter 4. Prototype implementation
e Board B required 3-4 routing layers, so we chose the 2 external layers and
the 2 most internal layers for routing.
The 2 internal routing layers were
sandwiched in between 2 layers of copper planes.
Total Height
(7.4ml
Core 0 2.6m"
Law1 Top Sig (GND pots) -- v
2mW
Core 12.6m11
Layer2 GND ND) -Layer3VOUT (PMYVUT) -o
repreg D2m&A
Core 02.6ln
Layer4l GND CGND) Layer5l1.8 O.V --
Bottom Sig (GND pour)
-
.AP
w
FIGURE
4.7: Top PCB ("A") stack-up.
Total Height [71.4mil
Core n26mi)
LeUi1
Sig (GND pots) - *2mW
Lay21WV.
Ly3
Sig [GND
Core (2.6
-3N
pour)
02mi
-Prepreg
Core V26mi
Layee4 Sig GND pots) -
-
Laye Sig (GND pow)
FIGURE
4.3
4.8: Bottom PCB ("B") stack-up.
A note on component selection
We aimed to keep the boards as small as possible, so we chose very small components. We realized that in doing so, we traded usability for space. 0402 chip
components were very fiddly to work with, and the QFN chips required extra care
in soldering. Testing points on the boards was difficult because the test points (surface pads) were so small. Moreover, the buttons that we chose were too small to
actually be usable. We will need to add a button cap to the user input button. The
microcontroller reset button is extremely difficult to press and we may replace that
with a larger one. Future developers should buy such components ahead of time for
testing.
Chapter 4. Prototype implementation
4.4
48
Chapter summary
In this chapter, two second-generation EyeRing prototypes were discussed.
The
first was a PCB for debugging and development. The second consisted of two PCBs
to be finger-worn. Implementation constraints and decisions were reviewed.
Chapter 5
Firmware implementation and
discussion
Firmware was a large portion of the work involved with this project. In this chapter,
firmware overview and details are covered.
5.1
Functional overview
The firmware was written to support the desired ring functions: image capture,
video capture, motion and gesture data, and low power consumption.
The functions and decisions shown in Figure 5.1 are described below:
9 init: When the ring is turned on, this function runs. First, it sets up the
microcontroller's GPIO and communication modules (SPI, UART). The direct
memory access (DMA)1 modules are set up for moving information beween
the camera, the micro, and the Bluetooth module. The microcontroller's SPI
module syncs with the camera, and the UART module sets up the Bluetooth
module. The PAN1326 requires a TI service pack to be sent every time it
boots up, so the microcontroller sends that. The service pack consists of HCIlevel commands that configure the PAN1326. After init, the microcontroller
goes to sleep.
1
Atmel UC3 documentation refers to DMA as PDCA, Peripheral DMA Controller. This is
because the UC3 micros are only capable of moving data between peripherals and internal memory.
They cannot transfer data between blocks of internal memory.
49
Chapter 5. Firmware implementation and discussion
50
no
FIGURE 5.1: Functional block diagram for firmware.
* sleep: The microcontroller enters "frozen" mode. This mode allows external
interrupts and peripheral bus (PB) interrupts to wake up the micro. The CPU
and hi-speed bus (HSB) are stopped to reduce power consumption.
" telemetry timer: A microcontroller timer is used to wake up the micro at
pre-set time intervals.
" telemetry: The microcontroller polls an ADC pin to determine the battery
voltage. It also gets velocity and acceleration data from the MPU. These
telemetry data are sent to the Bluetooth module, which then relays the information to the host. The microcontroller then goes back to sleep.
" button press: This external interrupt wakes the microcontroller when the user
presses the input button.
" take picture: The microcontroller requests an image from the camera.
" success: If the camera sends an entire image without glitch, the microcontroller sends the image data to the Bluetooth module. Otherwise, the microcontroller will make 60 attempts to grab an image from the camera.
Chapter 5. Firmware implementation and discussion
51
0 video: Then, if video mode has been selected, the microcontroller requests
another image. If video mode is not enabled, then the microcontroller goes
back to sleep.
o MPU interrupt: the MPU can be set to interrupt the microcontroller when
specific actions have been detected. The MPU can recognize movements such
as tapping, shaking, free-falling, and high-G acceleration.
o poll MPU: When the MPU interrupts the microcontroller, the micro polls all
available information from the MPU and sends it to the Bluetooth module.
Then the microcontroller goes to sleep.
5.2
More firmware details
Atmel has released the Atmel Software Framework (ASF) with sample code for
its microcontroller families, including the AT32 "UC3" family. On a low level, we
use ASF's routines to set up and use modules like UART, SPI, etc. We found the
learning curve to be somewhat high due to lack of documentation.
The PAN1326 module uses the TI CC2564 baseband control chip, which has its
own quirks. This chips requires a firmware upgrade to be loaded upon every power
cycle. Thus, we had to communicate a TI-written "service pack" to the chip via HCI
UART
2.
TI provides a Stonestreet One-written Bluetooth stack, which we did not
use due to its platform-specific implementation. The chip also has its own vendorspecific HCI commands that must be used for changing radio strength, UART baud
rate, etc. We spent a lot of time writing firmware to take care of the CC2564 and to
customize the service pack, until we realized that the btstack firmware performed
most of those tasks.
Our Bluetooth stack is adapted from btstack, an open-source Bluetooth stack written by Matthias Ringwald.
btstack is optimized for portability.
Earlier on we
used smalltooth, a different stack, but abandoned it due to its early development
stage. Similar projects that we investigated for reference were TinyStack and Dean
Camera's stack [26, 27].
2
Host Controller Interface, a Bluetooth communication layer between the host stack (on the
microcontroller) and the radio controller (CC2564).
Chapter 5. Firmnware implementation and discussion
52
We encountered many struggles in trying to get the Bluetooth stack to work. First,
we wrote our own stack for the HCI layer. This firmware modified the TI service
pack to our configurations and also used vendor-specific commands to change the
Bluetooth HCI UART baud rate. We realized that writing firmware for other stack
layers (RFCOMM, L2CAP, SDP, SPP) would be a massive undertaking, for which
there was no time. Thus, we tried our hand at porting the open-source smalltooth
stack.
Our misguided attempt with smalltooth did not support the Serial Port
Profile (SPP) Bluetooth layer and was poorly organized. In addition, the person in
charge of smalltooth was inactive.
Then, we ported btstack to our platform and had limited success: we could pair with
a Windows 7 computer, but the ring would promptly drop the L2CAP connection.
Attempts to connect to an Android device (using applications like BlueTerm and
Bluetooth Terminal) also had dropped connection issues. We pursued these bugs
but did not have enough time to work them out. Thus, we can verify our UART
speed between the microcontroller and the Bluetooth module, but we cannot verify
over-air data rate. Further work in firmware will hopefully smooth the problem
over.
The C329 camera has a somewhat fiddly SPI module and sometimes requires multiple attempts to sync the camera.
However, after syncing the camera with the
microcontroller over SPI, picture-taking went relatively smoothly. We found that
power cycles tended to confuse the camera and cause it to become unresponsive.
Thus, upon turn-on, the microcontroller resets the C329 (which usually takes a few
tries to do successfully). We based our code on Shilkrot's C++ code (for the same
purpose), which he ported from Voisen.
5.3
Chapter summary
In this chapter, the firmware's ideation and structure were presented.
Chapter 6
Evaluation
6.1
Performance and Discussion
First, it is quite clear that the 2nd-generation ring is still quite large. Without a
housing, it measure approximately 1.75 cubic inches in volume, which is a modest
improvement over previous rings, but it will probably be comparable in size once a
housing is made.
FIGURE
6.1: Prototype on finger.
We found that JPGs produced by the C329 were between 10kilobytes and 50kilobytes. For 10 frames per second at that image size, we would need 4 megabits per
53
54
Chapter 6. Evaluation
second (baud rate for SPI communications).
When we actually tried a baud of
4000000, we found that we captured 5.5 images per second. This is because there
is some latency in the C329, and it usually takes about 3-4 microcontroller requests
before it actually takes a picture. Increasing the SPI frequency did not make a
large difference in image frequency, and around 7000000 baud, the image frequency
capability actually decreased due to communication errors.
Again, the Bluetooth wireless part of the project has not been resolved yet. However, we were able to successfully run HCI UART communications between the
microcontroller and the Bluetooth module at over 1 Mbps. We were not able to
conduct extensive batery life testing, but taking continuous video did not exhaust
the battery over the better part of an hour.
TABLE
6.1: Comparison of EyeRings
1st-gen EyeRing
2nd-gen EyeRing
SPI data rate (Mbps)
rate
UART
data
(Mbps)
Bluetooth packet size
(bytes)
Energy storage (mW-
8 theoretically
3
6
4
127
1600
400
400
1
3.25
2
1.75
ing)
+
Feature
h)
Battery life (hours)
Volume (in3 )
(without
hous-
As seen in Table 6.1, the second-generation EyeRing has improved upon the previous generation. One of the most striking advantages of the new system is the
maximum Bluetooth data (ACL) packet length. Of course, this packet length is
also limited by the capabilities of the host processor, but assuming a host with
reasonable packet sizes, we will be able to achieve much higher throughput with the
PAN1326.
Chapter 7
Conclusions and suggestions
7.1
Conclusions
A hardware platform for EyeRing was designed, and firmware was written. The
firmware was mostly functional, achieving almost 6fps video and integrating new
feedback modalities. Hardware was added for a laser diode module and for vibration
motors. Together, these feedback modes will help blind and sighted people point
and capture relevant information accurately. A motion capture unit was added for
velocity and acceleration data. Once the Bluetooth stack is perfected, we anticipate
being able to tackle applications that we have been dreaming of.
Unfortunately, the boards were not ready to be integrated with applications and
have yet to be tested in a user study. However, progress is promising and shows
that data transfer can be fast enough for video streaming applications.
7.2
Suggestions for further work
We encountered many struggles with time allocation for design and implementation
work.
We suggest that future developers be wiser in contemplating the tradeoff
between theoretical gains and actual usability.
For instance, we spent too much
time on the Bluetooth firmware and not enough time testing and perfecting circuits
for the PCBs. We recommend that future developers of EyeRing decide how much
time they can allocate to custom firmware stacks and design accordingly. We would
55
Chapter 7. Conclusions and sUggestions
56
recommend that they investigate using a Bluetooth module with an on-board stack
that has large packet size capabilities. If that isn't possible, we would recommend
using a module whose stack can be modified and re-flashed. Either way, they would
save a lot of firmware development time.
Another avenue would be to use a design house to make the boards. This would
allow much more time for application development and user studies. We considered
this approach, but decided that the cost was prohibitive with a certain vendor.
Perhaps other companies have lower prices.
For the existing hardware, there are many aspects that could be expanded. The
PAN1326 module has Bluetooth Low Energy capabilities, which were not utilized
This mode could be enabled to send device telemetry (battery
in this research.
state of charge, user motion, etc.)
to the host.
The PAN1326 also can use a
deep sleep mode called HCILL, which is proprietary to Texas Instruments.
In
HCILL mode, the PAN1326 module goes to sleep and only responds to HCILL
wake-up commands over UART. We also did not perform radio setting tests with
the PAN1326.
Different parameters, like sniff period and radio power, could be
tuned to find the best operating point.
Currently, we use a 2-element ring buffer to deal with video frame data transfer. It
may be better to use dynamic memory allocation.
There is a trade-off between speed and power use when determining the best clocking frequency for the system. We suggest a thorough analysis of the system, including factors that would change with the clock frequency: microcontroller power
consumption, SPI and UART baud error, and usable SPI and UART speeds.
Future developers may investiate the use of FPGAs to perform the JPEG compression from the camera. Such a solution could potentially be faster and more
energy-efficient. The FPGA could be configured to pass serial data straight to the
Bluetooth module, replacing or bypassing the microcontroller. Usability concerns
both the user and the designers. To ease firmware development, future members
of the EyeRing project may look into porting the ASF code into Arduino code.
Arduino has a code suite and IDE for 32-bit systems and can be configure to run
on custom PCBs.
We also suggest that future developers consider new and exciting products that
have entered the market: TI Bluetooth SensorTag, a Bluetooth low Energy device
57
Chapter 7. Conclusions and suggestions
with built-in sensors; Limberboard, a small and flexible microcontroller platform
for wearables; 1010, an Android-specific development board which has been used
with USB Bluetooth dongle; and Trinket, an Arduino-compatible board similar to
the Teensy. The Teensy 3.0 may be a good option, as well. Initial tests with the
Teensy 3.0 showed that it is capable of fast communications and data transfer
the
bottleneck in that case was using the RN-42 Bluetooth module.
Taking a small step back towards COTS parts may enable future designers to iterate prototypes more quickly and create more user applications. We envision the
C329 camera module, a COTS microcontroller board, and a custom power management board (with connectors for the C329 and feedback peripherals) as a good
compromise between functionality and time spent. The custom board would have
connectors to eliminate the messy wiring that would result from solely COTS parts,
and the microcontroller board & C329 would drastically decrease development time
and costs.
7.3
Summary
New hardware for EyeRing is promising and will enable myriad applications once
its Bluetooth firmware is debugged. One of the main advantages of this system is
fast SPI, UART, and DMA capabilities. The DMA allows the microcontroller to
perform tasks like polling battery voltage, controlling haptic and visual feedback,
and more while the DMA runs communications in the background.
In this new
hardware, size and battery life are improved modestly, and new capabilities include
motion detection and multimodal feedback.
Appendix A
EyeRing schematics and layouts
This appendix contains the circuit schematics for the second-generation EyeRing
prototypes. EyeRing Top is the schematic that is laid out on board A, the top
board, whose layout follows the schematic. EyeRing Micro and EyeRing Power
are laid out on board B, the bottom board, whose layout is shown last. All layout
polygon fills have been removed for ease of display.
59
3
I
CONNECTOR FROM BOTTOM BOARD
13
LEVEL SHITMER 01 LEVEL SHnTER OE
A
BTshutdown
20
1
B'rh
SA
USARTI RX
USARTI TX
USART TX
USARTi CTS
USART1 CTS
USARTI RTS
USARTI
V1BR02
p
1
2
1s
3
17
4
1
16
14
14
7
13
S.
12
91
RTS
5S
BRO2
_
LASER
--
20
19
9VIBROI
VBRI
A
2
16
6
IV
RF
118
LA
1?7
A33572-ND
3 IV
1.9V
PM VOUT
X2
4
3
B
B
i
GND
NC
%
Slow osc
535-10003-1-ND
GNED
Us
I
Al
LVL A6
RF1
3
H CTS
HORTS
5 HC
6
HCI TX
7
BT
LEVEL SETITER OE
A2
AS
A4&.
10
E
HCI
4
CTS
HCI.RX
AUDPSYNC
9swwa.-cKJN
SNc ULD-0 0OUT
BTshadown
I USART RX
JSART1 TX
84
D3 JL USARTI-CTS
USART-RTS
B2
Vccb
16 LVL B6
El
III-ND
PAD~w
d
C
DBG
HCN CTS
HC1 RTS
HCli RX
HCI TX
RX
SHUTD
A3
BT
GMO
GKD
15
s=oi
17
TXBO106
296-23869-1-ND
MLDo IN
C
AUDOS! F
isAUDIN
AUDCX
2
C
23L
VDDD1
24
A
B
N
NC
GND
GND
P14973Cr-ND
GND
LASER DIODE
VCCB
PM VOUT
VCCA
3.3V LDO
3.3V
I.V
PMVUT
D
120
P
1
m
clfu
Mp 4' F 3F .1 4 F 4. huF 4. 4 3
1
sow 34ock
-C28
U4
EN
GND
Nc
IIGND
b
C32
1
I,
4.7uF
"=="
GND
ND
-
9
3
Z-.
4
33
C31
4
T
5
38
40
GND
GE=
=
GND
GND
4
C37
VV-C36
GND
GC41
U? 3.3 LIX)
ADPl22AUIZ-3.3-R7CT-ND
Vibromotots
E
PM
VOUT
PM VOiIT
RV7
121
0
oGNDO
VrfRMGTOR2+
FET3T
2
J? LASERDOIE
ROM
TORE+
A98036-ND
.3
NC
I
PET2
DMG342OU fEt
DMG3420U-7D1CT-ND
DMG342DU fet
DMG3420U-7DICT-ND
LaserHolder
F
F
F3781-ND
EyeRing Top
FETI
DMG342OU fet
DMG3420U-7D1CT-ND
Sime
D
FRl
I
Numbr
Rev-on
B
2
e
3
9/17/2O13
CNUser. 0
c
1
Shat of
Duwn y:
4
0
3'
0
ox
0
*
ode.
@
Adlk
w
m.
m.
- e.
w
p
16M
,In
--. eR
Allik
w
U
______
~
II
*IFrF
@0
A0
I.
;4-4
ii
.
a
m
i
..
...
U
.
..
- -
offmam
.TMI
R--AM-
I
I
I
I
2
I
I
CONNECTOR TO T(P BOAM
Y7
LEVEL S
20
WrA;X.W.
USARTI
to
1 -L--.Cjj
2
=
VlRft0'
1..
RX
USAWN YX
16
usAwn R'FS
is
4
GKV
12
LASER
sf
to
3
y
I
!
XM-khmeft
VIIATT V US
a
a
MICRO
UKC
3.31
12B
H
39
vmsu
FAO3,DD~
wwrqaw
PA0570I1R 7eWI!RL
,DW(MaRw
143JC 531710
R14
5159.
c
c
,AOKMVIWOAmI T35N-NOI)s'
PNHSHAV W.U&MVM
valm
IIMP
-7U7~WIlaj
HIVIM
5
jI
C14
GND6-AOETN
D
D
Nvu
U3
cuou
4c
mc
Nc
a
C2329
Nc
is
AMDA
AMCL
N
CI5~C
ADWSW
4I~
wr
1
O
7Ls
voo
05~~oo
Kc
wc
to
"
P
OO)
Nc
Nc
I
2Lf
m Po&~~oL"
WD
sm
cpcvr
RvXV
7
7
CLKCUT
SCLS=
umim
U'1428-IODS-644
Ey~eRing Mcro
!K.
I
%--m
sw- 19 -11:
4
t-
000~~
00
t
m
0
0
0
m
'
u
t~i
Uri
0..
J .lLOA~
MAO)
-4-
0
0
w
iu00
*Oool
0
SI.
0
I
I
I'-
+ ANN
III
a
Jo
I
ii
Bibliography
[1] 8600 ring scanner.
URL http://www.honeywellaidc.com/en-US/Pages/
Product. aspx?cat egory=wearable-s canner- and-mobile-comput er&cat=
HSM&pid=8600.
[2] Sesame ring. URL http: //ringtheory.net/.
[3] George Stetten, Roberta Klatzky, Brock Nichol, John Galeotti, Kenneth
Rockot, Samantha Horvath, Nathan Sendgikoski, David Weiser, and Kimberly
Zawrotny. Fingersight: Fingertip visual haptic sensing and control. Ottawa,
Canada, October 2007.
[4] Rich
Shea.
July
2013.
The
OrCam
URL
device:
Giving
back
functionality,
http://www.blindness.org/blog/index.php/
the-orcam-device-giving-back-functionality/.
[5] Refreshable
braille
display.
URL
http://w301math-atresources.
wikispaces.com/file/view/braille-4.jpg/33019009/braille-4.jpg.
[6] Wireless quick facts, December 2012. URL http: //www. ctia. org/advocacy/
research/index.cfm/aid/10323.
[7] Cassandra Xia and Pattie Maes.
A framework and survey of artifacts for
augmenting human cognition. 2013.
[8] Visual impairment and blindness. Technical Report 282, World Health Organization, June 2012. URL http: //www. who. int/mediacentre/f actsheets/
fs282/en/.
[9] Angela Edwards.
Barriers to financial inclusion:
Factors affecting the in-
dependent use of banking services for blind and partially sighted people.
Technical report, Royal National Institute of Blind People, March 2011.
65
66
Bibliography
URL http://www.rnib. org. uk/aboutus/Research/reports/money/Pages/
barriersf inance . aspx.
[10] Suranga Nanayakkara, Roy Shilkrot, and Pattie Maes. EyeRing: a finger-worn
In CHI '12 Extended Abstracts on Human Factors in Computing
assistant.
Systems, Austin, TX, May 2013.
[11] David McNeill. Language and Gesture. Cambridge University Press, New York,
NY, 2000.
[12] Richard A. Bolt. "Put-That-There": voice and gesture at the graphics interface. volume 14 of 3, pages 262 270. ACM, 1980.
[13] Thad Stainer, Steve Mann, Bradley Rhodes, Jeffrey Levine, Jennifer Healey,
Dana Kirsch, Rosalind Picard, and Alex Pentland. Augmented reality through
wearable computing. Technical Report 397, MIT Media Lab, 1997.
[14] David Merrill and Pattie Maes. Augmenting looking, pointing and reaching
gestures to enhance the searching and browsing of physical objects. In Proceedings of the 5th InternationalConference on Pervasive Computing, Toronto,
Ontario, Canada, May 2007.
Accessibility of top 1,000 titles of 2012.
[15] Claire Creaser.
Technical re-
port, Loughborough University and Royal National Institute of Blind People, April 2013. URL http: //www. rnib. org.uk/prof essionals/Documents/
Top_1000_books_2012.pdf.
[16] Barbara
WIPO
2013.
Martin
book
URL
and
treaty.
Dan
Pescod.
Technical
report,
June
17
World
press
Blind
release
Union,
for
June
http: //www. worldblindunion. org/English/news/Pages/
JUne-17-Press-Release-f or-WIPO-Book-Treaty. aspx.
[17] NLS factsheets: Copyright law amendment, 1996, 1996. URL http: //www.
loc.gov/nls/reference/factsheets/copyright.html.
[18] Knowledge Ecology International. 28 june 2013: 51 signatories to the marrakesh treaty, July 2013. URL http: //keionline. org/node/1769.
[19] Mike Masnick. How the MPAA fought to keep audiovisual materials out of
WIPO treaty for the Blind/Deaf; and how that's a disaster for education, June
2013.
URL http://www.techdirt.com/articles/20130608/13101023381/
Bibliography
67
67
Bibliography
how-mpaa-fought-to-keep-audiovisual-materials-out-wipo-treaty-blinddeaf-how-thatsshtml.
[20] John Slade. Update on inclusive society 2013.
Technical report, Royal Na-
tional Institute of Blind People, April 2013. URL http: //www. rnib. org. uk/
aboutus/Research/reports/2013/Update-inclusivesociety_2013.doc.
[21] Nate
Hoffelder.
2013.
Bill
URL
McCoy
is
wrong
epub3
isnt
ready,
May
http://www.the-digital-reader.com/2013/05/28/
bill-mccoy-is-wrong-epub3-isnt-ready/.
[22] EPUB3
support
grid,
March
2013.
URL
http: //www. bisg. org/
what-we-do-12-152-epub-30-support-grid.php.
[23] Graeme Douglas, Heather Cryer, Ebru Heyberi, and Sarah Morley Wilkins. Impact report on braille standard for medicine packaging. Technical report, University of Birmingham and Royal National Institute of Blind People, July 2013.
URL http://www.rnib.org. uk/aboutus/Research/reports/2013/Impact_
report onBrailleonMedicines_ (22-07-2012_CAI-RR18). doc.
[24] Anuruddha Hettiarachchi, Suranga Nanayakkara, Kian Peen Yeo, Roy Shilkrot,
and Pattie Maes. FingerDraw: more than a digital paintbrush. In Proceedings
of the 4th Augmented Human International Conference (AH '13), Stuttgart,
Germany, March 2013.
[25] Rahul Balani.
lular networks.
les, 2007.
Energy consumption analysis for bluetooth, WiFi and celTechnical report, University of California at Los Ange-
URL http://nesl.ee.ucla. edu/fw/documents/reports/2007/
poweranalysis.pdf.
[26] Martin Leopold. Tiny bluetooth stack for TinyOS. Technical report, University
of Copenhagen, 2003.
[27] Dean Camera. Embedded bluetooth stack. Technical report, La Trobe University, November 2011.
Download