Final report for entire project period

advertisement
NSF Year-3 and Final Report
Grant No. IIS-0083347
Summary of Research Education Activities
• During the course of the research and development in Year-3, two graduate students continued
critical development of the prototype system. Specifically, they developed the code modules (in
C++) for the microcontroller, built the custom multi-fingered linear tactile displays, designed and
fabricated the integrated camera & lens mounting system, and the electrotactile (ET) displays that fit
into the GraphiGlove prototype system. They also developed the algorithm and software to create an
adjustable linear pixel-to-tactile electrode mapping, allowing the users to adjust the field of view of
the optical image. The students learned CAD programs to design the flexible multi-fingered
electrotactile arrays, and to model the prototype 'glove', and used industrial PC board layout
programs for all of the electronic systems in the device. They learned how operate in a cooperative
and coordinated technical and clinical environment where both hardware and software design and
development must integrate with human psychophysical testing to evaluate the performance of a
particular design, recommend modifications, and iterate the design. This clinical testing necessitated
learning how to design and run experiments using human subjects, including writing IRB testing
protocols, receiving NIH approved HS training, experimental data analysis, and reporting of the
results for review and integration into the evolving R&D plan.
To perform the psychophysical tests, the students also developed a set of various shapes, gratings,
contrast gradients and patterns in the JEDA Dynamic Image Development environment. These
graphic images were for calibrating the GraphiGlove system and video display monitor, and to test
subject's ability to detect and interpret the haptic information in both discrimination and object
identification tasks. This student also developed and ran pilot studies to create the experimental
protocol for training and testing the human subject in the haptic exploration and identification tasks.
In the course of this human subject training and testing, both students were exposed to issues of
human disability, and to the underlying motivation for this project, which was to develop an
electronic haptic aid for viewing computer graphics by people with visual impairment.
The graduate students were also given responsibility for identifying commercial products that would
satisfy the technical requirements of the design, and make modifications as necessary to
accommodate changes in specifications. They were also encouraged to identify commercial vendors,
initiate bids for services, and develop drafts of contracts to obtain materials and services.
Two of the key personnel who initially worked on the project were subsequently drawn away by
other teaching or R&D opportunities. Steve Haase, Ph.D., left in July of 2002 for an associate
professor position at University of Shippensburg, PA. Mitchell Tyler, M.S., P.E., left the project in
August of 2003 to assume his responsibilities as P.I. on two NIH SBIR Phase-2 research grants
awarded to Wicab, Inc. (a small, hi-tech spin-off company established to commercialize a brainmachine interface developed in this laboratory). The awards were to continue development of a
tongue-based electrotactile display system for applications to postural instability and falling due to
balance loss, and to spatial orientation and navigation by people who are blind. In addition, one of
the graduate students originally supported on this project was also hired by the company, based on
his experience and performance, to lead the technical development efforts of the tongue-display
system.
1
• Undergraduate and graduate coursework completed:
Medical Instrumentation [BME/ECE 462 & 762]
Therapeutic Medical Devices [BME 515],
Occupational Ergonomics and Biomechanics [IE 564],
Design for Human Disability and Aging [IE 662].
Rehabilitation Psychology Research [RPSE 690]
Computers In Medicine [BME/ECE 463]
Seminar in ECE [ECE 600]
Intro to MEMS Engineering [ECE 601]
Master's Research and Thesis [BME 790]
Advanced Independent Study [ECE 999]
Activities & Findings
Technical developments:
The prototype system developed in Year-3 employs much of the same hardware utilized in the
system developed in the first year of the project. The prototype device shown in Figure 1.a is
configured as a palm-sized base unit with five articulated finger trays. The palm and fingers are
placed in an open-cell foam padded 'tray' (a molded depression on the upper surface of the device
'fingers') within which the 1 x 24 arrays of gold-plated electrodes (1.50 mm diameter, on 2.32 mm
centers – the same specification as a standard Braille cell) are mounted. The electrode strips are
created by using photolithographic techniques to generate the circuit topography that is then
fabricated using a modified flexible-circuit process to create the conformable, 240 µm thick Mylar
(polyester) electrode matrix. Each array is fixed at the base of each finger, contacting the glabrous
skin, while the articulating finger 'tray' can be adjusted proximally or distally so that the camera and
tactile arrays are coincident.
Once adjusted and set for the user's hand, the design allows for normal adduction / abduction of the
fingers and prevents possible rotation of the camera / lens about the long axis of the finger due to the
moment caused by friction at the lens-display interface. Additionally, the friction coefficient at the
interface with the video display is kept to a minimum by using a high-density velvet buffer around
the linear lens array. This also eliminates ambient light intrusion that would reduce the brightness
and contrast of the visual image and consequent degradation of the sensor signal. (See Figure 2)
The imaging system employs six 1 x 512 pixel CCD array camera (Texas Advanced Optoelectronic
Solutions, (TAOS) Model TSL-208), one mounted axially on each finger and one on the medial
aspect of the palm (Figure 2). Each has a 65 mm linear imaging area [3.9 lines/mm resolution], and
fitted with a SELFOC Lens-Array (Model SLA-20D, focal length: 9.1 mm) from Nippon Sheet Glass
(NSG). Each camera / lens package is mounted on the inferior side of the articulated 'fingers' that
connect to the base or 'palm' unit (containing the system boards) by a sliding joint. Each finger is
designed to both rotate about the pivot point on the base unit and can be adjusted to accommodate
hand sizes ((5th percentile women's to 75th percentile men's, based on anthropometric data for North
Americans).
2
Figures 1.a & b. (a: top) The GraphiGlove, (b: above) with user on the LCD video monitor.
3
Figure 2. Close-up of a single finger tray with electrode array and linear camera.
The microcontroller (a Z-World RCM3200 "Rabbit", running custom Dynamic C++ compiled code
at 44.2 MHz), converts the summated and averaged video data from each camera into discrete
electrode address and amplitude data to the Tactile Display Unit (TDU) serial port. The TDU
converts the data from the Rabbit into a 200 Hz train of three sequential 0-30 V. pulses, each 25µs
wide, which are subsequently converted to 0-14 mA current-controlled pulses for stimulating the
electrodes mounted on the finger arrays.
The linear imaging arrays sample the optical image at 120 frames/s, and employ a scheme that
effectively allows a digital shift register to output analog values representing the brightness of each
pixel. The electronics were designed and implemented along with embedded software to capture the
analog values from each pixel in the linear arrays, select the imaging array size over which to
average the signal amplitude, and then scale it for display on the corresponding tactors via the TDU
mapping algorithm developed for this application.
The linear camera pixel-to-stimulation tactor mapping ratio is designed to range from 9:1 - a narrow
field-of-view (FOV) for high spatial resolution, to 21:1 - a wide FOV that allows for an initial
holistic or 'big picture' view of the image to help the user get oriented and decide how to haptically
4
explore the object. This range of settings was determined under a controlled set of experiments
where subjects were asked to identify the preferred pixel:tactor mapping ratio for accurate and
reliable performance on a variety of computer graphic pattern (both gratings and solid shapes). This
information will be important to the eventual development of a practical device for classroom use by
students who are blind or have low vision and may lack a priori knowledge or understanding of the
image complexity that they are trying to 'see'.
System architecture.
The system consists of a voltage-controlled tactile-display-unit (TDU, previously developed under
separate funding), and a voltage-to-current converter to generate a current sufficient for the
stimulating the fingers (see Figure 3). The linear cameras mounted in the fingers of the glove
capture image data from the video display for input to the central 'palm unit' central palm unit
containing the programmable microcontroller that manages all control, communications, and data
processing for pixel-to-tactor image conversion, electrotactile stimulation intensity. In the future, it
could also store and recall user-specific presentation settings and automatic image quality & intensity
compensation.
Figure 3.
5
Perceptual Experiments
In Year 2, human psychophysical studies were conducted to characterize the Dynamic Range of
perceived stimulus intensity by quantifying both the threshold of sensation and the maximal intensity
without discomfort at each point (i.e. electrode) along each of the fingers on both hands. We
anticipated significant differences in both the magnitude and quality of sensation depending on
location along the finger, potentially necessitating subsequent development of a stimulation intensity
compensation algorithm. Using repeated measures on six adult human subjects, we collected blocks
of data for all 24 electrodes on all fingers of both hands. Stimuli were randomly presented to any one
of the 120 electrodes on both hands, and subjects were instructed to adjust the stimulus intensity to
both sensation threshold and then maximum intensity without discomfort. Overall, the results
indicate that while the mean Dynamic Range varied between subjects, the mean sensation threshold
value within subject across all 5 fingers of the left hands and the first 3 fingers of the right hand were
not statistically significant. However, we found that both the sensation threshold and maximum
intensity values for the ring and pinkie fingers of the right hand (innervated by the ulnar nerve) were
consistently and significantly elevated and more variable. It appears that the left (i.e. the nondominant hand in our subject cohort) is more consistent and generally better suited for this type of
tactile stimulation. We consider this to be an advantageous discovery in that it allows the operator to
use their non-dominant hand for haptic exploration of the image while the dominant hand is free to
operate the user controls for the device, e.g. stimulation intensity, dynamic range, frequency, and
image contrast, zoom, etc.
The useful dynamic range of electrotactile sensation is admittedly still quite limited when compared
with the ranges of any other normal sensory modality. The mean current ratio (maximum intensity
without discomfort / sensation threshold) is approximately 2.4 (SE  0.23) and ranged from 2.8 to
2.1 for 5 of the 6 subjects tested. Based on these results, In Year-3 we concluded that it was not
necessary to create a user-specific, or even a generalized iso-intensity mapping for the GraphiGlove
prototype to be functional. The consistency in the dynamic range results on the fingers of both hands
demonstrate that variability in both sensation threshold and dynamic range at electrode-skin interface
exists, it is relatively small and predictable, and does not significantly interfere with the GraphiGlove
system performance in image scanning and interpretation tasks. This realization bodes well for a
commercial device that will not require customized control features. Rather, we envision a future
GraphiGlove that will be relatively generic and easily adaptable to a wide range of visually impaired
users seeking access to videographic information.
In Year 3, based on the results of the Dynamic Range study conducted in Year-2, a study to
characterize the output stage of the GraphiGlove system was performed. Single electrodes situated at
the midpoint of each of the three segments on the left-middle finger of 5 subjects were randomly
selected and stimulated. Subjects were instructed to adjust the stimulus amplitude, i.e. the "TDU
Intensity" in approximately 5% (1.5 V) increments, and both TDU output voltage and delivered
stimulus current to the electrode were measured. This procedure was repeated approximately 10
times at each position. The results can be seen in Figure 4. The graph shows the mean values of
current (mA) for all 161 trials, and a linear approximation of the relationship between the user
controlled intensity setting, as a percentage of the maximal output of the TDU (nominally 30 V,
unloaded), and the measured current delivered to the electrode in contact with the skin.
The results show that the least variability occurred at the lowest intensity levels (usually below
sensation threshold), whereas at higher commanded levels, the effects of individual skin resistance
affects the ability of the system to deliver current. This is evident in the wide dispersion of data at the
maximal TDU setting (100%). At the maximum TDU setting, individuals with relatively low skin
impedance received maximum potential output of the current source (values ~12-13 mA), while
6
others do not (e.g. cluster at ~6-8 mA), indicating that the system becomes current limited at high
skin resistance. We observed that subjects with the highest skin resistance typically had thicker
corneal layers and were drier than those with the lower resistance (ergo highest current
measurements). These results indicate the present system has some limitations, and appears to be
dependent on preparation of the electrode-skin interface (e.g. oil & callous free), and especially
hydration. For the system to perform adequately for a broad range of skin characteristics, a
subsequent redesign will require a higher maximal current output capability, on the order of 20 mA.
All Subje cts , All Locations
Average Current [mA]
14
12
over all data
10
Linear (over
all data)
8
6
4
y = 0.0964x + 0.2286
2
0
0
20
40
60
80
100
TDU Intensity [%]
Figure 4. Characterization of the current output stage of the GraphiGlove system.
Graph of the relationship between the user-controlled intensity setting, "TDU Intensity"
(as a percentage of the maximal output of the TDU, nominally 30 V, unloaded), and the
measured current delivered to the electrode in contact with the skin.
The significant delays in completing the Year-3 technical objectives, due to major changes in
personnel prevented completion of the rigorous psychophysical testing of the GraphiGlove system.
However, the students developed an experimental protocol for training and testing the human subject
in the haptic exploration and identification tasks, and ran pilot studies to test some of their
hypotheses. In the course of this planning and initial subject testing, the students were exposed to
issues of human disability, and to the underlying motivation for this project, which was to develop an
electronic haptic aid for viewing computer graphics by people with visual impairment. Samples of
the graphical display images developed in the JEDA Dynamic Image Development environment for
this phase of the study are shown in Figures 5.a-h.
7
a
b
c
d
e
f
g
h
Figure 5. Samples of static images generated for presentation on the video display.
Images a & b: low and high spatial frequency gratings, respectively; c & d: high
contrast letters; e & f: high and low contrast step-gradients, respectively; g & h:
solid & open shapes of same geometry.
Contributions
The accomplishments reported here represent, to the best of our knowledge, the first U.S.
development of a haptic display using electrotactile (ET) stimulation for portrayal of videographic
information (see Kajimoto, et. al, 2003). The experimental results on electrotactile dynamic range on
the fingers directly informed the development and subsequent funding of an NIH R01research grant.
The work by our group and others has demonstrated the feasibility of ET stimulation for haptic
display on the fingertips, complimenting our studies on the abdomen and tongue. This recent work
establishes a new level of understanding of how the brain perceives such stimuli, and gives some
insight as to how the sensation is elicited.
At the conclusion of the project, we have developed and tested a functional prototype "tactile glove",
and the rudiments of a testing and training protocol for used in subsequent perceptual research
anticipated for the device. A formal curriculum intended for use in conjunction with a commercial
prototype of device was not realized due to delays in technical development. Nonetheless, the present
system, and the knowledge derived from the research conducted with it will serve as the basis from
which a training system can be developed for use in the classroom so that children who are blind or
have low vision will have comparable access to computer graphical information as their sighted
classmates. Ultimately, we hope that this will be a significant objective for future potential
collaborations.
The work supported here directly contributed to the successful development of the tongue-based
electrotactile display system with applications to postural instability and falling due to balance loss,
and to spatial orientation and navigation by people who are blind. The former of these two
developments, the BrainPort Balance device (Wicab Inc., Middleton, WI), is currently in clinical
trials for FDA approval, and is already in clinical use in Canada, Latin America, and in Europe.
8
Invention Disclosure
"A Multidimensional Immersive Tactile Information Display" This invention is a method and device
to present multidimensional spatial-temporal information to the hand(s) of a human operator. Each
hand is equipped with a flexible glove incorporating available sensors that measure the position,
orientation, and shape (POS) of the hand, and also with a series of stimulators that produce touch
sensations which change according to the hand’s POS. The stimulators are ideally surface electrodes
delivering controlled current pulses to stimulate touch nerves (electrotactile or electrocutaneous
stimulation).
An Invention Disclosure Report, was submitted on 11/26/03 to the Wisconsin Alumni Research
Foundation (WARF), the intellectual property and commercialization organization for the UWMadison research community, based on the work reported here. After thorough review, WARF
elected to not pursue a patent on the technology.
Publications
Bach-y-Rita, P., Tyler, M. E. and Kaczmarek, K. A. (2003). "Seeing with the brain." Int. J. on
Human and Comp. Interaction 15(2), 287-297.
Bach-y-Rita, P., Kaczmarek, K. A. and Tyler, M. E. (2003). "A tongue-based tactile display for
portrayal of environmental characteristics." In Virtual and Adaptive Environments (Hettlinger, L.
J. and Haas, M. W., Eds.), pp. 169-186. Erlbaum, Mahwah, NJ.
Bach-y-Rita, P. (in press). Late post-acute neurologic rehabilitation: neuroscience, engineering and
clinical programs. Arch. Phys. Med. Rehab.
Web sites
http://bme.wisc.edu/
http://kaz.med.wisc.edu/
Collaborations
Because of the intensely technical nature of the work in Years-2 and 3, we were unable to fulfill our
plans to collaborate with Terrence Millar, PI of the K-Through-Infinity (KTI) program at NISE at
UW-Madison.
External seminars / presentations
Tyler, M.E., Haase, S.J., Kaczmarek, K.A., & Bach-y-Rita, P. (2002). "Development of an
electrotactile glove for display of graphics for the blind: Preliminary results." Poster presented at: 2nd
Joint Conference of the IEEE Engineering in Medicine and Biology Society and the Biomedical
Engineering Society; Houston TX; October 24-27 2002.
Project described during Seminar for Brain Awareness Week, UW-Madison, October 22, 2002.
(Bach-y-Rita)
Coulter Prize Lecture of the American Congress of Rehabilitation Medicine October 2002. (Bach-yRita)
Project described in BME-515: Therapeutic Medical Devices Seminar, UW-Madison, February
2003-2006. (Tyler)
9
Download