The upper margin on the first page is of 3

advertisement
Hugo Furtado
International Postgraduate School
Jožef Stefan Institute
Roman Trobec
Department of Communication Systems
Jožef Stefan Institute
Wireless Sensor Networks for
Minimally Invasive Surgery
Minimally invasive surgery grows in popularity. With it, there is also a growing demand for better technological support.
There are a number of applications based on augmented reality technology to support MIS. One other topic which grows in
interest is wireless sensor networks. Being a highly multidisciplinary topic, research is fragmented towards very different
directions. There is currently a gap in what regards applications. These do not build on the sum of results but rather evolve
driven by specific sets of results. We propose to combine advances in augmented reality with advances in low level WSN
research to build an application to support MIS.
Minimally invasive surgery is a new discipline highly supported by multidisciplinary technological advances supported by
augmented reality, robotic technology and advanced instrumentation and measuring equipment. In the last decade the
wireless sensor networks are becoming more promising, because of their ability to support all crucial issues of the
minimally invasive surgery: safety, security and flexibility. In this paper a review of the already implemented solutions and
new ideas for future developments are presented. Main focus is given to the minimally invasive methods and wireless
sensors while such solutions are the most prospective in the future use because of their inclination and suitability for the
patients and medical personnel.
1. Introduction
Minimally invasive surgery (MIS) is one of the most important trends in modern medicine. It
offers less trauma for the patient, smaller risk of infection and shorter hospital stay. The latter, clearly
benefits the patient but also the health system where the cost per patient is reduced and the ability to
treat more patients increased. The surgeon is faced with performing the kind intervention without direct
visual guidance, without direct tactile feedback and with impaired dexterity due to the small manoeuvre
space. Medical imaging equipment such as Ultra-sound or fluoroscopy is used intra-operatively with
the purpose of helping navigation. Many times this intra-operative guidance is complemented by a
previous study of pre-operative images acquired from other, more precise modalities, like Computed
Tomography (CT) or Magnetic Resonance Imaging (MRI). In this case, surgeons or other
interventionists correlate in his mind the studied images with the lesser precise but real time intraoperative modalities.
Augmented Reality (AR) applications can support the interventionist by enhancing real time intraoperative imaging with computer generated information. The systems can automatically relate preoperative images to real-time intra-operative modalities. Like this, they can provide guidance cues of
directions to follow. This has the potential benefit of not only leading to better treatment accuracy but
also to reduce the cognitive load of the interventionist.
AR applications are heavily dependent on information gathering. Basically they can be seen as a
pipeline of sensing (object tracking, head tracking, pressure measurement), processing (registration,
collision detection, etc.) and display (on a screen, head-mounted display, on top of the "real" world,
etc.). Wireless sensor networks (WSN) share some of the segments of the "pipeline", e.g., sensing and
processing. So, even if AR applications are applied nowadays to solve the problems associated with
MIS - they try to augment reality so the surgeons recover their "lost" senses - they often relay on a set
of dedicated sensors which are not wireless. Here is the opportunity to use the WSNs strengths to feed
AR applications sensing demands. The advantages are clear: unobtrusive mini-sensors which can even
be implanted inside the human body, providing real-time data that would otherwise be cumbersome or
even impossible to obtain with wired sensors.
To achieve this, research results end experience gained on different areas have to be analyzed and
combined.
This paper brings two main contributions: first we present an overview of existing WSN medical
applications, with the analysis of their strengths and weaknesses. Second we present a set of design
requirements for an eventual typical WSN supported AR demonstration and an initial proposal of an
architecture fulfilling these requirements. We propose a client-server architecture based on a
middleware server that manages the network in terms of reliability, security and connectivity. Using
such an approach we plan to achieve plug-and-play like reliable wireless sensing, with security
mechanisms to prevent data sniffing and data corruption by third parties. We propose a conceptual
solution and also discuss about possible implementation technologies.
The rest of the paper is organized as follows: in the next section, we present the summarized
relevant background in each of the mentioned areas. Then we present our WSN supported AR
architecture proposal in detail. We continue by presenting some envisioned scenarios and finally we
end with discussion and conclusions.
2. Background
There are many applications of WSN to medicine in very different perspectives. They can range
from Body Sensor Networks (BSN) where a collection of wireless on-body sensors communicate with
external servers with the aim of supporting rehabilitation [1], prevent ulcers [2], or simply inform the
doctors at remote locations of the state of the patients at home [3] to applications spanning across
whole hospitals where medical staff and equipment is tracked to increase efficiency of the medical care
process [4]. The former is normally based in custom-developed wireless sensor boards that measure
physiological quantities, perform some local processing and send the data to a sink. The sink can be
either a personal device (like a PDA) or a PC. The latter kind of applications is normally based on a
more complex architecture where local networks (sensors detecting doctors with RFID tags)
communicate with central servers using normal LAN infrastructure and manage this information in a
central database. In the WILHO project [5] RFID tag readers that can communicate with a central
server are used to order clothes in a hospital. The developed framework abstracts the RFID reader
(treating it just like a general sensor) and the architecture could be used with other sensing technology.
In [6,7] the authors developed a number of different vital sign wireless sensors intended to acquire
information from patients in case of accidents or of a massive disaster. The sensors relay their
information to PDAs or ambulance based stations using short range wireless communications. The data
can be integrated into a pre-hospital patient record. On another perspective, some applications are
dedicated to support the medical team in the operating room (OR) [8]. In this case, the patient, the
medical team, the tools and blood bags are equipped with RFID tags. Using a rule-based approach and
the readings from the sensors that are present or not, the system generates warnings in case something
is wrong e.g. the blood bags are of the wrong type.
Augmented Reality is the technology by which computer generated information is superimposed
on a real scene with the purpose of enhancing the experience coming from this scene. There are also
many applications in medicine and in particular in MIS where the constraints that surgeons face,
especially the lack of visibility make AR a strong candidate for improvements. Example systems can be
found in abdominal [9], cardiac [10], minimally invasive lung brachytherapy [11,12] and in many other
medical treatments. AR systems are also used together with actuators, especially in robotic surgery
[13]. Still, to this date, we have found no report on the combination of these two domains: an AR
system collecting information from a network of wireless sensors rather then from a conventional
sensor setup. Typically, both AR applications and WSN applications are custom-tailored where wither
the sensors and their arrangement (in the case of AR) or the display and handling of data (in the case of
WSN) are very specific to the applications. This motivates our current work: to develop an architecture
which interfaces both domains. Still, interfacing these complex domains is not an easy task.
Middleware systems bridge two (ore more) layers of software that normally don't interact out-ofthe-box. According to [14] middleware is "...software that lies between the operating system and
applications running on each node of the system. Generally, middleware is expected to hide the internal
workings and the heterogeneity of the system, providing standard interfaces, abstractions and a set of
services that depends largely on the application". Faced with the task of interfacing these two worlds
our current approach was inspired by two points of view pointing in the same direction. The first is
depicted in [15] where the authors present a complete network architecture IPv6-based architecture for
WSN. In most of the reviewed applications communication architecture is not IP compatible. In this
work the authors show that, whereas before it was thought that IP is an old standard not suitable for
WSNs, with careful design the performance can be better that the custom based approaches that were
designed for each of the applications. This leads us to leave the design of a custom network stack for
now and focus on the middleware. Such a decision is supported by [16] where their review of possible
distribution schemes for WSNs shows that an approach via middleware can be an interesting one. Also
to many other authors [17] middleware is a good choice to provide high level abstraction when
interfacing heterogeneous sensors and applications. Having this into consideration, it seems to us that
middleware based architectures are instrumental to combine the power of WSN sensing capabilities
with the enhancing capabilities of AR applications with its already proven benefits to medical
interventions.
3. Proposed architecture
We designed an architecture that addresses the OR needs by focusing on three main requirements
for a medical application: safety, security and flexibility. We will discuss these requirements more in
detail.
Safety is a key factor in a medical application. The consequences of system failure can be
permanent body damage or death, so it is of extreme importance that safety is taken into account when
designing a system to support medical interventions. When talking about communications, the various
aspects to consider are: maintaining the data rate to the acceptable level, avoiding communication
breakdown and what are the consequences when it is not possible to guarantee one or the other.
Maintaining the data rate and avoiding breakdown can be seen as a guarantee of a certain level of
quality-of-service (QoS). In the literature, many groups discuss this issue with regard to WSNs [18]. In
our case we will address the issue by providing a scheme to monitor the connections taking into
account certain criteria. As for the consequences of the problems, for the time being we limit ourselves
to providing the means for an application to know there has been a problem rather than designing a
fault-tolerant-system.
When designing a WSN architecture for the OR it is also of importance to have security in mind.
It should be avoided that data leaks out of the network as privacy sensitive information about the
patient might be disclosed to a third-party. On the other hand, the network must be protected from
intrusion, where an impersonator node might generate fake data or might corrupt the real data,
potentially compromising the surgery. Encryption schemes can provide the necessary security in what
regard avoiding data "sniffing". We present a simple authentication mechanism avoiding that foreigner
nodes can take a part of the network as being trusted sensors.
Flexibility of the network is of utmost importance in such medical environments as the type of
sensors that might be required can vary greatly. Just as an example, nowadays, a surgical team relies on
all kinds of inputs: pressure, ECG, imaging (ultra-sound, endoscopic cameras, etc.). In the future it is
expected that position tracking moves to the OR as well as wireless sensing of all kinds. To cope with
these kinds of different possibilities a system must be designed to be extendable and not rely only on
certain kinds of inputs. Also, applications themselves may function as sensors as their output data can
serve as input in an actuator, itself acting as a network node. So, as already stated in [19], a mechanism
for easy plug-and-play of sensors and applications into the network must be derived and one that
doesn't compromise the previous requirements (safety and security). This vision is also shared in other
efforts like the European research network SENSEI [20], aiming at building the framework for the
"internet of things" through the design of a general network architecture.
Table 1 shortly summarizes the requirements of a MIS supporting application from the network
point of view.
Topic
Safety
Security
Flexibility
Description
Sensing must be reliable. Special care must be put so that:
 data are accurate
 communication flow is continuous (at the needed rate)
 must cope with faulty sensors (evaluation of consequences, actions)
Security must be guaranteed from several points of view:
 data cannot be corrupted by external agents
 data cannot be sniffed by external agents
 only trusted sensors and other resources can be part of the network
Participating sensors must be allowed to be heterogeneous and must be able to join
and leave the network in a simple way
Table 1 - Overview of requirements for a WSN medical application
To address these requirements, we propose an architecture based on a middleware component that
will be responsible for managing security, safety and flexibility that imply connectivity between
sensors and applications. The component will be in charge of deciding which sensors are allowed to
participate in the network, of monitoring the quality of the sensing data and the connections and of
maintaining "network context" having centralized information of which nodes are part of the network
at a certain time, and which ones exchange information.
This work is in part inspired with the work presented in [21] used in online gaming. Large scale
gaming is influenced by unreliable communication, heterogeneity and persistence. We have not been so
concerned with scalability but much more with security. They use a real time XML database that can
be queried with XPath. Using this technology, they maintain the state of the game in the server, which
players are participating, which have non-player characters, etc. The use of XML as a core technology,
both for the database and for network communications allows for rapid prototyping of applications and
easing up the proof-of-concept phase. Such a database will allow us to maintain the current state of the
network, regarding active network nodes, which is useful both to inform client applications of the
available sensors and for the QoS monitoring. Since our first aim was proof-of-concept and the study
of the feasibility and utility of the proposed system, and not the efficiency, XML was a good choice
because well established protocol and easy implementation, at least in the server side and other nodes
with enough computational resources.. Apart from the server, a simpler middleware component also
exist in the sensor or actuator nodes. Figure 1. shows a schematic block diagram of the proposed
architecture.
Figure 1 – Operating room WSN architecture overview
The middleware component manages the network. It is responsible for accepting or rejecting
network members, provide information upon request of the status of the network (how many
sensors are connected, which kind, etc.), connecting two or more nodes together for efficient data
exchange and providing QoS (what we see here as safety) mechanisms. These services are described in
more details below:.
 Node authentication: the authentication will be done at a node's request. The node
wanting to participate in the network must log in by doing a request to the server. An
example of how a sensor can request to join the network is shown in Figure 2. The sensor
sends a request to join the network and the server asks for authentication. If authentication
is successful, the sensor is accepted as part of the network and an entry is created in the
server database. The server keeps this database with the table of the participating nodes and
also with their characteristics (type of measurement, reliability, etc.). The server will then
broadcast the current sensor pool to all, previously participating nodes. After updating, the
sensors can accept connections with any other node from the table.
Figure 2 - Example of a node requesting to participate in the network

Connectivity: since the server maintains a list of all connected nodes, it is possible for an
application to query the server looking for available sensors. The application can then
request to get data from sensors. This provides for maximum flexibility: the application can
get information about which type of data is generated by the sensors, the location of the
sensors, etc. and can, like this, create its own custom network of information tailored to its
needs.

QoS mechanism: When the server receives requests for connection from applications and
allows them it will update a table with the current connections. Like this the server knows
at all times which nodes are holding active data transfer connections. When a data
connection is maintained between two nodes (e.g. an application and a pressure sensor) the
middleware on these nodes is also responsible for sending messages to the server
middleware with timestamps. The server can compare the timestamps and have a measure
of the latency of the connection. If the latency becomes unacceptable the server can warn
the application. With this mechanism, the server is always aware of the communication
status between the connected nodes and can hold statistics on the times taken by data, how
many failures, etc. These statistics can be used in the future when other applications join
and request nodes with certain QoS criteria.
At the sensor level, the middleware will be a much simpler component as the only functionalities it
has to provide is implementation of the communication protocol, and the database with the ID of the
trusted nodes, which is actually just a simple table. In the sensors, application layer will be responsible
for managing the data acquisition and interfacing the middleware. On actuators, the application layer
will implement the logic to control the hardware according to the data received from the network. In
[22] the authors present their own implementation of the middleware as an adaption of an existing
server middleware, which was reduced to achieve the small footprints required to fit in sensor node
hardware. Their work shows that tiny implementations of middleware that are compatible with
industrial standard (e.g. CORBA) is possible on sensor nodes.
4. Example scenarios
To exemplify how such a system could be used we provide two possible scenarios: minimally
invasive cardiac surgery and a context-based application in the OR, which is built in parallel to the
cardiac application.
Scenario 1: A system for automatic catheter placement in minimally invasive cardiac surgery was
previously designed and implemented and is described in [auto cite... not sure I like it]. A simple
description of the system is shown in Figure 3a). In this system, the application relies on several sensor
sources: position tracking and pressure measurements inside the aorta and inside a balloon catheter.
The position is used for visualization and together with the pressure in the aorta is used to control the
position of the catheter. The position is physically controlled by a mechanical actuator. The pressure
inside the balloon is measured and automatically controlled by a mechanical pump. All sensors are now
connected with wires to the acquisition systems. The application is already distributed, with 3 parts in
different PC's: tracking, visualization and control. Communication between PC's is done over TCP/IP
using application specific protocols. Available wireless pressure sensors in the market [23] are small
enough to be used inside the arteries. These still don't incorporate hardware for radio communications
but this will be a matter of time. The existing system can be updated by using the architecture described
before. The pressure sensors could be wireless with the advantages of easier integration on the OR.
Moreover, in this kind of surgery it is also important to monitor other pressures (namely right and left
arm radial pressure). This is done now, but with such architecture it would be extremely easy to do this
using wireless pressure sensors as integrating the in the application is seamless. The actuators (catheter
and pump) would also be nodes on the network. The controlled algorithms can be implemented in
embedded processors and they can wirelessly get the needed data (position and pressure) through the
network. Finally, the same is true for the visualization: the application can simply be designed to
register as a node in the middleware server and to collect data from the tracking sensor. In the same
way as now, this data can be displayed to the user. Moreover it would then be extremely easy to get
data from the other sensors and display information about the pressures on the screen (at the moment
this would require new dedicated connections and specific protocols for data transmission). The
proposed improvement is depicted in Figure 3b).
Figure 3 - a) existing system for support of minimally invasive cardiac surgery
b) proposed upgrade with our architecture
Scenario 2: here we envision a context aware application supported by our architecture. This case
builds on top of the previous one. Let's imagine that the previous scenario is implemented. Then, we
would like to have a system such as the one described in [8] where a higher level engine, based on
information gathered by sensors in the surgery, generates warnings on specific dangerous situations.
With our architecture, any application can simply register with the server and request for sensor
information. Then, a separate computer can implement the context-awareness algorithms based on the
sensor information. As an example, in surgery of scenario 1 it is important to maintain the balloon from
occluding a specific artery. When this occlusion occurs, right radial pressure drops (thus the need to
monitor it). This can be correlated by a rule based system that when the measured position indicates
occlusion and the pressure has dropped indicates (with greater confidence) that occlusion occurred.
5. Discussion
In our design we chose to approach the problem in a simplified way. The reasons for this were
explained before: it is feasible to approach the problem only with middleware and it seems like a good
idea to use an IP compatible stack. But also, our focus is not on redesigning a new network stack but to
concentrate on the application. Our main objective is to test whether this kind of architecture will help
the development of new application due to its flexibility. Finally, designing and implementing both
network stack and middleware would be time consuming and a stepwise approach is more suitable at
this time. Nevertheless, if the concept proves valuable and if we learn that an appropriate stack would
be more suitable, this approach will have to be considered in the future.
In terms of architecture, our design has great flexibility. It allows for different kinds of sensors to
be plugged in. Sensing data types will be registered in the database and applications will subscribe to it
based on their needs. The complexity of the underlying network is hidden by the middleware. The
applications only see data that they can subscribe to and this data will be redirected to them. Like this,
applications can be designed to depend on a variable number of sources of information of different
kinds. This can even change at runtime as an application (even through its users) might request new
sources of data at any time. The sensors can be combined in any way and so, more complex
applications can easily be developed on top of this architecture. Applications which feed on sensor data
and process it to produce higher level information (e.g. a context-based application of scenario 2) and
in turn output these results as data to the network. Also, several applications can reside side-by-side,
just profiting from the network infrastructure without interfering with each other. This conceptual
separation of network from applications provides the necessary modularity so that development of new
applications is much easier.
6. Conclusions and future work
An architecture for easy interconnection of wireless sensors with the objective of supporting
minimally invasive surgery was presented. So far, only the concept is presented. Our following steps
will be to refine the idea and chose carefully the technology so implementation can be smooth.
We expect with our implementation to prove that such an architecture can speed up development
of prototypes of applications to support MIS and that the architecture addresses, even if still at a basic
level, some of the major requirements of surgical support.
Acknowledgement
The authors acknowledge the financial support from the state budget by the Slovenian Research
Agency under grant P2-0095.
Bibliography
[1] E. Jovanov, A. Milenkovic, C. Otto, and P. de Groen, "A wireless body area network of
intelligent motion sensors for computer assisted physical rehabilitation," Journal of
NeuroEngineering and Rehabilitation, vol. 2, no. 1, p. 6, 2005.
[2] "Ubiquitous Personal Assistive System for Neuropathy," ,
http://www.cs.ucla.edu/~alireza/healthnet08.pdf
[3] Kristof Van Laerhoven, Benny P.L.Lo, Jason W.P.Ng, Surapa Thiemjarus, Rachel King, Simon
Kwan, Hans-werner Gellersen, Morris Sloman, Oliver Wells, Phil Needham, Nick Peters, Ara
Darzi, Chris Toumazou, and Guang-zhong Yang, "Medical Healthcare Monitoring with
Wearable and Implantable Sensors," 2004.
[4] S. Seppänen, M. I. Ashraf, and J. Riekki, "RFID Based Solution For Collecting Information
And Activating Services In A Hospital Environment," Proc 2nd Intl Symposium on Medical
Information and Communication Technology (ISMICT'07), 2007.
[5] "ISG WILHO, Wireless Hospital," , http://www.ee.oulu.fi/research/isg/projects/isgwilho
[6] "CodeBlue An Ad Hoc Sensor Network Infrastructure for Emergency Medical Care," ,
http://now.cs.berkeley.edu/~mdw/papers/codeblue-bsn04.pdf
[7] "CodeBlue: Wireless Sensors for Medical Care," , http://fiji.eecs.harvard.edu/CodeBlue
[8] Jakob E.Bardram, N. Niels, and rskov, "A context-aware patient safety system for the operating
room," in Proceedings of the 10th international conference on Ubiquitous computing Seoul,
Korea: ACM, 2008, pp. 272-281.
[9] P. Lamata, A. Jalote-Parmar, F. Lamata, and J. Declerck, "The Resection Map, a proposal for
intraoperative hepatectomy guidance," International Journal of Computer Assisted Radiology
and Surgery, vol. 3, no. 3-4, pp. 299-306, 2008.
[10] C. A. Linte, A. D. Wiles, N. Hill, J. Moore, C. Wedlake, G. Guiraudon, D. Jones, D.
Bainbridge, and T. M. Peters, "An augmented reality environment for image-guidance of offpump mitral valve implantation,", 6509 ed 2007.
[11] I. Wegner, M. Vetter, M. Schoebinger, I. Wolf, and H. P. Meinzer, "Development of a
navigation system for endoluminal brachytherapy in human lungs," in Society of Photo-Optical
Instrumentation Engineers (SPIE) Conference 2006.
[12] I. Wegner, J. Biederer, R. Tetzlaff, I. Wolf, and H. P. Meinzer, "Evaluation and extension of a
navigation system for bronchoscopy inside human lungs," in Society of Photo-Optical
Instrumentation Engineers (SPIE) Conference 2007.
[13] F. Devernay, F. Mourgues, and E. Coste-Maniere, "Towards endoscopic augmented reality for
robotically assisted minimally invasive cardiac surgery," Proc. Medical Imaging and
Augmented Reality, pp. 16-20, 2001.
[14] I. Chatzigiannakis, G. Mylonas, and S. Nikoletseas, "50 ways to build your application: A
survey of middleware and systems for wireless sensor networks," 2007, pp. 466-473.
[15] W. H. Jonathan and David E.Culler, "IP is dead, long live IP for wireless sensor networks," in
Proceedings of the 6th ACM conference on Embedded network sensor systems Raleigh, NC,
USA: ACM, 2008, pp. 15-28.
[16] M. Kuorilehto, M. Hännikäinen, and T. D. Hämäläinen, "A survey of application
distribution in wireless sensor networks," Eurasip Journal on Wireless Communications and
Networking, vol. 2005, no. 5, pp. 774-788, 2005.
[17] R. Kay, mer, K. Oliver, and M. Friedemann, "Middleware challenges for wireless sensor
networks," SIGMOBILE Mob. Comput. Commun. Rev., vol. 6, no. 4, pp. 59-61, 2002.
[18] D. Chen and P. K. Varshney, "QoS support in wireless sensor networks: A survey,", 1 ed 2004,
pp. 227-233.
[19] L. Schmitt, T. Falck, F. Wartena, and D. Simons, "Towards Plug-and-Play Interoperability for
Wireless Personal Telehealth Systems," 4th International Workshop on Wearable and
Implantable Body Sensor Networks (BSN 2007), IFMBE Proc, vol. 13, pp. 257-263, 2007.
[20] "SENSEI," , http://www.sensei-project.eu/
[21] D. Wagner and D. Schmalstieg, "Muddleware for prototyping mixed reality multiuser games,"
2007, pp. 235-238.
[22] F. J. Villanueva, D. Villa, F. Moya, J. Barba, F. Rinco̕n, and J. C. Lo̕pez, "Lightweight
middleware for seamless HW-SW interoperability, with application to wireless sensor
networks," 2007, pp. 1042-1047.
[23] "Cardiomems," , http://www.cardiomems.com/
Download