The RealityGrid PDA and Smartphone Clients: e-Science

advertisement
The RealityGrid PDA and Smartphone Clients:
Developing effective handheld user interfaces for
e-Science
Ian R. Holmes and Roy S. Kalawsky
East Midlands e-Science Centre of Excellence,
Research School of Systems Engineering, Loughborough University, UK
Abstract
In this paper the authors present the RealityGrid PDA and Smartphone Clients: two
novel and highly effective handheld user interface applications, which have been created
purposefully to deliver (as a convenience tool) flexible and readily available ‘around the
clock’ user access to scientific applications running on the Grid. A discussion is
provided detailing the individual user interaction capabilities of these two handheld
Clients, which include real-time computational steering and interactive 3D visualisation.
The authors also identify, along with several lessons learned through the development of
these Clients, how such seemingly advanced user-oriented e-Science facilities (in
particular interactive visualisation) have been efficiently engineered irrespective of the
targeted thin-device deployment platform’s inherent hardware (and software) limitations.
1. Introduction
The UK e-Science RealityGrid project [1] has
extended the concept of a Virtual Reality (VR)
research centre across the Grid. “RealityGrid
uses Grid technologies to model and simulate
very large or complex condensed matter
structures and so address challenging scientific
problems that would otherwise remain out of
reach or even impossible to study” [2]. As part
of the project’s ongoing research initiative, the
authors (in part) conducted a very thorough and
comprehensive Human Factors evaluation [3];
finely examining e-Science user interaction
methodologies with a view towards improving
everyday working practices for applications (as
well as computer) scientists. The result and
subsequent evaluation of this investigative
process highlighted (amongst other notable
findings) the need for a more flexible and
convenient means of supporting ubiquitous
scientific user interaction in Grid computing
environments. This need was discerned from
RealityGrid’s scientific workflows in which it
was identified that at certain stages researchers
could benefit from having a more lightweight
and portable extension/version of their familiar
desk-based front-end applications; also (in some
cases) providing a more attractive and usable
alternative to some of today’s current and
otherwise somewhat cumbersome command
line-driven systems. Creating a more convenient
and mobile form of e-Science user interaction
was deemed beneficial because it would allow
researchers the freedom to interact with their
applications from virtually any location and at
any time of day (or night); especially supporting
(and sustaining) the vitally important interactive
process at times when it may become
inconvenient or otherwise impractical for the
scientist to carry a laptop PC or to connect to
the Grid from an office, laboratory or other
similar statically-oriented working environment.
In this paper the authors describe how
this discerned need for convenient e-Science
usability has been addressed (within the
RealityGrid project) through the creation of the
RealityGrid PDA and Smartphone Clients: two
novel and highly effective handheld user
interface applications, which have been built
purposefully to deliver (as a convenience tool)
flexible and readily available ‘around the clock’
user access to scientific applications running on
the Grid. A discussion is provided detailing the
individual user interaction capabilities of these
two handheld Clients, which include real-time
computational steering and interactive 3D
visualisation. The authors also identify, along
with several lessons learned through the
development of these Clients, how such
seemingly advanced user-oriented e-Science
facilities (in particular interactive visualisation)
have been efficiently engineered irrespective of
the targeted thin-device deployment platform’s
inherent hardware (and software) limitations.
Figure 1. “The central theme of RealityGrid is computational steering” [2] (left). The project has
developed its own Grid-enabled computational steering toolkit [10], comprising of lightweight Web
Services middleware and a programmable application library (right).
2. Related Work: The UK e-Science
RealityGrid Project
“Through the linking of massive computational
infrastructure at remote supercomputing centres
as well as experimental facilities, RealityGrid is
providing scientists with access to enormous
resources in a highly efficient manner, which
means researchers in a range of fields can
benefit from the close coupling of highthroughput experimentation, modelling and
simulation, and visualisation” [4]. To give an
insight into the computational scale of the
project: RealityGrid has been central to the
development, deployment and testing of
scientific Grid middleware and applications,
which, as demonstrated by the TeraGyroid [5],
STIMD [6] and SPICE [7] projects, efficiently
utilise a federated Grid environment (or ‘Gridof-Grids’) built around the UK’s most advanced
computing technology and infrastructure, linked
via trans-Atlantic fibre to the supercomputing
resources of the US’s TeraGrid [8].
2.1 Computational Steering
“The central theme of RealityGrid is
computational steering, which enables the
scientist to choreograph a complex sequence of
operations or ‘workflows’” [2] (Refer to Figure
1). “Using the Grid and a sophisticated
approach called computational steering, the
scientists steer their simulations in real-time so
that the heavy-duty computing can focus where
the dynamics are most interesting” [9].
RealityGrid has developed (as part of a wider
Grid-based problem-solving environment) its
own dedicated computational steering toolkit
[10], comprising of lightweight Web Services
middleware and a programmable application
library (Refer to Figure 1). Using the toolkit,
researchers have been able to inject the project’s
computational steering facilities into their
existing (and traditionally non-interactive)
scientific codes and thus become empowered to
steer their experiments (simulations) in realtime on (or off) the Grid.
2.2 Lightweight Visualisation
Extended RealityGrid research has recently led
to the creation of Lightweight Visualisation
(Refer to Figure 2). Developed separately by the
authors, Lightweight Visualisation has provided
a lightweight software platform to support
ubiquitous, remote user interaction with highend scientific visualisations; as a standalone
system it has also been designed for universal
use both within and outside of RealityGrid.
Through a framework of linked Grid-enabled
components, Lightweight Visualisation provides
the underlying software infrastructure to
facilitate high-level visualisation-oriented user
interaction via a wide range of inexpensive and
readily available commodity user interface
devices, such as PDAs (Refer to Figure 2),
mobile phones and also low-end laptop/desktop
PCs; systems all commonly affiliated by their
inherent lack of specialised (expensive)
hardware resources, which would traditionally
have discounted them from being considered as
viable
user
interaction
outlets
for
computationally-intensive visualisation. By also
adopting a similar architectural design to that of
the established RealityGrid computational
steering toolkit (Refer to Figures 1 and 2),
Lightweight Visualisation services (as with
steering) can be easily injected into existing
visualisation codes, thus allowing researchers to
interact remotely (via a PDA, for example) with
dedicated user-oriented facilities, which are
indigenous and often unique to their familiar or
preferred visualisation tools/applications.
Figure 2. Lightweight Visualisation (right) has provided a lightweight software framework to support
remote and ubiquitous visualisation-oriented user interaction via a wide range of inexpensive and readily
available commodity devices such as PDAs (left), mobile phones and also low-end laptop/desktop PCs.
3. The RealityGrid PDA and
Smartphone Clients
The RealityGrid PDA and Smartphone Clients
have both been created to allow scientists the
mobile freedom to stay in touch with their
experiments ‘on the move’ and especially at
times when it isn’t always convenient (or
practical) to carry a laptop PC or to connect to
the Grid from a desk-based terminal in an office
or laboratory-type environment. Using either of
these two handheld e-Science Clients the
researcher becomes able to readily access and
freely interact with their scientific applications
(discreetly if necessary) from virtually any
location and at any time of the day (or night);
connecting to the Grid over a standard, lowbandwidth Wi-Fi (802.11) or GSM/GPRS/3G
network and interacting through a user device,
small and compact enough to be easily
transported around ‘hands free’ inside one’s
pocket.
The RealityGrid PDA and Smartphone
Clients have each been individually engineered
for their respective pocket-sized devices using
C# and the Microsoft .NET Compact
Framework; thus allowing them to be deployed
onto any PDA or Smartphone device running on
the Microsoft Windows Mobile embedded
operating system. These two handheld eScience user interface applications are each able
to offer a comprehensive assemblage of highlevel RealityGrid-oriented user interaction
services, collectively allowing researchers to:
find their scientific jobs running on the Grid
(Refer to Figure 3), computationally steer their
simulations in real-time (Refer to Figure 4),
graphically plot parameter value trends (Refer
to Figure 4) as well as explore on-line scientific
visualisations (Refer to Figures 5 and 6).
3.1 Middleware Interactivity
The process by which the handheld RealityGrid
Clients communicate with scientific simulation
and visualisation applications, running on (or
off) the Grid, is achieved via a set of bespoke
proxy classes (internal to the Clients), which
have been developed specifically to support
interactivity with RealityGrid’s lightweight
middleware environment. Earlier versions of the
RealityGrid middleware were built around the
Open Grid Services Infrastructure (OGSI)
specification and implemented using OGSI::Lite
[11] (Perl). In this instance it was possible to
automatically generate the Clients’ internal
communication proxy code using a Microsoft
tool (wsdl.exe), which was fed with each
individual middleware component’s descriptive
Web Services Description Language (WSDL)
document. For the later versions of RealityGrid
middleware, which have been built around the
newer Web Services Resource Framework
(WSRF) specifications and implemented using
WSRF::Lite [11] (again in Perl), the Clients’
internal proxies had to be programmed
manually using bespoke low-level networking
code. This was due to the fact that the proxy
generation tool could not cope with the latest
WSRF specifications and essentially failed to
generate any usable code. Both of the handheld
Clients (PDA and Smartphone) are each able to
operate transparently with the earlier OGSIbased versions of RealityGrid’s middleware as
well as the latest WSRF-based implementation.
3.2 Security Provision
Through their respective internal proxy classes,
the PDA and Smartphone Clients are both able
to consume RealityGrid Web/Grid Services via
a two-way exchange of Simple Object Access
Figure 3. The RealityGrid PDA and Smartphone Clients both provide a mobile facility to query a Registry
(left) and retrieve a list of currently deployed jobs (right). Once a job has been selected, each Client will
then attach (connect) remotely to the appropriate scientific application running either on or off the Grid.
Protocol (SOAP) messages, which are
transmitted over a wireless network using the
Hyper Text Transfer Protocol (HTTP) or (in the
case of the latest WSRF-based middleware)
with optional Secure Sockets Layer (SSL)
encryption (HTTPS). In the event of using SSL,
both Clients will require the user to specify a
digital e-Science X.509 user certificate (or chain
of certificates) in order to perform the Public
Key Infrastructure (PKI) cryptography and thus
encrypt/decrypt all of the data on the network.
RealityGrid middleware (WSRF) will also
specify a list of authorised users and will require
a valid username and optional password to be
inputted through the Client, which will then be
inserted into a securely hashed Web Services
Security (WSSE) header, inside every
individually despatched SOAP packet. The
.NET Compact Framework, which was used to
build both of the handheld RealityGrid Clients,
provided no built-in provision for the required
WSSE or SSL/PKI security functionality. As a
result these essential elements had to be customdeveloped for both of the Clients using entirely
bespoke C# code to implement WSSE and a
relatively inexpensive third-party Application
Programming Interface (API), known as
SecureBlackBox [12], to implement SSL/PKI.
Additional wireless (802.11) security can also
be attained through the optional employment of
Wired Equivalent Privacy (WEP) or Wi-Fi
Protected Access (WPA/WPA2), both of which
can be used in conjunction with the Clients’
own security provision (when using Wi-Fi) but
are not relied upon and therefore not discussed
in any detail within the scope of this paper.
3.3 Finding Jobs on the Grid
When employing either of the RealityGrid PDA
or Smartphone Clients, the first task for the user
to perform (through the Client), prior to being
able to steer a simulation or explore a
visualisation, is to find their scientific jobs (the
location of which may not be known) on the
Grid. In RealityGrid, information describing
each actively deployed job (its start date/time,
creator ID, software package and most
importantly its location) is published to a central
(optionally secure) Web/Grid Service known as
the Registry (Refer to Figures 1 and 2). The
PDA and Smartphone Clients each provide the
user with a convenient facility to query a
RealityGrid Registry (of which there may be
several) in order to retrieve a list of currently
deployed (registered) jobs on the Grid (Refer to
Figure 3). The process of requesting, retrieving
and displaying job data is performed by both of
the Clients with an average latency of roughly
one quarter of a second; with a slightly longer
additional time delay incurred in the event of
having to first perform an SSL handshake with a
Registry, prior to requesting its job data. Once a
list of jobs has been successfully retrieved and
displayed, the researcher is able to then select
specific entries (jobs) to interact with on an
individual basis. The PDA or Smartphone
Client will then attempt to attach (connect)
through the RealityGrid middleware to the
appropriate scientific application (running either
on or off the Grid), in turn unlocking the more
advanced steering and visualisation-oriented
features of the handheld user interface software.
3.4 Real-Time Computational Steering
Once successfully attached to a running
simulation (having discovered its location from
a Registry) the PDA or Smartphone Client will
automatically display its built-in computational
steering user interface (Refer to Figure 4).
Through their respective steering interfaces,
each of the handheld RealityGrid Clients
provides the researcher with an effective mobile
Figure 4. The PDA and Smartphone Clients provide a mobile user interface for monitoring and tweaking
(steering) simulation parameters in real-time (left), additionally incorporating visual parameter value
trend plotting capabilities (right) to aid understanding of how experiments change and evolve as they run.
facility to monitor and tweak (steer) simulation
parameters in real-time (either individually or as
part of a batch), as the simulation itself runs and
evolves concurrently on a remote computational
resource. In RealityGrid, individual parameters
are exposed from within a scientific application
through the project’s open-source computational
steering API. This level of exposure facilitates
internal simulation/application parameter data to
be accessed and manipulated (steered) remotely
via the RealityGrid PDA and Smartphone
Clients, through a representative middleware
interface known as the Steering Web Service
(SWS) (Refer to Figure 1); or in the case of the
earlier OGSI-based middleware: the Steering
Grid Service (SGS). The RealityGrid PDA and
Smartphone Clients, via remote interactions
through the SWS/SGS, are both additionally
able to provide mobile facilities for invoking
lower-level simulation control procedures such
as Pause/Resume and Stop/Restart.
Whilst attached to a simulation, the
PDA and Smartphone Clients will each poll its
representative SWS/SGS via SOAP message
exchanges at regular one second intervals, to
request and subsequently display ‘up to the
second’ simulation parameter state information;
with one integral parameter update (request and
display) requiring a duration of roughly two
tenths of a second in order to fully complete.
The information returned to the Clients from the
RealityGrid middleware (in SOAP format)
individually describes all of the parameters that
have been exposed from a simulation using the
steering toolkit. Each parameter description
includes a unique identifier (or handle), current,
minimum and maximum values, data types, and
status flags denoting ‘steerable’, monitored-only
or internal to the application-embedded steering
library. In order to maximise usability during
the steering process, both user interfaces have
been developed to utilise a sophisticated
handheld device multi-threading scheme. This
has enabled each of the Clients to continuously
poll the steering middleware (for updated
parameter information) in an integral and
dedicated
background
thread,
whilst
concurrently remaining active and responsive at
all times to capture user input events
(entering/tweaking parameter values, etc.) in a
completely separate user interface thread. Prior
to the implementation of this handheld device
multi-threading scheme, both Clients would
simply ‘lock up’ their user interfaces and
temporarily block all user inputs throughout the
duration of every regular update polling cycle.
3.5 Visual Graph Plotting
To enhance and extend the process of steering a
simulation through either of the handheld
RealityGrid Clients, both user interfaces also
incorporate a visual parameter value trend
plotting facility (Refer to Figure 4). Using a
plotted graph to visually represent parameter
value fluctuation trends was identified from the
initial Human Factors investigation [3] to
greatly improve the scientist’s perception and
understanding of their experiments. The two
handheld RealityGrid Clients will each allow
the researcher to visually plot historical records
for one user-specified parameter at a time. This
visual plotting functionality has been custombuilt (for both Clients) using the .NET Compact
Framework’s standard Graphics Device
Interface API (known as GDI+). As with
computational steering, visual graph plotting for
both Clients is also performed in real-time,
allowing the researcher to visually observe how
their simulation changes and evolves
concurrently, either naturally over a period of
time or more often as a direct response to userinstigated computational steering activity.
Figure 5. The handheld RealityGrid Clients each offer the researcher an inclusive provision of advanced,
visualisation-oriented interaction facilities, which (up until this point) would only typically otherwise
have been accessible through conventional desktop (or laptop)-based user interface applications.
3.6 Handheld Interactive 3D Visualisation
The most advanced feature of the RealityGrid
PDA and Smartphone Clients is their highly
novel ability to each support user-interactive 3D
visualisation (Refer to Figures 5 and 6). Both of
the handheld RealityGrid Clients have been
developed specifically for use with the authors’
Lightweight Visualisation framework, allowing
them to each offer researchers an inclusive
provision of advanced, visualisation-oriented
interaction facilities, which (up until this point)
would only typically otherwise have been
accessible through conventional desktop (or
laptop)-based user interface applications. Using
Lightweight Visualisation’s middleware and it’s
accompanying, application-embedded module
(Refer to Figure 2), the handheld RealityGrid
Clients are both able to communicate usercaptured interaction commands (pan, zoom,
rotate, etc.) directly into a remote visualisation
application, and in response receive visual
feedback through the form of pre-rendered,
scaled, encoded and streamed image (frame)
sequences; processed and displayed locally
within the Clients (using GDI+) at an average
frequency of 7-10 frames-per-second. Thus
Lightweight Visualisation is able to achieve an
advanced level of 3D visualisation interactivity
within each of the Clients by shielding their
respective
resource-constrained
handheld
devices from having to perform any graphicallydemanding operations (modelling, rendering,
etc.) at the local user interface hardware level.
The process by which the handheld
RealityGrid Clients connect and interact
remotely with a visualisation (via Lightweight
Visualisation) is slightly more involved than for
steering a simulation through the SWS/SGS.
Having found and subsequently selected a
visualisation application/job from a Registry
(Refer to Figure 3), the researcher’s PDA or
Smartphone Client will then initiate a two-stage
attaching sequence (Refer to Figure 2): firstly
acquiring a remote host network endpoint from
the Lightweight Visualisation Web Service;
secondly establishing a direct socket channel
(using the acquired addressing information) to
the visualisation’s representative Lightweight
Visualisation Server. The Server is responsible
for managing simultaneous Client connections,
communicating received interaction commands
directly into the visualisation application and
encoding/serving pre-rendered images back to
the Client(s) for display. All communication
between the Client and the Server (message
passing/image serving) is performed using
Transmission Control Protocol/Internet Protocol
(TCP/IP) with optional SSL data protection.
The use of TCP/IP as the base networking
protocol within Lightweight Visualisation was
deemed preferable to the higher-level HTTP, in
order to eliminate undesirable transmission and
processing latencies incurred through the
marshalling of large quantities of binary image
data within SOAP-formatted message packets.
Once attached to a visualisation, user
interactions through the handheld Clients adopt
similar, standard conventions to those of the
familiar desktop model: dragging the stylus
across the screen (as opposed to the mouse) or
pressing the directional keypad (as opposed to
the keyboard cursor keys) in order to explore a
visualised model, and selecting from a series of
defined menu options and dialog windows in
order to access any application-specific features
of the remote visualisation software (Refer to
Figure 6). As with steering, handheld device
multi-threading has again been implemented
within the Clients’ 3D visualisation front-ends,
enabling them to both continuously receive and
display served image streams in a dedicated
Figure 6. When using the RealityGrid PDA or Smartphone Clients (left), researchers who employ VMD
[13] on a daily basis are able to access and interact (through their handheld device) with familiar,
indigenous user interface facilities from their everyday desktop or workstation application (right).
background thread, whilst constantly remaining
active and responsive to capture all user input
activity in a separate and concurrently running
user interface thread.
3.7 Tailor-Made User Interaction
RealityGrid has delivered Grid-enabling support
for a wide range of existing and heterogeneous
scientific visualisation codes. These currently
include commercial and open-source codes that
have been successfully instrumented (adapted)
for RealityGrid using the toolkit, as well as
purely bespoke codes that have been written
within the project to fulfil a specific scientific
role; usually acting as an on-line visualisation
counterpart to a bespoke simulation application.
Each of RealityGrid’s numerous visualisation
applications typically provides its own unique
assemblage of dedicated user interaction
services, which are also often geared
specifically towards a particular, individual field
of scientific research (fluid dynamics, molecular
dynamics, etc.). At the heart of Lightweight
Visualisation is its ability to accommodate this
level of user-interactive diversity by ‘tapping
into’ any given visualisation’s indigenous user
facilities (through a small embedded code
module or script) (Refer to Figure 2) and then
exposing them (similar to steerable parameters)
so that they can be accessed and invoked
remotely by a specially built (or Lightweight
Visualisation-enabled) Client; in this instance
the RealityGrid PDA and Smartphone Clients.
To properly demonstrate the benefits
of this ‘tapping into’ approach, the RealityGrid
PDA and Smartphone Clients have both initially
been developed to provide tailor-made handheld
front-ends for the Visual Molecular Dynamics
(VMD) [13] application (Refer to Figures 5 and
6). VMD is an open-source visualisation code
that has been employed extensively within
RealityGrid projects such as STIMD [6] and
SPICE [7]; it has also provided the initial testbed application for developing the Lightweight
Visualisation framework. When using the 3D
visualisation user interfaces of the RealityGrid
PDA or Smartphone Clients, researchers who
employ the VMD code on a daily basis are able
to access and interact (through their handheld
device) with familiar, indigenous user interface
facilities (in addition to the generic pan, zoom,
rotate, etc.) from their everyday desktop or
workstation application (Refer to Figure 6).
4. Conclusion and Future Directions
In this paper the authors have presented the
RealityGrid PDA and Smartphone Clients;
offering an insight into how these types of
handheld user interface applications (despite
their perceived inherent limitations, both in
terms of hardware and software capability) can
now be effectively engineered to deliver real,
beneficial scientific usability (steering, 3D
visualisation, security, etc.) and provide flexible
‘around the clock’ user access to the Grid by
complimenting and extending the more
traditional desk-based methods of e-Science
user interaction. Through the creation of these
two handheld user interfaces, scientists within
the RealityGrid project have been able to take
advantage of a highly convenient means for
readily accessing and interacting with their
Grid-based research experiments, which has
proved to be highly beneficial, particularly
when having to deal with scientific jobs (often
due to delayed scheduling allocations or lengthy
compute-time periods) outside of working hours
or whilst being away from the conventional
desk/office-based working environment: during
a meeting/conference, or whilst being at home
in the evenings/away at weekends, etc.
Future avenues of development for the
RealityGrid PDA and Smartphone Clients
currently include creating additional tailor-made
handheld front-ends for alternative visualisation
codes (and scientists who don’t use VMD).
Extended development is also currently engaged
with integrating a handheld user interface for
RealityGrid’s recently released Application
Hosting Environment (AHE) [14], which will
introduce invaluable new resource selection,
application launching and job management
capabilities. Further endeavours (outside of
RealityGrid) are also currently geared towards
deploying this developed handheld technology
as part of a trial with the University Hospitals of
Leicester NHS Trust (UHL NHS Trust), in
which it will be employed by surgeons and
medical consultants to view and interact with
Magnetic Resonance Imaging (MRI) and
Computed Tomography (CT) scan data, whilst
being away from the conventional office or
desk-based working environment.
Acknowledgement
The authors wish to thank the dedicated teams
of scientists within the RealityGrid project
(University College London, University of
Manchester,
University
of
Oxford,
Loughborough University, Edinburgh Parallel
Computing Centre and Imperial College).
Special thanks also go to Simon Nee (formally
of Loughborough University) and to Andrew
Porter of Manchester Computing for their
invaluable, additional support and advise.
[5].
[6].
[7].
[8].
[9].
[10].
References
[1].
[2].
[3].
[4].
RealityGrid, http://www.realitygrid.org
J. Redfearn. Ed. (2005, Sept).
“RealityGrid – Real science on
computational Grids.” e-Science 2005:
Science on the Grid. [On-line]. pp. 12-13.
Available:
http://www.epsrc.ac.uk/CMSWeb/Downl
oads/Other/RealityGrid2005.pdf
R.S. Kalawsky and S.P. Nee. (2004,
Feb.). “e-Science RealityGrid Human
Factors Audit – Requirements and
Context Analysis.” [On-line]. Available:
http://www.avrrc.lboro.ac.uk/hfauditReal
ityGrid.pdf
D. Bradley. (2004, Sept). “RealityGrid –
Real Science on computational Grids.” eScience 2004: The Working Grid. [Online]. pp. 12-13. Available:
http://www.epsrc.ac.uk/CMSWeb/Downl
oads/Other/RealityGrid2004.pdf
[11].
[12].
[13].
[14].
R. J. Blake, P. V. Coveney, P. Clarke and
S. M. Pickles, “The TeraGyroid
Experiment – Supercomputing 2003.”
Scientific Programming, vol. 13, no. 1,
pp. 1-17, 2005.
P.W. Fowler, S. Jha and P.V. Coveney.
“Grid-based steered thermodynamic
integration accelerates the calculation of
binding free energies.” Philosophical
Transactions of the Royal Society A, vol.
363, no. 1833, pp. 1999-2015, August 15,
2005.
S. Jha, P. Coveney and M. Harvey.
(2006, June). “SPICE: Simulated Pore
Interactive Computing Environment –
Using Federated Grids for “Grand
Challenge” Biomolecular Simulations,”
presented at the 21st Int. Supercomputer
Conf, Dresden, Germany, 2006. [Online]. Available:
http://www.realitygrid.org/publications/s
pice_isc06.pdf
TeraGrid, http://www.teragrid.org
M. Schneider. (2004, April). “Ketchup on
the Grid with Joysticks.” Projects in
Scientific Computing Annual Research
Report, Pittsburgh Supercomputing
Center. [On-line]. pp. 36-39. Available:
http://www.psc.edu/science/2004/teragyr
oid/ketchup_on_the_grid_with_joysticks.
pdf
S.M. Pickles, R. Haines, R.L. Pinning
and A.R. Porter. “A practical toolkit for
computational steering.” Philosophical
Transactions of the Royal Society A, vol.
363, no. 1833, pp. 1843-1853, August 15,
2005.
WSRF::Lite/OGSI::Lite,
http://www.sve.man.ac.uk/Research/Ato
Z/ILCT
SecureBlackBox,
http://www.eldos.com/sbb
W. Humphrey, A Dalke and K. Schulten,
“VMD – Visual Molecular Dynamics”, J.
Molec Graphics, vol. 14, pp. 33-38,
1996.
P.V. Coveney, S.K. Sadiq, R. Saksena,
S.J. Zasada, M. Mc Keown and S.
Pickles, "The Application Hosting
Environment: Lightweight Middleware
for Grid Based Computational Science,”
presented at TeraGrid '06, Indianapolis,
USA, 2006. [On-Line]. Available:
http://www.realitygrid.org/publications/t
g2006_ahe_paper.pdf
Ian R. Holmes and Roy S. Kalawsky 2006
Download