Document 13346894

advertisement
ICENI: A Grid Middleware
for the e-Scientist
Dr Steven Newhouse
Technical Director, London e-Science Centre
Department of Computing, Imperial College
London e-Science Centre
• Specialising in applied Grid Middleware
• Centre Projects
– E-Science Portal at Imperial College (Sun)
– Grid Infrastructures (Compusys & Intel)
• High Energy Physics
– GridPP & EU DataGrid testbeds
• EPSRC pilot projects
– Reality Grid
– DiscoveryNet
• Distributed Protein Annotation Grid (BBSRC)
2
1
What is an e-Scientist?
• Applied Scientists are becoming e-scientists:
– computational & data services
• HPC, genetic databases, electronic journals
– remote sensors
• Large Hadron Collider, SOHO
– personal devices
• workstations, mobile phones, PDA’s
– supporting collaborative working
• ‘collaboratories’: global scientific collaborative communities
3
What is an e-Scientist?
• Applied Scientists are becoming e-scientists:
– computational & data services
• HPC, genetic databases, electronic journals
– remote sensors
• Large Hadron Collider, SOHO
– personal devices
• workstations, mobile phones, PDA’s
– supporting collaborative working
• ‘collaboratories’: global scientific collaborative communities
Urgent need for an integrated environment
4
2
What is Grid Middleware?
• It is the computation infrastructure that enables
e-science
• Provides secure resource sharing & coordinated
problem solving in dynamic, multi-institutional
virtual organisations
Foster et.al.
5
ICENI
The Iceni, under Queen Boudicca,
united the tribes of South-East
England in a revolt against the
occupying Roman forces in AD60 •
•
•
•
•
IC e-Science Networked Infrastructure
Developed by LeSC Grid Middleware Group
Collect and provide relevant Grid meta-data
Use to define and develop higher-level services
Prototype web service protocols
6
3
ICENI Architecture
Web
Services
Gateway
Public Computational Community
Computational
Resource
CR
JavaCoG
Globus
Application
Portal
SR
Identity
Manager
Private
Administrative
Domain
Resource Browser
Storage
Resources
CR
Domain Manager
CR
SR
SR
Network
Resources
Public Computational Community
SR
Software
Resources
Resource
Broker
Application
Mapper
PolicyManager
CR
SR
ResourceManager
Private
Component
Design Tools
Gateway between private
and public regions
Application
Design Tools
Public
7
Contents
• Grid Enabled Component Frameworks
– How e-scientists will create applications on the Grid
• Portals
– How e-scientists will interact with the Grid
• Collaborative Visualisation
– How e-scientists will work together on the Grid
• Grid Middleware
– The software infrastructure that enables this activity
• The Future…
8
4
Grid Enabled Component
Framework
• Goals:
– Promote component reuse and sharing
– Simplify application construction
– Enable deployment to diverse Grid resources
• Component Repository
– Browse the meta-data within the component
• Ports – used to connect components
• Implementations – where the component can run
• Partition the roles within scientific computing
– Numerical Developer: Create components
– Scientific Developer: Add domain knowledge
– End User: Visual Programming Paradigm
9
Example: Linear Solver
Linear
Equation
Source
DoF
Matrix
Vector
Linear
Equation
Solver
Vector
Display
Vector
10
5
Example: Linear Solver
Linear
Equation
Source
DoF
Matrix
Vector
Linear
Equation
Solver
Vector
Display
Vector
Unsymmetric
Matrix
C
Java
11
Example: Linear Solver
Linear
Equation
Source
DoF
Matrix
Vector
Unsymmetric
Matrix
Vector
Display
Vector
LU
C
C
Linear
Equation
Solver
Java
Java
ScaLAPACK
12
6
Example: Linear Solver
Linear
Equation
Source
DoF
Matrix
Vector
Unsymmetric
Matrix
Display
Vector
Vector
LU
C
C
Linear
Equation
Solver
BiCG
Java
C
Java
Java
ScaLAPACK
ScaLAPACK
13
Higher-level Services
• Component framework provides:
– Rich application meta-data
– Decoupled component definition and implementation
• Application Mapper:
– Exploit performance information to map component
implementation to the ‘best’ resources
• Resource Broker:
– Resource selection through user defined policies:
• Minimise cost using computational economics
• Minimise execution time using the application mapper
14
7
Web Portals
• Handheld wireless devices become ubiquitous
– Personal Digital Assistants
– Mobile Phones
• Interaction with e-science infrastructures
– Any time
– Any place
– Any where
• Goal: Provide secure ‘one stop shop’ for e-science
15
EPIC: e-Science Portal at
Imperial College
• Collaborative LeSC industrial project with
Sun Microsystems
• Develop a secure portal infrastructure to:
– Access your own personal environment
– Applications to support day-to-day e-science
– Interaction with other Grid infrastructures
• Allow role based access to resources
– Anonymous: public web pages
– Students: internal pages, email, compute resources
– Staff: restricted pages
16
8
Accessing e-science applications
• Submarine Design
– Dr Ian Mathews, Imperial College
– Parameter exploration to find optimal design
• Mesoscale Materials Analysis
– Professor Peter Coveney, QMUL, London
– Remote access to Lattice Boltzman simulation (LB3D)
• Technical Compute Portal
– Wolfgang Gentzsch & Dan Fraser (Sun Microsystems)
– Web based access to Sun’s Grid Engine
17
Collaborative Visualisation
• Computational simulations are:
– Complex: multi-disciplinary & multi-physics
– Expensive: high-end compute & storage resources
– Lengthy: require many processors for many hours
• Need to maximise the return on investment
– Alter configuration parameters during execution
– Extract results during execution
– Visualise results with remote collaborators
18
9
Steering & Visualisation
Architecture
Jini Lookup Service
Visualisation
Server
19
Steering & Visualisation
Architecture
Jini Lookup Service
Application
Component
Visualisation
Server
20
10
Steering & Visualisation
Architecture
Jini Lookup Service
Application
Component
Visualisation
Client
Visualisation
Server
21
Steering & Visualisation
Architecture
Jini Lookup Service
Application
Component
Visualisation
Client
Visualisation
Server
Rendering
Engine
e.g. VTK
Chromium
22
11
Distributed Steering &
Visualisation
Rendering
Engine 1
Application
Component
Visualisation
Server
Dataset A & B
Dataset A
Visualisation
Client 1
Same view
of dataset A
Visualisation
Client 2
Dataset B
Visualisation
Client 3
View
dataset B
Each component position to
minimise communication cost & maximise performance
23
Rendering
Engine 2
Open Grid Services Architecture
(OGSA)
• Initiated by the Globus team with IBM
• Leverage e-commerce software within e-science
• Utilise web service protocols and tool base:
– UDDI: Universal, Discovery, Device & Integration
– WSDL: Web Services Definition Language
– SOAP: Simple Object Access Protocol
• Refactor Globus toolkit to use web services
• OGSA supports transient & stateful web services
24
12
Grid Service Contracts
Jini
Lookup
Service
DRMAA
Client
DRMAA
Resource
25
Grid Service Contracts
Resource
Browser
DRMAA
Client
User:B
Jini
Lookup
Service
DRMAA
Resource
User:A+B
Duration:1hr26
13
Grid Service Contracts
User:A
Duration:10m
DRMAA
Resource
User:A
DRMAA
Client
Jini
Lookup
Service
DRMAA
Client
User:B
DRMAA
Resource
User:A+B
Duration:1hr27
OGSA & Jini Integration
User:A
Duration:10m
DRMAA
Resource
Gateway
Manager
GSI enabled
Web Service
Hosting
Environment
Jini
Lookup
Service
DRMAA
Resource
User:A+B
Duration:1hr28
14
OGSA & Jini Integration
User:A
Duration:10m
DRMAA
Resource
Gateway
Manager
WSDL
Interface
GSI enabled
Web Service
Hosting
Environment
Jini
Lookup
Service
DRMAA
Resource
User: A+B
Duration:1hr29
Jini Client
Interface
OGSA & Jini Integration
User:A
Duration:10m
DRMAA
Resource
Gateway
Manager
WSDL
Interface
GSI + SOAP
Connection
GSI enabled
Web Service
Hosting
Environment
Jini
Lookup
Service
Jini Client
Interface
DRMAA
Resource
User: A+B
Duration:1hr30
15
OGSA & Jini Integration
User:A
Duration:10m
DRMAA
Resource
Gateway
Manager
WSDL
Interface
GSI enabled
Web Service
Hosting
Environment
Jini
Lookup
Service
SOAP->Java
GSI + SOAP
Connection
User Info
Jini Client
Interface
DRMAA
Resource
User: A+B
Duration:1hr31
Access Grid Credits
• Producer: Stephen McGough
• Node Operator: Marko Krznaric
• Demonstrators:
–
–
–
–
Nathalie Furmento
William Lee
James Stanton
Asif Saleem
32
16
London e-Science Centre
• Director: Professor John Darlington
• Technical Director: Dr Steven Newhouse
• Research Staff:
–
–
–
–
–
Nathalie Furmento, Stephen McGough
Anthony Mayer, James Stanton
Yong Xie, William Lee
Marko Krznaric, Murtaza Gulamali
Asif Saleem, Laurie Young
• Contact:
– http://www.lesc.ic.ac.uk/
– e-mail: lesc@ic.ac.uk
33
17
Download