e-Science, the Grid & Southampton University Andy Keane Director, Southampton Regional

advertisement
e-Science, the Grid
& Southampton University
Andy Keane
Director, Southampton Regional
e-Science Centre
ajk@soton.ac.uk
e-Science and the Grid
‘e-Science is about global collaboration in
key areas of science, and the next
generation of infrastructure that will
enable it.’
John Taylor
Director General of Research Councils,
Office of Science and Technology
Grid Computing
The Grid Problem
“Flexible, secure, coordinated resource sharing
among dynamic collections of individuals,
institutions, and resource”
• Enable communities (“virtual organizations”) to share
geographically distributed resources as they pursue
common goals - assuming the absence of …
–
–
–
–
central location
central control
omniscience
existing trust
Why Grids?
• A biochemist exploits 10,000 computers to
screen 100,000 compounds in an hour
• 1,000 physicists worldwide pool resources
for peta-op analyses of petabytes of data
• Climate scientists visualize, annotate, &
analyze terabyte simulation datasets
• Engineers at a multinational company
collaborate on the design of a new product
• An insurance company mines data from
partner hospitals for fraud detection
• An application service provider offloads
excess load to a compute cycle provider
Network for Earthquake
Engineering Simulation
• NEESgrid: national
infrastructure to couple
earthquake engineers with
experimental facilities,
databases, computers, &
each other
• On-demand access to
experiments, data
streams, computing,
archives, collaboration
NEESgrid: Argonne, Michigan, NCSA, UIUC, USC
Grids: Why Now?
• Moore’s law  highly functional end-systems
• Ubiquitous Internet  universal connectivity
• Network exponentials produce dramatic
changes in geometry and geography
– 9-month doubling: double Moore’s law!
– 1986-2001: x340,000; 2001-2010: x4000?
• New modes of working and problem solving
emphasize teamwork, computation
• New business models and technologies
facilitate outsourcing
Elements of the Problem
• Resource sharing
– Computers, storage, sensors, networks, …
– Heterogeneity of device, mechanism,
policy
– Sharing conditional: negotiation, payment,
…
• Coordinated problem solving
– Integration of distributed resources
– Compound quality of service requirements
• Dynamic, multi-institutional virtual
organisations
– Dynamic overlays on classic org structures
– Map to underlying control mechanisms
http://www.globus.org/research/papers/anatomy.pdf
The Grid World: Current
Status
• Dozens of major Grid projects in scientific
& technical computing/research & education
– Deployment, application, technology
• Some consensus on key concepts and
technologies
– Open source Globus Toolkit™ a de facto
standard for major protocols & services
– Far from complete or perfect, but out there,
evolving rapidly, and large tool/user base
• Global Grid Forum a significant force
• Industrial interest emerging rapidly
UK e-Science Projects
UK e-Science Initiative
• £120M Programme over 3 years
• £75M is for Grid Applications in all
areas of science and engineering
• £10M for Supercomputer upgrade
• £35M ‘Core Program’ to encourage
development of generic ‘industrial
strength’ Grid middleware
 Require £20M additional ‘matching’
funds from industry
EPSRC e-Science Projects (1)
• Comb-e-Chem:Structure-Property
Mapping
– Southampton, Bristol, Roche, Pfizer, IBM
• DAME: Distributed Aircraft Maintenance
Environment
– York, Oxford, Sheffield, Leeds, Rolls Royce
• Reality Grid: A Tool for Investigating
Condensed Matter and Materials
– QMW, Manchester, Edinburgh, IC,
Loughborough, Oxford, Schlumberger, …
EPSRC e-Science Projects (2)
• My Grid: Personalised Extensible Environments
for Data Intensive in silico Experiments in
Biology
– Manchester, EBI, Southampton, Nottingham,
Newcastle, Sheffield, GSK, Astra-Zeneca, IBM, Sun
• GEODISE: Grid Enabled Optimisation and Design
Search for Engineering
– Southampton, Oxford, Manchester, BAE, Rolls Royce
• Discovery Net: High Throughput Sensing
Applications
– Imperial College, Infosense, …
UK Grid ‘Core Program’
UK Grid ‘Core Program’
1. Grid ‘Core Program’ e-Science
Centres
 UK e-Science Grid
2. Development of Generic Grid
Middleware
3. ‘IRC’ Grand Challenge Project
4. Support for e-Science Projects
5. International Involvement
6. Grid Network Team
UK e-Science Grid
Glasgow NeSC
Edinburgh (NeSC)
DL
Belfast
Newcastle
Manchester
Cambridge
Oxford
Cardiff
RAL
London
Southampton
Hinxton
e-Science Centre
Middleware Projects
• Each centre has £1M to fund industry
facing middleware projects (NeSC £3M)
• Projects require 50% resourcing from
industry
• Projects should aim to exploit Grid
technology to allow e-Science based
working
• £5M for ‘Open Call’ projects
IRC ‘Grand Challenge’ Project
• Equator: Technological
innovation in physical and
digital life (led by Notts)
• AKT: Advanced Knowledge
Technologies (led by Soton)
• DIRC: Dependability of
Computer-Based Systems (led
by N’castle)
• MIAS: From Medical Images
and Signals to Clinical
Information (led by Oxford)
Support for e-Science Projects
• ‘Grid Starter Kit’ Version 1.0
- available for distribution from July 2001
- maintain library of Open Source Grid m/w
• Grid Support Centre in operation
- leads Grid Engineering Group
- supports users
• Training Courses
- first courses given
• National e-Science Institute Research Seminar
Programme
• Architecture, database and engineering task
forces
Southampton Regional
e-Science Centre
http://www.e-science.soton.ac.uk
Soton e-Science Centre
Management Board
• Objectives
– Ensure there is
executive attention to
key e-Science Centre
activities
– Set priorities to the EScience Centre
activities
– Make key decisions on
e-Science centre
process, resourcing and
standards
– Communicate
accomplishments, issues
and intentions to senior
management
• Members
–
–
–
–
–
Andy Keane
Simon Cox
Dave de Roure
Pete Hancock
Mike Chrystall
• Links
– http://www.escience.soton.ac.uk
– http://www.nesc.ac.uk
Soton e-Science Centre
Management Board
• Scope
– Activities within the
Southampton e-Science
Centre
– Resourcing within the
Southampton e-Science
Centre
– GRID development
supporting Southampton
e-Science
– Regional Outreach
• Initial Agenda
–
–
–
–
–
–
Communications
GRID deployment
Recruiting
Accommodation
Access Grid Node
Establishing Regional
Community
Soton e-Science Centre
Focus
• To support a number of industry facing Grid
projects.
• To engage diverse industrial sectors.
• To make use of large scale HPC facilities.
• To make use of knowledge based technologies.
• Involvement with Applications working group at
Global Grid Forum and Industry Forum.
• Advanced Knowledge Technologies IRC led from
Southampton
• Report to University level e-Science committee.
• Support trans-national projects, e.g., GRIA.
Soton e-Science Centre
Facilities
• Dedicated office space to house six staff
initially, conversion of e-Science building to
house up to 20 underway – SRIF funds.
• Small linux cluster for public use across
the Grid operated in collaboration with
SUCS now running.
• Access to large linux (324 node) cluster
for partners in collaboration with SUCS.
• Access grid node operational.
Soton e-Science Centre
TimeLine
• Initial grant agreed August 2001, dedicated
building available June 2002.
• Centre Manager recruited Sept. 2001 and
Technical Manager April 2002.
• Initial test-bed staff recruited for three
projects which are now well established:
– Geodise, Mygrid, Comb-e-chem
• Centre project recruiting underway:
– G-Ship, GEM
• Further Centre projects being developed:
– G-Civil, G-Yacht, G-Risk
Ongoing Activities
• Delivery of G-Ship and GEM projects.
• Development of further project specifications
taking place:
– G-Civil will allow on-site instrumentation to be accessed
via the Grid.
– G-Yacht to provide America’s Cup teams with new ways of
collaborative working.
– G-Risk to give civil engineering contractors web service
based risk analysis tools (Anglo-Dutch consortium).
• University Strategic Plan being approved.
• Conversion of new premises at cost of £1.5M –
available June 2002.
• Refinement of public linux service underway.
Southampton Regional
e-Science Centre
G-Ship Project
G-Ship Project
• Working with existing portfolio of
commercial ship motion prediction
software.
• Migrating this software into a Grid
services model.
• Aim to provide consulting naval architects
with state-of-the-art collaborative tools
accessed via the Grid.
G-Ship Project
• Provision of simple sign on, data entry, job
submission and workflow control services.
• CAD geometry input service, capable of accepting
a variety of entry formats.
• Automated mesh generation service based on
design topology.
• Parallel processing of solutions (over distributed
resources, via OGSA compliant technology).
• Post processing and visualisation service to provide
meaningful output to the designer.
• Capability for integration with other service based
systems to provide combined offerings with
greater scope.
G-Ship Project
• The power of ship motion simulation
becomes available to a wider user base.
• Closer integration with CAD facilities
brings about time and resource efficiencies.
• Brings in a ‘plug in’ approach to motion
models integrated with the CAD design
process.
• Produces Grid based services that can be
integrated
into
more
sophisticated
offerings.
Southampton Regional
e-Science Centre
GEM Project
GEM Project
• Working with small start-up company that
designs optimal planar photonic devices.
• Aim to provide Grid services based design
and optimization capability.
• Closely linked to the fabrication of new
devices and existing CAD tools.
• State-of-the-art optimization toolkits.
GEM Project
• Portal driven access to electromagnetic
simulation technologies.
• Distributed computational resources for
EM simulation.
• Service based delivery of optimisation
over the Grid.
• Automated generation and population of
a repository of designs.
• Provide a framework for Grid based
knowledge engineering and exploitation
in this field.
GEM Project
GEM will link to the Geodise test-bed
and demonstrate seamless access to
state-of-the-art optimisation and
search tools, industrial strength
analysis codes, and distributed
computing and data resources.
Summary
• Resource sharing & coordinated problem solving in
dynamic, multi-institutional virtual organizations
• Using clusters, supercomputers & data repositories
• Focus on Grid middleware:
– Globus Toolkit a source of protocol and API
definitions – and reference implementations
– Open Grid Services Architecture represents next step
in evolution
– Condor High throughput computing
– Web Services & W3C leveraging e-business
• e-Science Projects applying Grid concepts to applications
• Southampton e-Science Centre emphasis on support for
engineering applications.
Download