Malcolm Atkinson

advertisement
UK e-Science
National e-Science Centre
Open Day
Prof. Malcolm Atkinson
Director
www.nesc.ac.uk
17th January 2003
1
e-Science Leadership
Partnerships
e-Science alliance: Edinburgh+Glasgow

Physics & Astronomy (2), EPCC, Informatics, Computing Science
Capability Computing & e-Science: Edinburgh + CCLRC
UK + EU: Research and Training Projects £70M
GridPP, European Data Grid, AstroGrid, ENACTS,
GRIDSTART, RealityGrid, Neuroinformatics Grid, …
QCDOC + QCD Grid
HPC(x) (Edinburgh, IBM, CCLRC: 3.3TFlops)
Scottish Investment £6.7M
ScotGRID, SRIF, eDIKT,
Scottish Centre for Genomic Technology and Informatics, …
NeSC set up, launched and running £8M
e-Science Institute


Blue Gene Workshop (Protein Folding & Structure, IBM)
GGF5 & HPDC11(900 people, largest GGF, largest HPDC)
BlueDwarf (IBM p690 server donated for Scientific DB Research)
2
3
UK e-Science
e-Science and the Grid
‘e-Science is about global collaboration in key
areas of science, and the next generation of
infrastructure that will enable it.’
‘e-Science will change the dynamic of the
way science is undertaken.’
John Taylor
Director General of Research Councils
Office of Science and Technology
4
From presentation by Tony Hey
What’s Changing
Collaboration is Growing
Data is Exploding
Theory
Interdependence
Computing
Experiment
5
6
UK e-Science Programme (1)
2001 - 2003
DG Research Councils
E-Science
Steering Committee
Director’s
Awareness and Co-ordination Role
Grid TAG
Director
Director’s
Management Role
Generic Challenges
EPSRC (£15m), DTI (£15m)
Academic Application Support
Programme
Research Councils (£74m), DTI (£5m)
PPARC (£26m)
BBSRC (£8m)
MRC (£8m)
NERC (£7m)
£80m Collaborative projects
ESRC (£3m)
EPSRC (£17m)
CLRC (£5m)
Industrial Collaboration (£40m)
7
UK e-Science Programme (2)
2003 - 2005
DG Research Councils
E-Science
Steering Committee
Director’s
Awareness and Co-ordination Role
Grid TAG
Director
Director’s
Management Role
Generic Challenges
EPSRC (£15m), DTI (£15m)
Academic Application Support
Programme
Research Councils (£74m), DTI (£5m)
PPARC (£26m)
BBSRC (£8m)
MRC (£8m)
NERC (£7m)
£80m Collaborative projects
ESRC (£3m)
EPSRC (£17m)
CLRC (£5m)
Industrial Collaboration (£40m)
8
9
NeSC in the UK
National
eScience
Centre
HPC(x)
Glasgow
Edinburgh
Directors’ Forum
Newcastle
Helped build a community
Belfast
Engineering Task Force
Grid Support Centre
Manchester
Daresbury Lab
Architecture Task Force
UK Adoption of OGSA
Cambridge
OGSA Grid Market
Oxford
Workflow Management
Hinxton
RAL
Database Task Force
Cardiff
OGSA-DAI
London
GGF DAIS-WG
Southampton
e-SI Programme
training, coordination,
community building,
workshops, pioneering
GridNet
10
NeSC Staff
Senior staff








Prof. Malcolm Atkinson
Dr Arthur Trew
Dr Anna Kenway
Ms Gill Maddy
Dr Dave Berry
Dr Richard Sinnott
Dr Mark Parsons
Mr Stuart Anderson
Director
Deputy Director
Centre Manager
Event Manager
Research Manager
Technical Director (Glasgow)
Commercial Director
Regional Director
Research partnerships


Dr Bob Mann
Dr Richard Baldock
Institute for Astronomy
MRC, Human Genetics Unit
Industrial partnerships


Dr Andy Knox
Dr Dave Pearson
IBM Greenock
Oracle
11
NeSC Related Projects
SHEFC
ScotGrid
eDIKT
SRIF
Wellcome
£0.9M
£2.3M
£2.3M
Cardiovascular Functional
Genomics
£5.4
MRC
Neuroinformatics Grid £1.5M
(Biobank Scottish Spoke)
PPARC
AstroGrid
GridPP
EPSRC
e-STORM
GridNet
DTA Neuroinf.
IRCs
£5M
£17M
£359K
£595K
£6M
Equator
AKT
DIRC
Nanotechnology
EU IST FP5 Projects
GridStart
Enacts
Data Grid
€1.5M
€0.8M
€10M
Centre Projects
OGSA-DAI
SunGrid
GridWeaver
£1.3M
£400K
£132K
Proposed Centre Projects
Bridges
OGSA-DAI II
GridWeaver 2
PGPGrid
MS.NETGrid
FirstDIG
£372K
£277K
£400K
£312K
£112K
£ 90K
12
EU GridProjects
DataGrid (CERN, ..)
EuroGrid (Unicore)
DataTag (TTT…)
Astrophysical Virtual Observatory
GRIP (Globus/Unicore)
GRIA (e-Business, …)
GridLab (Cactus, …)
CrossGrid
EGSO (Solar Physics)
GridStart
45 million Euros
13
NeSC Internationally
e-Science Institute
>1000 different participants
From >25 countries
Conferences organised
Blue gene
Opening by Gordon Brown
Sun HPC Consortium
Applications workshop
Global Grid Forum 5
HPDC 11
N+N Meetings
USA, San Francisco, Aug 01
China bioinf., e-SI, June ’02
USA, London, Oct ’02
China, Kunming, Jan ’03
Visitors
Ian Foster
Steve Tuecke
Greg Riccardi
Roy Williams
Jim Gray (03)
Alex Szalay (03)
North American visits
SDSC & ISI, Nov. 01
SuperComputing 01
Canarie 7, Toronto
ANL, Nov 01 (OGSA), GGF5
planning
NPACI Meeting
Toronto, GGF4 (OGSA, DAIS &
GGF5 planning)
NCSA, Feb 02
ANL, Feb 02
ANL, Early Adopters, June 02
Magic meeting, Sep. 02
GGF6, Chicago
SuperComputing 02, Baltimore
GlobusWorld, San Diego, Jan. 03
Programme C’ttees
GGF4
GGF5
HPDC11
GGF7
HPDC12
DB Chapter
In Edition 2 of Grid book
14
15
A X-informatics Grid
X-ologists
X-informatics Application
SemanticInfrastructure
Grid
Data Mining
X-informatics Common High-level
Monitoring
Diagnosis
Scheduling
Accounting
Logging
Data Integration
Authorisation
Data Access
Grid Plumbing & Security Infrastructure
Data & Compute Resources
Distributed
Structured
DataData
Providers
Data Curators 16
Database Growth
PDB protein structures
17
More Computation
as computer performance improves, the
range of applications increases
100
90
HPCx
£53M: 3 machines
Tflops capability
80
70
60
50
40
30
20
10
0
whole earth
organs
climate
solar
weather
eddy
cells
complex
resolution materials
multiscale
oceans
design
whole
astroplasmas
protein drug design
nanostructures
aircraft
structures
0
1
2
3
4
5
6
7
8
year
18
1a. Request to Registry
for sources of data
about “x”
SOAP/HTTP
Registry
1b. Registry
responds with
Factory handle
service creation
API interactions
2a. Request to Factory for access
to database
Factory
Client
2c. Factory returns
handle of GDS to
client
3a. Client queries GDS with
XPath, SQL, etc
3c. Results of query returned to
client as XML
2b. Factory creates
GridDataService to manage
access
Grid Data
Service
3b. GDS interacts
with database
XML /
Relationa
l
database
19
OGSA-DAI
Release 1 Available
http://www.ogsadai.org.uk
http://www.ogsa-dai.org.uk
http://www.ogsa-dai.org
http://www.ogsadai.org
20
Access Grid
Access Grid Nodes
Technology Developed by Rick Stevens’ group
at Argonne National Laboratory
Access Grid will enable informal and formal
group to group collaboration
Distributed lectures and seminars
Virtual meetings
Complex distributed grid demos
Uses MBONE and MultiCast Internet
Technologies
21
From presentation by Tony Hey
22
Wellcome Trust: Cardiovascular
Functional Genomics
Glasgow
Shared data
Edinburgh
Public curated
data
Leicester
Oxford
London
Netherlands
23
LHC Computing
Challenge
1 TIPS = 25,000 SpecInt95
~PBytes/sec
Online System
~100 MBytes/sec
•100 triggers per second
•Each event is ~1 Mbyte
US Regional
Centre
Tier
3
~ Gbits/sec
or Air Freight
Italian Regional
Centre
Institute
Institute
~0.25TIPS
Workstations
CERN Computer
Centre >20 TIPS
Tier
0
French Regional
Centre
Tier
2
~Gbits/sec
Physics data
cache
PC (1999) = ~15 SpecInt95
Offline Farm
~20 TIPS
~100 MBytes/sec
•One bunch crossing per 25 ns
Tier
1
1. CERN
ScotGRID++
~1 TIPS
RAL Regional
Centre
Tier2 Centre
Tier2 Centre
Tier2 Centre
~1 TIPS ~1 TIPS ~1 TIPS
Physicists work on analysis “channels”
Institute
Institute
100 - 1000
Mbits/sec
Tier
4
Each institute has ~10 physicists working on
one or more channels
Data for these channels should be cached by
the institute server
24
25
global in-flight engine diagnostics
in-flight data
global network
eg SITA
airline
ground
station
DS&S Engine Health Center
internet, e-mail, pager
maintenance centre
data centre
Distributed Aircraft Maintenance Environment: Universities of Leeds, Oxford, Sheffield &York
26
27
Comparative
Functional Genomics
Large amounts of
data
Highly
heterogeneous
Data types
Data forms
community
Highly complex and
inter-related
Volatile
28
UCSF
UIUC
From Klaus Schulten, Center for Biomollecular Modeling and Bioinformatics, Urbana-Champaign29
30
Download