ppt - California Institute for Telecommunications and Information

advertisement
Applications Requiring
An Experimental Optical Network
Invited Keynote
I-Light Applications Workshop
Indiana Univ. Purdue Univ. Indianapolis
December 4, 2002
Dr. Larry Smarr
Director, California Institute for Telecommunications and
Information Technologies
Harry E. Gruber Professor,
Dept. of Computer Science and Engineering
Jacobs School of Engineering, UCSD
Closing in on the Dream
A High Performance Collaboration Grid
“What we really have to do is eliminate distance between individuals
who want to interact with other people and with other computers.”
― Larry Smarr, Director
National Center for Supercomputing Applications, UIUC
Illinois
ATT &
Sun
Boston
SIGGRAPH 89
Science by Satellite
“Using satellite technology…demo of
What It might be like to have high-speed
fiber-optic links between advanced
computers in two different geographic locations.”
― Al Gore, Senator
Chair, US Senate Subcommittee on Science, Technology and Space
http://sunsite.lanet.lv/ftp/sun-info/sunflash/1989/Aug/08.21.89.tele.video
I-WAY: Information Wide Area Year
Supercomputing ‘95
•
The First National 155 Mbps Research Network
– 65 Science Projects
– Into the San Diego Convention Center
•
I-Way Featured:
– Networked Visualization Application Demonstrations
– Large-Scale Immersive Displays
– I-Soft Programming Environment
Cellular Semiotics
CitySpace
http://archive.ncsa.uiuc.edu/General/Training/SC95/GII.HPCC.html
UIC
Alliance 1997: Collaborative Video Production
via Tele-Immersion and Virtual Director
Alliance Project Linking CAVE, Immersadesk,
Power Wall, and Workstation
UIC
Donna Cox, Bob Patterson, Stuart Levy, Glen Wheless
www.ncsa.uiuc.edu/People/cox/
iGrid 2002
September 24-26, 2002, Amsterdam, The Netherlands
• Fifteen Countries/Locations Proposing 28 Demonstrations:
Canada, CERN, France, Germany, Greece, Italy, Japan, The
Netherlands, Singapore, Spain, Sweden, Taiwan, United
Kingdom, United States
• Applications Demonstrated: Art, Bioinformatics, Chemistry,
Cosmology, Cultural Heritage, Education, High-Definition Media
Streaming, Manufacturing, Medicine, Neuroscience, Physics,
Tele-science
• Grid Technologies: Grid Middleware, Data Management/
Replication Grids, Visualization Grids, Computational Grids,
Access Grids, Grid Portal Sponsors: HP, IBM, Cisco, Philips,
Level (3), Glimmerglass, etc.
UIC
www.startap.net/igrid2002
iGrid 2002 Was Sustaining 1-3 Gigabits/s
Total Available Bandwidth
Between Chicago and Amsterdam
Was 30 Gigabit/s
The Move to Data-Intensive Science & Engineeringe-Science Community Resources
ALMA
LHC
Sloan Digital Sky Survey
ATLAS
Why Optical Networks Are Emerging
as the 21st Century Driver for the Grid
Scientific American, January 2001
A LambdaGrid Will Be
the Backbone for an e-Science Network
Apps Middleware
Clusters
Dynamically
Allocated
Lightpaths
Switch Fabrics
Physical
Monitoring
C
O
N
T
R
O
L
P
L
A
N
E
Source: Joe Mambretti, NU
NSF Defines Three Classes of Networks
Beyond the Commodity Internet
•
Production Networks (e.g. Internet2)
– High-Performance Networks
– Reaches All US Researchers
– 24 / 7 Reliable
•
Experimental Networks
– Trials of Cutting-Edge High-Performance Networks
– Deliver Advanced Application Needs Unsupported by Production Networks
– Robust Enough to Support Application-Dictated Development:
– Software Application Toolkits,
– Middleware,
– Computing and Networking
•
Research Networks
–
–
–
–
–
Smaller-Scale Network Prototypes
Enable Basic Scientific and Engineering Network Research
Testing of Component Technologies, Protocols, Network Architectures
Not Expected to Be Persistent
Not Expected to Support Production Applications
www.evl.uic.edu/activity/NSF/index.html
Local and Regional Lambda Experimental Networks
Are Achievable and Practical
• Several GigaPOPs and States Are Building
– Multi-Lambda Metropolitan Experimental Networks
– Lighting up Their Own Dark Fiber (I-WIRE, I-Light, CENIC CalREN-XD)
– With Hundreds of Lambdas by 2010
• OptIPuter Funded to Research LambdaGrid
– Middleware and Control Plane
– Application Driven
• Substantial State and Local Funds Can Be Heavily Leveraged by
an NSF Experimental Networks Program
–
–
–
–
–
Cross-country Inter-Connection (National Light Rail)
Persistent Support of Emerging Experimental Networks
First NSF Workshop UIC December 2001
Second NSF Workshop UCI May 2002
Expected NSF RFP by Winter 2003
The Next S-Curves of Networking
Exponential Technology Growth
Lambda Grids
Experimental
Networks
Production/
Mass Market
DWDM
100%
Technology
Penetration
Internet2 Abilene
Experimental/
Early Adopters
Connections Program
0%
Research
Gigabit Testbeds
Time
Technology S-Curve
~1990s
2000
2010
Networking Technology S-Curves
Cal-(IT)2
An Integrated Approach to the Future Internet
220 UC San Diego & UC Irvine Faculty
Working in Multidisciplinary Teams
With Students, Industry, and the Community
The State’s $100 M
Creates Unique Buildings, Equipment, and Laboratories
www.calit2.net
Data Intensive Scientific Applications
Require Experimental Optical Networks
• Large Data Challenges in Neuro and Earth Sciences
– Each Data Object is 3D and Gigabytes
– Data are Generated and Stored in Distributed Archives
– Research is Carried Out on Federated Repository
• Requirements
–
–
–
–
Computing Requirements  PC Clusters
Communications  Dedicated Lambdas Over Fiber
Data  Large Peer-to-Peer Lambda Attached Storage
Visualization  Collaborative Volume Algorithms
• Response
– OptIPuter Research Project
The Biomedical Informatics Research Network
a Multi-Scale Brain Imaging Federated Repository
BIRN Test-beds:
Multiscale Mouse Models of Disease, Human Brain Morphometrics, and
FIRST BIRN (10 site project for fMRI’s of Schizophrenics)
NIH Plans to Expand
to Other Organs
and Many Laboratories
Microscopy Imaging of Neural Tissue
Marketta Bobik
Francisco Capani & Eric Bushong
Confocal image of a sagittal section through rat cortex
triple labeled for
glial fibrillary acidic protein (blue),
neurofilaments (green) and actin (red)
Projection of a series of optical sections
through a Purkinje neuron
revealing both the overall morphology (red)
and the dendritic spines (green)
http://ncmir.ucsd.edu/gallery.html
Interactive Visual Analysis of Large Datasets -East Pacific Rise Seafloor Topography
Scripps Institution of Oceanography Visualization Center
http://siovizcenter.ucsd.edu/library/gallery/shoot1/index.shtml
Tidal Wave Threat Analysis
Using Lake Tahoe Bathymetry
Graham Kent, SIO
Scripps Institution of Oceanography Visualization Center
http://siovizcenter.ucsd.edu/library/gallery/shoot1/index.shtml
SIO Uses the Visualization Center
to Teach a Wide Variety of Graduate Classes
•
•
•
•
Geodesy
Gravity and Geomagnetism
Planetary Physics
Radar and Sonar Interferometry
•
•
•
Seismology
Tectonics
Time Series Analysis
Deborah Kilb & Frank Vernon, SIO
Multiple Interactive Views of Seismic Epicenter and Topography Databases
http://siovizcenter.ucsd.edu/library/gallery/shoot2/index.shtml
NSF’s EarthScope
Rollout Over 14 Years Starting
With Existing Broadband Stations
NSF Experimental Network Research Project
The “OptIPuter”
• Driven by Large Neuroscience and Earth Science Data
– NIH Biomedical Informatics Research Network
– NSF EarthScope (UCSD SIO)
• Removing Bandwidth as a Constraint
– Links Computing, Storage, Visualization and Networking
– Software and Systems Integration Research Agenda
• NSF Large Information Technology Research Proposal
– UCSD and UIC Lead Campuses
– USC, UCI, SDSU, NW Partnering Campuses
– Industrial Partners: IBM, Telcordia/SAIC, CENIC, Chiaro
Networks, IXIA
• PI—Larry Smarr; Funded at $13.5M Over Five Years
– Start Date October 1, 2002
www.calit2.net/news/2002/9-25-optiputer.html
From SuperComputers to SuperNetworks-Changing the Grid Design Point
• The TeraGrid is Optimized for Computing
–
–
–
–
1024 IA-64 Nodes Linux Cluster
Assume 1 GigE per Node = 1 Terabit/s I/O
Grid Optical Connection 4x10Gig Lambdas = 40 Gigabit/s
Optical Connections are Only 4% Bisection Bandwidth
• The OptIPuter is Optimized for Bandwidth
–
–
–
–
32 IA-64 Node Linux Cluster
Assume 1 GigE per Processor = 32 gigabit/s I/O
Grid Optical Connection 4x10GigE = 40 Gigabit/s
Optical Connections are Over 100% Bisection Bandwidth
OptIPuter Inspiration--Node of a 2009
PetaFLOPS Supercomputer
DRAM - 4 GB - HIGHLY INTERLEAVED
MULTI-LAMBDA
AON
CROSS BAR
Coherence
2nd LEVEL CACHE
96 MB
640 GB/s
2nd LEVEL CACHE
96 MB
64 B wide
160 GB/s
VLIW/RISC CORE
24 GFLOPS
6 Ghz
...
64 B wide
160 GB/s
VLIW/RISC CORE
24 GFLOPS
6 Ghz
Source: Steve Wallach, Supercomputing 2000 Keynote
Global Architecture of a 2009 COTS
PetaFLOPS System
10 meters=
50 nanosec Delay
3
2
4
5 ...
16
1
128 Die/Box
4 CPU/Die
17
64
ALL-OPTICAL
SWITCH
63
...
18
...
32
49
48
47
I/O
LAN/WAN
... 33 Multi-Die
Multi-Processor
46
Source: Steve Wallach, Supercomputing 2000 Keynote
OptIPuter NSF Proposal Partnered with
National Experts and Infrastructure
Asia
Pacific
Vancouver
Seattle
Portland
CA*net4
Pacific
Light
Rail
Chicago
UIC
NU
San Francisco
Asia
Pacific
SURFnet
CERN
PSC
NYC
NCSA
USC
Los Angeles UCI
UCSD, SDSU
San Diego
(SDSC)
Atlanta
AMPATH
Source: Tom DeFanti and Maxine Brown, UIC
OptIPuter LambdaGrid
Enabled by Chiaro Networking Router
www.calit2.net/news/2002/11-18-chiaro.html
Medical Imaging
and Microscopy
Chemistry,
Engineering, Arts
switch
switch
• Cluster – Disk
• Disk – Disk
Chiaro
Enstara
• Viz – Disk
• DB – Cluster
switch
switch
San Diego
Supercomputer Center
• Cluster – Cluster
Scripps Institution of
Oceanography
Image Source: Phil Papadopoulos, SDSC
The OptIPuter Experimental
The UCSD OptIPuter Deployment
UCSD Campus Optical Network
To CENIC
Phase I, Fall 02
Phase II, 2003
Production Router
SDSC
SDSC
SDSC
SDSC
Annex
Annex
JSOE
Engineering
CRCA
Arts
SOM
Medicine
Chemistry
Phys.
Sci Keck
Collocation point
Preuss
High
School
6th
Undergrad
College
College
Node M
Collocation
Chiaro Router
SIO
Earth
Sciences
½ Mile
Source: Phil Papadopoulos, SDSC; Greg Hidley, Cal-(IT)2
Planned Chicago Metro
Electronic Switching OptIPuter Laboratory
Internationals: Canada, Holland, CERN, GTRN, AmPATH, Asia…
Int’l GE, 10GE
16x1 GE
16x10 GE
Metro GE, 10GE
16-Processor McKinley
at University of Illinois
at Chicago
10x1 GE
+
1x10GE
Nat’l GE,
16-Processor
Montecito/Chivano
at Northwestern
10GE
StarLight
Nationals: Illinois, California, Wisconsin, Indiana,
Abilene, FedNets. Washington, Pennsylvania…
Source: Tom DeFanti
Metro Optically Linked Visualization Walls
with Industrial Partners Set Stage for Federal Grant
• Driven by SensorNets Data
–
–
–
–
Real Time Seismic
Environmental Monitoring
Distributed Collaboration
Emergency Response
• Linked UCSD and SDSU
– Dedication March 4, 2002
Linking Control Rooms
UCSD
SDSU
44 Miles of Cox Fiber
Cox, Panoram,
SAIC, SGI, IBM,
TeraBurst Networks
SD Telecom Council
NTT Super High Definition Video
(NTT 4Kx2K=8 Megapixels) Over Internet2
Applications:
Astronomy
Mathematics
Entertainment
Starlight
in Chicago
www.ntt.co.jp/news/news02e/0211/021113.html
SHD
= 4xHDTV
= 16xDVD
USC
In Los Angeles
OptIPanel
5x3 Grid of 1280x1024 Pixel LCD Panels Driven by 16-PC Cluster Resolution=6400x3072
Pixels, or ~3000x1500 pixels in Autostereo
The Continuum at EVL and TRECC
OptIPuter Amplified Work Environment
Passive stereo display
AccessGrid
Digital white board
Tiled display
Source: Tom DeFanti, Electronic Visualization Lab, UIC
OptIPuter Transforms Individual Laboratory
Visualization, Computation, & Analysis Facilities
Fast polygon and
volume rendering
with stereographics
+
GeoWall
= 3D APPLICATIONS:
Earth Science
Underground
Earth Science
Anatomy
Neuroscience
GeoFusion GeoMatrix Toolkit
Rob Mellors and Eric Frost, SDSU
SDSC Volume Explorer
Visible Human Project
NLM, Brooks AFB,
SDSC Volume Explorer
Dave Nadeau, SDSC, BIRN
SDSC Volume Explorer
The Preuss School UCSD OptIPuter Facility
Download