PPT - Larry Smarr - California Institute for Telecommunications and

advertisement
“Calit2"
Talk
Nortel Visiting Team
Calit2@UCSD
December 12, 2005
Dr. Larry Smarr
Director, California Institute for Telecommunications and
Information Technology
Harry E. Gruber Professor,
Dept. of Computer Science and Engineering
Jacobs School of Engineering, UCSD
Two New Calit2 Buildings Will Provide
Major New Laboratories to Their Campuses
•
New Laboratory Facilities
– Nanotech, BioMEMS, Chips, Radio, Photonics,
Grid, Data, Applications
– Virtual Reality, Digital Cinema, HDTV, Synthesis
•
Over 1000 Researchers in Two Buildings
– Linked via Dedicated Optical Networks
– International Conferences and Testbeds
UCOct.
San
Richard C. Atkinson Hall Dedication
28,Diego
2005
UC Irvine
www.calit2.net
The Calit2@UCSD Building is Designed for Prototyping
Extremely High Bandwidth Applications
1.8 Million Feet of Cat6 Ethernet Cabling
UCSD is Only
UC Campus with
10G
CENIC
Connection for
~30,000 Users
Photo: Tim Beach,
Calit2
Over 9,000
Individual
1 Gbps
Drops in the
Building
~10G per Person
150 Fiber Strands to Building;
Experimental Roof Radio Antenna Farm
Ubiquitous WiFi
Calit2@UCSD Is Connected
to the World at 10Gbps
Maxine Brown, Tom DeFanti, Co-Chairs
iGrid
2005
THE GLOBAL LAMBDA INTEGRATED FACILITY
www.igrid2005.org
September 26-30, 2005
Calit2 @ University of California, San Diego
California Institute for Telecommunications and Information Technology
50 Demonstrations, 20 Counties, 10 Gbps/Demo
Nortel 10Gb Line-Speed Security Demo – iGrid@Calit2
Less than 500 nsecs Latency Added
Chicago
Ottawa
4 Ge
StarLight
San
Diego
12 Ge
4 Ge
via IWIRE
Source Data
Linux cluster at EVL
Force10
OC-192 GFP
Source Data
Linux Cluster
12 Ge
Visualization
Cluster
NetherLight
Amsterdam
4 Ge
Tile display
Visualization
Source Data
Linux Cluster
First Trans-Pacific Super High Definition Telepresence
Meeting in New Calit2 Digital Cinema Auditorium
Lays
Technical
Basis for
Global
Digital
Keio University
President Anzai Cinema
UCSD
Chancellor Fox
Sony
NTT
SGI
The OptIPuter Project –
Creating a LambdaGrid “Web” for Gigabyte Data Objects
• NSF Large Information Technology Research Proposal
– Calit2 (UCSD, UCI) and UIC Lead Campuses—Larry Smarr PI
– Partnering Campuses: USC, SDSU, NW, TA&M, UvA, SARA, NASA
• Industrial Partners
– IBM, Sun, Telcordia, Chiaro, Calient, Glimmerglass, Lucent
• $13.5 Million Over Five Years
• Linking Global Scale Science Projects to User’s Linux Clusters
NIH Biomedical Informatics
Research Network
NSF EarthScope
and ORION
The Optical Core of the UCSD Campus-Scale Testbed -Evaluating Packet Routing versus Lambda Switching
Funded by
NSF MRI
Grant
Goals by 2007:
>= 50 endpoints at 10 GigE
>= 32 Packet switched
>= 32 Switched wavelengths
>= 300 Connected endpoints
Approximately 0.5 TBit/s
Arrive at the “Optical” Center
of Campus
Switching will be a Hybrid
Combination of:
Packet, Lambda, Circuit -OOO and Packet Switches
Already in Place
Lucent
Glimmerglass
Chiaro
Networks
Source: Phil Papadopoulos,
SDSC, Calit2
Toward an Interactive Gigapixel Display
•
•
Scalable Adaptive
Graphics Environment
(SAGE) Controls:
100 Megapixels
Display
Calit2 is Building a LambdaVision Wall in
Each of the UCI & UCSD Buildings
NSF
LambdaVision
MRI@UIC
– 55-Panel
•
1/4 TeraFLOP
– Driven by 30-Node
Cluster of 64-bit
Dual Opterons
•
1/3 Terabit/sec I/O
– 30 x 10GE
interfaces
– Linked to OptIPuter
•
•
1/8 TB RAM
60 TB Disk
Source: Jason Leigh, Tom DeFanti, EVL@UIC
OptIPuter Co-PIs
OptIPuter Software Architecture--a Service-Oriented
Architecture Integrating Lambdas Into the Grid
Distributed Applications/ Web Services
Visualization
Telescience
SAGE
Data Services
JuxtaView
Vol-a-Tile
LambdaRAM
Distributed Virtual Computer (DVC) API
DVC Runtime Library
DVC Configuration
DVC Services
DVC
Communication
DVC Job
Scheduling
DVC Core Services
Resource
Namespace
Identify/Acquire
Management
Security
Management
High Speed
Communication
Storage
Services
GSI
XIO
RobuStore
Globus
PIN/PDC
GRAM
Discovery
and Control
Lambdas
IP
GTP
CEP
XCP
LambdaStream
UDT
RBUDP
Calit2’s Direct Access Core Architecture
Will Create Next Generation Metagenomics Server
Sargasso Sea Data
Moore Marine
Microbial Project
NASA Goddard
Satellite Data
DataBase
Farm
Flat File
Server
Farm
10 GigE
Fabric
Request
+ Web Services
JGI Community
Sequencing Project
W E B PORTAL
Sorcerer II Expedition
(GOS)
Traditional
User
Dedicated
Compute Farm
(100s of CPUs)
Response
Direct
Access
Lambda
Cnxns
Local
Environment
Web
(other service)
Local
Cluster
TeraGrid: Cyberinfrastructure Backplane
(scheduled activities, e.g. all by all comparison)
(10000s of CPUs)
Source: Phil Papadopoulos, SDSC, Calit2
Adding Web & Grid Services to Optical Channels
to Provide Real Time Control of Ocean Observatories
LOOKING is Driven By
NEPTUNE CI Requirements
LOOKING:
(Laboratory for the Ocean Observatory
Knowledge Integration Grid)
•
Goal:
http://lookingtosea.ucsd.edu/
– Prototype Cyberinfrastructure for
NSF’s Ocean Research Interactive
Observatory Networks (ORION)
Building on OptIPuter
•
LOOKING NSF ITR with PIs:
– John Orcutt & Larry Smarr - UCSD
– John Delaney & Ed Lazowska –UW
– Mark Abbott – OSU
•
Collaborators at:
– MBARI, WHOI, NCSA, UIC, CalPoly,
UVic, CANARIE, Microsoft, NEPTUNECanarie
Making Management
of Gigabit Flows Routine
Partnering with NASA to Combine Telepresence with
Remote Interactive Analysis of Data Over National LambdaRail
www.calit2.net/articles/article.php?id=660
August 8, 2005
SIO/UCSD
OptIPuter
Visualized
Data
HDTV Over
Lambda
NASA
Goddard
Calit2/SDSC Proposal to Create a UC Cyberinfrastructure
of OptIPuter “On-Ramps” to TeraGrid Resources
OptIPuter + CalREN-XD + TeraGrid =
“OptiGrid”
UC Davis
UC San Francisco
UC Berkeley
UC Merced
UC Santa Cruz
UC Los Angeles
UC Santa Barbara
UC Riverside
UC Irvine
UC San Diego
Creating a Critical Mass of End Users
on a Secure LambdaGrid
Source: Fran Berman, SDSC
Interdisciplinary Groups in Networks, Circuits,
and Information Theory
Circuits and Wireless
Wireless SensorNets Driving an Ultra High Bandwidth
Fiber Optic Backbone Create a Planetary Scale Computer
Download