PPT - Larry Smarr - California Institute for Telecommunications and

advertisement
The OptIPuter and Its Applications
Invited Talk
Cyberinfrastructure for Humanities, Arts, and Social Sciences
A Summer Institute
SDSC
UCSD
July 26, 2006
Dr. Larry Smarr
Director, California Institute for Telecommunications and
Information Technologies
Harry E. Gruber Professor,
Dept. of Computer Science and Engineering
Jacobs School of Engineering, UCSD
From “Supercomputer–Centric”
to “Supernetwork-Centric” Cyberinfrastructure
Terabit/s
1.E+06
32x10Gb “Lambdas”
Bandwidth (Mbps)
1.E+04
Bandwidth of NYSERNet
Research Network Backbones
Gigabit/s
1.E+03
60 TFLOP Altix
1.E+02
1 GFLOP Cray2
1.E+01
1.E+00
T1
1985
Optical WAN Research Bandwidth
Has Grown Much Faster Than
Supercomputer Speed!
Computing Speed (GFLOPS)
1.E+05
Megabit/s
1990
1995
2000
Network Data Source: Timothy Lance, President, NYSERNet
2005
The OptIPuter Project – Creating a “SuperWeb”
for Data Intensive Researchers
• NSF Large Information Technology Research Proposal
– Calit2 (UCSD, UCI) and UIC Lead Campuses—Larry Smarr PI
– Partners: SDSC, USC, SDSU, NCSA, NW, TA&M, UvA, SARA,
NASA Goddard, KISTI, AIST, CRC(Canada), CICESE (Mexico)
• Industrial Partners
– IBM, Sun, Telcordia, Chiaro, Calient, Glimmerglass, Lucent
• $13.5 Million Over Five Years—Now In the Fourth Year
NIH Biomedical Informatics
Research Network
NSF EarthScope and ORION
What is the
OptIPuter?
• Applications Drivers  Interactive Analysis of Large Data Sets
• OptIPuter Nodes  Scalable PC Clusters with Graphics Cards
• IP over Lambda Connectivity Predictable Backplane
• Open Source LambdaGrid Middleware Network is Reservable
• Data Retrieval and Mining  Lambda Attached Data Servers
• High Defn. Vis., Collab. SW  High Performance Collaboratory
www.optiputer.net
See Nov 2003 Communications of the ACM
for Articles on OptIPuter Technologies
Dedicated Optical Channels Makes
High Performance Cyberinfrastructure Possible
(WDM)
c* f
Source: Steve Wallach, Chiaro Networks
“Lambdas”
Parallel Lambdas are Driving Optical Networking
The Way Parallel Processors Drove 1990s Computing
National Lambda Rail (NLR) and TeraGrid Provides
Cyberinfrastructure Backbone for U.S. Researchers
NSF’s TeraGrid Has 4 x 10Gb
Lambda Backbone
Seattle
International
Collaborators
Portland
Boise
Ogden/
Salt Lake City
UC-TeraGrid
UIC/NW-Starlight
Cleveland
Chicago
New York City
Denver
San Francisco
Pittsburgh
Washington, DC
Kansas City
Los Angeles
Albuquerque
Raleigh
Tulsa
Atlanta
San Diego
Phoenix
Dallas
Links Two Dozen
State and
Regional Optical
Networks
Baton Rouge
Las Cruces /
El Paso
Jacksonville
Pensacola
San Antonio
Houston
NLR 4 x 10Gb Lambdas Initially
Capable of 40 x 10Gb wavelengths at Buildout
DOE, NSF,
& NASA
Using NLR
Creating a North American Superhighway
for High Performance Collaboration
Next Step: Adding Mexico to
Canada’s CANARIE and the U.S. National Lambda Rail
OptIPuter Scalable Adaptive Graphics Environment
(SAGE) Allows Integration of HD Streams
OptIPortal–
Termination
Device
for the
OptIPuter
Global
Backplane
OptIPortal–
Termination Device for the OptIPuter Global Backplane
•
•
•
20 Dual CPU Nodes, 20 24” Monitors, ~$50,000
1/4 Teraflop, 5 Terabyte Storage, 45 Mega Pixels--Nice PC!
Scalable Adaptive Graphics Environment ( SAGE) Jason Leigh, EVL-UIC
Source: Phil Papadopoulos SDSC, Calit2
The World’s Largest Tiled Display Wall—
Calit2@UCI’s HIPerWall
HDTV
Digital Cameras
Digital Cinema
Zeiss Scanning
Electron Microscope
Center of Excellence in
Calit2@UCI
Albert Yee, PI
Calit2@UCI Apple Tiled Display Wall
Driven by 25 Dual-Processor G5s
50 Apple 30” Cinema Displays
200 Million Pixels of Viewing Real Estate!
Falko Kuester and
Steve Jenks, PIs
Featured in
Apple Computer’s
“Hot News”
3D Videophones Are Here!
The Personal Varrier Autostereo Display
•
Varrier is a Head-Tracked Autostereo Virtual Reality Display
– 30” LCD Widescreen Display with 2560x1600 Native Resolution
– A Photographic Film Barrier Screen Affixed to a Glass Panel
– The Barrier Screen Reduces the Horizontal Resolution To 640 Lines
•
•
Cameras Track Face with Neural Net to Locate Eyes
The Display Eliminates the Need to Wear Special Glasses
Source: Daniel Sandin, Thomas DeFanti, Jinghua Ge, Javier
Girado, Robert Kooima, Tom Peterka—EVL, UIC
How Do You Get From Your Lab
to the National LambdaRail?
“Research is being stalled by ‘information overload,’ Mr. Bement said, because
data from digital instruments are piling up far faster than researchers can study.
In particular, he said, campus networks need to be improved. High-speed data
lines crossing the nation are the equivalent of six-lane superhighways, he said.
But networks at colleges and universities are not so capable. “Those massive
conduits are reduced to two-lane roads at most college and university
campuses,” he said. Improving cyberinfrastructure, he said, “will transform the
capabilities of campus-based scientists.”
-- Arden Bement, the director of the National Science Foundation
www.ctwatch.org
To Build a Campus Dark Fiber Network—
First, Find Out Where All the Campus Conduit Is!
UCSD Campus-Scale Routed OptIPuter with
Nodes for Storage, Computation and Visualization
The New Optical Core of the UCSD Campus-Scale Testbed:
Evaluating Packet Routing versus Lambda Switching
Goals by 2007:
>= 50 endpoints at 10 GigE
>= 32 Packet switched
>= 32 Switched wavelengths
>= 300 Connected endpoints
Approximately 0.5 TBit/s
Arrive at the “Optical” Center
of Campus
Switching will be a Hybrid
Combination of:
Packet, Lambda, Circuit -OOO and Packet Switches
Already in Place
Funded by
NSF MRI
Grant
Lucent
Glimmerglass
Force10
OptIPuter@UCI is Up and Working
ONS 15540 WDM at UCI
campus MPOE (CPL)
10 GE DWDM Network
Line
1 GE DWDM Network
Line
Calit2 Building
Kim-Jitter
Measurements
This Week!
Tustin CENIC Calren
POP
Wave-2: layer-2 GE.
UCSD address space
137.110.247.210-222/28
Floor 4 Catalyst 6500
UCSD Optiputer
Network
Engineering Gateway Building,
SPDS
Viz Lab
Floor 3 Catalyst 6500
Floor 2 Catalyst 6500
Los
Angeles
Catalyst 3750 in 3rd
floor IDF
Wave-1: UCSD address
space 137.110.247.242246 NACS-reserved for
testing
Catalyst 3750 in
NACS Machine
Room (Optiputer)
HIPerWall
UCInet
MDF Catalyst 6500 w/ firewall, 1st floor closet
Catalyst 3750 in CSI
Created 09-27-2005 by Garrett Hildebrand
Modified 11-03-2005 by Jessica Yu
ESMF
10 GE
Wave 1 1GE
Wave 2 1GE
Calit2/SDSC Proposal to Create a UC Cyberinfrastructure
of OptIPuter “On-Ramps” to TeraGrid Resources
OptIPuter + CalREN-XD + TeraGrid =
“OptiGrid”
UC Davis
UC San Francisco
UC Berkeley
UC Merced
UC Santa Cruz
UC Los Angeles
UC Santa Barbara
UC Riverside
UC Irvine
UC San Diego
Creating a Critical Mass of End Users
on a Secure LambdaGrid
Source: Fran Berman, SDSC , Larry Smarr, Calit2
OptIPuter Software Architecture--a Service-Oriented
Architecture Integrating Lambdas Into the Grid
Source: Andrew Chien, UCSD
Distributed Applications/ Web Services
Visualization
Telescience
SAGE
Data Services
JuxtaView
Vol-a-Tile
LambdaRAM
Distributed Virtual Computer (DVC) API
DVC Runtime Library
DVC Configuration
DVC Services
DVC
Communication
DVC Job
Scheduling
DVC Core Services
Resource
Namespace
Identify/Acquire
Management
Security
Management
High Speed
Communication
Storage
Services
GSI
XIO
RobuStore
Globus
PIN/PDC
GRAM
Discovery
and Control
Lambdas
IP
GTP
CEP
XCP
LambdaStream
UDT
RBUDP
PI Larry Smarr
Announced January 17, 2006
$24.5M Over Seven Years
Marine Genome Sequencing Project
Measuring the Genetic Diversity of Ocean Microbes
CAMERA’s Sorcerer II Data
Will Double Number of Proteins
in GenBank!
Announced January 17, 2006
CAMERA’s Direct Access Core Architecture
Will Create Next Generation Metagenomics Server
Sargasso Sea Data
Moore Marine
Microbial Project
NASA Goddard
Satellite Data
Community Microbial
Metagenomics Data
DataBase
Farm
Flat File
Server
Farm
10 GigE
Fabric
Request
+ Web Services
JGI Community
Sequencing Project
W E B PORTAL
Sorcerer II Expedition
(GOS)
Traditional
User
Dedicated
Compute Farm
(1000 CPUs)
Response
Direct
Access
Lambda
Cnxns
Local
Environment
Web
(other service)
Local
Cluster
TeraGrid: Cyberinfrastructure Backplane
(scheduled activities, e.g. all by all comparison)
(10000s of CPUs)
Source: Phil Papadopoulos, SDSC, Calit2
The Future Home of the Moore Foundation Funded
Marine Microbial Ecology Metagenomics Complex
First Implementation of
the CAMERA Complex
Major Buildout of Calit2
Server Room Underway
Photo Courtesy Joe Keefe, Calit2
Calit2 and the Venter Institute Will Combine
Telepresence with Remote Interactive Analysis
Live Demonstration
of 21st Century
National-Scale
Team Science
25 Miles
Venter
Institute
OptIPuter
Visualized
Data
HDTV
Over
Lambda
UIC/UCSD 10GE CAVEWave on the National LambdaRail
Emerging OptIPortal Sites
OptIPortals
UW
UIC EVL
NEW!
MIT
NEW!
JCVI
UCI
UCSD
SIO
SunLight
SDSU
CICESE
CAVEWave Connects Chicago to Seattle to San Diego…and Washington D.C. as of 4/1/06
and JCVI as of 5/15/06
Borderless Collaboration
Between Global University Research Centers at 10Gbps
Maxine Brown, Tom DeFanti, Co-Chairs
iGrid
2005
THE GLOBAL LAMBDA INTEGRATED FACILITY
www.igrid2005.org
September 26-30, 2005
Calit2 @ University of California, San Diego
California Institute for Telecommunications and Information Technology
100Gb of Bandwidth into the Calit2@UCSD Building
More than 150Gb GLIF Transoceanic Bandwidth!
450 Attendees, 130 Participating Organizations
20 Countries Driving 49 Demonstrations
1- or 10- Gbps Per Demo
CineGrid Leverages OptIPuter Cyberinfrastructure
to Enable Global “Extreme Media” Collaboration
•
CineGrid Experiments Aim to Push the State of the Art in:
– Streaming & Store-and-Forward File Transfer
– Using High-speed, Low-latency Network Protocols
– HDTV in Various Formats for:
– Teleconferencing
– Telepresence
– Production
– 2K And 4K Digital Cinema Workflows and Distribution
– Stereo
– In High Resolution (2K, 4K)
– Virtual Reality In Higher Resolution (24 Megapixel)
– Distributed Tiled Displays With 20-100 Megapixels
– Long term Digital Archiving
•
International Workshop:
–
–
Tokyo July 2006
Calit2 Dec 2006
Source: Tom DeFanti, Laurin Herr
OptIPuter 4K Telepresence over IP at iGrid 2005
Demonstrated Technical Basis for CineGrid
New Calit2
Digital Cinema Auditorium
Lays
Technical
Basis for
Global
Digital
Keio University
President Anzai Cinema
UCSD
Chancellor Fox
Sony
NTT
SGI
Calit2 Works with CENIC to Provide
the California Optical Core for CineGrid
Partnering with
SFSU’s Institute for
Next Generation
Internet
SFSU
UCB
Digital Archive
of Films
CineGridTM will Link
UCSD/Calit2 and USC School
of Cinema TV with Keio
University Research Institute
for Digital Media and Content
Plus, 1Gb and 10Gb Connections to:
Prototype of
CineGridTM
• Seattle, Canada, Japan, Asia, Australia, New
Zealand
• Chicago, Canada, Japan, Europe, Russia, China
• Tijuana
USC
Extended SoCal
OptIPuter to USC
Source: Laurin Herr,
School of CinemaPacific Interface
Television
CineGridTM Project Leader
Calit2
UCI
Calit2
UCSD
Calit2 and the Venter Institute Test CineGrid™
with HDTV Movie by John Carter
StarLight
Chicago
Sony
HDTV
JH-3
JCVI
Calit2 Auditorium
Live Demonstration of 21st Century
Entertainment Delivery June 14, 2006
JC Venter Institute
Rockville, MD
iGrid 2005
Kyoto Nijo Castle
Source:
Toppan
Printing
Interactive VR
Streamed Live
from Tokyo to
Calit2 Over
Dedicated GigE
and
Projected at
4k Resolution
iGrid 2005 Cultural Heritage
China and USA
•
•
•
•
•
•
•
•
•
•
•
Great Wall Cultural Heritage
International Media Centre, China
San Diego State University, USA
Great Wall Society, China
San Diego Supercomputer Center, USA
Chinese Academy of Sciences, China
GLORIAD, USA
Chinese Institute of Surveying and Mapping, China
Cybermapping Lab, University of Texas-Dallas, USA
GEON viz/3D-scanning lab, University of Idaho, USA
Stanford University, USA
Source : Maxine
Brown, EVL UIC
Data Acquisition from Laser Scanning Combined with Photogrammetry
Enables the Construction of Unique Cultural Heritage Images from China.
3D Designs Of The Great Wall Are Combined With 3D Scans Of Physical
Images And Satellite Imagery Stored On Servers In China And San Diego.
www.internationalmediacentre.com/imc/index.html
NSF’s Ocean Observatories Initiative (OOI)
Envisions Global, Regional, and Coastal Scales
LEO15 Inset Courtesy
of Rutgers University,
Institute of Marine and
Coastal Sciences
$300M in
President’s
Budget for OOI
Coupling Regional and Coastal Ocean Observatories
Using OptIPuter and Web/Grid Services
www.neptune.washington.edu
www.mbari.org/mars/
LOOKING:
(Laboratory
for the Ocean
Observatory
Knowledge
Integration
Grid)
LOOKING Funded by NSF ITRJohn Delaney, UWash, PI)
www.sccoos.org/
Using the OptIPuter to Couple Data Assimilation Models
to Remote Data Sources Including Biology
NASA MODIS Mean Primary Productivity
for April 2001 in California Current System
Regional Ocean Modeling System (ROMS)
http://ourocean.jpl.nasa.gov/
Interactive Remote Data
and Visualization Services
•
•
•
•
• Multiple Scalable Displays
• Hardware Pixel Streaming
• Distributed Collaboration
Scientific-Info Visualization
AMR Volume Visualization
Glyph and Feature Vis
Visualization Services
NCSA Altix Data
and Vis Server
Linking to
OptIPuter
•Data Mining for Areas of Interest
•Analysis and Feature Extraction
•Data Mining Services
National Laboratory for Advanced Data Research
An SDSC/NCSA Data Collaboration
The Synergy of Digital Art and Science
Visualization of JPL Simulation of Monterey Bay
4k Resolution
Source: Donna Cox, Robert Patterson, NCSA
Funded by NSF LOOKING Grant
First Remote Interactive High Definition Video
Exploration of Deep Sea Vents-Prototyping NEPTUNE
Canadian-U.S. Collaboration
Source John Delaney & Deborah Kelley, UWash
High Definition Still Frame
of Hydrothermal Vent Ecology 2.3 Km Deep
1 cm.
Source:
John Delaney and
Research Channel,
U Washington
White Filamentous Bacteria on 'Pill Bug' Outer Carapace
Download