LUSI WBS 1.6 Controls and Data Systems Breakout Session – Sub-System Manager

advertisement
LUSI
WBS 1.6 Controls and Data Systems
Breakout Session
G. Haller – Sub-System Manager
August 20, 2008
LUSI DOE Review Aug. 20, 2008
Controls (WBS 1.6) p. 1
Gunther Haller
haller@slac.stanford.edu
Content
Scope
Cost & Schedule
WBS Organization
Cost
Schedule
Control and Data System Architecture
Control System
Architecture
Devices
Controller Examples
Data System
Data Systems Architecture
Science Data Acquisition & Processing
DAQ Components
High Level Applications, Online Archive
Offline File Management, Meta Data
Summary
LUSI DOE Review Aug. 20, 2008
Controls (WBS 1.6) p. 2
Gunther Haller
haller@slac.stanford.edu
LUSI Controls & Data Systems Location
Controls & Data Systems hardware/software
Hutches 3, 4, an 5
Control rooms for hutches 3, 4, and 5
X-Ray tunnel
NEH and FEH server rooms
FEH (H4/5, H4/5
Control Rooms &
Server Room
X-Ray Transport Tunnel
(XRT) (200 m)
HEDS
XCS
H: Hutch
CXI
NEH (H3 , H3
Control Room &
Server Room
LCLS
AMO
SXR
imaging
LUSI DOE Review Aug. 20, 2008
Controls (WBS 1.6) p. 3
XPP
Common control and data systems design
for photon beam-line/instruments (XTOD,
AMOS, LUSI, SXR)
Gunther Haller
haller@slac.stanford.edu
Scope – WBS 1.6 Control & Data Systems
Included in W.B.S. 1.6
All controls & DAQ, labor and M&S, for XPP, CXI, XCS instrument
components with diagnostics/common optics included in baseline
Includes controllers, racks, cables, switches, installation
Data-storage and processing for FEH
Initial offline (more effort will be on operating budget)
Input-signals to LCLS machine protection system link-node modules
Provided by LCLS X-Ray End Station controls (CAM is G. Haller)
Personnel protection system
Machine protection system (LCLS modules, fibers)
Laser safety system
Accelerator timing
Femto-second laser timing
Network architecture & security
Data-storage and processing for NEH
User safeguards
Laser controls
CXI 2D detector controls
Interfaces described in
1.1-517 ICD between XES and LUSI (released document)
LUSI DOE Review Aug. 20, 2008
Controls (WBS 1.6) p. 4
Gunther Haller
haller@slac.stanford.edu
Performance Requirements
From LUSI Performance Execution Plan (PEP)
This presentation will show that the requirements
will be fulfilled
LUSI DOE Review Aug. 20, 2008
Controls (WBS 1.6) p. 5
Gunther Haller
haller@slac.stanford.edu
1.6 WBS to Level 4
1.6
Control & Data
Systems
1.6.1
Integration &
Management
1.6.2
Common
Controls
1.6.3
XPP
Example
XPP
1.6.4
CXI
1.6.5
XCS
1.6.6
Offline
Computing
1.6.3.1
Requirements,
Design, Setup
1.6.3.2
Standard Hutch
Controls
1.6.3.4
Specific
Controls
LUSI DOE Review Aug. 20, 2008
Controls (WBS 1.6) p. 6
Gunther Haller
haller@slac.stanford.edu
W.B.S 1.6.2 Common Controls
W.B.S. 1.6.2 Common Controls
Photon beam feedback
Electron beam feedback
Hutch environmental measurement
FEH data storage
Data processing
Initial level 2 processing
Racks & cables
Non-hutch racks and cables, mainly FEH
LUSI DOE Review Aug. 20, 2008
Controls (WBS 1.6) p. 7
Gunther Haller
haller@slac.stanford.edu
W.B.S 1.6.3, 1.6.4, 1.6.5
W.B.S. 1.6.3 XPP, 1.6.4 CXI, 1.6.5 XCS
Requirements, design, setup
Standard hutch controls
Hutch cables, racks, installation
Workstations
Beamline processor
Channel access gateway
Machine protection system Interface
Specific controls
Valve/vacuum controls
Pop-in profile monitor
Pop-in intensity monitor
Intensity position monitor
Slit controls
Instrument specific controls for each section of the instrument
LUSI DOE Review Aug. 20, 2008
Controls (WBS 1.6) p. 8
Gunther Haller
haller@slac.stanford.edu
W.B.S 1.6.6 Offline Computing
W.B.S. 1.6.6 Offline Computing
Data-format
API
Data-catalog
Meta-data management
Processing framework
Workflow
Pipeline
LUSI DOE Review Aug. 20, 2008
Controls (WBS 1.6) p. 9
Gunther Haller
haller@slac.stanford.edu
Cost Methodology
Basis for agreement on what components need to be controlled and
how
Detailed Engineering Specification Documents (ESD’s) for each
instrument
All ESD’s are approved and released
Two ESD’s for each instrument
Controls ESD
Describing devices to be controlled
E.g. motion, vacuum
EPICS processing to be performed
E.g scanning
DAQ ESD
Describing devices to be read into DAQ
E.g. 2-D detectors, waveform sampling, some 120-Hz cameras, etc
Online processing to be performed
Plus one ESD for diagnostics
Basis for agreement on who is responsible for what and where the
interface is: Interface Control Documents (ICD’s)
ICD’s to all instruments are approved and released
LUSI DOE Review Aug. 20, 2008
Controls (WBS 1.6) p. 10
Gunther Haller
haller@slac.stanford.edu
Example XPP Beam-Line
Start from beam-line, itemize controls
LUSI DOE Review Aug. 20, 2008
Controls (WBS 1.6) p. 11
Gunther Haller
haller@slac.stanford.edu
ESD’s and ICD’s
XPP
SP-391-001-21 XPP Controls ESD
SP-391-001-22 XPP Controls & DAQ ICD
SP-391-001-23 XPP DAQ ESD
CXI
SP-391-001-13 CXI Controls ESD
SP-391-001-14 CXI Controls & DAQ ICD
SP-391-001-18 CXI DAQ ESD
XCS
SP-391-001-24 XCS Controls ESD
SP-391-001-25 XCS Controls & DAQ ICD
SP-391-001-26 XCS DAQ ESD
Diagnostics
SP-391-001-19 LUSI Common Diag. & Optics ESD
All documents at
http://confluence.slac.stanford.edu/display/PCDS/LUSI+Document+Page
LUSI DOE Review Aug. 20, 2008
Controls (WBS 1.6) p. 12
Gunther Haller
haller@slac.stanford.edu
Cost Methodology
Bottoms-up: supporting excel spread-sheet organized
by WBS created from ESD content (agreement between
scientist and controls)
Labor
Number of hours and detailed tasks for each WBS
Based on prior experience from previous SLAC experiments
Material
Lists each individual component to be purchased with price under
each WBS
Each item is labeled with reference number
Reference number references component on LUSI Controls Item list
spread-sheet
List of every controls item used for LUSI
> 95% of components supported by quotes or purchase orders
All items on item list supported by quote or purchase order printout
LUSI DOE Review Aug. 20, 2008
Controls (WBS 1.6) p. 13
Gunther Haller
haller@slac.stanford.edu
WBS Spread-Sheet Example
WBS
Activity
BOE Hours Cost
Item
Item # Count $/each Total
Item # references Controls
Item list, see next slide
LUSI DOE Review Aug. 20, 2008
Controls (WBS 1.6) p. 14
Gunther Haller
haller@slac.stanford.edu
LUSI Controls Item List
Below are first 14 items of LUSI Controls Item list
Total ~70 separate items
Components in WBS spread-sheet refer to this
Reference Number
Price support pages containing copies of
previous orders or quotes are labeled with this
item #
LUSI DOE Review Aug. 20, 2008
Controls (WBS 1.6) p. 15
Gunther Haller
haller@slac.stanford.edu
Contingency
Contingency calculated for each element from two
factors
Design Maturity
6 levels for labor
5 levels for M&S
Judgment Factor
Risks, exchange rate, etc
Held at project level
LUSI DOE Review Aug. 20, 2008
Controls (WBS 1.6) p. 16
Gunther Haller
haller@slac.stanford.edu
Project Budget
Detailed bottoms-up cost estimate
Labor: number of hours listed for each task
All M&S itemized to the component level
Almost 100% supported by vendor quotes or recent purchase
orders
WBS 1.6
Resource Type
Control
Accounts
Work
Packages
WBS 1.1
6
12
$5,461,314
WBS 1.2
14
49
$5,942,486
WBS 1.3
11
45
$9,486,460
WBS 1.4
16
45
$7,715,265
WBS 1.5
10
39
$6,383,995
WBS 1.6 (G. Haller)
20
289
$7,135,691
WBS
Total BAC
LUSI DOE Review Aug. 20, 2008
Controls (WBS 1.6) p. 17
Values
Value
Labor
$3,409,458
Non-Labor
$3,726,233
Total BAC
$7,135,691
$42,125,211
Gunther Haller
haller@slac.stanford.edu
Schedule
All tasks and materials (order, award, receive dates) in P3
1.6 is internally linked with predecessors and successors
“Available” mile-stones for each deliverable identified and
entered
Linked to instrument “Need” mile-stones
Resources leveled
LUSI DOE Review Aug. 20, 2008
Controls (WBS 1.6) p. 18
Gunther Haller
haller@slac.stanford.edu
Milestones
XPP
XPP Controls PDR
CD-3A – XPP Instrument Start Construction
XPP Controls FDR
XPP Controls available
CD-4A – XPP Start Operation
Dec 08
Jun 09
Sept 09
Mar 10
Dec 10
CXI Controls PDR
CD-3B – CXI – Instrument Start Construction
CXI Controls FDR
CXI Controls available
CD-4B – CXI – Start Operation
Sep 09
Apr 10
Jun 10
Nov 10
Dec 11
XCS Controls PDR
CD-3C – XCS – Instrument Start Construction
XCS Controls FDR
XCS Controls available
CD-4C – XCS – Start Operation
Nov 09
Apr 10
Feb 11
Jul 11
Aug 12
CXI
XCS
LUSI DOE Review Aug. 20, 2008
Controls (WBS 1.6) p. 19
Gunther Haller
haller@slac.stanford.edu
CDA Schedule Critical Envelope
CDA has multiple deliveries
to the instruments and is
heavily driven by their
needs. The project will
monitor strings of activities
with the least float
LUSI DOE Review Aug. 20, 2008
Controls (WBS 1.6) p. 20
Gunther Haller
haller@slac.stanford.edu
Slow Controls Tasks & Hardware
EPICS
In use at BaBar, APS, ALS
It is the LCLS control system
Basic EPICS Control and Monitoring
Vacuum: Instruments, connecting ‘pipes’
Valve control
Timing/triggering (timing strobe from EVR)
Motion control (‘stages’)
Camera control
Bias voltage supplies
120-Hz (slow) Analog-Digital Converters
Digital IO bits/states
Temperatures
Hardware
As much as feasible chosen from LCLS repertoire
Added new controllers based on instrument requirements
LUSI DOE Review Aug. 20, 2008
Controls (WBS 1.6) p. 21
Gunther Haller
haller@slac.stanford.edu
Common Controls Hardware
Examples
Racks
VME Crates
Motorola CPUs
Timing EVR PMC cards
Cameralink PMC cards
VME ISEG HV supplies
Analog-digital converter modules
Solenoid controllers
PLCs
Network switches
Terminal servers (Ethernet-to-Serial Port)
LUSI DOE Review Aug. 20, 2008
Controls (WBS 1.6) p. 22
Gunther Haller
haller@slac.stanford.edu
Example: Motion Systems
Newport XCS controller
LUSI DOE Review Aug. 20, 2008
Controls (WBS 1.6) p. 23
Gunther Haller
haller@slac.stanford.edu
Common Diagnostics Readout
E.g. intensity, profile monitor,
intensity position monitors
E.g. Canberra PIPS or IRD SXUV
large area diodes (single or quad)
Amplifier/shaper/ADC for
control/calibration/readout
Quad-Detector
R2
q1
q2
R1
Target
L
• Fourdiode
design
• On-board calibration circuits not shown
LUSI DOE Review Aug. 20, 2008
Controls (WBS 1.6) p. 24
Gunther Haller
haller@slac.stanford.edu
Interface to LCLS
Interface to LCLS/X-Ray End-Station Infrastructure
Machine timing (~ 20 psec jitter)
Laser timing (< 100 fsec jitter)
120 Hz beam data
Machine protection system
Hutch protection system
Laser safety system
Networking
EPICS server
LUSI DOE Review Aug. 20, 2008
Controls (WBS 1.6) p. 25
Gunther Haller
haller@slac.stanford.edu
120-Hz Data Feedback Loop
Low latency 120 Hz beam-line data communication
Use existing second Ethernet port on IOC’s
No custom hardware or additional hardware required
UDP multi-cast
Raw Ethernet packages
Accelerator
EO
IOC
IOC
Experiment
IOC
120-Hz network
Timing
Realtime per-pulse information can be used for e.g.
Vetoing of image samples (using accelerator data)
Adjustment of accelerator or photon beamline components based on
instrument/diagnostics results
Compensation of drifts, etc
Transport of electro-optics timing result to hutch experiments
LUSI DOE Review Aug. 20, 2008
Controls (WBS 1.6) p. 26
Gunther Haller
haller@slac.stanford.edu
Data Sub-System
Difference to conventional X-Ray experiments: High peak rate & large
volume comparable to high-energy physics experiments such as BaBar
@ SLAC
Data Rate/Volume of CXI Experiment
(comparable to other LUSI experiments)
LCLS Pulse Rep Rate (Hz)
120
Detector Size (Megapixel)
1.2
Intensity Depth (bit)
14
Success Rate (%)
30%
Ave. Data Rate (Gigabit/s)
0.6
Peak Data Rate (Gigabit/s)
1.9
Daily Duty Cycle (%)
50%
Accu. for 1 station (TB/day)
3.1
Challenge is to perform data-correction and image processing while
keeping up with continuous incoming data-stream
SLAC Particle Physics and Astro-Physics group involved has advantage
since it has substantial experience acquiring and processing large data
rates at high rates
LUSI DOE Review Aug. 20, 2008
Controls (WBS 1.6) p. 27
Gunther Haller
haller@slac.stanford.edu
Coherent Imaging of Single Molecules
• Diffraction from a single molecule:
noisy
diffraction
pattern of
unknown
orientation
single LCLS pulse
unknown
orientation
• Combine 105 to 107 measurements into 3D dataset:
Classify/sort
Average
Alignment
The highest achievable resolution is limited by the ability
to group patterns of similar orientation
Gösta Huldt, Abraham Szöke, Janos Hajdu
(J.Struct Biol, 2003 02-ERD-047)
LUSI DOE Review Aug. 20, 2008
Controls (WBS 1.6) p. 28
Reconstruct by
Oversampling
phase retrieval
Miao, Hodgson, Sayre, PNAS
98 (2001)
Gunther Haller
haller@slac.stanford.edu
Data System Architecture
Detector specific
Photon Control Data Systems (PCDS)
Beam Line
Data
L1: Acquisition
Detector + ASIC
FEE
Timing
L0: Control
L2: Processing
L3: Data Cache
Detector
Experiment specific
May be bump-bonded to ASIC or integrated with ASIC
Front-End Electronics (FEE)
Provide local configuration registers and state machines
Provide ADC if ASIC has analog outputs
FEE uses FPGA to transmit to DAQ system
LUSI DOE Review Aug. 20, 2008
Controls (WBS 1.6) p. 29
Gunther Haller
haller@slac.stanford.edu
Level 0 Nodes
Level 0: Control
DAQ operator consoles
Provide different functionalities
Run control
Partition management, data-flow
Detector control
Configuration (modes, biases, thresholds, etc)
Run monitoring
Data quality
Telemetry monitoring
Temperatures, currents, voltages, etc
Manage all L1, L2 and L3 nodes in a given partition (i.e. the set
of DAQ nodes used by a specific experiment or test-stand)
LUSI DOE Review Aug. 20, 2008
Controls (WBS 1.6) p. 30
Gunther Haller
haller@slac.stanford.edu
Level 1 Nodes
Level 1: Acquisition
Receive 120 Hz timing signals, send trigger to FEE, acquire FEE
data
Error detection and recovery of the FEE data
Control FEE parameters
Calibration
Dark image accumulation and averaging
Transfer curve mapping, gain calculation
Neighbor pixel cross-talk calculation
Event-build FEE science data with beam-line data
Image processing
Pedestal subtraction using calibration constants, cross-talk corrections
Partial data reduction (compression)
Rejection using 120 Hz beam-line data
Processing envisioned both in software and firmware (VHDL)
Send collected data to Level 2 nodes over 10 Gb/s Ethernet
LUSI DOE Review Aug. 20, 2008
Controls (WBS 1.6) p. 31
Gunther Haller
haller@slac.stanford.edu
Level 2 & 3 Nodes
Level 2: Processing
High level data processing:
Learn, pattern recognition, sort, classify
e.g. combine 105 – 107 images into 3D data-set
Alignment, reconstruction
Currently evaluating different ATCA blades for L2 nodes
Send processed data to L3 over 10 Gb/s Ethernet
Level 3: Data Cache
Provide data storage
Located in server room in experimental hall
Off-line system will transfer data from local cache to tape staging
system
Tape staging system located in SLAC central computing facilities
Must be able to buffer up data in local storage during downtimes of
staging system
LUSI DOE Review Aug. 20, 2008
Controls (WBS 1.6) p. 32
Gunther Haller
haller@slac.stanford.edu
ATCA Crate
ATCA
Advanced Telecommunication
Computing Architecture
Based on backplane serial
communication fabric
We use 10-Gigabit Ethernet
2 custom boards
Reconfigurable Cluster Element (RCE)
Module
Interface to detector
Up to 8 x 2.5 Gbit/sec links to
detector modules
Cluster Interconnect Module (CIM)
Managed 24-port 10-G Ethernet
switching
One ATCA crate can hold up to 14 RCE’s &
2 CIM’s
Essentially 480 Gbit/sec switch
capacity
Naturally scalable
Can also scale up crates
LUSI DOE Review Aug. 20, 2008
Controls (WBS 1.6) p. 33
RCE
CIM
ATCA
Crate
Gunther Haller
haller@slac.stanford.edu
Reconfigurable Cluster Element (I)
The RCE is the most challenging
among the different Level 1 node
types
SLAC custom made ATCA board
Used in other SLAC experiments
Based on System On Chip (SOC)
Technology
Implemented with Xilinx Virtex 4
devices, FX family
Xilinx devices provide
Reconfigurable FPGA fabric
DSPs (200 for XC4VFX60)
Generic CPU (2 PowerPCs 405
running at 450 MHz for XC4VFX60)
PPC is choice for IP cores for next
generation FPGA’s
TEMAC: Xilinx TriMode Ethernet
Hard Cores
MGT: Xilinx Multi-Gigabit
Transceivers 622Mb/s to 6.5Gb/s
(16 for XC4VFX60)
LUSI DOE Review Aug. 20, 2008
Controls (WBS 1.6) p. 34
RCE with RTM
Gunther Haller
haller@slac.stanford.edu
Reconfigurable Cluster Element (II)
System Memory Subsystem
512 MB of RAM
Memory controller provides 8 GB/s
overall throughput
Uses Micron RLDRAM II
Platform Flash Memory Subsystem
Stores firmware code for FPGA fabric
Configuration Flash Memory
Subsystem
128 MB configuration flash
Dedicated file system for storing
software code and configuration
parameters (up to 16 selectable images)
Storage Flash Memory Subsystem
(optional)
Up to 1TB per RCE persistent storage
flash (currently 256GB per RCE)
Low latency/high bandwidth access
through I/O channels using PGP
Uses Samsung K9NBG08 (32 Gb per
chip)
LUSI DOE Review Aug. 20, 2008
Controls (WBS 1.6) p. 35
Gunther Haller
haller@slac.stanford.edu
RCE Software
Software
Ported open source Real-Time kernel
Adopted RTEMS: Real Time Operating Systems for
Multiprocessor Systems
Written BSP mainly in C++
Plus some C and assembly
Written 10Gb Ethernet driver and PGP drivers for bulk
data
1Gb management interface driver
Built interface to RTEMS TCP/IP network stack
Developed specialized network stack for zero-copy
Ethernet traffic
LUSI DOE Review Aug. 20, 2008
Controls (WBS 1.6) p. 36
Gunther Haller
haller@slac.stanford.edu
Cluster Interconnect Module
ATCA network card
SLAC custom made board
Based on two 24-port 10Gb Ethernet switch
ASICs from Fulcrum
Up to 480 Gb/s total bandwidth
Managed via Virtex-4 device
Currently XC4VFX12
Fully managed layer-2, cut-through switch
Interconnect up to 14 in-crate RCE boards
(i.e. 28 RCEs)
Interconnect multiple crates for additional
scalability
Fully configurable
Designed to optimize crates populated with
RCE boards
Ability to use ATCA redundant lanes for
additional bandwidth if desired
Ability to use 2.5Gb/s connections in place of
standard 1Gb/s Ethernet
At the same time may be configured to
connect standard ATCA blades
LUSI DOE Review Aug. 20, 2008
Controls (WBS 1.6) p. 37
Gunther Haller
haller@slac.stanford.edu
Experiment Front-End Board
Interfaces to detector ASIC
Control signals
Row/column clocks
Biases/thresholds
Analog pixel voltage
Contains
Communication IP core
Local configuration state machine
Local image readout state machine
Example: SLAC board
FPGA with
MGT interfaces, up to 4 x 2.5
Gbit/sec fiber IO
~ 200 digital IO
VHDL programmed
Includes communication IP
core provided by SLAC
LUSI DOE Review Aug. 20, 2008
Controls (WBS 1.6) p. 38
Gunther Haller
haller@slac.stanford.edu
CXI 2D-Detector Control and DAQ Chain
Groundisolation
Vacuum
Fiber
Cornell
detector/
ASIC
SLAC FPGA
front-end
board
ATCA crate
with SLAC
DAQ Boards
Each Cornell detector has ~36,000 pixels
Controlled and read out using Cornell custom ASIC
~36,000 front-end amplifier circuits and analog-to-digital converters
Initially 16 x 32,000-pixel devices, then up to 64 x 32,000-pixel devices
4.6 Gbit/sec average with > 10 Gbit/sec peak
LUSI DOE Review Aug. 20, 2008
Controls (WBS 1.6) p. 39
Gunther Haller
haller@slac.stanford.edu
Calibration & Distribution (using SLAC
DAQ)
LUSI DOE Review Aug. 20, 2008
Controls (WBS 1.6) p. 40
Gunther Haller
haller@slac.stanford.edu
Noise (using SLAC DAQ)
LUSI DOE Review Aug. 20, 2008
Controls (WBS 1.6) p. 41
Gunther Haller
haller@slac.stanford.edu
XPP 2D-Detector Control and DAQ Chain
BNL XAMP Detector 1,024 x 1,024 array
Uses 16 each 64-channel FexAmps BNL custom ASICs
Instantaneous readout: 4 ch x 20 MHz x 16bit= 20 Gbit/sec into FPGA
Output FPGA: 250 Mbytes/s at 120 Hz (1024x1024x2x120)
FexAmps proto-type ASIC has been received at SLAC and configuration
and read out tests using SLAC LCLS DAQ system have begun
Fiber
SLAC standard
front-end board
Detector
SLAC LCLS DAQ
ATCA crate
ASIC board with readout ASIC plus ADC’s
LUSI DOE Review Aug. 20, 2008
Controls (WBS 1.6) p. 42
Gunther Haller
haller@slac.stanford.edu
DAQ Waveform Sampling Digitizer
Agilent Acqiris DC282 high-speed 10-bit cPCI Digitizer
4 channels
2-8 GS/s sampling rate
Acquisition memory from 1024 kpoints to 1024 Mpoints
Low dead time (350 ns) sequential recording with time stamps
6U PXI/CompactPCI standard, 64 bit, 66 MHz PCI bus
Sustained transfer rate up to 400MB/s to host SBC
LUSI DOE Review Aug. 20, 2008
Controls (WBS 1.6) p. 43
Gunther Haller
haller@slac.stanford.edu
High-Level Applications
To allow commissioners and users and of each
experiment to:
Use a common interface to both the DAQ system and EPICS
Speed up the development cycle by using a high level
programming language, but still be able to easily build
critical sections in C/C++
Easily develop new applications
Provide a GUI integrated with the programming language
Re-use code developed by other LUSI experiments
Python as high level scripting language
Easy to learn, fast dev cycle, extensible, open-source, powerful,
relatively fast
QT as graphical user interface
Framework and support for scientists provided by PCDS
LUSI DOE Review Aug. 20, 2008
Controls (WBS 1.6) p. 44
Gunther Haller
haller@slac.stanford.edu
Online Archive
The online archive has a dual role
Store science and EPICS data for retrieval/monitoring/analysis by
the online system
Allow DAQ and controls to keep operating during downtimes of the
offline staging system
The archive size depends on average data rate and estimated
downtime
Initially assumes:
2 MB per image, 120Hz pulse rate, 30% success rate, 50% daily duty
cycle: ~3.1 TB/day
4 days estimated downtime offline staging system (eventually up to 7
days)
Will start with 12 TB going up to 20 TB before all 3 instruments are
operating
Must be able to easily scale size to accommodate for larger
detectors
Must be able to store initially > 250 MB/s
LUSI DOE Review Aug. 20, 2008
Controls (WBS 1.6) p. 45
Gunther Haller
haller@slac.stanford.edu
Online Archive Data Format
Online acquires data from instruments as C++ objects
Each class represents instrument data type or instrument
configuration
Classes might also describe processed instrument data or EPICS
data needed for data analysis
Data written to disk native DAQ object oriented format
Data stored in its memory representation
Classes designed to optimize high performance and self
describing features
Minimize read/write operations needed to re-create or store an
object
Maximize ability to adapt to changes in the data structures (eg
number of pixels for a given detector) without introducing a new
class
LUSI DOE Review Aug. 20, 2008
Controls (WBS 1.6) p. 46
Gunther Haller
haller@slac.stanford.edu
Archive and Offline System
Interface between the archive and the offline system
made of two parts
Files staged on dedicated local disk
10 Gb/s link between NEH and SCCS for bulk data transfer
Replicated MySQL database used to maintain transfer state
MySQL database in PCDS enclave to share meta-data
information
Availability of a file, completion of a file copy operation, etc
1.6-526 Online/Offline ICD (Interface Control Document)
released
Offline will store the data in HDF5 files
Compatible with NeXus standard for X-ray, neutron and muon
data
LUSI DOE Review Aug. 20, 2008
Controls (WBS 1.6) p. 47
Gunther Haller
haller@slac.stanford.edu
File Management, Metadata
1.6-118 Offline Data Management System document released
File Management
Central file manager tracks all files [iRODS].
High-performance parallel filesystem used for disk storage
[Lustre].
Tape system used for long-term archiving [HPSS].
Network-based export interface.
Disk-based (e-SATA, USB) export interface.
Meta Data
Science metadata database contains user, run, instrument, and
pulse attributes from online system.
Additional user and run information replicated from electronic
logbook.
All metadata may be queried to locate files and portions of files of
interest.
Metadata is exported with data.
LUSI DOE Review Aug. 20, 2008
Controls (WBS 1.6) p. 48
Gunther Haller
haller@slac.stanford.edu
Analysis Options
Can use NeXus or HDF5 Tools and Analysis Packages
Open Genie
LAMP
GumTree
Nathan
Redas
Scilab
Amortool
More …
IDL-HDF5
Matlab
Mathematica
O-Matrix
ViTables
More …
LUSI DOE Review Aug. 20, 2008
Controls (WBS 1.6) p. 49
Open Source
NeXus API
Commercial
HDF5
File
API
Cactus
Chombo
dxhsf5
H5PartRoot
HL-HDF
ParaView
PyTables
VisAD
Open Source
Many more
…
Gunther Haller
haller@slac.stanford.edu
Scientific Computing
Scientific Computing for LUSI Science
Opportunities and needs are being evaluated
Very dependent on the detailed nature of the science
Unprecedented size (for photon science) of data sets to be
analyzed
Unprecedented computational needs (for photon science)
Comparable in scale to a major high-energy physics
experiment
Greater need for flexibility than for a major high-energy
physics experiment
Main scientific computing effort not part of baseline
LUSI DOE Review Aug. 20, 2008
Controls (WBS 1.6) p. 50
Gunther Haller
haller@slac.stanford.edu
Risk
Risk
IF there are major changes in the scope, performance, existence
or placement of CXI/XPP/XCS instrumentation due to evolving user
requirements…THEN, it might be difficult to meet the schedule
and budget as specified in P3
Mitigation
Release Engineering Requirement documents
Already done
Adhere to BCR process
LCLS requirement
Participate in Experimental Area design process
Already participating
LUSI DOE Review Aug. 20, 2008
Controls (WBS 1.6) p. 51
Gunther Haller
haller@slac.stanford.edu
Controls/DAQ Team Leaders
1.6. CAM: G. Haller
Deputy (P. Anthony)
Online (A. Perazzo)
Controls (D. Nelson)
DAQ (C. O’Grady)
Infrastructure (R. Rodriguez)
Offline Computing (S. Luitz)
Technical leaders are also responsible for AMO and XES-provided photon area
controls/DAQ/infrastructure needed by LUSI
Provides low risk having interface issues, provides high efficiency
Ensures common solutions
No issue with man-power, plus instruments are time-phased. Could accelerate LUSI
controls, all driven by budget availability
Scientist
XPP (D. Fritz)
CXI (S. Boutet)
XCS (A. Robert)
Diagnostics/Common Optics (Y. Feng)
Detectors (N. Van Bakel)
LUSI DOE Review Aug. 20, 2008
Controls (WBS 1.6) p. 52
CAM: Control Account Manager
Gunther Haller
haller@slac.stanford.edu
Summary
All control and data systems requirements in LUSI Performance
Execution Plan will be met with system presented for W.B.S. 1.6
Technical, cost, and schedule risks are low
Well documented agreements with instruments
Re-use of LCLS controls software, hardware where appropriate
Cost bottoms-up with detailed quotes for each component
Schedule fully linked and resource leveled
Data subsystem concept & architecture are well developed
Use standard interface to all detectors
CXI and XPP/XCS detector ASICs are already being configured and
read out using the LCLS DAQ system
Data management system provides high bandwidth and is scalable
Leverage significant expertise at SLAC in data acquisition and
management
Ready to be approved for cost and schedule baseline
LUSI DOE Review Aug. 20, 2008
Controls (WBS 1.6) p. 53
Gunther Haller
haller@slac.stanford.edu
END OF PRESENTATION
LUSI DOE Review Aug. 20, 2008
Controls (WBS 1.6) p. 54
Gunther Haller
haller@slac.stanford.edu
Download