Experimental Area Controls and Data-Acquisition for LCLS and LUSI Instruments Gunther Haller

advertisement
Experimental Area Controls
and Data-Acquisition for LCLS and LUSI
Instruments
Parallel Session Presentation
Gunther Haller
Research Engineering Group
SLAC Particle Physics and Astrophysics Division
Photon Control and Data Systems (PCDS)
30 October 2007
LCLS FAC Meeting 30 October 2007
v5
Parallel: X-Ray Controls & DAQ Systems
Gunther Haller
‹#›
haller@slac.stanford.edu
LCLS & LUSI
Common Controls and DAQ System Design for LCLS and LUSI
Common CAM (G. Haller) for LCLS and LUSI Controls & DAQ, including XTOD Controls
Group is called Photon Control and Data Systems (PCDS) group
LCLS Controls & DAQ Responsibility
Common services for all hutches
PPS
Laser Safety
Accelerator timing interface
120 Hz beam-quality data interface
Machine protection system interface
User safeguards
Network interface
AMO experiment (NEH, hutch 2)
All controls and DAQ
2-D detector
Control and DAQ for detector itself only
LUSI Control & DAQ Responsibility
X-Ray Pump Probe, XPP (NEH, hutch 3)
X-Ray Photon Correlation Spectroscopy, XCS (FEH, hutch 1)
Coherent X-Ray Imaging, CXI (FEH, hutch 2)
LCLS & LUSI
Local data storage in halls
LCLS FAC Meeting 30 October 2007
Parallel: X-Ray Controls & DAQ Systems
Gunther Haller
‹#›
haller@slac.stanford.edu
E.g. Overall LUSI Specifications
Per pulse data collection
Data Management
Experimental
Diagnostic – EO signal, e- and g
beam parameters
Unified data model
Archiving capacity – 5 PB/year
Analysis staging storage capacity –
20 TB
Raw data rate and volume
2 Gb/sec or higher
On-line storage capacity - 20 TB/day
Timing/Triggering
Offline Analysis
> 1000 node cluster
Pump Laser operation
Vacuum controls
MPS systems
Laser PPS systems
EO system
EO timing measurement < 1 ps
Detector trigger < 1 ms
Real time analysis
Frame correction, quality control
To the extent possible - binning,
sparsification, FFT
Quick view
Quasi real-time feedback, 5 frame/s
Alignment
LCLS FAC Meeting 30 October 2007
Parallel: X-Ray Controls & DAQ Systems
Gunther Haller
‹#›
haller@slac.stanford.edu
Controls – List of Components
Measurement instrument
Optics
KB mirrors for focusing
Refractive lens for focusing
Monochromator
Collimator
Slits
Attenuators
Split-delay
Pulse picker
Compressor
Diffractometer
e- and ion TOF
Mass spectrometer
EO timing measurement
Laser systems
Pump laser and diagnostics
EO laser
Molecular alignment laser
Sample environment
Vacuum systems
Particle injector
Cryostat
Cryo-em stage
Precision stages
Turbo pumps
Ion pumps
2D Detectors
Beam Diagnostics
Intensity monitors
Beam positioning monitor
Wavefront sensor
Cornell detector for CXI
BNL detector for XPP
BNL detector for PCS
LCLS FAC Meeting 30 October 2007
Parallel: X-Ray Controls & DAQ Systems
Gunther Haller
‹#›
haller@slac.stanford.edu
Basic Motion System Block Diagram
67 Stepper motors
Stepper
Motor
LVDT
In/Out
Viewing
Paddle
Newport
Stepper
Motor
Beam Line
Devices
M
M
Interface
Driver
Controllers
EPICS
IOC
Hytec
SMDS4-B
Driver &
Encoder
SLAC LVDT
Breakout
Chassis
SLAC 8-CH
Solenoid
Controller
Hytec
Transition
Card & Hytec
IP Card &
VME Carrier
Board
Highland
LVDT VME
Module
Acromag
IP Card
& VME
Carrier Board
Newport
XCS
Controller
Hytec SMDS4-B driver
VME IOC
MVME6100
Ethernet
Newport XCS controller
LCLS FAC Meeting 30 October 2007
Parallel: X-Ray Controls & DAQ Systems
Gunther Haller
‹#›
haller@slac.stanford.edu
Viewing/Camera System Block Diagram
Cameras
PULNiX 6710CL (648x484, 9um x 9um)
Up to 120 Hz
Triggered by timing signal from timing
system event receiver PMC card
Centroid finding software running on VME
IOC
Devices
Beam Line
Trigger
Controllers
From
Accelerator
EVG Timing
Master
EPICS
IOC
EVR Timing
PMC
EDT Frame
Grabber PMC
EDT DV
CameraLink
PMC card
VME IOC
MVME6100
PULNiX
6710CL
camera
Ethernet
LCLS FAC Meeting 30 October 2007
Parallel: X-Ray Controls & DAQ Systems
Gunther Haller
‹#›
haller@slac.stanford.edu
Example: Timing Control - Electro-Optic Sampling
Stabilized Fiber Optic RF Distribution (10 fs)
LBNL
Pump-probe
Gun Laser
Electro-optic
Sampling
Laser
Sector 20
LTU
NEH
Laser
Machine Timing Distribution ~20 ps jitter (plus longer term drifts)
Separate fast-timing network to get < 100 fs timing
Beam to laser timing difference is measured and used, at 120 Hz,
to process images in DAQ
E.g. sorting/binning of images for pump probe experiment
LCLS FAC Meeting 30 October 2007
Parallel: X-Ray Controls & DAQ Systems
Gunther Haller
‹#›
haller@slac.stanford.edu
Data Acquisition/Mgmt Architecture
SCCS
LUSI
Detector Control
Node
Detector
Specific
Quick View
Rendering Node
FPGA
DetectorSpecific FrontEnd Electronics
(FEE)
Online
Volume Rendering
Cluster
Experiment
Common
10–G
Ethernet
2D Detector
ADC
Volume
Rendering Node
4 x 2.5
Gbit/s
fiber
SLAC
LCLS
DAQ
Box
10–G
Ethernet
Data Server
Data Servers
Online
Processors
Disk Arrays/
Controller
Accelerator 120-Hz
Data Exchange &
Timing Interface
Offline
Detector (bump-bonded or integrated)
Detector–specific Front-End Electronics (FEE)
Local configuration registers
State-machine to generate low-level detector
readout signals
Digitize analog pixel voltages
LCLS FAC Meeting 30 October 2007
Parallel: X-Ray Controls & DAQ Systems
Tape Drives/
Robots
‹#›
Organize bits into pixel words
Transmit to DAQ system
IP core in FPGA for
communication
Up to 4 x 2.5-Gb/s fibers
Gunther Haller
haller@slac.stanford.edu
Example: Experiment Front-End Board
Interfaces to detector ASIC
Control signals
Row/column clocks
Biases/thresholds
Analog pixel voltage
Contains
Communication IP core
Local configuration state machine
Local image readout state machine
Example: SLAC development board
FPGA with
MGT interfaces, up to 4 x 2.5 Gbit/sec fiber IO
~ 200 digital IO
VHDL programmed
Includes communication IP core provided by
SLAC
Every detector system needs such a board to
interface to the detector/ASIC maybe with
detector-specific ADC’s/DAC’s integrated on
board (or on separate board which is connected
to this board)
No additional modules needed to connect to
common DAQ
LCLS FAC Meeting 30 October 2007
Parallel: X-Ray Controls & DAQ Systems
Gunther Haller
‹#›
haller@slac.stanford.edu
Data Acquisition/Mgmt Architecture
SCCS
LUSI
Detector Control
Node
Detector
Specific
Quick View
Rendering Node
ADC
FPGA
DetectorSpecific FrontEnd Electronics
(FEE)
Online
Volume Rendering
Cluster
Experiment
Common
Level 1
2D Detector
Volume
Rendering Node
4 x 2.5
Gbit/s
fiber
10–G
Ethernet
SLAC
LCLS
DAQ
Box
10–G
Ethernet
Data Server
Data Servers
Online
Processors
Disk Arrays/
Controller
Accelerator 120-Hz
Data Exchange &
Timing Interface
Offline
Level 1 DAQ nodes are responsible for:
Control FEE parameters
Receive machine timing signals
Send trigger signals to FEE
Acquire FEE data
Merge FEE data with beam-line data
LCLS FACinformation
Meeting 30 October 2007
Parallel: X-Ray Controls & DAQ Systems
Tape Drives/
Robots
Low level real time data processing,
e.g.:
Filtering of images based on beam-line
data
Pixel correction using calibration
constants
Send collected data to Level 2 nodes
Gunther Haller
‹#›
haller@slac.stanford.edu
ATCA Crate
ATCA
Based on 10-Gigabit Ethernet backplane
serial communication fabric
2 custom boards
RCE
Reconfigurable Cluster Element (RCE)
Module
CIM
Interface to detector
Up to 8 x 2.5 Gbit/sec links to detector
modules
Cluster Interconnect Module (CIM)
Managed 24-port 10-G Ethernet switching
One ATCA crate can hold up to 14 RCE’s
& 2 CIM’s
Essentially 480 Gbit/sec switch capacity
Naturally scalable
Can also scale up crates
LCLS FAC Meeting 30 October 2007
Parallel: X-Ray Controls & DAQ Systems
Gunther Haller
‹#›
haller@slac.stanford.edu
ATCA
Crate
Reconfigurable Cluster Element (RCE) Boards
Addresses performance issues with offshelf hardware
Processing/switching limited by
CPU-memory sub-system and not
# of MIPS of CPU
Scalability
Cost
Networking architecture
Reconfigurable Cluster Element
module with 2 each of following
Virtex-4 FPGA
2 PowerPC processors IP
cores
512 Mbyte RLDRAM
8 Gbytes/sec cpu-data memory
interface
10-G Ethernet event data interface
1-G Ethernet control interface
RTEMS operating system
EPICS
up to 512 Gbyte of FLASH
memory
Rear
Transition
Module
LCLS FAC Meeting 30 October 2007
Parallel: X-Ray Controls & DAQ Systems
Reconfigurable
Cluster Element
Module
Gunther Haller
‹#›
haller@slac.stanford.edu
Cluster Interconnect Module
Network card
2 x 24-port 10-G Ethernet
Fulcrum switch ASICs
Managed via Virtex-4 FPGA
Network card interconnects up to
14 in-crate RCE boards
Network card interconnects
multiple crates or farm machines
LCLS FAC Meeting 30 October 2007
Parallel: X-Ray Controls & DAQ Systems
Gunther Haller
‹#›
haller@slac.stanford.edu
Data Acquisition/Mgmt Architecture
SCCS
LUSI
Detector Control
Node
Detector
Specific
Quick View
Rendering Node
Experiment
Common
FPGA
DetectorSpecific FrontEnd Electronics
Online
4 x 2.5
Gbit/s
fiber
Volume Rendering
Cluster
Level 2
10–G
Ethernet
2D Detector
ADC
Volume
Rendering Node
SLAC
LCLS
DAQ
Box
10–G
Ethernet
Data Server
Data Servers
Online
Processors
Disk Arrays/
Controller
Accelerator 120-Hz
Data Exchange &
Timing Interface
Offline
Level 2 DAQ nodes are responsible for:
Different technologies are being
evaluated for Level 2 nodes:
High level data processing, e.g. :
Combine 105 to 107 images into 3-D data-set
Learn/pattern recognition/classify/sort images
Alignment, reconstruction
Local caching of the data
LCLS FAC
Meetingreal
30 October
2007 information
Generate
time monitoring
Parallel: X-Ray Controls & DAQ Systems
Tape Drives/
Robots
ATCA/RCE crates
Linux cluster based on commercial
machines (eg Dell PowerEdge 1950)
Level 3: SCCS archive/bulk storage
Gunther Haller
‹#›
haller@slac.stanford.edu
Real-time Processing – Sorting in CXI
• Diffraction from a single molecule:
noisy
diffraction
pattern of
unknown
orientation
single LCLS pulse
unknown
orientation
• Combine 105 to 107 measurements into 3D dataset:
Classify/sort
Real-time?
Average
Reconstruct by
Oversampling
phase retrieval
Alignment
The highest achievable resolution is limited by the ability to group patterns
of similar orientation
Gösta Huldt, Abraham Szöke, Janos Hajdu
(J.Struct Biol, 2003 02-ERD-047)
LCLS FAC Meeting 30 October 2007
Parallel: X-Ray Controls & DAQ Systems
Miao, Hodgson, Sayre, PNAS
98 (2001)
Gunther Haller
‹#›
haller@slac.stanford.edu
Computational Alignment
Experimental Data (ALS)
Difference of pyramid diffraction patterns 10º
apart, Gösta Huldt, U. Uppsala
q
kin
“The number currently used to obtain high-resolution
structures of specimens prepared as 2D crystals, is
estimated to require at least 1017 floating-point
operations”
R. M. Glaeser, J. Struct. Bio. 128, (1999)
kout
“Computational Alignment” requires large
computational power that might only be provided
by performing offline analysis?
 Save first, and Analyze later? To be investigated

LCLS FAC Meeting 30 October 2007
Parallel: X-Ray Controls & DAQ Systems
Gunther Haller
‹#›
haller@slac.stanford.edu
Data Acquisition System
10-G Ethernet
Science Data
Example BNL XAMP Detector 1,024 x 1,024
array
Row-by-row readout: 64 readout IO’s
8 eight-channel 20-MHz 14-bit ADC’s + rangebit
Instantaneous readout:
64 ch x 20 MHz x 16bit= 20 Gbit/sec into FPGA
ATCA
1G
Ethernet
private
subnet
NEH or FEH
hutch-common
local datastorage and to
SCCS
Controller PC
With
EElog
SLAC WAN
RCE &
CIM
Channel Access
Gateway
Output FPGA
EPICS
Channel
Access
250 Mbytes/s at 120 Hz (1024x1024x2x120)
Dedicated 1G Ethernet
Electrical IO
XAMP
Detector
Electronics
1,024 x
1,024 pixels
EVG-EVR Protocol with
Time-Stamp, Beamcode
Up to 4 x 2.5 Gb/sec Fibers
Experiment
-specific
Front-end
Board
& ASIC
Trigger Strobe
Experiment-Specific
Front-End Electronics (FEE)
DAQ to
Experiment
interface
LCLS FAC Meeting 30 October 2007
Parallel: X-Ray Controls & DAQ Systems
Trigger
Strobe
VME IOC With
EVR PMC Timing
Module
SLAC Common DAQ
Gunther Haller
‹#›
haller@slac.stanford.edu
120-Hz
Beam and
Experiment
Data
Fiber From
EVG
Machine
Timing
System
Data Acquisition and Processing
Example: Pixel Detectors
Calibration
Without beam (dedicated calibration runs or in-between ~8-ms spaced beams)
Images to be used to calculate calibration constants
Examples
Dark-image accumulation and averaging, pedestal calculation
Transfer curve mapping, gain calculation
Neighbor pixel cross-talk correction
Readout 120-Hz Science Images
Calibration Correction
With beam
Pedestal subtraction
Piece-wise linear or polynomial correction
Cross-talk compensation
Other corrections?
Requirements to be determined after prototype detectors/ASICs are evaluated
Investigate C++ RTEM processing versus VHDL
LCLS FAC Meeting 30 October 2007
Parallel: X-Ray Controls & DAQ Systems
Gunther Haller
‹#›
haller@slac.stanford.edu
Data Acquisition and Processing (2)
Event-building with 120-Hz accelerator timing & beam-quality data information
Attach 120-Hz time-stamp, beam-code to pipelined images
Attach 120-Hz beam-quality data to pipelined images
Dataflow consistency monitoring & diagnostics
Error detection and recovery
Filtering of images
Explore partial online data reduction
Investigate rejection using 120-Hz accelerator beam-data
Investigate feature extraction from calibrated images
Investigate filtering using extracted features
Requirements to be determined after prototype detectors/ASICs are evaluated
Investigate C++ RTEM processing versus VHDL
Realtime Event Monitoring
Monitor quality of images
~ 5-Hz rate
Processing of images
Display on user monitor
LCLS FAC Meeting 30 October 2007
Parallel: X-Ray Controls & DAQ Systems
Gunther Haller
‹#›
haller@slac.stanford.edu
Spectrometer Data Acquisition
Detector
Beam Line
Devices
Interface
Controllers
Timing Fiber
from
Accelerator
EVG
EPICS
IOC
Timing
EVR
PMC
Trigger
cCPI Acqiris
8 GHz, 10-bit
Digitizer
cPCI Module
Up to 8 GHz waveform sampling, 1 usec
record or 1-GHz, 500-usec window
At 120 Hz -> 100 Mbytes/sec
High performance to move or process
data
Waveform is time-stamped via EVR
Respective beam-quality data is attached
to each waveform
Agilent Acqiris 8-GHz, 10-Bit DC282 cPCI
Digitizer Module
Waveform
EPICS Time,
Beam Code
cPCI IOC
120-Hz Beam-Data from Accelerator
1G Ethernet Science Data
to Local Cache & SCCS
Slow Controls Ethernet
LCLS FAC Meeting 30 October 2007
Parallel: X-Ray Controls & DAQ Systems
Gunther Haller
‹#›
haller@slac.stanford.edu
Offline Data Management and Scientific Computing
Data Format and API (online/offline)
Data Catalog, Meta Data Management
Electronic Logbook
Processing Framework, Workflow, Pipeline
Mass Storage System
Offline Data Export System
Offline Processing Cluster
Offline Disk Server
Science Tools
Scientific Computing for LUSI Science
Opportunities and needs being evaluated
Very dependent on the detailed nature of the science
Unprecedented size (for photon science) of data sets to be analyzed
Unprecedented computational needs (for photon science)
Comparable in scale to a major high-energy physics experiment
Greater need for flexibility than for a major high-energy physics experiment
LCLS FAC Meeting 30 October 2007
Parallel: X-Ray Controls & DAQ Systems
Gunther Haller
‹#›
haller@slac.stanford.edu
Applications
User programs
Endstation operation
Calibration
Alignment
Interface to SW for diffraction/scattering experiments
SPEC
Interface to instrumentation/analysis SW
MatLab
LabView
User tools
STRIP tool
ALARM Handler
LCLS FAC Meeting 30 October 2007
Parallel: X-Ray Controls & DAQ Systems
Gunther Haller
‹#›
haller@slac.stanford.edu
Summary
Control subsystem based on EPICS, standard at SLAC
Controller selection in progress paced by beam-line definition
Starting to assemble test-setups
Moving and processing science data is key data-acquisition task
AMO data acquisition via 10-bit 8-GHz cPCI waveform sampling modules
2D-Pixel detector acquisition via SLAC ATCA DAQ Modules
Peak data rate/volume requirements are comparable to HEP
experiments, requiring separate data acquisition and management
system
Leverage significant expertise at SLAC in data acquisition and management
Prototypes of ATCA DAQ modules in hand
LCLS FAC Meeting 30 October 2007
Parallel: X-Ray Controls & DAQ Systems
Gunther Haller
‹#›
haller@slac.stanford.edu
Download