ESS_Status_

advertisement
ESS Control System Project Status
Timo Korhonen
Chief Engineer, Integrated Control System Division
www.europeanspallationsource.se
October 21, 2014
A European Project
Largest&European&science&project&
Sweden,
Denmark and Norway: !
50% of construction
15-20% of operations!
European partners:
50% of construction!
•
•
•
•
•
•
•
•
•
•
•
•
•
Sweden 35%
Denmark 12.5%
Germany 11%
United Kingdom 10%
France 8%
Italy 6%
Spain 5%
Switzerland 3.5%
Norway 2.5%
Poland 2%
Hungary 1.5%
Czech 0.3%
Estonia 0.25%
2
Cash
vs. In-kind
Cash)vs.)In^kind)
M€&
400&
350&
In^kind)poten, al)
In^kind)planned)
Cash)
300&
250&
200&
150&
100&
50&
0&
2013&
2014&
2015&
2016&
2017&
2018&
2019&
2020&
2021&
2022&
2023&
2024&
2025&
3
Potencial
In-kind
ESS&In:kind&
contribuOons&potenOal&
4
Construction vs. Operation
M€&
5
The European Spallation Source
• An accelerator-based neutron source to be built in
Lund, southern Sweden
– Material and life sciences research
• Targeted to be the world’s most powerful neutron
source
– 5 MW beam power, 2.5 GeV proton energy, 14 Hz
repetition rate, 2.86 ms pulse@65 mA beam current
– Superconducting linac, rotating tungsten target
– 22 neutron beam lines in construction budget
6
The European Spallation Source
• First protons planned by
2019
• Project completion by 2025
– Full beam power, 22 neutron
beam lines
• Accelerator in
Lund,Sweden and data
processing center in
Copenhagen, Denmark
7
Machine layout
(200 m)
8
Target and the neutron beam lines
Rotating wheel target – beam has to
hit in the middle of the sector.
9
Office and experiment areas – nice…
10
ESS construction has started!
ESS is funded
and
Construction is
underway!
This aerial photo has been
taken about some weeks ago.
11
ESS Control System
• ESS Integrated Control System Division (ICS) is in charge
of building the control system(s) for the accelerator, the
neutron target, and providing controls for the beamline
components (in cooperation with the colleagues from
science directorate)
• Project scope includes also
– Conventional facilities control integration
– Safety & Protection Systems (Machine, Personnel)
– Global Timing System for site-wide synchronization
• Some parts of the controls will be provided by ESS
partner laboratories as in-kind contributions
– e.g., proton source and LEBT controls by Saclay (France)
12
Our Organization
Garry Trahern
Head of Division
Deputy Head of Division (vacant)
Timo Korhonen – Chief Engineer
Miha Reščič – Deputy Project Manager, Lead Systems Engineer
Team Assistant (50%)
Thilo Friedrich – Systems & Standardization Engineer, PhD
PROTECTION SYSTEMS
SOFTWARE AND SERVICES
Suzanne Gysin – GL
HARDWARE AND INTEGRATION
Leandro Fernandez
Daniel Piso Fernandez
Richard Fearn
Lead Integrator, Accelerator (vacant)
Ricardo Fernandes
Lead Integrator, Target (vacant)
Emanuele Laface
Javier Cereijo, Ph. D student
Karin Rathsman
Klemen Strniša (C)
Jaka Bobnar (C)
Urša Rojec (C)
Jakob Battellino (C)
Niklas Claesson (C)
Miha Vitorovič (C)
Alexander Söderqvist (C)
Jure Krašna (C)
Gregor Cijan (C)
Annika Nordt – GL
Manuel Zaera-Sanz (MPS
INFRASTRUCTURE
Remy Mudingay
Angel Monera (MPS)
Riccard Andersson, Ph. D. student
Stuart Birch (PSS)
Denis Paulic (PSS)
Marko Kolar (C)
Miroslav Pavleski (C)
13
How many are we…and how many we’ll be
80
80
70
70
60
60
50
50
40
Offsite
40
External
30
Onsite
30
Internal
20
20
10
10
0
0
2013 2014 2015 2016 2017 2018 2019 2020 2021
2013 2014 2015 2016 2017 2018 2019 2020 2021
80
70
60
SENIOR ENGINEER
50
ENGINEER
40
SENIOR SCIENTIST
30
20
10
0
SCIENTIST
SENIOR DEVELOPER
DEVELOPER
14
Project Structure
• Core components
–
–
–
–
–
–
WP.2 Applications
WP.3 Software core
WP.4 Hardware core
WP.5 Protection core
WP.7 CS Infrastructure
WP.8 Physics
• Integration support
–
–
–
–
–
WP.10 Accelerator
WP.11 Target
WP.12 Instruments
WP.13 Conventional Facilities
WP.14 Test Stands
15
How much?
16
Some technology decisions
• Using EPICS for controls in the whole facility
– From accelerator to neutron beamlines
– Plan to benefit from EPICS version 4 facilities
• pvAccess as the binding communicating layer
• Services (e.g. physics model as a service) for the accelerator
• Participating in the DISCS (Distributed Services for Controls)
collaboration
– Databases and applications for facility and device configuration data
– Middle-layer services (see http://openepics.sourceforge.net/)
• CS-Studio as the generic user interface tool
– Control room, subsystem developers, etc.
• Committed to work together with the EPICS and Accelerator
community
– Combining our resources we can make more and better things
17
Technology decisions cont.
• Using MRF event system for global timing
– Synchronization in the whole facility
– Timing supports beam synchronous applications (in
discussion with PSI about co-development)
– Looking at future developments together with some new
projects (ELI-Beams, SwissFEL, others)
• Machine protection is of prime importance
– Ability to switch off beam mid-pulse (10 us time range)
• FPGA-based fast interlock
• Slow devices connected to PLC-based system
– Has to be synchronized with beam operation
• Ramping up with beam intensity, pulse length and repetition rate
18
Controls hardware decisions
• Defining hardware standards – for the whole spectrum
– Standardize on one PLC brand – as far as we can
• To reduce complexity
– Digital platform form factor
• One form factor should cover the segment of fast signal processing
– Mainly beam diagnostics and LLRF
• Going towards MTCA.4 – carefully, though
– Still a new standard with compatibility, etc., issues
• Setting up collaborations for a full framework
– FMC standards, development framework for the whole cycle
(HW&FW&SW)
– Plan to use EtherCAT for middle range between PLCs and MTCA
• pulse-synchronized operations, distributed, kHz range acquisition
19
Controls platforms
• Evaluating motion controller platforms
– Current designs based on DeltaTau GeoBrick
– Working together with colleagues from science directorate
• Controls network design starting
• a network and servers expert hired recently
• Server hall (Lund site) design ongoing
• Starting to look at virtualization solutions
– “soft” IOC deployment, development servers
• Collaborative tools in heavy use
– Atlassian toolkit: JIRA (bugtracker), Confluence (Wiki, web
tool) and BitBucket (repository/Git management)
20
Common services
• Configuration databases and related tools
– (Controls) configuration, cabling, naming service, access control
(RBAC) etc.
• IOC development workflow and tools
– Still searching for optimal development and deployment flow
• How to deploy 600+ IOCs in a short time?
• How to manage these after deployment?
• How to support our off-site partners>
– “IOC Factory” – you will hear about that in the future
• Use tools from the community whenever possible
– CS-Studio, Archiver, Save&Restore, alarm handler
– Contribute to development when appropriate
• Use resources to develop things that do not exist
– Our local infrastructure, for example…
21
Summary
• ESS construction has started
– New accelerator, new institute, new…everything
– Roll up the sleeves and start building systems
• Control system effort is ramping up
– Moving from prototyping to design decisions
– Hiring people (watch our web pages!)
– Working in collaborations and with in-kind contributors in
member states (still looking for partners)
• Lots of things to build
– Many challenges, many opportunities
22
Download