Securing Cyber-Physical Software C. Warren Axelrod, PhD Senior Consultant, Delta Risk LLC Agenda • • • • • • • • • • Overview Security of Safety-Critical Systems in the News Safe & Secure Software Systems Engineering (S4E) Different outlooks of security & safety software engineers What are “cyber-physical systems”? Process for securing software systems Process for ensuring systems are safe Certification of avionics and other safety-critical systems Recommendations for application security and safety folks Summary and conclusions About the Presenter • • • • • • • • • • • • • Career as a senior executive in IT and InfoSec areas in financial services ISE Luminary Leadership Award (2007) Computerworld Premier 100 and Best-in-Class Awards (2003) Contributed to the National Strategy to Secure Cyberspace (2002) Congressional Subcommittee testimony on cyber security (2001) Represented financial services over Y2K at the National Information Center Co-founder and Board member of the FS-ISAC Contributed to FSSCC Research Agenda Published 5 books on IT management and information security Published 100+ professional articles and book chapters; posted 200+ blogs Moderated and presented at150+ conferences, seminars, roundtables Ph.D. in Managerial Economics, Cornell University; B.Sc. Honors in Electrical Engineering and MA Honors in Economics and Statistics, Glasgow University Certifications: CISSP, CISM Hosted by OWASP & the NYC Chapter Overview Security and safety software engineers live in different and often separate worlds. The former worry about protecting information-processing systems and data from attacks. The latter are very concerned with potential harm that could be inflicted by malfunctions and failures of computer control systems. It is not sufficient to train software safety engineers about securing control systems. Information security professionals need to gain a greater understanding of the control systems to which their information systems are increasingly being connected. This two-way exchange of ideas and approaches is crucial if we are to ensure that systems comprising both security-critical and safety-critical components meet necessary standards and certifications across the board. This presentation addresses the security-safety gap that exists for software. Hacking a Prius Forbes’ Andy Greenberg Investigates Video available at http://www.forbes.com/sites/andygreenberg/2013/07/24 /hackers-reveal-nasty-new-car-attacks-with-me-behindthe-wheel-video Who Is To Blame? • The issue of LIABILITY may be the greatest hindrance to progress ... – – – – What happened? Whose fault was it? (The driver/controller, obviously) Who can we sue? For how much? • With increasing system complexity, greater interconnectivity and inadequate monitoring and data collection, the root cause may be very difficult to discern Hackers Beware! The Barnaby Jack Case Video available at http://www.youtube.com/watch?v=sjnnB_pJzHU The Cheney Video Paranoia or Reality? Video available at http://www.youtube.com/watch?v=N-2iyUpnUwY Grid Attacks The Aurora Project Video available at http://www.youtube.com/watch?v=rTkXgqK1l9A Safety vs. Security per Barnes “Safety and security are intertwined through communication ... in the case of safety, the software [system] must not harm the world; in the case of security, the world must not harm the software [system]. A safety-critical [software] system is one in which the program must be correct ... A security-critical [software] system is one in which it must not be possible for some incorrect or malicious input from the outside [or from an insider] to violate the integrity of the system ...”* Adapted from: J.G.P. Barnes, “Ada” in Avionics: Elements, Software and Functions Edited by C.R. Spitzer CRC Press, FL, 2007 *Based on definitions of safety and security found in Boehm, 1978 Academic Hole per Weiss “...the general lack of security for ICSs [industrial control systems] is due to a ‘hole ... in academia’ since ‘security is taught in computer science departments, whereas control systems are taught in various engineering departments.’” Adapted from: Joseph Weiss Protecting Industrial Control Systems from Electronic Threats Momentum Press: New York, 2010. Structure & Hierarchy of S4E Activities Characteristics Elements SYSTEMS People Facilities Data Hardware Functional SAFETY Performance Assurance Testing Repairing Technology Documents Processes Development (Projects) SOFTWARE Operations (Support) Nonfunctional SECURITY Compliance ENGINEERING Monitoring Reliability Management Reporting Responding Adapted from: C. W. Axelrod, Engineering Safe and Secure Software Systems, © 2013 Artech House The 3-Pumpkin Model The World Damage Attacks Attacks Damage Safe Software Systems Secure Software Systems Safe & Secure Software Systems Source: C.W. Axelrod, Engineering Safe and Secure Software Systems, © 2013 Artech House NSF Definition of CPS • National Science Foundation (NSF) definition – The term cyber-physical system refers to the tight conjoining of and coordination between computational and physical resources – Research advances in cyber-physical systems promise to transform our world with systems that: • • • • • • • respond more quickly are more precise work in dangerous or inaccessible environments provide large-scale, distributed coordination are highly efficient augment human capabilities, and enhance societal wellbeing Source: NSF, Cyber-Physical System (CPS) Program Solicitation NSF 10-515, 2010 Defining Cyber-Physical Systems External End Users “CYBER” “PHYSICAL” Internal Users and Admins/Ops Control System Admins/Operators DataProcessing Software Interfaces Control and Administrative Software Utilities Firmware Hardware Support Utilities & Firmware Reports Supports Physical System Hardware “Cyber-Physical System” Manages Information System Embedded System Adapted from: C.W. Axelrod, “Mitigating the Risks of Cyber-Physical Systems,” IEEE LISAT Conference, Farmingdale, NY, May 2013 © 2013 IEEE Security and Safety Risks • Risks from security-critical systems – Economic—fraud, identity theft, lost customers and sales, out-ofbusiness, restitution – Legal—criminal activities, regulatory fines/actions, business damage control, lawsuits – Social—loss of reputation • Risks from safety-critical systems – Physical harm—loss of life, injuries, radioactivity, chemical and other poisonings – Environmental damage—contamination, pollution, destruction and/or abandonment of buildings and transportation paths – Economic—costs of recovery/repair/reconstitution, bankruptcy, restitution – Legal—regulatory fines/actions, business damage control, lawsuits – Social—loss of reputation Threats & Consequences External Security Threats/Exploits and External Events Insider Threats/Exploits (Intentional and Accidental) INTERNAL & EXTERNAL THREATS & EXPLOITS Security-Critical Information Systems Economic Impact Social/Legal Impact Physical Harm Safety-Critical Control Systems Damage to Environment SOFTWARE-INTENSIVE SYSTEMS CONSEQUENCES OF MALFUNCTION, MISUSE OR FAILURE Source: C.W. Axelrod, Engineering Safe and Secure Software Systems, © 2013 Artech House Securing Systems Information Source: C.W. Axelrod, Engineering Safe and Secure Software Systems, © 2013 Artech House Making Software Systems Safe Source: C.W. Axelrod, “Mitigating the Risks of Cyber-Physical Systems,” IEEE LISAT Conference, Farmingdale, NY, May 2013 © 2013 IEEE Table 1: RTCA/DO-178C standard applied to aircraft certification RTCA/DO-178C Standard Applied to Aircraft Certification Levels for Various Aircraft Systems System Flight control Cockpit display and controls Flight management Brakes and ground guidance Centralized alarms management Cabin management Onboard communications Type of System Control Level A Level B Level C Level D (Catastrophic) (Hazardous) (Major) (Minor) X Control X Control X Control Information X X Information X Information X Verification and Validation • Safety-critical control systems are generally subjected to intensive internal and/or external verification and validation to meet safety certification standards • Collection of data required for V&V is intentionally built into design and manufacture of safety-critical systems • WHY ARE VERIFICATION AND VALIDATION SO OFTEN MISSING FROM SDLCs FOR SECURITY-CRITICAL SYSTEMS? • INCLUDE THEM! Functional Security Testing • Functional testing is the norm, i.e., verifying that the system does what it is supposed to • Non-functional testing for performance, security, availability, etc. is often neglected under pressure to deliver software on time • Software systems often lacking basic security through inadequate testing • INCLUDE FULL FUNCTIONAL SECURITY TESTING IN SDLCs Generation of Security Data • InfoSec practitioners use readily-available data to develop security metrics used in decision-making • Often easier-to-collect data are less useful • Applications, system software and networks must generate more useful data (even if doing so is costly and timeconsuming) subject to acceptable ROI • BUILD-IN CREATION OF SECURITY DATA (Safety data collectors are often incorporated into control systems, e.g., black boxes or event recorders are already in aircraft, trains, and increasingly in cars) Summary • Software security and safety approaches are outside-in and inside-out respectively • Need to address both for cyber-physical systems and systems of systems • Increasing connectivity between security-critical information systems and safety-critical control systems is resulting in “vulnerable control systems” and “hazardous information systems” What We Need • Transference of knowledge and experience between security and safety silos through education and training, professional certifications, etc. • Information sharing about cyber and physical threats, exploits, events and consequences • Participation and collaboration among security and safety software professionals at each and every stage of the SDLC • Building security and safety requirements in, rather than them bolting on • Sharing responsibility (liability?) for overall software system safety and security References • • • • • • • • • • • • C. W. Axelrod, “Bridging the Safety-Security Software Gap,” 5th International Conference on Safety and Security Engineering (SAFE 2013), Rome, Italy, September 2013 C.W. Axelrod, “Mitigating the Risks of Cyber-Physical Systems,” IEEE LISAT Conference, Farmingdale, NY, May 2013 C.W. Axelrod, Engineering Safe and Secure Software Systems, Artech House, 2012 C.W. Axelrod, “The Need for Functional Security Testing,” CrossTalk, 24(2), 2011 C.W. Axelrod, “Creating Data from Applications for Detecting Stealth Attacks,” CrossTalk, 24(5), 2011 C.W. Axelrod, “Applying Lessons from Safety-Critical Systems to Security-Critical Software,” 2011 IEEE LISAT Conference, Farmingdale, NY, May 2011 J.G.P. Barnes, “Ada” in C.R. Spitzer (ed.) Avionics: Elements, Software and Functions, CRC Press, 2007 B.W. Boehm, Characteristics of Software Quality, North-Holland, 1978 Carnegie Mellon University Software Engineering Institute (CMU SEI), Software Assurance Curriculum Project: Volume I: Master of Software Assurance Reference Curriculum, Technical Report CMU/SEI2010-TR-005, 2010 D. Firesmith, Security and Safety Requirements for Software-Intensive Systems. Auerbach Publications, December 2013 (forthcoming) National Science Foundation (NSF), Cyber-Physical System (CPS), Program Solicitation NSF 10-515, 2010 J. Weiss, Protecting Industrial Control Systems from Electronic Threats, Momentum Press, 2010 Contact Information • • • • C. Warren Axelrod, Ph.D. Senior Consultant, Delta Risk LLC Telephone: 917-670-1720 Email: waxelrod@delta-risk.net