White paper #6

advertisement
DRAFT
White paper #6
Practical Answers: A Practitioner's Role Call of Cybersecurity
Research Opportunities
Mine Altunay, Kirk Bailey, Matt Crawford,
Barbara Endicott-Popovsky, Josh Goldfarb, Anne Schur
Abstract--This paper discusses the near term operational challenges facing both scientists and the
security practitioners who protect the global interconnected computing infrastructure DOE scientists rely
upon for creating transformational scientific breakthroughs. ‘We must break the escalation cycle that
locks cyber intruders and their targets in a state where targets are perennially resigned to attacks and
intruders are at liberty to exploit and disrupt networks without much risk of suffering consequences [1],
and we must act offensively by directly addressing the “elephant in the living room:” malicious threats
are the norm, not the exception. This places us at an advantage because it immediately provides a new
context of looking at the pervasive problem. Immediately the problem is: a) addressed as vulnerabilities
where there are different approaches to providing and measuring based on context that addresses how
secure is good enough for different operational contexts. No longer is cybersecurity a one-size–fits-all, b)
assuming that scientists want to protect their research, this new view changes the way scientists will think
about security, their belief should shift to security is a friend, rather than a barrier to doing science, and
c) the implementation of security is no longer the single territory of the cyber-security
officers/technologists it becomes a team sport between users (scientists) of the infrastructure and the
practitioners of providing security (cyber-security technology expertise, forensics, educators, nation state
and non-nation state policy decision makers, and enforcers.) Our approach to finding innovative
solutions to cybersecurity is that providing cybersecurity capabilities is about providing information
assurance. This creates a solution space that is about risk assessment and balancing perceived risk with
mission critical operational capabilities. No single system is entirely secure, and the right balance of
security and functionality is key. This can often involve taking a holistic approach to cybersecurity,
where security is viewed less as a compliance process and more as a science. This paper identifies new
cybersecurity research opportunities within this new framework of thinking about cybersecurity.
How is Cybersecurity Implemented in Today’s Systems?
The McCumber Cube, (Figure 1) is a widely accepted definitional model of information
assurance (IA). Widely used methods for securing cyber assets are technologies,
policies/practices and human factors (training, vetting employees, daily behaviors, etc.) [2, 3].
These methods are deployed through the three basic information states—transmission, storage
and processing; providing three services to systems—Confidentiality, Integrity and
Availability—the so-called CIA of cybersecurity.
1
DRAFT
Figure 1. The McCumber Cube defining information assurance [2, 3]
While this framework has served us well in the past and has all the right words to address
cybersecurity, it falls short because using these fundamentals has us developing “state-of-the art”
solutions based on traditional practices focused on mechanisms and controls rooted on
principals of 50-year old computing environments and a mindset of putting in place physical
barriers and where a security breach was the exception. The familiar controls-based principles of
“access of least privilege” and “separation of duties” and the “Bell-Lapadula” security model are
of less relevance in today’s high stakes cyber-conflict.
Statistics from the FBI reinforce the notion of thinking about cybersecurity in terms of
managing breach as a daily occurrence, rather than responding to intrusion after-the-fact. If FBI
estimates are correct that 75% of online traffic is malicious, that organized crime makes more
money on the Internet than through drugs, and that the criminal ‘take’ last year from the Internet
almost doubled e-commerce revenues,1’ then the set of circumstances in which our laboratories
operate online today is untenable. If just 1% of the World Wide Web’s population has criminal
intent, with over 1.2 billion people online, every log onto the Internet exposes you to potentially
120 million malicious users. It’s no longer a question of ‘if’ you will be attacked, but rather,
‘when.’
To get to new levels of state-of-the-art there is a greater need for innovation and proofs of
concept for radically new operational practices (strategies) and tactics. Imagine you are living
in a huge home with hundreds of rooms and you want to go to bed and get a good night sleep;
1
FBI 2007 presentation to the Agora, University of Washington.
2
DRAFT
however, before you go to bed you want to make sure you, the house, and your possessions are
safe. Now if you are living in a world where it is known that your home already has intruders in
it and others will be coming and going as a constant state, what can you do to get a good night’s
sleep?
Given this context, innovative solutions to performing open science throughout DOE must be
derived from the operational challenges of providing and sustaining cybersecurity as an
interdisciplinary team sport between users (scientists) of the infrastructure and the practitioners
of providing security (cyber-security technology experts, digital forensics investigators,
educators, nation state and non-nation state policy decision-makers, and enforcers.
Information assurance applied to the domain of cybersecurity must be exercised in a context
where no single system is entirely secure, and the right balance of security and functionality is
key. This must involve taking a holistic approach to cybersecurity, where security is viewed less
as a compliance process and more as a science. Risk assessment and the balancing of perceived
risk must be done in the context of our new science missions, their critical operational
capabilities that occur in a compromised world, and from the perspective of scientists “doing
science”.
The Scientist’s Perspective
Historically, while researchers are accustomed to protecting their research data, they have not
been focused on broader cybersecurity considerations. Moreover, very little information is
available which addresses the specific needs of the Open Science research community from a
human factors perspective that includes understanding how the IA practitioner and scientist coexist to perform the enterprise of science. IA should be engaged in enabling, sustaining and
maintaining the mission of science.
To improve the security of the Open Science community, we need to investigate and
understand how the community does its work that brings together the two principal users: 1) the
IA representative, whose charter is to provide and sustain the infrastructures, tools, and data that
scientists, as consumers, use to do their research; and 2) the scientist, whose charter is to do
research i.e. to be a consumer/user. Security and science activities need to be unobtrusively
integrated to create an enterprise that is an enabler of science for a broad range of use contexts
that typically fall into three areas:

Performing the full experiment life cycle: Scientists can perform their activities at a
single site (a closed environment of conduct), remotely (geographically dispersed
from their instruments and colleagues), or in some combination.

Data analysis activities: Data captured from experiments is processed and analyzed.
These analysis activities might have the same needs as performing experimentation
such as interoperability and/data information sharing.

Traditional data/information sharing [6]: Data/information is shared in real-time and
non real-time in many different media forms that enable the scientist to perform their
science experiments and analysis. For example, working in real-time with other
3
DRAFT
geographically dispersed people, scientists might need the capability to monitor and
control an experiment, while for analysis activities they may need electronic
whiteboards. When not working in real-time, tools that facilitate data exchange, such
as email and portals are required to be available 24/7.
To meet the security needs of the open science community, a unique challenge to the cyber
technology community is to keep science going 24/7 in an enterprise that has taken on the
characteristic of being distributed in nature; it is an e-campus. No longer do researchers run their
jobs on a single local system -- now they may use, and be dependent on, networked resources
distributed around the country and/or around the world. In this operational environment, even
"simple" things, like authentication, can be a challenge. Techniques to protect data, while
affording the availability of efficient science capabilities, are critical. Ease of doing science
safely in a networked world should be as easy as flipping a light switch.
The Operations Practioners’ Perspective
This white paper identifies research opportunities from the operating assumption that all
networks and systems are, or will be, compromised. This approach sets a new tenant of
operating principals upon which to build a research agenda that takes on an offensive, rather than
a defensive, posture for very specific scientific activity mission spaces:
 “All Cyber Conflicts are Fought on Local Networks – in the Cyber-Battle Space
Your Network is the High Ground”
 “Local Cyber-Conflict Mandates a Priority for Localized Intelligence Gathering
and Information Sharing”
 “Nothing is More Important than Reducing Attack Surfaces and Attack Vectors
(you want box canyons)”
 “Preparing the Local Cyber-Battle Space for conflict is more important than
Stopping Intrusions”
 “Keep the Playing Field Level – Eliminate Rules of Engagement for the Local
Network”
 “Integrate Deception into Local Operations as Much as Possible – Keep the Fight
as Unfair as Possible”
 “Weaponized Local Systems and Data is an Option that Should not be
Overlooked”
4
DRAFT
 “On the Local Network, Offensive Measures have a Huge Advantage over
Defensive Measures”
To meet these tenants and provide state-of-the-art capabilities, it is imperative that we do not
immediately jump into tool-building and new architectures. Instead, we propose that radically
new operational strategies and tactics be formulated that address the new environments in which
open science is performed, associated with the following areas:
o
Organization and Authority – Focuses on the roles and responsibilities
for providing the required leadership/ownership, objectives and resources for the
development and enforcement of all cyber-security programs.
Policy – Focuses on establishing appropriate policy oversight of IT
security policies and supporting IT security efforts that set required standards,
guidance and enforcement to meet compliance and risk requirements.
o
Audit and Compliance – Focuses on compliance and audits within the
organization to provide management and regulators with assurance that controls are
adequately designed and operating effectively to meet compliance and risk
management requirements.
o
o
Risk Management and Intelligence – Focuses on proactively identifying
new threats, vulnerabilities, and risks outside and inside the organization through key
strategic alliances, innovative information gathering, and information sharing
practices. Also focuses on on-going risk assessments, identification of risk tolerance
levels, and implementation of associated risk control programs.
Privacy – Focuses on information privacy and protection within the
organizational business framework and institutional policy and compliance
requirements.
o
Incident Management – Focuses on response and resolution of
information security incidents to minimize the business impact and risk of further
incidents, as well as meet all legal and/or contractual requirements.
o
o
Education and Awareness – Focuses on the planning, procedures,
documentation and implementation of security awareness, education and related
training for employees, service providers, students, faculty, and other users of the
computing research and computing resources within DOE and other entities.
5
DRAFT
Operational Management – Focuses on appropriate security controls and
operational practices for networks, computer systems, applications, and data
throughout entities involved in the open science enterprise.
o
o
Technical Security and Access Control – Focuses on the controls that
restrict access in compliance with Information Systems Security Policy and its stated
operating principles (“Access of Least Privilege” and “Separation of Duties”).
o
Monitoring, Measurement and Reporting – Focuses on the controls that
define the event information that will be logged and monitored, adequate analysis of
the information and the alert levels that will be triggered for incident response.
o
Physical and Environmental Security – Focuses on physical protections
of data center/physical assets and data from theft, damage or loss.
o
Asset Identification and Classification – Focuses on planning and
operational procedures related to inventory, accountability and responsibility,
classification and implementation of associated controls.
o
Account Management and Outsourcing – Focuses on the policy and
procedures governing the hiring, transfer, termination, and clearance processes for
employees, contractors, and venders.
As a starting point to address the above elements, it might be useful to adapt the model
depicted in Table-1. This model has the potential to define new operations from a team point of
reference. The security strategy addresses holding intruders accountable, and gives a framework
to think of tools from multi-disciplines such as digital forensics, legal remedies and active
defense.
TABLE I 4R MODEL FOR ACCOUNTABLE SYSTEMS
Security Strategy
Tools



Resistance
Ability to repel attacks
6
Firewalls
User authentication
Diversification
DRAFT
Recognition
1) Ability to detect an attack or a probe
2) Ability to react / adapt during an attack


Intrusion detection systems
Internal integrity checks
Recovery
1) Provide essential services during attack
2) Restore services following an attack








Incident response
("forensics" - the what)2
Replication
Backup systems
Fault tolerant designs
Forensics - the who
Legal remedies
Active defense
Redress
1) Ability to hold intruders accountable in a
court of law.
2) Ability to retaliate
Role Call of Cybersecurity Research Opportunities
First and foremost, the DOE cybersecurity environment exists to enable operations that
support the scientific community. Before any additional security mechanisms/procedures are
implemented, the focus must remain on providing robust services, meeting customer service
expectations, and ensuring customers that the computing infrastructure is reliable. Today this
means that the operational community3 requires certain conditions be met by any cybersecurity
innovation:
1) Any technical security protections, provided outside its own control, must be predictable in
operation. In other words, service operators must be able to predict the response of technical
security controls to any legitimate use or operation of a service. Further, to the greatest extent
possible, the configuration and operation of such controls should be open to inspection by
operators of affected services.
2) Host-based security controls must be compatible with the need to make regular patches to
operating systems and software. Patches and updates to other security controls must be made on
schedules consistent with operation of those services supported by affected hosts. (Exceptions
may be made for severe vulnerabilities under current attack.)
2
While forensics (small "f"), is part of Recovery in the 3R model [1], it is not rigorous enough for capturing
evidence admissible in a courtroom, with its focus is on discovering what happened in order to restore network
function quickly [1]. Redress requires computer Forensics (capital "F"), which focuses on establishing who is
responsible in order to develop suitable evidence to hold them accountable in a court of law.
3
Note that for the purposes of this white paper, "operational community" means those groups within a DOE or
Contractor site which operate network-accessible services.
7
DRAFT
3) Network-based testing and verification of host security must not become a denial-ofservice attack by itself. New tests against hosts providing services, and new intensity levels of
existing tests, must only be initiated in cooperation with service operators.
Research Grand Challenge Opportunities
1) Scientists do their science in an interconnected system that is growing dynamically. The
number of accounts and passwords associated with these systems is also increasing and
scalability supporting authentication is becoming harder. There is a need for new authentication
paradigms that will usably scale to increasingly complex networked environments.
2) Because of the complexity of the networked environment, scientists are no longer able to
self-monitor their infrastructure and respond to adverse events. Techniques (such as new
visualizations) are needed to assist them to be situationally aware of the status of their
infrastructure, make decisions and determine actionable coordinated responses that mitigate the
adverse event.
3) New ways of doing open science may require new method and metrics to systematically
track successes and promotes cyber-safe culture behaviors, versus behaviors that are geared to
meet auditing requirements. As examples: a) Measures that can serve as indicators about how
well IA and scientists are integrating their roles in open science to provide a safe-cyber culture
rather than separate measures for each. These measures should include usability in areas of
deployability, supportability, accessibility, complexity versus security, and b) Measures that
enable the assessment of how sufficient is the security for a particular mission. As part of the
development of this effort, the research agenda might address a modeling and simulation test-bed
that can assess the impacts of policies for all stakeholders --- a science research sim city if you
will.
4) Policies that are based on how secure you (the scientist with his/her collaborators) are in
your particular environment. Security is based on context rather than a one-size-fits-all.
5) Often missed is the idea of containment and distributed incident management. This should
include how others (humans and IT systems) should be alerted to prevent spread of the impact of
the event.
6) Weaponizing the data for a variety of tactical purposes. The idea is to make data
dangerous, instead of encrypting it, because encryption is so easily mis-used in expensive and
un-welcomed ways. It’s also administratively heavy and expensive. Dangerous data can be far
more valuable, and likely more effective, for more nimble, flexible and cheap defensive
strategies. This is offensive capability that not only protects, it provides immediate negative
consequences to the perpetrators removing the “human factor” reward built into our current
systems and culture that favors the perpetrators. Thus perpetrators become accountable due to the
means of deterrence, rather then the victims being the one’s held accountable. (NOTE from Kirk:
8
DRAFT
Can one of you scientists please invent this capability tonight for me? You would be pleased to
see how well I would use it.)
7) Deriving operational practices of engagement that promote the culture of doing science as a
team sport. This, in turn, will create new needs for developing metrics, education, and training,
9
DRAFT
Bibliography
[1] Endicott-Popovsky, B.E., Frincke, D. "Adding the Fourth 'R': A systems approach to solving the hacker's arms
race." Hawaii International Conference on System Sciences (HICSS) 39 Symposium: Skilled Human-intelligent
Agent Performance: Measurement, Application and Symbiosis, 4 January 2006, Kauai, HI, Retrieved Feb. 17, 2007
from the World Wide Web: http://www.itl.nist.gov/iaui/vvrg/hicss39/4_r_s_rev_3_HICSS_2006.doc.
[2] Maconachy, V., Schou, C., Ragsdale, D. and D. Welch, “A model for information assurance: an integrated
approach,” in Proceedings of the 2nd Annual IEEE Information Assurance Workshop, June 2001, USMA, West
Point.
[3] McCumber, J. (1991). "Information systems security: A comprehensive model," in Proceedings of the 14th
National Computer Security Conference, Washington, D.C., October, 1991; reprinted in the Proceedings of the 4th
Annual Canadian Computer Security Conference, Ottawa, Ontario, May, 1992; reprinted in DataPro Reports on
Information Security, Delran, NJ: McGraw-Hill, 1992.
[4] Ellison, R.J., Fisher, D.A., Linger, R.C., Lipson, H.F., Longstaff, T.A. and. N.R. Mead. (May, 1999).
"Survivable network systems: An emerging discipline. CMU/SEI 97-TR-013," Software Engineering Institute,
Carnegie-Mellon University, Pittsburgh, PA.
[5] Ryan, D. "New directions in cyber law. Paper," Presented at the CISSE 7th Colloquium. June 2003, Washington,
D.C.
[6] Schur, A., Meyers, JD, Keating, KA, Payne, DA, Valdez, T, and Yates, K. “Collaborative Suites for Experiment
Oriented Scientific Research.” ACM’s Interactions Vol.5. No.3. May/June 1998 pp.40-47.
See Kirk’s Appendix Kirk’s Appendix A attached.
10
Download