International Symposium

advertisement
International Symposium
The University of Texas in Arlington, 28 - 30th July
Part 1: Mathematical modeling of
processes in systems life cycles in
compliance with standards
requirements of ISO/IEC 15288
Part 2: Methodical Approach for the
Evaluation of System Vulnerability
in Conditions of Terrorist Threats
Prof. Kostogryzov Andrey, Russia, Moscow, akostogr@ceisoq.ru
Plan (part 1)
1. The Role of Mathematical Modeling. Abstract Idea for
Information Systems Modeling
2. Offered Models and Software Suites CEISOQ
3. Mathematical Modeling for Process Architectures
Analysis, Examples
4. Mathematical Modeling of Items Gathering , Examples
5. Mathematical Modeling of Item Content Analysis ,
Examples
6. Mathematical Modeling of System Protecting , Examples
7. Mathematical Models in Developing , Examples
1.1 The Role of Mathematical Modeling. Abstract Idea for
Information Systems Modeling
LIFE CYCLE
STAGES
CONCEPT
DEVELOPMENT
PRODUCTION
UTILIZATION
SUPPORT
RETIERMENT
PURPOSE
Identify stakeholders' needs
Explore concepts
Propose viable solutions
Refine system requirements
Create solution description
Build system (including
mathematical software models as a
system component)
Verify and validate system
Produce individually or in quantity
Inspect and test
Operate system to satisfy users’
needs
Provide sustained system
capability
Store, archive or dispose the
system
MODELS
CAPABILITIES
Substantiation of
quantitative system
requirements
Requirements analysis,
investigations of risks and
potential threats,
evaluations of proposed
solutions and expected
hazards. Tests and
evaluations of system
operation quality
Investigations of risks and
potential threats.
Evaluations of system
operation quality,
optimization of parameters
Substantiation of rational
modes
Practice solutions are based on system analysis: the
fundamentation for providing high quality of system is rational
use of models and methods that allow to evaluate and optimize
different existing processes in life cycle
1.2 The Role of System Analysis and Mathematical Modeling
-
the main actions for rational achievement system purposes
For example
in Project Process:
- for Project Assessment Process - assess project progress, analyze
data and measures to make appropriate recommendations;
- for Risk Management Process - evaluate the risks, determine the risk
treatment strategies;
- for
Information Management Process - define information
maintenance actions
in Technical Processes:
-for Requirements Analysis Process -define the functional boundary of
the system, technical and quality in use measures, specify system requirements and
functions;
- for Architectural Design Process - define appropriate logical
architectural designs, evaluate alternative design solutions;
- for Operation Process - monitor the system operation, introduce
remedial changes to operating procedures, the operator environment, humanmachine interfaces and operator training as appropriate when human error
contributed to failure etc.
1.3 Abstract Idea for Information Systems Operation
Modeling
SYSTEM
Higher
systems
Purposes
Resources
Sources
Requirements to
information
system
Use
conditions
Information system
Operated
objects
Users
Interacted
systems
The general purpose of
operation:
to meet requirements for
providing reliable and timely
producing complete, valid
and confidential information
for its following use
Subordinate
systems
A totality of system characteristics that bears on ability to satisfy users
needs in output information defines information systems operation quality
1.4 Abstract Idea for Information Systems Modeling
Required information (ideal)
Reliable, timely, complete, valid
and confidential information
Used information (real)
non-confidential
due to processing
intolerable
mistakes
required quality
doubtful
non-produced as a
result of system's
unreliability
untimely
non-actual
with hidden
distortions as a result
of unauthorized
accesses
with hidden software distortions
due to random faults of staff and users
incomplete
due to random faults missed during checking
2 Offered Models and Software Suites CEISOQ
CEISOQ
was presented
on conferences: “Modeling and
Certifying of Arms and Military
Techniques” (1998-2000), on
the
International
Air-Space
Show MAKS (1999, the town of
Zhukovsky),
on
the
International Engineering and
Technical
Management
Symposium (2000, the USA), on
the International Computation
and Information Conference
(2000,
Kuwait),
on
the
International
Seminar
“Mathematical
Methods,
Models and Architecture for
Providing Computer Networks
Security” (2001), was awarded
be the Golden Medal of the
International Innovation and
Investment Salon (2001, the
chairman of Jury is the Nobel
Prize Winner Mr. Alfyorov) etc.,
is patented and certified.
2.1 Model of Functions Performance in Conditions of System Unreliability (physics)
Cause 1
t1
t1+q
Reliability is provided
Cause 2
t2
Cause 3
t2+q
t3
t
Reliability is not
provided
- Failure-free system operation time (operative conditions);
- Failure recovery time (inoperative conditions);
- Required permanent time of reliable operation.
An application allows to answer such questions as
• what requirements to items (hardware/software units, staff or users)
operation time between failures and to repair time are admissible?
• what operation processes should be duplicated?
• what about the reliability of a system architectural design etc.
2.2 The Models Complex of Calls Processing (physics for information system)
Calls for
technological
operations
Produced
output data
Users calls for
output
information producing
BUFFER
Users calls for
input information
filling
SERVING
SYSTEM
Performed technological
operations
Transferred
messages
Felt
data
Users calls for
messages
transfer
An application allows to answer the questions:
-what processing units should be choosen from the producing point of view?
-what calls flows and functional tasks may be main causes for bottle-necks? etc.
Software tools CEISOQ allows to compare effectiveness of six dispatcher technologies:
• technology 1 for a calls processing without priorities: in consecutive order for unitasking
processing mode; in time-sharing order for multitasking processing mode;
• priority technologies in consecutive processing order:
– technology 2 for calls processing with relative priorities “first in - first out”;
– technology 3 with absolute priorities;
– technology 4 for calls batch processing with relative priorities;
– technology 5 combined by technologies 2, 3 and 4.
2.3 The Model of Entering into System Current Data Concerning New
Objects of Application Domain (physics)
а) Processes of new objects and events (OE) appearance and information delivery:
OE № 1
OE № 2
OE № 3
OE № 4
t
message № 1
message № 2
- information completeness
message № 3
message № 4
- information incompleteness
b) Modeling queuing system M/G/
Real process:
OE appearance
Information
delivery
Unit 1
Storage
Database
Objects features
Events parameters
Unit 2
Formalization:
Calls flow
…
...
Served calls
Calls serving
An application allows to answer such question as
what productivity of preparation, transfer and input units
preferred to provide information completeness? and others
should be
2.4 Models of Items Gathering from Sources (physics)
In difference from Model 2.3 evaluated item’s actuality characterizes updating processes
DB
DB
DB
t
DB
-
Items actuality
is provided
updating moments;
Items actuality
is not provided
items actuality maintenance after the last updating;
-
non-actuality for items
-
An application allows to answer the question as what productivity of
preparation, transfer and input units and what gathering technologies should
be preferred to provide items actuality?
2.5 Models of Items Gathering from Sources (mathematics)
The probability of information actuality maintenance until its use moment
(proved on the base of the limit theorem for regenerative processes) is:
а) for the mode of source information delivery immediately after essential
object current state change:
1

P   B(t )[1C (t )]dt,

0
b) for updating mode regardless without dependence on object current state
is changed or not (for example, when gathering is regulated):


1 

[
1

Q
(
t
)][
1

C
(
t


)
dB
(

)]
dt ,

P
 
0

q0
where C(t) is the probability definition function (PDF) of time between
neighboring essential real station changes,  – is the mean time;
B(t) is the PDF of time for information preparing, delivering, and inputting;
Q(t) is the PDF of time interval between the neighboring updating, q is the
mean time
2.6 Model of Items Analysis (physics on example of information checking)
Initial
information
preparation
Attention
concentration
is restorated
Attention
concentration
is OK
...
...
Check-up and fault correction
Formalized
checking of
fault lack
Real information
Fault
lack
Input information
Fault
lack
Fault
lack
Fault
...
Fault lack information (fault
detected)
Fault
...
Fault
Fault
lack
Fault
lack
A man is the most valuable systems component
An application allows to answer the questions:
Is a checker able to reveal all existing errors? Moreover, is he able
to commit no own errors? Is there needed software support for
effective analysis? What about system faultlessness in real time
operation? etc.
2.7 Models Complex of Dangerous Influences on a Protected System
(physics on example of computer viruses influences )
Established regulating
diagnostic
Validation result
of initial system
The state
able to
operate
Functional
task
performance
of
System
resources
state
diagnostic
System
transformation to
«pure» state
Functional
task
performance
in established
mode
Hidden or
is «purity»
visible
Virus
penetration
Virus
activation
violation
system
operation
Dangerous influences (for example from computer bags
viruses or terrorists etc.) define high system vulnerability not only
through a suddenness and incomprehensibility of their influences
but also mainly on account of their insufficient studyness.
An application allows answer the questions:
· what about a danger of influences on a protected system and which safety
technologies should be preferred for different environment scenarious? etc.
2.8 Model of an Unauthorized Access to System Resources (physics)
The core of modeling is barriers overcoming as random processes
1-st barrier to be overcome during unauthorized access
*
. . .
*
*
*
*
*
*
Unauthorized
k-th barrier to be overcome during unauthorized access
Violator
*
*
*
*
*
*
*
*
* *
* *
Stored system
resources
*
*
*
Unauthorized
actions
...
*
*
*
*
*
*
*
*
*
*
*
not
realized
access is
* *
*
*
Unauthorized
access is
realized
*
An application allows to answer such questions:
- what about the quantity of barriers?
- what protection barriers decoding time is tolerated against unauthorized
accesses? etc.
2.9 Model with due regard to resources objective value
(physics on example of information confidentiality)
1-st barrier to be overcome during unauthorized access
*
. . .
*
*
*
*
*
Information
confidentiality is
*
*
k-th barrier to be overcome during unauthorized access
Violator
*
*
*
*
*
*
*
Unauthorized actions
...
*
*
*
*
* *
* *
Stored system
resources
*
*
maintained
*
Keep* one’s patience,
*
please! Just 36,6%
*
*
*
*
*
* Period*of objective
* value * after
*
beginning !
The Model has the difference from Model of an Unauthorized
Access in considering a period of objective value of kept resources.
One allows to answer the questions: what about a probability of
unauthorized access with
this consideration? Is this period
valueable? etc.
3 Mathematical Modeling for Process Architectures Analysis
For example a system project shall implement the following activities:
- define appropriate logical architectural designs
- analyze the system functions identified in requirements
analysis and allocate them to elements of system architecture
- analyze the resulting architectural design to establish
design criteria
- determine which system requirements are allocated to
operators for the most effective, efficient and reliable
human-machine interaction
- evaluate alternative design solutions, modeling them to a
level of detail that permits comparison against the
specifications expressed in the system requirements and the
performance, costs, time scales and risks expressed in the
stakeholder requirements
- specify the selected physical design solution as an
architectural design baseline in terms of its functions,
performance, behavior, interfaces and unavoidable
implementation constraints etc.
Example 3.1.
Let an Air Transport System be developed for making
intercontinental flights (see Fig. D.1 from ISO/IEC 15288): Aircraft System (subsystem 1);
Airport system (subsystem 2); Fuel distribution system (subsystem 3); Air traffic control
system (subsystem 4); Ticketing system (subsystem 5). Let from users point of view the
mean inoperable technical conditions for a hypothetical Air Transport Systems is equal to
2 hour per year. And it is admissible. It means that admissible availability is equal to 0.99977
(1-2hours/(365days 24hours)). System recovery time after an error equals to 30 minutes.
It is required to evaluate the availability and probability of reliable Air Transport
System operation during 1 day in considering equal reliability for all subsystems.
Results: 1) for providing required availability the mean time between
failures for one subsystem should be equal more than 1.3.years and the mean
time between failures for whole system will be about 0.26 years;
2) for providing required reliability during one day the mean time between
failures for one subsystem should be equal more than 61 years and for the
system will be about 12.2 years- more in 47 times (!!!)
Example 3.2. An information basis for efficient fighters use is a board radar
complex. It weighs about 1% of the whole airplane and its cost varies in the
range 10-20% of the whole airplane cost. One of the bottlenecks of any
modern fighter is low reliability of radar stations (RS). Mean time between
failures equals to 200 hours is still considered as admissible. For assigned
admissible probability of reliable performance of a set task not less than 0.95
it is required to define the maximum continuous time of a board RS use.
Solution. Let’s the mean time of RS repair after its failure is equal
to 4 hours. The maximum mean time of continuous RS use mustn’t
exceed 6 hours 15 minutes. It is enough for a certain military task
fulfillment by a fighter. If the mean time between failures equals 200
hours the probability of reliable RS operation while a fighter fulfils a
military task can’t exceed the level 0.97-0.98 (!!!).
Example 3.3. If any Transport System (see Fig.D1 from ISO/IEC 15288) is
very large there are hundreds or even thousands of staff members. To solve a
given functional system problem there are required efforts of several specialists.
Let a problem solution depend on joint but independent actions of 5 people. Let
each of 4 specialists make 1 error a month and the 5th inexperienced person
makes 1 error a day. System recovery time after an error equals to 30 minutes. It is
required to evaluate faultlessness of such a group’s actions within a week.
Results. The probability of faultless joint actions of the first 4
specialists within a 40-hours workweek equals to 0.8-0.82 but the
low-quality work of the 5th member mocks the whole group work.
The probability of faultless actions decreases to 0.02 (!!!).
These results prove the importance of thorough specialists
training because a man is the main system bottleneck.
4. Mathematical Modeling of Items Gathering
Example 4.1.
Military aviation effectiveness
is caused by
intermediate-range ballistic missiles use. Distant air enemy detection
and identification is carried out by an onboard radar station (RS).
ABOUT
All locators are divided into 2 groups: with mechanical air surveillance
(3-rd generation) and with electronic scanning (4-th generation). The
unavoidable disadvantage of existing mechanical scanning systems is the
impossibility of combining two modes: missile guidance on a target and
surveillance. This means that sequentially surveying air RS sets current
positions of several just detected targets. In case of fighting with nonmaneuvering low-speed targets (for example with cruise missiles) this method
proves to be correct. When enemy maneuverability is high the achieved
information actuality turns out to be insufficient.
To destroy a maneuvering target its data should be fixed 5-10 times per
second. The thing is that an aerial with mechanical surveillance may define
new target’s data only in its next turn, i.e. in a second. In this case a pilot does
not have any ability to survey air, what in fight conditions causes an
indubitable loss.
What about information quality for locators of 3-rd and
4-th generation in quantity?
Example 4.1 (computation results)
Input: i is the mean time between essential changes; i is the mean
preparing time; i is the mean transfer time; I is the mean time of entering into a
system. Di = D2 – it means that information is gathered without any dependencies
on changes, qi is the mean time between updating. Output: Pact.i is the probability
of information actuality on the moment of use.
Case of fight with non-maneuvering (i=1-4 are RS with electronic scanning, i=5-6 – with mechanical one)
Case of fight with high-maneuvering enemy targets
Example 4.1 (summary of modeling)
Computation results show that information gathering
process
architecture based on RS with mechanical scanning provides information
actuality not less than 0.88-0.94. Achieved probability 0.88-0.94 may be considered
as standard for effectiveness in fighting with non-maneuvering or low-speed
targets. For comparison use of electronic scanning RS allows to increase this
probability to 0.96-0.98.
The difference in case of high-maneuvering enemy targets is in the
frequency of significant targets positions changes – they happen every 1-2
seconds. Computation results prove that mechanical scanning RS may provide
information actuality equal to 0.56-0.74.
Electronic scanning RS may provide used information actuality equal to:
0.81 - 0. 85 in case of sharp turn maneuvers within a second;
- 0.90 - 0.92 in case of turn maneuvers within 2 seconds.
Conclusion:
-
actuality increase to the level 0.9 and higher is a very important and
necessary condition for fighting aviation effective opposing of
fighting aviation to high-maneuvering targets. Due to such high
information actuality there is possible for a system to obtain a new
quality (practice “transition from QUANTITY to QUALITY”)
Example 4.2. Information roots in “Wall Mart” success are on the first
place. To increase productivity each worker salesclerks were equipped
with manual bar-code readers. Information contained in a bar-code is
shown on a display. A worker can get a retrospective picture of products
sold within a day, a week and 5 weeks. On each article there is everything
what may be necessary for ordering. What may be achieved due to
information actuality increase?
Summary. Actuality of information is not less than 0.992
(i=1-4). Information read from a bar-code, which is transferred to
the “Wal-Mart” headquarters.The satellite system is connected
with more than 4000 company suppliers. For comparison, other
shops, where information is updated hourly, information actuality
equals to 0.3-0.7 (i=5-8), i.e. at a moment of information use it is as
true as false. Feel the difference !!!
5. Mathematical Modeling of Item Content Analysis
Example 5.1. Let an operators’ work order of an air traffic control system
(see Fig.D1 from ISO/IEC 15288) be developed. As the main operator’s
work is connected with monitors it is required to substantiate time of a
continuous operator’s work shift under the condition that the probability
of obtaining correct results of information analysis is not less than 0.99.
Data mining for modeling
Let’s the number of simultaneously tracked objects does not
exceed 20. Changes of object states happen every second. During
an hour of work there happens not more than 3600 changes of each
object state. As a flight is continuous it is critical for an operator if
the frequency of object state changes equals to 1 change per 5
seconds.
Let’s assume that among these 20 objects there are not
more than 4 significant ones preparing for a take-off or landing.
Thus within an hour an operator must analyze up to 14400 objects’
states (3600/520), within 2 hours – 28800 ones, within 4 hours –
57600 ones, within 5 hours – 72000 ones. A fraction of essential
information doesn’t exceed 20%. An analysis speed of an
experienced analyzer equals 4 objects a second. The frequency of
type I and II errors equals to 1 error a week.
Example 5.1. (summary of modeling)
Input: Vi is the content of analyzed information; i is the fraction of information
essential for analysis; i is the information analysis speed; ni is the frequency of type I
errors (when unimportant information is considered as essential); TMTBF i is the mean time
between algorithmic type II errors; Tcont.i is the continuous period of an analyst’s work. Treq.i
is the assigned term (deadline) for analysis. Output: P after i is the probability of correct
analysis results obtaining; after i is the fraction of unaccounted essential information
i=1-3 describe one-hour work and i=4-8 – five-hour work. Owing to high
qualification of operators the computed correctness of their work is
stably high – 0.94-0.98, but less than required one (0.99).
Result: there are no formal solutions of the problem because
the assigned probability is almost unachievable, the problem
can’t be solved due only to the control of a shift’s work time
Example 5.2. An automated monitoring system of a territorially distributed system
critical to a safety problem is developed (a pipeline of oil or gas network,
communications of a chemical enterprise, stores of nuclear industry waste, a
regional energy system etc.). Information from automatic sensors is transferred
to an integrating center. Though degree of control is quite high and control itself
is continuous the main information gathering and processing functions are
performed in an automatic mode. An operator receives integral information,
making a decision and developing control actions. It is required to define such
minimum speed of data processing that the probability of correct integrated
results obtaining is not less than 0.9999.
Result: only if data processing speed in the automatic mode is not less than
199500 symbols a minute and in the manual mode not less than 108 graphical
results a minute there may be achieved the required correctness. The obtained
results are system engineering requirements for a monitoring subsystem as
well as requirements to qualifications of system staff.
6.1 Modeling of System Protecting From Unauthorized Accesses
Example 6.1. In the end of 1941 allies belonging to the anti-Hitler coalition in the
English naval began to form escorts with transport and military vessels, which
were sent to northern ports of the USSR. In 1942 an escort PQ-17 was sent to the
USSR. Suddenly on half the way the escort was attacked by German submarines
and bombers. As a result 24 of 36 vessels were drowned, 3350 trucks, 430 tanks
and 100000 tons of freight disappeared on bottom. The matter is the Finnish
center of radio interception received a telegram in the Morse code, decoded one
and transferred it to the Germans. It is required to substantiate system
requirements to cipher complexity for providing latency of transition
of a vessels caravan with the probability not less than 0.99.
Result: to provide information confidentiality during 20 days it
was required to select cipher algorithm for the cipher decoding
more than 3 years (!!!) for complexity
6.1 Modeling of System Protecting From Unauthorized Accesses
Example 6.2. Once on the competitions there was suggested to solve some problems
connected with enciphering. There were compared an algorithm of 97-rate enciphering on
the elliptic curve and an RSA algorithm with a 512-digits key. To take part in competitions
there were got 195 enthusiasts from 20 countries. For deciphering, which took 40 days,
there were used 740 computers. In the end deciphering took about 16 000 machine years on
conversion to computers which throughput is 1million operations per second.
It seems that the final results of the competitions are
not of practical importance for a specialist (it was only
revealed that a cipher may be deciphered in 40 days). But
in fact the latent results are amazing not in the form
they were drawn in competitions but in the form of
cryptographic protection modeling results (!!!).
Example 6.2. (system analysis)
Let it is required to define maximum admissible period of objective
information confidentiality that may characterize protected information to provide
its confidentiality with the probability not less than 0.995.
If the key is changed 1 time a year the enciphering will provide
confidentiality for information, which period of objective confidentiality
equals to more than 200 million years (!) in case of 97-rate encoding on the
elliptic curve and not more than 4-5 months (!) in case of enciphering with
the help of the RSA algorithms. If a key is changed once a month (practice
for today) the encoding by algorithm RSA
provides information
confidentiality not less than 0.999, and the period of objective information
confidentiality will also be equal to hundreds of millions of years!!!
How may be estimated the following information???
“…The RSA Data Security firm recommends companies to
protect data by more reliable keys, which length exceeds
768 digits or better 1028 digits…”
Brilliant answer (ancient): effective system
engineering decision should be made only on
thorough modeling knowledge for system processes
6.2 Mathematical Modeling of System
Protecting Against Dangerous Influences
Technology 1
– preventive diagnostic of system integrity
Technology 2
- multishift security monitoring
Technology 3
- security monitoring when system integrity
Input for modeling : j is the frequency of influences for penetrating a danger
source; j is the mean activation time of a penetrated danger source (only for
technologies 1 and 3); T betw.j is the time between the end of diagnostic and the
beginning of the next one; Tdiag.j is the diagnostic time including the time of system
integrity repair; TMTBF j is the mean time between operator’s errors; Treq.j is the
required period of permanent secure system operation. Output: Pinf.j is the
probability of dangerous influence absence in a system within the assigned period
Treq.j; Ppen.j is the probability of penetrated danger source absence.
Example 6.3. On the market has appeared a special medical system with sensors
built in a man’s clothes, which immediately give warnings of dangerous changes
of a man’s physical state. A special portable computer gathers information from
sensors and processes it. Let’s evaluate what lifetime without any serious
illnesses may be provided for people using such clothes.
System analysis. Let’s examine 2 variants of a man’s reaction upon signals.
The 1-st variant implies a visit to a doctor and taking the necessary treatment
within 4 hours after a man has received a signal of danger from the system.
The 2-nd variant implies an immediate intervention of a personal doctor after
first danger symptoms appeared and an organism integrity recovery(reflected ).
Results: the probability of serious illnesses absence is
for 1-st variant - within a year not les than 0.98, within 2
years – not less than 0.92, within 10 years – not more than 0.35;
for the 2-nd variant - within 46 years not less than 0.95.
7. Mathematical Models in Developing
In compliance with standards requirements of
ISO/IEC 15288 there are developed:
- Model for enterprise environment development;
- Models of project development;
- Models of systems development in life cycle;
- Models of life cycle process development;
- Models of customer satisfaction in system life cycle and others
The models may be applied for solving such system
problems appearing in a systems life cycle as:
.
- substantiation of quantitative system requirements to hardware, software,
users, staff, technologies;
- requirements analysis;
- evaluation of project engineering decisions and possible danger;
- detection of bottle-necks;
- investigation of problems concerning potential threats to system operation
and information security;
- testing, verification and validation of system operation quality;
- rational optimization of system technological parameters;
- substantiation of plans, projects and directions for effective system
utilization, improvement and development
Publications and Implementations
1994
1996
1999
2001-2003
Applications
Summary:
Presented models and software tools are the methodological and implementation foundation of customers, enterprises,
certification bodies, test laboratories and experts in Russia. They
support the Russian standard “GOST RV. Information technology. Set of
standards for automated system. The typical requirements and metrics of
information systems operation quality. General principles”.
CEISOQ
has already found wide application in Universities for education
Plan (Part 2)
1. Approach for Evaluating System Vulnerability in Conditions
of Terrorist Threats. General Propositions
2. Modeling complex and software suites “VULNERABILITY”
3. Investigations after September 11
3.1 How effective has been the system of flights safety provision before
September 11 (in Russia and the USA) ?
3.2 How the level of the existing safety may be increased and by what
measures?
4. Examples for a sea gas-and-oil producing system
4.1 Risk to become an object of terror
4.2 Risk of suspicious events non-detection and mistaken analytical
conclusions
4.3 Risk of latent penetration and influence
4.4 Risk of protection barriers overcoming
Conclusion
1.1 Approach for Evaluating System Vulnerability in
Conditions of Terrorist Threats. General Propositions
General scheme of terrorist threats development
1. Choice of a
terrorist object
2. Search of
system
vulnerability
3. Preparation
for terrorist
acts
4. Latent or
obvious actions
concerning
system
vulnerability
7.** Analysis of
terrorist
threats
realization
results
6. Realization of
threats by means
of system protection barriers
overcoming
5*. Formulation
of
requirements
and threats
* There may be no clear requirements and threats (as it was on September, 11 2001)
** For a single terrorist act there may be no feedback as a result of the terrorist impact analysis
1.2 Approach for Evaluating System Vulnerability in
Conditions of Terrorist Threats. General Propositions
Сhain of logical dependences for an evaluation of a system vulnerability risk
1.Risk
to
become
an
object
of terror
2. Risk of
suspicious
events nondetection
during
admissible
time
3.Risk of
mistaken
analytical
conclusions
.
during
admissible
time
4.Risk of
dangerous
influence on
system
during given
period
System vulnerability risk
during given period
5.Risk of
protection
barriers
overcoming
during given
period
2 Modeling complex and software suites “VULNERABILITY”
Rvulner - integral system vulnerability during given period tgiven
(Psafety - integral system safety during given period tgiven),
depends on:
Pobject - risk to become an object of terror,
Rnon-det - risk of suspicious events non-detection during admissible time,
Rmist - risk of mistaken analytical conclusions during admissible time,
Rinfl - risk of dangerous influence on system during given period,
Rover- risk of protection barriers overcoming during given period
3.1 How effective has been the system of flights safety
provision before September 11 (in Russia and the USA)?
System analysis. Barriers from the point of view of terrorists:
the 1st barrier is pass and interobject modes in aerodromes and centers of air
traffic control;
the 2nd barrier is a preflight examination and control of passengers and their
luggage during the registration;
the 3rd barrier is a preflight examination before boarding;
the 4th barrier is a lock-up door to the cockpit;
the 5th barrier is an on-line warning about a highjacking
Input in case of opposing to
inexperienced terrorists in Russia
The computation results
3.1 How effective has been the system of flights safety provision before
September 11? (computation results for trained terrorists )
in
Russia
in the
USA
Conclusion: In Russia and the USA the existing before September
11 systems of flights safety were uneffective against planned actions of
prepared terrorists (protection in probability measure - about 0.52-0.53).
The bottlenecks were a weak protection of a cockpit and absence of
active opposing measures on board an airplane
3.2 How the level of the existing safety may be increased and
by what measures?
The first measure consists in making a cockpit door a real barrier insuperable
for terrorists during a flight.
As soon as a cockpit door has become insuperable a cockpit may be turned
into a center of a cabin security telemonitoring and control. It is the second measure.
Thus pilots may timely get complete and valid information about situation in the cabin.
Terrorists who have revealed themselves are in standing positions; the others
remain sitting. The first task of the defended part is to disable these threats subjects at
least for a few minutes. There are needed means and ways of illethal action, i.e. which
don’t lead to a fatal end because passengers may also run the danger. The third
measure is a use of point means of illethal action on the revealed terrorists. Such
means may be a soporific gas and/or short-period influences, for example, dazzling (a
terrorist is suddenly dazzled by a light beam), and/or deafening, and/or electro-shocking
causing a temporary loss of consciousness. There should be several ways of influence
because one way may be easily neutralized (gas masks against gases, sunglasses
against dazzling etc). Thus the revealed terrorists may certainly be disabled.
As in a cabin may be accomplices able to repeat the high-jacking after an
additional preparing there must be provided ways of compulsory keeping of suspicious
passengers on their seats till the emergency landing. It is the fourth measure, which
may be provided again by soporific actions, jammed belts etc.
3.2 How the level of the existing safety may be increased and
by what measures? (computation results)
Result. Owing to active opposing measures
undertaking on board an airliner the probability of flights
safety provision against terrorists may essentially increase
from 0.52-0.53 to 0.98-0.99.
Note: the first failures will make terrorists analyze
their causes and find new bottlenecks of the security
system thus continuing the counteraction.
And preventive actions should base on modeling
4. Examples for a sea gas-and-oil producing system
.
Processes of possible terrorist influence and system safety provision
(platforms, coastal technological complexes including floating storage and
offloading terminals, liquefied natural gas terminals, pipelines, tubing
stations) are evaluated for various scenarios and conditions
4.1 Risk to become an object of terror
Let the mean time of revealing interesting information by terrorists be two days, the
mean time of delivering the interesting information to the terrorist analytical center
be no longer than 1 hour and the mean time of making a decision concerning the
system attractiveness from the point of view of its possible choice as an object of
terror be about one month. It is assumed that information gathering doesn’t
obviously depend on system state changes and the mean time between information
updates in the terrorist analytical center is forecasted about 1 month.
There is an objective danger and high risk for sea gasand-oil producing systems to become an object of terror
4. 2 Risk of suspicious events non-detection and mistaken analytical conclusions
Is it principially possible to solve the problem of detecting suspicious events in
time and making correct conclusions from the gathered on-line information?
Results: The analysis carried out on the basis of the facts concerning FBI activity
has shown that the risk of erroneous analytical conclusions and as a consequence
non-undertaking or undertaking inadequate countermeasures is equal more than
0.998 (!!!).
Nowadays it is practically impossible to prevent realization of
terrorist acts aimed at any kinds of systems and objects. It is
necessary to conduct a profound purposeful work (based on
modeling) on radical reduction of risks concerning non-detecting of
suspicious events, erroneous analytical conclusions
4.3 Risk of latent penetration and influence
For existing safety systems the probability that dangerous influences (explosions
of fuel-air mixes clouds, generation and burning of fire balls, oil spill and burning, separation and
spread of technological equipment parts and others ) doesn’t occur within 24 hours is above
0.99997 , i.e. the risk of emergency realization is about 0.00003. In conditions of daily
failure danger the risk of required safety within a month increases up to 0.001. This high
level of sea gas-and-oil producing system protection in emergencies is mainly provided
by application of approved automatic safety technologies (!)
For similar conditions the risks of terrorist threat realization are
incommensurably higher (within a month risk runs up to 0.93). Owing
to insufficient preparedness and technical equipment of operators for
timely and valid recognition of terrorist threats at the background of
other technical threats variety sea gas-and-oil producing systems are
today completely helpless in case of terrorist dangers (!!!)
4.4 Risk of protection barriers overcoming
In practice the protection from an unauthorized access is
a sequence of barriers after successful overcoming of
which a violator can get an access to system’s resources.
Such an access is possible from a special command post
when an automated workstation of control service is used.
Change frequency of
the barrier parameter
value
Mean time of
barrier
overcoming
1. Protection by a patrol boat
Change of guards every
24 hours
10 hours
Latent penetration from the air, under the
water, fraud of guards
2. System of passes to the platform with a change of security
services
Change of guards every
24 hours
10 minutes
Documents falsification, conspiracy, fraud
3. the electronic key to get to the GOPS control unit
5 years (time between
changes)
1 week
Theft, forcible key withdrawal, conspiracy
4. The password to enter the automated GOPS system
1 month.
10 days
Spying, compulsory questioning, conspiracy,
selection of a password
5. The password to get access to software devices
1 month.
10 days
Barrier
Possible way of barrier
overcoming
—
—
6. The password to get access to the required information
1 month.
10 days
7. The registered external information carrier with write access
1 year
24 hours
—
—
Theft, forced registration, conspiracy
8. Confirmation of a user identity, during a session of work with
the computer
9. Telemonitoring of a helipad, a drilling module, energy
equipment, a technological module, pipelines and equipment,
rescue rooms and boats etc.
10. Encoding of the most important information
1 month.
24 hours
Spying, compulsory questioning, conspiracy
Time between changes of
software devices)
1 month
Simulation of a failure, false films, dressing up
as employees, conspiracy
Change of keys every
month
1 year
Decoding, conspiracy
4.4 Risk of protection barriers overcoming
1. The first 3 barriers are overcome with the probability about 0.34. Use
of alternating passwords once a month for the 4th, 5thand 6th barriers
allows to increase the protection level from 0.66 to 0.86.
2. Introduction of the 7th and 8th barrier is practically useless.
3. Use of telemonitoring means allows to increase information resource
protection from UA to 0.998 what also doesn’t meet the stated
requirements.
4. Only the use of all 10 barriers provides the required protection
from UA with the required probability more than 0.9999 what practically
excludes any risk of terrorist access to system control post information
resources and essentially reduces system vulnerability.
4.5 Summary result for sea gas-and-oil producing systems
Protection technologies from terrorists threats are imperfect
and cannot be compensated by approved existing safety
technologies. A whole system and its components (firstly
platforms, coastal complexes including floating storage and
offloading terminals, liquefied natural gas terminals, pipelines,
tubing stations) are in fact extremely vulnerable (!)
CONCLUSION
Engineering decisions concerning development of
protection system for preventing terrorist threats should be
modeled, quantitatively evaluated and substantiated taking
into account potential scenarios of threats development,
opportunities of protective barrier overcoming during
possible acts of terrorism etc. The offered complex
“Vulnerability” is capable for system mathematical
modeling, risks analysis
and protection measures
effectiveness investigations.
A good beginning (based on modeling) is half the battle…
Download