The Predictive Perimeter: A New Look at Border Protection

advertisement
The Predictive Perimeter: A
New Look at Border Protection
by Don Adams, CTO, Worldwide Public Sector, TIBCO
with Alan Lundberg
With over 100,000 miles of land and seashore borders to protect and
increasing scrutiny over how taxpayer dollars are spent in protecting our
borders, innovative technologies will have to play a major role in
protecting our country.
The advent of event-driven architecture (EDA) and sophisticated event
processing software, which allows for the fusion of information from
historical databases with real-time, in-memory sensor data and “events,” is
changing the game in favor of law enforcement and homeland security.
In this paper, we will explain the concepts behind EDA and how it can be
used in border security and counter-terrorism efforts. This new and
strategic IT environment represents a next step in the evolution from guns,
guards, and fences to network centric systems to predictive perimeter
defense systems. We will also discuss event processing software concepts
with a focus on distributed processing in the context of a predictive
perimeter defense.
http://www.tibco.com
Global Headquarters
3303 Hillview Avenue
Palo Alto, CA 94304
Tel: +1 650-846-1000
Toll Free: 1 800-420-8450
Fax: +1 650-846-1005
© 2010, TIBCO Software Inc. All rights
reserved. TIBCO, the TIBCO logo, The
Power of Now, and TIBCO Software are
trademarks or registered trademarks of
TIBCO Software Inc. in the United States
and/or other countries. All other product and
company names and marks mentioned in
this document are the property of their
respective owners and are mentioned for
identification purposes only.
Document
Introduction
The U.S. Department of Homeland Security faces many daunting challenges these days. Illegal
immigration and air travel restrictions are at an all time high, and new types of sophisticated
terrorist activity at the borders have been detected. Some believe it’s only a matter of time before a
terrorist crosses the border and carries out a mission with terrible and tragic consequences.
With over 100,000 miles of land and seashore borders to protect and with increasing scrutiny over
how taxpayer dollars are spent in protecting our borders, the new reality is that more guards, guns
and fences will not be enough to ensure our security in the coming years. It is clear that new and
innovative technologies will have to play a major role in cost effectively protecting our country
and our freedoms.
A variety of new technologies will need to be incorporated into the U.S. Homeland Security
arsenal to provide the millions of “eyes and ears” needed to more effectively counter terrorism.
Focus areas include:
•
Land-based observation and detection – sensor and video
•
Aerial-based observation and detection – including UAV and satellite
•
Sea, shoreline, and port security – sensors, cargo manifest correlations
•
Biometric detection – facial recognition and thermal
•
Cargo detection
•
Radiation detection
To enhance the existing security arsenal, new technology must meet the following requirements:
•
Adaptability, to change rules or tactics on the fly
•
Interoperability with legacy defense, DHS, and agency systems in particular
•
Situational awareness and instant response
•
Ability to work with a wide variety of sensor data
The last bullet point is important for defense applications and is based, in part, on some established
defense IT architectures, particularly the Joint Directors of Laboratories (JDL) sensor data fusion
concepts combined with innovative complex event processing software from TIBCO.
The Predictive Perimeter
Document
In his latest book [1], The Power to Predict, Vivek Ranadivé, founder and CEO of TIBCO
Software, discusses how the next evolution of infrastructure software is enabled by the fusion of
historical knowledge with real-time information. The combination of historical databases with
real-time, in-memory “events” is at the heart of TIBCO’s event-driven-architecture (EDA).
JDL Data Fusion Model
The JDL data fusion model is directly applicable to detection theory, where patterns and signatures
discovered by abductive and inductive reasoning processing (for example data mining) is “fused”
with real-time events. The JDL processing model [2], has survived the test of time as the dominant
functional data fusion model for decades. The vast majority of the most complex real-time event
processing architectures are based on the JDL model. It is worth noting that the JDL data fusion
model is a communications infrastructure that looks remarkably similar to TIBCO’s patented
“information bus” and is based on the concept of a service-oriented architecture (SOA).
In fact, the art and science of multi-sensor data fusion has emerged as the underlying foundation
for the state of the art in enterprise software, including the rapidly growing field of Predictive
Business® , which has produced innovative deployments and applications in telecommunications,
finance, transportation, and, of course, defense. All of these sectors require complex inference
processing and the management of real-time events from distributed sensors, agents, and other
processing components, including historical data-at-rest repositories.
There are three main meta-architectural components of the JDL model. Perhaps the most important
of these are events. Events can be local or external, and can originate from myriad sources in many
formats. There is also the core complex event and/or the so-called event stream processing
architecture, which the JDL model depicts as the “data fusion domain.” We will use the terms data
fusion domain, data fusion, multi-sensor data fusion, complex event processing, and event stream
processing somewhat interchangeably in this paper, as the concepts are independent of vendor
implementation and have similar technical objectives and outcomes. Finally, there is the user
interface, providing operational visibility into every technical and business process depicted in the
model. (Figure 1)
The Predictive Perimeter
Document
EVENT
SOURCES
EVENT PREPROCESSING
DISTRIBUTE
D
LOCAL
EVENT
SERVICES
.
.
EVENT
.
PROFILES
.
.
.
.
DATA
BASES
.
.
OTHER
DATA
Predictive Defense Environment
LEVEL ONE
LEVEL TWO
LEVEL THREE
EVENT
REFINEMENT
SITUATION
REFINEMENT
IMPACT
ASSESSMENT
USER
INTERFACE
LEVEL FOUR
PROCESS
REFINEMENT &
SPECIALIZED
ANALYSIS
DB MANAGEMENT
Historical
Data
Profile &
Patterns
Figure 1. Multi-level inference in a distributed event-decision architecture






Enterprise Business Activity Monitoring (BAM): Human visualization,
monitoring, interaction and situation management
Level 4 – Adaptive BPM (Dynamic Resource Allocation and Refinement):
Decide on control feedback, for example resource allocation, scheduling, sensor
and state management, parametric and algorithm adjustment
Level 3 – Predictive Analytics: Impact assessment, i.e. assess intent on the basis
of situation development, recognition and prediction
Level 2 – Situation Detection: Identify situations based on sets of complex events,
state estimation, etc.
Level 1 – Event Track and Trace: Identify events and make initial decisions
based on association and correlation
Level 0 – Event Preprocessing: Cleansing of event-stream to produce semantically
understandable data
The objectives of homeland security organizations may differ in context, but the overarching
technical goal is the same:
Correlate information gained from abductive and inductive reasoning processes with real-time
events to infer current situational knowledge and predict both opportunities for, and threats to,
security personnel to maximize assets and minimize liabilities.
Predictive Homeland Security Perimeter
This technical goal is at the heart of TIBCO’s real-time sense and respond vision for Predictive
Business and for a Predictive Perimeter environment. To expand the basic precepts of Predictive
The Predictive Perimeter
Document
Business into a Predictive Perimeter environment, we need to add technology from DHS and
commercial environments. These include sensors of every description, data warehouses, business
process management and workflow, specialized analytical processors (imagery, image, facial
recognition, license plate readers, thermal imaging, signal processing, and classic multiintelligence sources), inference engines supporting rule-sets, state-machine and temporal elements,
integration with every aspect of legacy DHS and administrative systems, and distributed secure
kiosks.
To illustrate how a predictive homeland security perimeter might work, consider the following
scenario based on a Homeland Security Environment (HSE) 1 and security sensors already in place.
If this seems like science fiction, see examples of large-scale deployments of complex event
processing described by Stanford University professor Dr. David Luckham in his book, The Power
of Events: An Introduction to Complex Event Processing in Distributed Enterprise Systems.
Scenario: The Best Defense is a Predictive Offense
“J” gets a call on his cell phone to let him know his cargo is 30 miles from his location outside
Juarez, Mexico – a desert region across the border from Texas. The area is known for illegal
border crossings and smuggling activity, because only a long fence separates Mexico from the
United States. You can see where the fence has been repeatedly broken and vandalized.
J has been pre-paid to transport human cargo across the border. Earlier in the day, he went to a
local cambio to change pesos to U.S. dollars. Yesterday he drove his truck to a local cantina for a
quick drink. He didn’t notice that the bartender made a phone call during his visit. He also didn’t
notice the AeroVironment Global Observer flying high overhead, which caught several images of
him and his truck during routine surveillance flights earlier in the day. All of these “events” –
including the bartender’s phone call – were captured by different surveillance systems. The
challenge is detecting and correlating significant events from all the other images, transactions,
communications that are being collected every second of every day by law enforcement agencies.
In this case, two of J’s passengers are members of the Al Qaeda network, using the desert outside
El Paso, Texas as their point of entry into the U.S. They board the truck along with other
passengers and begin the 20 minute drive to the crossing point. Everything goes according to plan
until their truck is intercepted at the border by U.S. law enforcement officers who received an alert
from a Predictive Defense system one hour and 45 minutes before the truck reached the border.
The alert was triggered by a pattern of events that matched a pre-defined pattern in a complex
event processing system – sophisticated software from TIBCO that consumes streams of real-time
data and “events” and converts them into actionable information and decision patterns. (See
Figure 2) In this case, the events of interest were sensor input from face recognition software;
flight, bus, and rail information into Mexico; satellite pictures of license plates; thermal activity off
the road near the border – even the phone call from the bartender, who was watching for the arrival
of a certain patron. (Figure 2)
1
See Figure 3 for an architectural drawing of a notional HSE.
The Predictive Perimeter
Document
Figure 2. Scenario events correlated via CEP software generate intercept event
With complex event processing software, more events can be monitored without human
intervention. The following section provides information on TIBCO’s complex event processing
software, TIBCO BusinessEvents™, and how it enables Predictive Business and Predictive
Perimeter applications.
How it Works
Distributed, coordination-based architectures such as TIBCO Enterprise Message Service™ (EMS) or
TIBCO Rendezvous® provide the underlying communications infrastructure that enables complex event
processing and high-performance, rule-based processing services.
The Predictive Perimeter
Document
Hi-Res
Imagery
VIQ
Imagery
Thermal
Imagery
Human
Sources
Imagery
analysis
Watch
Correlate
Multi-Int
Correlate
Known
Terror
Sites
Watch
Center
Linguist
Grids
Alert
Service
Impact
Assess
Figure 3. Detecting threats and opportunities
Rule-Based Systems and TIBCO BusinessEvents
Rule-based systems (RBS), often referred to as expert systems, are widely used to model the
behavior of domain experts. For this reason, RBS are used extensively in a wide variety of
business applications, such as customer relationship management, fault and medical diagnosis,
mortgage evaluations, credit card authorization, fraud detection, and C4ISR environments. These
systems use declarative programming to tackle problems involving control, diagnosis, intelligent
behavior, and problem solving by describing what the computer should do rather than the exact
procedural instruction on how to do it. RBS contain rules from a specific domain and use these
rules to derive solutions to a wide range of business problems.
The Predictive Perimeter
Document
Object Management
C
H
A
N
N
E
L
S
C
H
A
N
N
E
L
S
State Machine
Working Memory
Events
Rules
Concepts
Figure. 4. TIBCO BusinessEvents high level architecture
Rule-engines 2 are processing engines designed to follow rules written in their language. These
engines are used to execute complex decision-making logic on data and event streams to make
decisions based on tens, hundreds, or thousands of facts. Rule-engines accomplish this by
decomposing large sets of rules into an efficient network of nodes that can process and react to
facts far more efficiently than they could be processed using procedural programming. Therefore,
rule-engines, if properly designed, scale well for numerous classes of problems, including complex
event processing [7].
The RETE Algorithm 3 is a well-known, efficient RBS implementation that also creates a network
of nodes. Each node in the network represents one or more test conditions found on the left-hand
side (LHS) of a rule set. At the bottom of the RETE network are nodes representing individual
rules. When a set of events filters all the way down to the bottom of the network, the set has passed
all of the tests on the LHS of a particular rule and this rule set becomes an activation and
potentially generates one or more events as input to other processing or alerting solutions. The
associated rule may have its right-hand side executed (fired) if the activation is not invalidated first
by the removal of one or more events from its activation set [9].
TIBCO BusinessEvents, based on the RETE Algorithm, enables organizations to execute the same
complex decision-making logic, without the programming complexity. Using a declarative
programming model, business architects can define rules that will execute on individual events and
combinations of events and facts in working memory. The BusinessEvents high-level runtime
architecture is represented in Figure 4.
The core rule-engine architecture of BusinessEvents is illustrated in Figure 5 [8]. A
BusinessEvents rule has three components. The declaration component is used to specify which
2
This section is intended as only a brief review of rule-engines, the RETE Algorithm, and a few key concepts related to TIBCO BusinessEvents.
Please refer to the references for a more detailed discussion or explanation.
3
As mentioned earlier, the BusinessEvents rule-engine is based on the RETE Algorithm. The RETE Algorithm is an efficient pattern-matching
algorithm for implementing rule-based systems designed by Dr. Charles L. Forgy in 1979. Subsequently, RETE became the basis for many expert
systems, including JRules, OPS5, CLIPS, JESS, Drools and LISA.
The Predictive Perimeter
Document
concepts, score cards, and events 4 the rule requires (as well as naming attributes). The condition
component represents facts that evaluate to a Boolean value.
Figure 5. BusinessEvents Rules Engine
All conditional statements must be true for the rule’s action to be executed. The action component
of a BusinessEvents rule is a collection of statements to be executed when all the conditions are
true.
A BusinessEvents CEP event represents an instance of an event definition, an immutable activity
that occurred at a single point in time. An event definition includes properties evaluated by the
BusinessEvents rule engine, including TTL and comprehensive information related to the event. A
concept in BusinessEvents is an object definition of a set of properties that represent the data fields
of an application entity and can describe the relationship among entities. Likewise, an instance of
an event is persistent and changeable; whereas an event expires and cannot be changed.
A scorecard is a BusinessEvents resource that serves as a container for global variables and can be
used to track information across the application. For example, in the JDL data fusion model,
matrices are used for assignments at all levels of the inference processing model. As illustrated in
Figure 6, multiple scorecards can be used, conceptually, for this dynamic assignment at Level 0
through 4 of the JDL model [2]. 5
4
TIBCO® defines an event as an immutable object representing a business activity that happened at a single point in time. An event includes
information for evaluation by rules, metadata that provides context, and a separate payload – a set of data relevant to the activity.
5
Note that some applications of the JDL model show scorecards used for Level 0 – 4 association and state estimation variables. Scorecards, from a
BusinessEvents implementation perspective, may or may not be the most efficient method to extract features from an event stream. Scorecards are
introduced here as both a generic concept (scorecard) from the JDL model used for explicit associations in performing state estimation and BE
Business Events Score Card resource. In practice, BusinessEvents implementations of association and state estimation will more than likely use the
BE Concepts resource.
The Predictive Perimeter
Document
Figure 6. Scorecards and JDL Model Processing
Figure 6 illustrates how an event stream might be processed into “interesting events” (as in the
terrorist scenario) by extracting events from the event stream. These events are ranked, or scored,
according to an estimation of relevance to a pattern of known terrorist activities. These events,
ranked according to likelihood, may be further processed to infer more complex events and
situations. These complex events and situations are then evaluated for mission impact. The entire
event stream should be reconfigurable in real time to provide the capability to fine tune all
detection and estimation processes.
Conclusion
TIBCO BusinessEvents defines complex events as an abstraction that results from patterns
detected among aggregated events. This definition of a complex event [7, 8] corresponds to the
JDL Level 2 inference abstraction referred to earlier as a situation. Figure 7 illustrates how
building inference in a relevant terrorist event detection scenario may be represented by using the
built-in BusinessEvents scorecard resource. As discussed earlier, BusinessEvents uses the RETE
Algorithm, regarded as one of the most efficient algorithms for optimizing mainstream rule-based
systems.
The Predictive Perimeter
Document
Operational Visibility
&
Process Coordination
Data-at-Rest
Peer
Event Sources
Event
Services
Peer
Peer
Rules Engine
Rules Engine
Event Sources
Figure 7. Example architecture for Predictive Perimeter
This paper has illustrated that real-time sensor data from any source in the extended value chain,
also known as the “event cloud,” can be processed into event streams using rule-based processing.
From event streams it is possible to extract features in real time based on matching patterns to the
event stream. Features of the raw data or event stream become “events (objects) of interest,” which
can be scored according to likelihood estimates constructed from high-speed pattern-matching
algorithms with almost zero latency.
Figure 8. Detecting events of interest and scoring based on pattern-matching algorithms
Events can be aggregated and correlated in runtime to infer complex events, also referred to as
“situations.” Complex events and situations discovered in runtime can be processed with patterns
The Predictive Perimeter
Document
developed from historical data to predict future business events and situations. This brings us back
to Predictive Business, the theme of TIBCO CEO Vivek Ranadivé’s latest book, The Power to
Predict.
Real-time events, combined with patterns and features extracted from historical data and expert
domain knowledge, are the foundation for businesses to anticipate exceptional situations, estimate
the impact on both the business and the customer, and take corrective actions before exceptional
situations become problems. In other words: Predictive Business leverages business assets to
maximize opportunities and minimize future organizational liabilities.
Relevant terrorist event detection is only one of many examples of how the concepts of Predictive
Business can help DHS minimize threats and liabilities to their personnel, the enterprise
infrastructures that serve them, and the citizens and politicians who place trust in their
organizations.
The Predictive Perimeter
Document
References
[1] Ranadivé, V., The Power to Predict, McGraw-Hill, NY, NY, 2006.
[2] Hall, D. and Llinas, J. editors, Handbook of Multisensor Data Fusion, CRC Press, Boca
Raton, Florida, 2001.
[3] Bass, T., Service-Oriented Horizontal Fusion in Distributed Coordination-Based Systems,
IEEE MILCOM 2004, Monterey, CA, 2 November 2004.
[4] Bass, T., “Intrusion Detection Systems & Multisensor Data Fusion,” Communications of the
ACM, Vol. 43, No. 4, April 2000, pp. 99-105
[5] Working Group on Rule-based Systems, International Game Developers Association, The 2004
AI Interface Standards Committee (AIISC) Report, 2004.
[6] http://en.wikipedia.org/wiki/Thomas_Bayes
[7] Luckham, D., The Power of Events, Addison Wesley, Pearson Education Inc., 2002.
[8] TIBCO BusinessEvents User’s Guide, TIBCO® BusinessEvents, Software Release 1.2,
September 2005.
[9] Jess®, The Rule Engine for the Java™ Platform, The Rete Algorithm, Version 7.0b5,
DRAFT, 5 January 2006.
Additional Reading
Waltz, E. and Llinas, J., Multisensor Data Fusion, Artech House, Boston, MA, 1990.
Hall, D., and Llinas, J., “An Introduction to Multisensor Data Fusion,” Proceedings of the IEEE,
Vol. 85, No. 1, IEEE Press, 1997.
Bass, T., “The Federation of Critical Infrastructure Information via Publish and Subscribe
Enabled Multisensor Data Fusion,” Proceedings of the Fifth International Conference on
Information Fusion: Fusion 2002, Annapolis, MD, 8-11 July 2002, pp. 1076-1083.
Carzaniga, A., Rosenblum, D., and Wolf, A., “Design and Evaluation of a Wide Area Event
Notification Service,” ACM Transactions on Computer Systems, Vol. 19, No. 3, August 2001, pp.
332-383.
Carzaniga, A., “Architectures for an Event Notification Service Scaleable to Wide-area
Networks,” PhD Thesis, Politecnico di Milano, December 1998.
The Predictive Perimeter
Download