Glossary

advertisement
Glossary
A
Adaptive intelligent systems Information systems that are able to change their
knowledge base and
their structural and functional characteristics during operation in a dynamically changing
environment
for a better reaction.
Adaptive resonance theory (ART) A neural network invented and developed by
Carpenter and
Grossberg. The network learns to categorize and adopt new patterns (''plasticity") while
retaining
previously learned patterns ("stability").
Alan Turing's test for artificial intelligence (AI) Definition for AI introduced by the
British
mathematician and computer scientist Alan Turing. It states approximately that a
machine system is
considered to possess AI if while communicating with a person behind a "bar," the
person cannot
recognize whether it is a machine or a human.
-cut of a fuzzy set Subset of the universe of the fuzzy set consisting of values that
belong to the fuzzy
set with a membership degree greater (weak cut) or greater or equal to (strong cut) a
given value [0,
1].
Apparent error The error calculated on the basis of the reaction of a neural network to
the data used for
its training. It is usually calculated as a mean square error (MSE) or root MSE (RMS).
Approximate reasoning A process of inferring new facts and achieving conclusions in
an intelligent
information system when inexact facts and uncertain rules are present.
Artificial neural network Biologically inspired computational model consisting of
processing elements
(called neurons) and connections between them with coefficients (weights) bound to the
connections,
which constitute the neuronal structure. Training and recall algorithms are also attached
to the structure.
Automatic speech recognition system (ASRS) A computer system which aims at
providing enhanced
access to machines and information via voice commands (instructions, queries,
communication
messages).
B
Backward chaining Goal-driven inference process which starts after a goal is
identified. A search for a
rule which has this goal in its consequent part is performed, and then data (facts) which
satisfy all the
conditions for this rule are sought in the database. The process is recursive, that is, a
condition in a rule
may be a conclusion in another rule (other rules). The process of searching for facts goes
backward.
Bidirectional associative memory (BAM) A neural network which has the
characteristic of a
heteroassociative memory. It can memorize pattern associations of the type (a(P),b(P),
where a(P) is an ndimensional
vector (pattern) and b(P) is its corresponding m-dimensional pattern. If at least one of the
patterns, even corrupted, is given as input, the network eventually produces the two
patterns associated
during training.
Brain-state-in-a-box network (BSB) An autoassociative network that is a recurrent but
not fully
connected network, in which a connection from a neuron's output to its input is allowed;
the
interconnection weights are nonsymmetric in general. The state space of the system is
restricted to a
hyper-cube ("box"), the edges being the desired states [previously learned patterns].
C
Catastrophic forgetting Phenomenon representing the ability of a network to forget
what it has learned
from previous examples, when they are no longer presented to it, but other examples are
presented
instead.
Center-of-gravity defuzzification method (COG) Method for defuzzification, for
example,
transforming a membership function B' into a crisp value y' such that y' is the
geometrical
Page 540
center of B'. The following formula is used: y' = where v are all the values from the
universe V of the fuzzy variable y.
Chaos A complicated behavior of a nonlinear dynamical system according to some
underlying rules.
Chaotic attractor An area or points from the phase space of a chaotic process where the
process often
goes through time, but without repeating the same trajectories.
Chaotic neuron An artificial neuron whose output is calculated with the use of a chaotic
output
function.
Classification problem A genetic AI problem which arises when it is necessary to
associate an object
with some already existing groups, clusters, or classes of objects.
Comprehensive artificial intelligence The area of integrating ordinary, symbolic AI,
neural networks,
fuzzy systems, and other AI techniques (genetic algorithms, evolutionary programming
chaos, etc.)
Connectionist expert system (CES) An expert system that has its knowledge
represented as a neural
network.
Connectionist production system A connectionist system that implements productions
of the form IF
C, THEN A, where C is a set of conditions and A is a set of actions.
Control Process of acquiring information for the current state of an object and emitting
control signals
to it in order to keep the object in its possible and desired states.
D
Decision-making systems AI systems which choose one among many variants as a
solution to a
particular problem. The solution is then recommended to the user.
Defuzzification Process of calculating a single output numerical value for a fuzzy
output variable on the
basis of an inferred membership function for this variable.
Design Creating objects that satisfy particular requirements following a given set of
constraints.
Destructive learning Technique which destroys the initial neural network architecture,
for example,
removes connections, for the purpose of better learning.
Diagnosis Process of finding faults in a system.
Distributed representation A way of encoding information in a neural network where a
concept or a
value for a variable is represented by a collective activation of a group of neurons.
Dynamical system A system which evolves in continuous or discrete time.
E
Expert system A computer system for solving difficult problems usually solved by
experts; the system
can provide a similar expertise to the one provided by experts in a restricted area, for
example diagnosis
of breast cancer, finding a cheap route for a round-the-world trip, etc.
Expert system shells Systems which have the architecture of an expert system but are
"empty" of
knowledge. They are tools which facilitate rapid phototyping of an expert system.
Explanation in expert systems Tracing, in a contextually comprehensible way, the
process of inferring
the solution, and reporting it.
Page 541
Extension principle A method which defines how to find the membership function of a
fuzzy set f(A) in
the universe V, where A is a fuzzy set in the universe U and the function f: U V is
given.
F
Fast Fourier Transform (FFT) Special nonlinear transformation applied on (mainly
speech) data to
transform the signal taken at a small portion of time from the time-scale domain into a
vector in the
frequency-scale domain.
Feedforward neural network A neural network in which there are no connections back
from output to
input neurons.
Feedback neural network A network in which there are connections from output to
input neurons.
Fitness See Goodness.
Forecasting See Prediction.
Forward chaining inference A inference method of applying all the facts available at a
given moment
to all the rules in a rule-based system in order to infer all the possible conclusions, and
repeating this
process until there are no more new facts inferred.
Fractals Objects which occupy fractions of a standard (integer number of dimensions)
space called the
embedding space.
Frames Information (knowledge) structures that represent structured information for
standard
situations. Frames consist of slots (variables) and fillers (values). The slots represent the
most typical
characteristics of the objects.
Fuzzification Process of finding the membership degree µA(x') to which input value x'
for a fuzzy
variable x defined on an universe U belongs to a fuzzy set A defined on the same
universe.
Fuzzy ARTMAP Extension of ART1 neural network models (see Adaptive Resonance
Theory) when
input nodes represent not "yes/no" features, but fuzzy features instead, for example, a set
of features
(sweet, fruity, smooth, sharp, sourish) used to categorize different samples of wines
based on their taste.
Fuzzy control In a broad sense, this is application of fuzzy logic to control problems. A
fuzzy control
system is a fuzzy system applied to solve a control problem.
Fuzzy clustering Procedure of clustering data into possibly overlapping clusters, such
that each of the
data examples may belong to each of the clusters to a certain degree, but all its
membership degrees sum
up to 1.
Fuzzy expert system An expert system to which methods of fuzzy logic are applied.
Fuzzy expert
systems may use fuzzy data, fuzzy rules, and fuzzy inference, in addition to the standard
ones
implemented in the ordinary expert systems.
Fuzzy expert system shell A tool that facilitates building and experimenting with fuzzy
expert systems.
It facilitates building the main modules in a fuzzy expert system.
Fuzzy neural network (FNN) Neural network designed to realize a fuzzy system,
consisting of fuzzy
rules, fuzzy variables, and fuzzy values defined for them and the fuzzy inference
method.
Fuzzy propositions Propositions which contain fuzzy variables and their fuzzy values.
The truth value
of a fuzzy proposition "X is A" is given by the membership function µA.
Page 542
Fuzzy query interface Interface to a standard database which allows the users to use
fuzzy terms
(values) in their queries, which values are not stored in the database.
Fuzzy relations A relation which links two fuzzy sets by assigning a number between 0
and 1 to each
element of the cross-product U V of the universes of the two fuzzy sets. It makes it
possible to
represent vague or ambiguous relationships.
G
General linear transformation Transformation f(x) of data vectors x such that f is a
linear function of
x, for example, f(x) = 2x + 1.
Generalization The ability of an information system to process new, unknown input
data in order to
obtain the best possible solution, or one close to it.
General nonlinear transformation Transformation f of data vectors x where f is a
nonlinear function of
x, for example, f(x) = 1/(1 + e-x · c), where c is a constant.
Genetic algorithms Algorithms for solving complex combinatorial and organizational
problems with
many variants, by employing analogy with nature's evolution. The general steps a
genetic algorithm
cycles through are: generate a new population (crossover) starting at the beginning with
initial one;
select the best individuals; mutate, if necessary; repeat the same until a satisfactory
solution is found
according to a goodness (fitness) function.
Goodness criterion A function which evaluates the appropriation of prospective
decisions when solving
an AI problem, for example, playing games.
Growing neural network A neural network which has the ability to start training with a
smaller
number of nodes, and subject to the error calculated, to increase this number if
necessary.
H
Hamming network Network which performs the task of pattern association, or pattern
classification,
based on measuring the Hamming distance, i.e., the number of bits two boolean vectors
differ in.
Hebbian learning law Generic learning principle which states that a synapse
connecting two neurons i
and j increases its strength wij if repeatedly the two neurons i and j are simultaneously
activated by input
stimuli.
Homophones Words with different spelling and meaning but which sound the same, for
example, "to,"
"too," "two" and ''hear," "here."
Hopfield network Fully connected feedback network which is an autoassociative
memory. It is named
after its inventor, John Hopfield.
Hybrid connectionist logic programming system A system which consists of a logic
programming
language and neural networks.
Hybrid fuzzy connectionist production system A system which consists of a
production rule-based
system, neural networks, fuzzy inference machine, and, possibly, some other modules
which facilitate
communication between the above, for example, rule extraction module, data processing
(normalization,
fuzzification, etc.).
Hybrid connectionist production system A system which consists of a production
rule-based system
and neural networks.
I
Inference Process of matching current data from the domain space to the existing
knowledge in a
knowledge-based information system and inferring new facts until a solution in the
solution space is
reached.
Page 542
Fuzzy query interface Interface to a standard database which allows the users to use
fuzzy terms
(values) in their queries, which values are not stored in the database.
Fuzzy relations A relation which links two fuzzy sets by assigning a number between 0
and 1 to each
element of the cross-product U V of the universes of the two fuzzy sets. It makes it
possible to
represent vague or ambiguous relationships.
G
General linear transformation Transformation f(x) of data vectors x such that f is a
linear function of
x, for example, f(x) = 2x + 1.
Generalization The ability of an information system to process new, unknown input
data in order to
obtain the best possible solution, or one close to it.
General nonlinear transformation Transformation f of data vectors x where f is a
nonlinear function of
x, for example, f(x) = 1/(1 + e-x · c), where c is a constant.
Genetic algorithms Algorithms for solving complex combinatorial and organizational
problems with
many variants, by employing analogy with nature's evolution. The general steps a
genetic algorithm
cycles through are: generate a new population (crossover) starting at the beginning with
initial one;
select the best individuals; mutate, if necessary; repeat the same until a satisfactory
solution is found
according to a goodness (fitness) function.
Goodness criterion A function which evaluates the appropriation of prospective
decisions when solving
an AI problem, for example, playing games.
Growing neural network A neural network which has the ability to start training with a
smaller
number of nodes, and subject to the error calculated, to increase this number if
necessary.
H
Hamming network Network which performs the task of pattern association, or pattern
classification,
based on measuring the Hamming distance, i.e., the number of bits two boolean vectors
differ in.
Hebbian learning law Generic learning principle which states that a synapse
connecting two neurons i
and j increases its strength wij if repeatedly the two neurons i and j are simultaneously
activated by input
stimuli.
Homophones Words with different spelling and meaning but which sound the same, for
example, "to,"
"too," "two" and ''hear," "here."
Hopfield network Fully connected feedback network which is an autoassociative
memory. It is named
after its inventor, John Hopfield.
Hybrid connectionist logic programming system A system which consists of a logic
programming
language and neural networks.
Hybrid fuzzy connectionist production system A system which consists of a
production rule-based
system, neural networks, fuzzy inference machine, and, possibly, some other modules
which facilitate
communication between the above, for example, rule extraction module, data processing
(normalization,
fuzzification, etc.).
Hybrid connectionist production system A system which consists of a production
rule-based system
and neural networks.
I
Inference Process of matching current data from the domain space to the existing
knowledge in a
knowledge-based information system and inferring new facts until a solution in the
solution space is
reached.
Page 543
Information Collection of structured data. In its broad meaning it includes knowledge
as well as simple
meaningful data.
Information retrieval Process of retrieving relevant information from a database by
using a query
language.
Initialization of a neural network The process of setting the connection weights in a
neural network to
some initial values before starting the training algorithm.
Interaction (human computer interaction) Communication between a computer
system on the one
hand and the environment or the user on the other hand, in order to solve a given
problem.
K
Knowledge Concise presentation of previous experience which can be interpreted in a
system.
Kohonen SOM A self-organised map neural network invented and developed by Teuvo
Kohonen.
L
Language analysis A process where a command or enquiry from user given in a
restricted natural
language is recognized by a computer program from the point of view of the syntax, the
semantics, and
the concepts of the language used before the command is interpreted in the system.
Leaky integrator An artificial neuron which has a binary input, a real-value output, and
a feedback
connection, which keeps the output of the neuron gradually decreasing in value after the
input stimulus
has been removed from the input.
Learning Process of obtaining new knowledge.
Learning vector quantization algorithms LVQ1, LVQ2, and LVQ3 A supervised
learning algorithm,
which is an extension of the Kohonen self-organized network learning algorithm.
Linguistic variable Variable which takes fuzzy values, for example "speed" takes
values of "high,"
"moderate," and "low."
Local representation in neural networks A way of encoding information in a neural
network in which
every concept or a variable is represented by the activation of one neuron.
Lyapunov exponent A parameter which provides the average rate of divergence or
convergence for a
chaotic process.
M
Machine-learning methods Computer methods for accumulating, changing, and
updating knowledge in
an AI computer system.
Membership function Generalized characteristic function which defines the degree to
which an object
from a universe belongs to a fuzzy set.
Memory capacity Maximum number m of patterns which can be learned properly in a
pattern
associator neural network.
Methods for feature extraction Methods used for transforming raw data from one
input domain space
into another; a space of features.
Modular system System consisting of several modules linked together for solving a
given problem.
Monitoring Process of interpretation of continuous input information to an information
system, and
recommending intervention if appropriate.
Page 544
Moving averages One of the theories used mainly for predicting stock market timeseries. The theory
says that by computing average values over periods of time the volatility of the time
series is smoothed
and the trends of the market are indicated. This is also a general method for time-series
data processing.
N
Neural network See Artificial neural network.
Noise Any small random signal that is added to the function which describes the
underlying behavior of
a process.
Nonlinear dynamical system A system whose next state on the time scale can be
expressed by a
nonlinear function from its previous time states.
Normalization Moving the scale of raw data into a predefined scale, for example [0, 1],
or [-1, 1], etc.
O
Optimization Finding optimal values for parameters of an object or a system which
minimize an
objective (cost) function.
Oscillatory neuron An artificial neuron built up of two elements (or two groups of
elements), one of
them being excitatory and the other inhibitory. Its functioning is described as oscillation,
characterized
by three parameters: frequency; phase; amplitude.
Overfitting Phenomenon which indicates that a neural network has approximated,
learned, too closely
the data examples, which may contain noise in them, so that the network can not
generalize well on new
examples.
P
Pattern matching Matching a feature vector, a pattern, with already existing ones and
finding the best
match.
Phase space of a chaotic process The feature space where the process is traced over
time.
Phonemes Linguistic elements which define the smallest speech patterns that have
linguistic
representation in a langauge.
Planning Important generic Al problem which is about generating a sequence of actions
in order to
achieve a given goal when a description of the current situation is available.
Power set of a fuzzy set Set of all fuzzy subsets of a fuzzy set.
Prediction Generating information for the possible future development of a process
from data about its
past and its present development.
Productions Transformation rules applied to obtaining one sequence of characters from
another, usually
represented in the form of: IF conditions , THEN actions .
Production system A symbolic AI system consisting of three main parts: (1) a list of
facts, considered a
working memory (the facts being called "working memory elements"); (2) a set of
production rules,
considered the production memory; (3) an inference engine, which is the reasoning
procedure, the
control mechanism.
Pruning Technique based on gradual removing from a neural network the weak
connections (which
have weights around 0) and the neurons which are connected to them during the training
procedure.
Q
Queues Data structures similar to the stack structure, but here two pointers are used-one
for the input
and one for the output element of the structure. They are also called FIFO structures
(first input, first
output).
Page 545
R
Recall phase Phase of using a trained neural network when new data are fed and results
are calculated.
Recurrent fuzzy rule A fuzzy rule that has as condition elements in its antecedent part
one or more
previous time-moment values of the output fuzzy variable.
Recurrent networks Networks with feedback connections from neurons in one layer to
neurons in a
previous layer.
Reinforcement learning, or reward-penalty learning A neural network training
method based on
presenting input vector x and looking at the output vector calculated by the network. If it
is considered
"good", then a "reward" is given to the network in the sense that the existing connection
weights get
increased, otherwise the network is "punished"; the connection weights, being
considered as "not
appropriately set," decrease.
Representation Process of transforming existing problem knowledge into some known
knowledgeengineering
schemes in order to process it in a computer program by applying knowledgeengineering
methods.
S
Sensitivity to initial conditions A characteristic of a chaotic process in which a slight
difference in the
initial values of some parameters that characterize the chaotic process will result in quite
different trends
in its further development.
Spatio-temporal artificial neural networks Artificial neural networks that represent
patterns of
activities which have some spatial distribution and appear at certain times.
Spurious states of attraction Patterns in which an associative memory neural network
can wrongly
converge during a recall process. These patterns are not presented in the set of training
examples.
Stability/plasticity dilemma Ability of the ART neural networks to preserve the
balance between
retaining previously learned patterns and learning new patterns.
Stack A collection of ordered elements and two operations which can be performed only
over the
element that is currently at the "top," that is, "push" an element on top of the stack, and
"pop" an element
from the stack.
Statistical analysis methods Methods used for discovering the repetitiveness in data
based on
probability estimation.
Supervised training algorithm Training of a neural network when the training
examples comprise
input vectors x and the desired output vectors y; training is performed until the neural
network "learns"
to associate each input vector x with its corresponding and desired output vector y.
Support of a fuzzy set A Subset of the universe U, each element of which has a
membership degree to
A different from zero.
T
Test error An error that is calculated when, after having trained a network with a set of
training data,
another set (test, validation, cross-validation), for which the results are also known, is
applied through a
recall procedure.
Time alignment A process where a sequence of vectors recognized over time are
aligned to represent a
meaningful linguistic unit (phoneme, word).
Time-Delay neural network (TDNN) Modification of a multilayer perceptron which
uses delay
elements to feed input data through. The input layer is a shift register with delay
elements.
Page 546
Time-series prediction Prediction of time-series events.
Training error See apparent error.
Training phase A procedure of presenting training examples to a neural network and
changing the
network's connection weights according to a certain learning law.
Tree Directed graph in which one of the nodes, called root, has no incoming arcs, but
from which each
node in the tree can be reached by exactly one path.
U
Unsupervised training algorithm A training procedure in which only input vectors x
are supplied to a
neural network; the network learns some internal features of the whole set of all the
input vectors
presented to it.
V
Validation Process of testing how good the solutions produced by a system are. The
solutions are
usually compared with the results obtained either by experts or by other systems.
Validation error See Test error.
Variable binding Substituting variables with possible values in an information system.
Vigilance Parameter in the ART network which controls the degree of mismatch
between the new
patterns and the learned (stored) patterns which the system can tolerate.
W
Wavelets transformation Nonlinear transformation which can represent slight changes
of a signal
within a chosen "window" from the time scale.
Download