Glossary[5], [11],[12]
a
action potential An action potential is a rapid change in the membrane potential of a biological neuron, induced by depolarization of the neuron above its threshold potential. It is modelled in artificial networks by an activation function.
activation function A mathematical function that a neuron uses to produce an output referring to its input value. Usually this input value has to exceed a specified threshold value that determines, if an output to other neurons should be generated. Functions such as Sigmoid are often used. Sometimes called the
transfer function.
artificial intelligence A research discipline whose aim is to make computers able to simulate human abilities, especially the ability to learn. AI is separated in e.g. neural net theory, expert systems, robotics, fuzzy control systems, game theory etc.
artificial neural network Artificial neural networks are simulations of how it is thought the animal brain operates. It has been found that these simulations possesses powerful pattern recognition and prediction abilities - human-like qualities.
axon The part of a biological neural cell that contains the dendrites, connecting this neural cell to other cells. The incoming stimulation of a neural cell is transported from the cell's core through the axon to the outgoing connections.
b
backpropagation A learning algorithm used by neural nets with supervised learning. Special form of the delta learning rule.
backpropagation net A feedforward type neural net. Has one input layer, one output layer and at least one hidden layer. Mainly used for pattern association.
bias A "pseudo" input of a neural net with any value except zero. Its purpose is to generate different inputs for different input patterns given to the net.
c
connection A path from one neuron to another to transfer information. Also called synapses, which are often associated with weights that determine the strength of the signal that is transferred.
d
delta learning rule A learning algorithm used by neural nets with supervised learning. The idea is to calculate an error each time the net is presented with a training vector (given that we have supervised learning where there is a target) and to perform a gradient descent on the error considered as function of the weights. There will be a gradient or slope for each weight. Thus, we find the weights which give the minimal error. It is also known as Gradient Descent.
dendrite The connections between biological neural cells. Electrical stimulation is transported from cell to cell using these connections.
e
epoch One iteration through the Backpropagation algorithm - one complete presentation of the training set to the network during training.
error A value that indicates the "quality" of a neural net's learning process. Used by neural nets with supervised learning, by comparing the current output values with the desired output values of the net. The smaller the net's error is, the better the net had been trained. Usually the error is always a value greater than zero.
error function The performance function that calculates the average squared error between the network output and the target outputs. Also known as the global error.
f
feedforward A specific connection structure of a neural net, where neurons of one neuron layer may only have connections to neurons of other layers. An example of such a net type is the Perceptron.
forwardpropagation The output values of a neural net's neurons are only propagated through the net in one direction, from the input layer to the output layer.
g
global error See error function.
global minimum The error is reduced during training to below a preset threshold. This represents a minimum in the error function and the smallest error is the global minimum. See also Local Minimum.
gradient descent see Delta Learning Rule
h
hidden layer A type of neuron layer that lies between a neural net's input and output layers. Called "hidden", because its neuron values are not visible outside the net. The usage of hidden layers extends a neural net's abilities to learn logical operations.
i
input A set of values, called "pattern", that is passed to a neural net's input neuron layer. The elements of those patterns are usually binary values.
input layer The first layer of a neural net, that accepts certain input patterns and generates output values to the succeeding weight matrix.
j
k
l
layer A group of neurons that have a specific function and are processed as a whole. The most common example is in a feedforward network that has an input layer, an output layer and one or more hidden layers.
learning See Training.
learning algorithm A mathematical algorithm that a neural net uses to learn specific problems.
learning rate A changeable value used by several learning algorithms, which effects the changing of weight values. The greater the learning rate, the more the weight values are changed. Is usually decreased during the learning process.
local minimum During training the error in a network is reduced to below a present threshold. However, if a local minimum is encountered then the network may never train successfully. The training algorithm tries to reduce the error but is prevented from doing so because the current error is only a local minimum. It is difficult to determine if a minimum is local or global (q.v.) and the usual method to get out of a local minimum is to jog (q.v.) the weights and continue training.
m
mapping Transformation of data from one representation to another.
momentum A common modification to standard backpropagation training; at each step, weight adjustments are based on a combination of the current weight adjustment (as found in standard  Backpropagation) and the weight change from the previous step.
multi-layer Perceptron A feedforward type neural net. Built of an input layer, at least one hidden layer and one output layer. Mainly used for pattern association.
n
neuron An element of a neural net's neuron layer.
neuron layer A layer of a neural net. The different layers of a neural net are connected by weight matrices.
node See Neuron.
o
output A value or a set of values (pattern), generated by the neurons of a neural net's output layer. Used to calculate the current error value of the net.
output error see error
output layer The last layer of a neural net, that produces the output value of the net.
p
perceptron Feedforward type neural net. Built of one input layer and one output layer. Mainly used for pattern association.
propagation The passing of values and errors through the different layers of a neural net during its learning process.
propagation function A function that is used to transport values through the neurons of a neural net's layers. Usually, the input values are added up and passed to an activation function, which generates an output.
q
r
s
sigmoid activation A specific type of a neuron's activation function.
supervised learning A specific type of a learning algorithm. The output (pattern) of the net is compared with a target output (pattern). Depending on the difference between these patterns, the net error is computed.
t
target
testing
threshold A specific value that must be exceeded by a neuron's activation function, before this neuron generates an output.
topology The way in which the neurons are connected together determines the topology of the neural network. Sometimes referred to as architecture or paradigm.
training Training is the process by which the neural network connection weights are adjusted so that the network performs the function for which it is designed.
training set A neural network is trained using a training set. A training set comprises information about the problem to be solved as input stimuli. Also known as training pattern.
u
unsupervised learning Specific type of a learning algorithm, especially for self-organizing neural nets. Unlike supervised learning, no target patterns exist.
v
w
weight An element of a weight matrix. A connection between two neurons with a value that is dynamically changed during a neural net's learning process.
weight matrix The connection structure between two neuron layers of a neural net. Its elements, the weights, are changed during the net's learning process. Each neural net has at least one weight matrix.
x
y
z