Handbook of Intelligent Computing and Optimization for Sustainable Development. Группа авторов. Читать онлайн. Newlib. NEWLIB.NET

Автор: Группа авторов
Издательство: John Wiley & Sons Limited
Серия:
Жанр произведения: Техническая литература
Год издания: 0
isbn: 9781119792628
Скачать книгу

      ANN, which is a domain of artificial intelligence, mimics the above discussed biological neural networks of nervous system. The connections of the neurons in ANN are computationally and mathematically modeled in more or less same way as the connections between the biological neurons.

      An ANN can be defined as a mathematical and computational tool for nonlinear statistical data modeling, influenced by the structure and function of biological nervous system. A large number of immensely interconnected processing units, termed as neurons, build ANN.

      Generally, ANN receives a set of inputs and produces the weighted sum, and then, the result is passed to the nonlinear function which generates the output. Like human being, ANN also learns by example. The models of ANN are required to be appropriately trained to generate the output efficiently. In biological nervous system, learning involves adaptations in the synaptic connections between the neurons. This idea influences the learning procedure of ANN. The system parameter of ANNs can be adjusted according to I/O pattern. Through learning process, ANN can be applied in the domains of data classification, pattern recognition, etc.

      The researchers are working on ANN for past several decades. This domain has been established even before the advent of computers. The artificial neuron [1] was first introduced by Warren McCulloch, the neurophysiologist, and Walter Pits, the logician, in 1943.

      2.3.1 McCulloch-Pitts Neural Model

      The model proposed by McCulloch and Pitts is documented as linear threshold gate [1]. The artificial neuron takes a set of input I1, I2, I3, … …, IN ∈ {0, 1} and produces one output, y ∈ {0, 1}. Input sets are of two types: one is dependent input termed as excitatory input and the other is independent input termed as inhibitory input. Mathematically, the function can be expressed by the following equations:

      (2.2)image

      W1, W2, W3, …, …, WN ≡ weight values associated with the corresponding input which are normalized in the range of either (0, 1) or (−1, 1);

      S ≡ weighted sum;

      θ ≡ threshold constant.

      This initial two-state model of ANN is simple but has immense computational power. The disadvantage of this model is lack of flexibility because of fixed weights and threshold values. Later McCulloch-Pitts neuron model has been improved incorporating more flexible features to extend its application domain.

      2.3.2 The Perceptron

      (2.3)image

      where

      b ≡ bias value.

      2.3.3 ANN With Continuous Characteristics

      where

      T ≡ extra input value associated with weight value 1 which represents the threshold or bias of a neuron.

Schematic illustration of the linear threshold function. Schematic illustration of linear threshold gate.

      The second stage of the model is the activation function which takes the sum-of-product value as the input and produces the output. The activity of this stage determines the characteristic of the ANN model. This function compresses the amplitude of the output so that it lies in the range of [0, 1] or [−1, 1]. The compression of the output signal is performed to mimic the signal produced by biological neuron in the form of continuous action-potential spikes.

Schematic illustration of ANN model with continuous characteristics. Graph depicts logistic sigmoid function.