Artificial Intelligence and Quantum Computing for Advanced Wireless Networks. Savo G. Glisic. Читать онлайн. Newlib. NEWLIB.NET

Автор: Savo G. Glisic
Издательство: John Wiley & Sons Limited
Серия:
Жанр произведения: Программы
Год издания: 0
isbn: 9781119790310
Скачать книгу
3.4 Illustration of backpropagation.

      3.2.1 Spatial Temporal Representations

      Most often in engineering, prior to becoming a member of the observation set, the input signals to the neural network have gone through some form of filtering. This also coincides with the form of potential maintained at the axon hillock region of the neural cell. With this in mind, we may modify Eq. (3.1) as

      (3.15)s left-parenthesis t right-parenthesis equals sigma-summation Underscript i Endscripts integral Subscript 0 Superscript t Baseline w Subscript i Baseline left-parenthesis tau right-parenthesis x Subscript i Baseline left-parenthesis t minus tau right-parenthesis italic d tau plus w Subscript b Baseline equals sigma-summation Underscript i Endscripts w Subscript i Baseline left-parenthesis t right-parenthesis asterisk x Subscript i Baseline left-parenthesis t right-parenthesis plus w Subscript b Baseline comma

      By adding filtering operations, we have included the equally important temporal dimension in the static model. For our purposes, we will now be interested in adapting the filters. To this end, we assume a discrete FIR representation for each filter. This yields

      (3.16)s left-parenthesis k right-parenthesis equals sigma-summation Underscript i Endscripts sigma-summation Underscript n equals 0 Overscript upper M Endscripts w Subscript i Baseline left-parenthesis n right-parenthesis x Subscript i Baseline left-parenthesis k minus n right-parenthesis plus w Subscript b Baseline equals sigma-summation Underscript i Endscripts normal w Subscript i Baseline normal x Subscript i Baseline left-parenthesis k right-parenthesis plus w Subscript b Baseline comma

      with k being the discrete time index for some sampling rate Δt, and wi(n) being the coefficients for the FIR filters. In the following, we will represent the vector wi = [wi(0), wi(1), … , wi(M)] and the delayed states as xi(k) = [xi(k), xi(k − 1), … , xi(kM)]. Now, a filter operation is written as the vector dot product wixi(k), with time implicitly included in the notation.

      We use the same approach to network modeling as in the previous section. Each link in the network is now created using an FIR filter (see Figure 3.5). The neural network no longer performs a simple static mapping from input to output; internal memory has now been added to simple static mapping from input to output. At the same time, since there are no feedback loops, the overall network is still FIR [2–5]. The notation now becomes normal w Subscript italic i j Superscript l Baseline equals left-bracket w Subscript italic i j Superscript l Baseline left-parenthesis 0 right-parenthesis comma w Subscript italic i j Superscript l Baseline left-parenthesis 1 right-parenthesis comma period period w Subscript italic i j Superscript l Baseline left-parenthesis upper M Superscript l Baseline right-parenthesis right-bracket.

      3.2.2 Neural Network Unfolding

      An interesting, more insightful, representation of the FIR network is derived by using a concept known as unfolding in time. The general strategy is to remove all time delays by expanding the network into a larger equivalent static structure.

Schematic illustration of finite impulse response (FIR) neuron and neural network.
bold w Subscript i comma j Superscript l Baseline equals left-bracket w Subscript i comma j Superscript l Baseline left-parenthesis 0 right-parenthesis comma w Subscript i comma j Superscript l Baseline left-parenthesis 1 right-parenthesis comma period period w Subscript i comma j Superscript l Baseline left-parenthesis upper M Superscript l Baseline right-parenthesis right-bracket Weight connecting neuron i in layer l − 1 to neuron j in layer l
w Subscript italic b j Superscript l Bias weight for neuron j in layer l
s Subscript j Superscript l Baseline left-parenthesis k right-parenthesis equals sigma-summation Underscript i Endscripts bold w Subscript italic i j Superscript l Baseline dot bold a Subscript i Superscript l minus 1 Baseline left-parenthesis k right-parenthesis plus w Subscript italic b j Superscript l Summing junction or neuron j in layer l
a Subscript j Superscript l Baseline left-parenthesis k right-parenthesis equals italic hyperbolic tangent left-parenthesis s Subscript j Superscript l Baseline right-parenthesis Activation value for

e-mail: [email protected]