Nonlinear Filters. Simon Haykin. Читать онлайн. Newlib. NEWLIB.NET

Автор: Simon Haykin
Издательство: John Wiley & Sons Limited
Серия:
Жанр произведения: Программы
Год издания: 0
isbn: 9781119078159
Скачать книгу
Posterior Cramér–Rao Lower Bound

      To assess the performance of an estimator, a lower bound is always desirable. Such a bound is a measure of performance limitation that determines whether or not the design criterion is realistic and implementable. The Cramér–Rao lower bound (CRLB) is a lower bound that represents the lowest possible mean‐square error in the estimation of deterministic parameters for all unbiased estimators. It can be computed as the inverse of the Fisher information matrix. For random variables, a similar version of the CRLB, namely, the posterior Cramér–Rao lower bound (PCRLB) was derived in [52] as:

      (4.19)bold upper P Subscript k vertical-bar k Baseline equals double-struck upper E left-bracket left-parenthesis bold x Subscript k Baseline minus ModifyingAbove bold x With Ì‚ Subscript k Baseline right-parenthesis left-parenthesis bold x Subscript k Baseline minus ModifyingAbove bold x With Ì‚ Subscript k Baseline right-parenthesis Superscript upper T Baseline right-bracket greater-than-or-equal-to bold upper F Subscript k Superscript negative 1 Baseline comma

      where bold upper F Subscript k Superscript negative 1 denotes the inverse of Fisher information matrix at time instant k. This bound is also referred to as the Bayesian CRLB [53, 54]. To compute it in an online manner, an iterative version of the PCRLB for nonlinear filtering using state‐space models was proposed in [55], where the posterior information matrix of the hidden state vector is decomposed for each discrete‐time instant by virtue of the factorization of the joint PDF of the state variables. In this way, an iterative structure is obtained for evolution of the information matrix. For a nonlinear system with the following state‐space model with zero‐mean additive Gaussian noise:

      (4.20)StartLayout 1st Row 1st Column bold x Subscript k plus 1 2nd Column equals bold f Subscript k Baseline left-parenthesis bold x Subscript k Baseline right-parenthesis plus bold v Subscript k Baseline comma EndLayout

      (4.21)StartLayout 1st Row 1st Column bold y Subscript k 2nd Column equals bold g Subscript k Baseline left-parenthesis bold x Subscript k Baseline right-parenthesis plus bold w Subscript k Baseline comma EndLayout

      the sequence of posterior information matrices, bold upper F Subscript k, for estimating state vectors, bold x Subscript k, can be computed as [55]:

      (4.22)bold upper F Subscript k plus 1 Baseline equals bold upper D Subscript k Superscript 22 Baseline minus bold upper D Subscript k Superscript 21 Baseline left-parenthesis bold upper F Subscript k Baseline plus bold upper D Subscript k Superscript 11 Baseline right-parenthesis Superscript negative 1 Baseline bold upper D Subscript k Superscript 12 Baseline comma

      where

      (4.23)bold upper D Subscript k Superscript 11 Baseline equals double-struck upper E left-bracket nabla Subscript bold x Sub Subscript k Subscript Baseline bold f Subscript k Baseline left-parenthesis bold x Subscript k Baseline right-parenthesis bold upper Q Subscript k Superscript negative 1 Baseline nabla Subscript bold x Sub Subscript k Subscript Superscript upper T Baseline bold f Subscript k Baseline left-parenthesis bold x Subscript k Baseline right-parenthesis right-bracket comma

      (4.24)bold upper D Subscript k Superscript 12 Baseline equals minus double-struck upper E left-bracket nabla Subscript bold x Sub Subscript k Subscript Baseline bold f Subscript k Baseline left-parenthesis bold x Subscript k Baseline right-parenthesis right-bracket bold upper Q Subscript k Superscript negative 1 Baseline comma

      (4.25)bold upper D Subscript k Superscript 21 Baseline equals left-parenthesis bold upper D Subscript k Superscript 12 Baseline right-parenthesis Superscript upper T Baseline comma

      (4.26)bold upper D Subscript k Superscript 22 Baseline equals double-struck upper E left-bracket nabla Subscript bold x Sub Subscript k plus 1 Subscript Baseline bold g Subscript k plus 1 Baseline left-parenthesis bold x Subscript k plus 1 Baseline right-parenthesis bold upper R Subscript k plus 1 Superscript negative 1 Baseline nabla Subscript bold x Sub Subscript k plus 1 Subscript Superscript upper T Baseline bold g Subscript k plus 1 Baseline left-parenthesis bold x Subscript k plus 1 Baseline right-parenthesis right-bracket plus bold upper Q Subscript k Superscript negative 1 Baseline comma

      where bold upper Q Subscript k and bold upper R Subscript k are the process and measurement noise covariance matrices, respectively.

      The general formulation of the optimal nonlinear Bayesian filtering leads to a computationally intractable problem; hence, the Bayesian solution is a conceptual solution. Settling for computationally tractable suboptimal solutions through deploying different approximation methods has led to a wide range of classic as well as machine learning‐based filtering algorithms. Such algorithms have their own advantages, restrictions, and domains of applicability. To assess and compare such filtering algorithms, several performance metrics can be used including entropy, Fisher information, and PCRLB. Furthermore, the Fisher information matrix is used to define the natural gradient, which is helpful in machine learning.

      Конец ознакомительного фрагмента.

      Текст предоставлен ООО «ЛитРес».

      Прочитайте эту книгу целиком, купив полную легальную версию на ЛитРес.

      Безопасно оплатить книгу можно банковской картой Visa, MasterCard, Maestro, со счета мобильного телефона, с платежного терминала, в салоне МТС или Связной, через PayPal, WebMoney, Яндекс.Деньги, QIWI Кошелек, бонусными картами или другим удобным Вам способом.

/9j/4AAQSkZJRgABAQEBLAEsAAD/7Rz0UGhvdG9zaG9wIDMuMAA4QklNBAQAAAAAAAccAgAAAgAA ADhCSU0EJQAAAAAAEOjxXPMvwRihontnrcVk1bo4QklNBDoAAAAAAPcAAAAQAAAAAQAAAAAAC3By aW50T3V0cHV0AAAABQAAAABQc3RTYm9vbAEAAAAASW50ZWVudW0AAAAASW50ZQAAAABDbHJtAAAA D3ByaW50U2l4dGVlbkJpdGJvb2wAAAAAC3ByaW50ZXJOYW1lVEVYVAAAAAoAQQBkAG8AYgBlACAA UABEAEYAAAAAAA9wcmludFByb29mU2V0dXBPYmpjAAAADABQAHIAbwBvAGYAIABTAGUAdAB1AHAA AAAAAApwcm9vZlNldHVwAAAAAQAAAABCbHRuZW51bQAAAAxidWlsdGluUHJvb2YAAAAJcHJvb2ZD TVlLADhCSU0EOwAAAAACLQAAABAAAAABAAAAAAAScHJpbnRPdXRwdXRPcHRpb25zAAAAFwAAAABD cHRuYm9vbAAAAAAAQ2xicmJvb2wAAAAAAFJnc01ib29sAAAAAABDcm5DYm9vbAAAAAAAQ250Q2Jv b2wAAAAAAExibHNib29sAAAAAABOZ3R2Ym9vbAAAAAAARW1sRGJvb2wAAAAAAEludHJib29sAAAA AABCY2tnT2JqYwAAAAEAAAAAAABSR0JDAAAAAwAAAABSZCAgZG91YkBv4AAAAAAAAAAAAEdybiBk b3ViQG/gAAAAAAAAAAAAQmwgIGRvdWJAb+AAAAAAAAAAAABCcmRUVW50RiNSbHQAAAAAAAAAAAAA AABCbGQgVW50RiNSbHQAAAAAAAAAAAAAAABSc2x0VW50RiNQeGxAcsAAAAAAAAAAAAp2ZWN0b3JE YXRhYm9vbAEAAAAAUGdQc2VudW0AA