Regarding (3.42), this estimate asymptotically approaches the true value of the input [35].
3.6 Concluding Remarks
Observers are dynamic processes, which are used to estimate the states or the unknown inputs of linear as well as nonlinear dynamic systems. This chapter covered the Luenberger observer, the extended Luenberger‐type observer, the sliding‐mode observer, and the UIO. In addition to the mentioned observers, high‐gain observers have been proposed in the literature to handle uncertainty. Although the deployed high gains in high‐gain observers allow for fast convergence and performance recovery, they amplify the effect of measurement noise [40]. Hence, there is a trade‐off between fast state reconstruction under uncertainty and measurement noise attenuation. Due to this trade‐off, in the transient and steady‐state periods, relatively high and low gains are used, respectively. However, stochastic approximation allows for an implementation of the high‐gain observer, which is able to cope with measurement noise [41]. Alternatively, the bounding observer or interval observer provides two simultaneous state estimations, which play the role of an upper bound and a lower bound on the true value of the state. The true value of the state is guaranteed to remain within these two bounds [42].
4 Bayesian Paradigm and Optimal Nonlinear Filtering
4.1 Introduction
Immanuel Kant proposed the two concepts of the noumenal world and the phenomenal world. While the former is the world of things as they are, which is independent of our modes of perception and thought, the latter is the world of things as they appear to us, which depends on how we perceive things. According to Kant, everything about the noumenal world is transcendental that means it exists but is not prone to concept formation by us [43].
Following this line of thinking, statistics will aim at interpretation rather than explanation. In this framework, statistical inference is built on probabilistic modeling of the observed phenomenon. A probabilistic model must include the available information about the phenomenon of interest as well as the uncertainty associated with this information. The purpose of statistical inference is to solve an inverse problem aimed at retrieving the causes, which are presented by states and/or parameters of the developed probabilistic model, from the effects, which are summarized in the observations. On the other hand, probabilistic modeling describes the behavior of the system and allows us to predict what will be observed in the future conditional on states and/or parameters [44].
Bayesian paradigm provides a mathematical framework in which degrees of belief are quantified by probabilities. It is the method of choice for dealing with uncertainty in measurements. Using the Bayesian approach, probability of an event of interest (state) can be calculated based on the probability of other events (observations or measurements) that are logically connected to and therefore, stochastically dependent on the event of interest. Moreover, the Bayesian method allows us to iteratively update probability of the state when new measurements become available [45]. This chapter reviews the Bayesian paradigm and presents the formulation of the optimal nonlinear filtering problem.
4.2 Bayes' Rule
Bayes' theorem describes the inversion of probabilities. Let us consider two events
(4.1)
Considering two random variables
(4.2)
where
4.3 Optimal Nonlinear Filtering
The following discrete‐time stochastic state‐space model describes the behavior of a discrete‐time nonlinear system: