Now, the continuous‐time state‐space model in (2.88) and (2.89) can be treated as an LTV system. The discrete‐time equivalent of (2.88) and (2.89) is obtained as:
(2.90)
(2.91)
where is the solution to
(2.92)
with the initial condition:
(2.93)
when we set and . As before, is given by:
(2.94)
So far, this chapter has been focused on studying the observability of deterministic systems. Section 2.7 discusses the observability of stochastic systems.
2.7 Observability of Stochastic Systems
Before proceeding with defining observability for stochastic systems, we need to recall a few concepts from information theory [26]:
Definition 2.3 Entropy is a measure of our uncertainty about an event in Shannon's information theory. Specifically, the entropy of a discrete random vector with alphabet is defined as:
and correspondingly, for a continuous random vector , we have:
Entropy can also be interpreted as the expected value of the term :
(2.97)
where is the expectation operator and is the probability density function (PDF) of . Definition of Shannon's entropy, , shows that it is a function of the corresponding PDF. It will be insightful to examine the way that this information measure is affected by the shape of the PDF. A relatively broad and flat PDF, which is associated with lack of predictability, has high entropy. On the other hand, if the PDF is relatively narrow and has sharp slopes around a specific value of , which is associated with bias toward that particular value of , then the PDF has low entropy. A rearrangement of the tuples