Mechanical Engineering in Uncertainties From Classical Approaches to Some Recent Developments. Группа авторов. Читать онлайн. Newlib. NEWLIB.NET

Автор: Группа авторов
Издательство: John Wiley & Sons Limited
Серия:
Жанр произведения: Физика
Год издания: 0
isbn: 9781119817611
Скачать книгу

      As we have seen in previous sections, one of the major problems in modeling epistemic uncertainties resides in the determination of the probability distribution (or its bounding) that a certain physical quantity follows. However, in a large majority of cases, the uncertainty can at least be bounded. A range with very wide bounds can often be obtained by considering constraints of physical nature on the quantities of interest. For example, physical considerations make it possible to limit the Poisson ratio of a homogeneous isotropic material between –1 and 0.5. Even if relatively narrow, this range is not very informative. The opinion of an expert, on the other hand, could allow this range to be further significantly reduced. In our example, an expert could, for example, state that, for the material that we are considering, the Poisson ratio will be between 0.2 and 0.4. Note that, based on this information, we cannot characterize how the uncertainty varies within this range (are the boundary values as plausible as the center values?). Nonetheless, there are situations where a range is the only information available.

      Interval analysis can be used to model such cases where the uncertainty is provided by bounds on the quantity of interest. The uncertainty on the quantities x1, …, xn is then described by their lower bound

and their upper bound
. A lot of research work has then focused on the propagation of uncertainties from x1, …, xn to a quantity y = g(x1, …, xn), where g is any function modeling the relation between the input and output variables. The problem of determining the lower and upper bound on y is equivalent to solving a global optimization problem. Consequently, many global optimization algorithms can be applied to solve this problem (Hansen and Walster 2003). Note that in cases where the function g is monotonic over the range of variation of the input variables (namely, on the range domain), then the lower and upper bounds on y can be determined very efficiently: due to the monotonicity, y has merely to be evaluated at the vertices of the hypercube constituted by the bounds of the input variables. The lower and upper bounds are then necessarily the minimum and maximum among these points, respectively. This method is often known as the vertex method. Other techniques use Taylor expansions to approximate the bounds on the output variable. Note that sampling techniques, such as Monte-Carlo simulation, can also be used to solve this problem, but this quickly becomes prohibitively time-consuming and methods using global optimization algorithms are usually more efficient. For a review of the different algorithms for efficient interval analysis, the reader may refer to Kreinovich and Xiang (2008).

      A major disadvantage of interval approaches is the lack of a measure of uncertainty, similar to probability within the context of probability theory. In interval analysis, uncertainty is characterized by the boundaries of the interval only, but no information is available on the likelihood of different values within that interval. In probabilistic approaches, the likelihood of different values is characterized by the PDF, which defines the probability that the quantity of interest lies within a certain range (for example, an interval). Such a measure is not available in the interval approach.

      Interval arithmetic and associated uncertainty propagation methods will not be discussed in more detail because of the lack of an uncertainty measure in interval analysis. This reduces the usefulness of the method for reliability- or robustness-based applications where quantification of the risk associated with various decisions is required. Nevertheless, the interval approach is still useful in situations where only the worst case needs to be considered.

      While bounds on an interval may be sufficient to quantify epistemic uncertainty within the context of interval analysis, there are many situations where additional information is available, making it possible to refine uncertainty quantification.

      Triangular membership functions are also frequently used in fuzzy set theory. They allow a single value to be modeled as the most probable, as well as bounds outside which the quantity of interest cannot be located. More generally, any kind of membership functions can be used, but this is very rare, in practice, because of the difficulty of specifying them for concrete problems.

      As with previous approaches, a fundamental question concerns the propagation of uncertainties through a function g. As for any value α of the membership function, we can associate an interval; this is tantamount to ultimately performing interval analysis propagation at each level α. The methods discussed in the previous section apply in this way. Purely algebraic propagation is also possible for simple operations (additions, multiplications, powers). For a more detailed view of fuzzy set theory, the reader may refer to Zimmermann (2011).

      One of the major drawbacks of fuzzy set theory lies, as for interval analysis, in the absence of a measure of uncertainty, which is equivalent to probability in probability theory. Uncertainty propagation is carried out at a level α, but the link with a measure allowing the quantification of the risk associated with this level α is still missing. In order to remedy this, a measure of uncertainty called possibility has been introduced. This has led to possibility theory, which will be presented in the next section.

      1.7.1. Theoretical context

      Formally, possibility theory is defined on a possibility space Epos, defined as follows:

      DEFINITION 1.16.– Letbe a set and E the set of the subsets of Ω. A function Π : E → ℝ is called a possibility measure (or possibility distribution function [PoDF]) if it satisfies the following axioms:

       – the function has values between 0 and 1: ∀A ∈ E, 0 ≤ Π(A) ≤ 1;

       – the image of the empty set is 0: Π(∅) = 0;

       – the image of all the events in the universe is 1: Π(Ω) = 1;

       – the function is monotonic: ∀A, B ∈ E, if A ⊆ B then Π (A) ≤ Π(B);

       – the function is subadditive: .

      DEFINITION 1.17.– Letbe a set, called universe,