href="#fb3_img_img_5ddac734-b248-5ae9-b869-14c2d9a54966.png" alt="images"/> are all nonnegative, and their sum is 1. We may think of
as observed weights or measures of occurrence of
obtained on the basis of an experiment consisting of a large number of repeated trials. If the entire experiment is repeated, another set of
f's would occur with slightly different values, and so on for further repetitions. If we think of indefinitely many repetitions, we can conceive of idealized values being obtained for the
f's. It is impossible, of course, to show that in a physical experiment, the
f's converge to limiting values, in a strict mathematical sense, as the number of trials increases indefinitely. So we postulate values
corresponding to the idealized values of
, respectively, for an indefinitely large number of trials. It is assumed that
are all positive numbers and that
(3.3.1)
The quantities are called probabilities of occurrence of , respectively.
Now suppose that E is any event in S that consists of a set of one or more e's, say . Thus . The probability of the occurrence of E is denoted by and is defined as follows:
If E contains only one element, say , it is written as
It is evident, probabilities of events in a finite sample space S are values of an additive set function defined on sets E in S, satisfying the following conditions:
1 If E is any event in S, then(3.3.2a)
2 If E is the sample space S itself, then(3.3.2b)
3 If E and F are two disjoint events in S, then(3.3.2c)
These conditions are also sometimes known as axioms of probability. In the case of an infinite sample space S, condition 3 extends as follows:
if is an infinite sequence of disjoint events, then
(3.3.2d)
As and are disjoint events, then from condition 3, we obtain
(3.3.3)
But since and , we have the following:
Theorem 3.3.1 (Rule of complementation) If E is an event in a sample space S, then
(3.3.4)
The law of complementation provides a simple method of finding the probability of an event , if E is an event whose probability is easy to find. We sometimes say that the odds in favor of E are
(3.3.4a)
which from 3.3.4 takes the form . The reader may note that .
Example 3.3.1 (Tossing coins) Suppose that 10 coins are tossed and we ask for the probability of getting at least 1 head. In this example, the sample space S has sample points. If the coins are unbiased, the sample points are equally likely (sample points are called equally likely if each sample point has the same probability of occurring), so that to each of the sample points the probability is assigned. If we denote by E the event of getting no heads, then E contains only one sample point, and , of course, has 1023 sample points. Thus
The odds on E and are clearly and .
Referring to the statement in Theorem 3.3.1 that and are disjoint events whose union is S, we have the following rule.
Theorem 3.3.2 (General rule of complementation) If are events in a sample space S, then we have
(3.3.5)
Another useful result follows readily from (3.3.2c) by mathematical induction
Theorem 3.3.3 (Rule of addition of probabilities for mutually exclusive events) If are disjoint events in a sample space S, then
(3.3.6)
Example 3.3.2 (Determination of probabilities