Probability models are mathematical frameworks that help us think about chance processes whose outcomes cannot be predicted with certainty. We begin with one of the simplest types of probability models, in which the set of possible outcomes of a chance process is finite.
The next definition extends the assignments of probabilities to subsets of \(\Omega\text{.}\) Recall that the power set\(\mathcal{P}(S)\) of a set \(S\) is the set of all subsets of \(S\text{.}\)
Comment on notation: For a single element \(\omega\) in \(\Omega\text{,}\) it is common practice to write \(P(\omega)\) as a shorthand for \(P(\{\omega\})\text{.}\)
A finite probability model is a pair \((\Omega,P)\text{,}\) where \(\Omega\) is a finite set, and where \(P\) is a finite probability measure on \(\Omega\text{.}\) The set \(\Omega\) is called the probability space of the model. Elements of \(\Omega\) are called outcomes, and elements of \(\mathcal{P}(\Omega)\) are called events.
Write the probability function for the probability space \(\Omega=\{a,b,c\}\) where outcome \(b\) is twice as likely as outcome \(a\text{,}\) and outcome \(c\) is three times as likely as outcome \(b\text{.}\)
Let \((\Omega,P)\) be a finite probability model. The complement \(A^c=\Omega\setminus A\) of an event \(A\) is also called the opposite of \(A\text{.}\) Disjoint events \(A,B\) (that is, \(A\cap B=\emptyset)\) are also called mutually exclusive. Given events \(U,V\) with \(P(V)\neq 0\text{,}\) the conditional probability of \(U\) given \(V\), denoted \(P(U|V)\text{,}\) is defined to be
Let \(\Omega=\{a,b,c\}\text{,}\) let \(U=\{a,b\}\text{,}\) and let \(V=\{b,c\}\text{.}\) Give an example of a probability function on \(\Omega\) for which \(U,V\) are independent. Give an example of a probability function on \(\Omega\) for which \(U,V\) are dependent.
Let \((\Omega,P)\) be a finite probability model with probability function \(p\text{,}\) and let \(n\) be a positive integer. Let \(p_{\Omega^n}\colon
\Omega^n\to [0,1]\) be defined by
It is easy to check (see CheckpointΒ 2.9 below) that \(p_{\Omega^n}\) is a probability function. Let \(P_{\Omega^n}\) denote the corresponding probability measure. The probability model \((\Omega^n,P_{\Omega^n})\) is called the space of (random) samples of size \(n\) taken from the space \(\Omega\text{.}\) The space \(\Omega^n\) models the outcomes that are obtained by \(n\) repetitions of the chance process that produces outcomes in \(\Omega\text{.}\)
Show that \(p_{\Omega^n}\) is a probability function. Let \(E,F\) be events in \(\Omega\text{,}\) and let \(E',F'\) be the events \(E'=\{\vec{\omega}\colon\omega_j\in E\}\) and \(F'=\{\vec{\omega}\colon\omega_k\in F\}\text{.}\) Show that \(E',F'\) are independent in \(\Omega^n\) if \(j\neq k\text{.}\)
Let \((\Omega,P)\) be a finite probability space with \(N=|\Omega|\text{,}\) and with the constant probability function \(p(\omega)=\frac{1}{N}\) for all \(\omega\in \Omega\text{.}\) Let \(\Omega^{n\ast}\) denote the set of all one-to-one sequencesβ1β
We use the nonstandard notation \(\Omega^{n\ast}\text{,}\) rather than the standard notation \(P_n(\Omega)\text{,}\) to denote the set of permutations of \(n\) elements of \(\Omega\text{,}\) in order to avoid confusion with probability measures, which are also denoted using capital letter P.
in \(\Omega\) of size \(n\text{.}\) An element \((\omega_1,\omega_2,\ldots,\omega_n)\) of \(\Omega^{n\ast}\) is called a simple random sample of size \(n\) taken from a probability space \(\Omega\text{.}\) Let \(p_{\Omega^{n\ast}}\) be given by the constant function
for all \((\omega_1,\omega_2,\ldots,\omega_n)\in
\Omega^{n\ast}\text{.}\) It is easy to check (see CheckpointΒ 2.10 below) that \(p_{\Omega^n}\) is a probability function. Let \(P_{\Omega^{n\ast}}\) denote the corresponding probability measure. The probability model \((\Omega^n,P_{\Omega^{n\ast}})\) is called the space of simple random samples of size \(n\) taken from the space \(\Omega\) (or the space of samples of size \(n\) taken from \(\Omega\) without replacement).
Show that \(p_{\Omega^{n\ast}}\) is a probability function. Let \(E\) be an event in \(\Omega\text{,}\) and let \(E',F'\) be the events \(E'=\{\vec{\omega}\colon\omega_j\in E\}\text{,}\)\(F'=\{\vec{\omega}\colon\omega_k\in E\}\text{,}\) Show that \(E',F'\) are dependent in \(\Omega^{n\ast}\) if \(j\neq k\text{.}\)
Bayesβ Rule can be viewed as a practical method for finding conditional probabilities. For events \(A,B\) in a probability space \(\Omega\text{,}\) where both probabilities \(P(A),P(B)\) are nonzero, the definition of conditional probability (2.3) gives us two ways to write \(P(A\cap B)\text{,}\) namely
A use case for this version of Bayesβ Rule is a problem in which you know the probabilities in the expression on the right, and you wish to find the probability on the left. In effect, this allows you to use \(P(A|B)\) to find \(P(B|A)\text{.}\)
In a more nuanced form of Bayesβ Rule, the event \(B\) is one of of a collection of events \(B_1,B_2,\ldots,
B_r\) that form a partition of \(\Omega\text{,}\) say, \(B=B_k\text{.}\) In this case, we have