We have proved a theorem often called the “Weak Law of Large Numbers.” Most people’s intuition and our computer simulations suggest that, if we toss a coin a sequence of times, the proportion of heads will really approach 1/2; that is, if Sn is the number of heads in n times, then we will have
A_n=\frac{S_n}{n}\rightarrow \frac{1}{2}
as n →∞. Of course, we cannot be sure of this since we are not able to toss the coin an infinite number of times, and, if we could, the coin could come up heads every time. However, the “Strong Law of Large Numbers,” proved in more advanced courses, states that
P\left(\frac{S_n}{n} \rightarrow \frac{1}{2} \right)=1 .
Describe a sample space Ω that would make it possible for us to talk about the event
E=\left\{\omega :\frac{S_n}{n}\rightarrow \frac{1}{2} \right\} .
Take as Ω the set of all sequences of 0’s and 1’s, with 1’s indicating heads and 0’s indicating tails. We cannot determine a probability distribution by simply assigning equal weights to all infinite sequences, since these weights would have to be 0. Instead, we assign probabilities to finite sequences in the usual way, and then probabilities of events that depend on infinite sequences can be obtained as limits of these finite sequences. (See Exercise 28 of Chapter 1, Section 1.2.)