Today I'm starting a new series on Probability. I will be using Durrett and writing out random (heh) thoughts in hopefully some sort of order. Today I want to talk about random variables and the law of large numbers.
Here's something fun: the word random does not appear anywhere in the definition of random variables. Instead, we first define a probability space omega that contains all possible outcomes. The map from omega to the real numbers R is called a random variable. I know, this surprised me when I first read it too. The randomness is not in the function X, but in which outcome actually occurs. The function itself is completely deterministic. Once ω is fixed, X(ω) is fixed.
This is the first conceptual shift: a random variable is not random. It is a way of reading randomness.
When we talk about repeated experiments: say flipping a coin, we often imagine doing it one flip at a time. But mathematically, it’s cleaner to imagine that all flips are already encoded in a single outcome and all possible subsets of outcomes are already written out. Now consider the average of the first n flips. This is also just a function. It takes the entire sequence ω and returns a number. The Law of Large Numbers says something crazy: For “almost every” ω, this average converges to the expected value.
It is not saying that randomness settles down. It is saying that most sequences in Ω already have this regularity built into them. The convergence is not happening in time. It is a statement about the structure of the space of all possible outcomes. The phrase almost every means: except for a set of outcomes with probability zero. So there do exist bizarre sequences where the average doesn’t converge. They’re not forbidden. They’re just so rare that probability assigns them zero weight.
This is the second conceptual shift: probability is not about what is possible, but about what is typical.
Something to think about before you sleep tonight!
No comments:
Post a Comment