# Richard Feynman’s “Probability” – Dillon Carroll Christiaan Huygens (1629-95) was a Dutch mathematician and physicist. He is credited with publishing the first book on probability, “De ratiociniis in ludo aleae” (“On Reasoning in Games of Chance”).

In chapter 6 of The Feynman Lectures, Richard Feynman once again demonstrates why he was a master teacher. This chapter is on probability, and Feynman essentially encapsulates an entire field in eleven pages. Much detail is left out, true, but in one broad stroke, he paints an accurate picture of the essentials of probability and statistics. Starting with the very basics, Feynman logically guides the reader in a gradually increasing arc of complexity until ending the chapter by explaining one of the most famous concepts of physics, the Heisenberg uncertainty principle. That he can guide a reader from zero understanding of statistics to a decent understanding of such a fundamental component of quantum mechanics is a testament to the efficacy of The Feynman Lectures.

As the chapter begins, Feynman starts on a small scale He defines chance and goes on to common coin-toss problems. I was worried at this point, thinking the entire chapter was going to be covering such basics. All of us—even the humanities majors out there—have had it drilled into our minds that a coin has a 50% probability of landing on heads, and a 50% probability of landing on tails; and that the more times you flip the coin, the closer the outcomes will approach those probabilities. The reader must follow his arguments, however, because the importance of the first section lies in how he presents the information. For example, Feynman puts special emphasis on the fact that probability represents only a best guess—a model—of what will happen. If it’s a good guess, the experiments will validate our model of reality. When Feynman presents 100 trials of 30 coin-tosses apiece, the total number of heads was 1493/3000, or 0.498 of the total coin tosses. Thus, to say that each side of a coin has a 50% chance of landing face-up is a good approximation of what really happens.

There has been nothing too difficult yet. Feynman uses that discussion of coin tosses to segue into the more complicated topic of the binomial distribution. The binomial distribution is a way to model the probability of certain outcomes—in, say, a coin toss—relative to the number of trials (the actual coin tosses). Binomial distribution is something that I had never seen explained well before this chapter of Feynman’s lectures. An interesting iteration of “the random walk” is a drunkard’s walk home. In the scenario an inebriated man is attempting to go from point a to point b. However, on each successive step he has an equal probability of going in any direction. The question becomes: what is the most likely distance he will travel before arriving home (point b)? In this graph, a is the origin (0,0) and b is an arbitrary (x,y).

Section 6.3, titled “The random walk,” begins to be a bit more difficult. Here the idea is to look at the average deviation from the expected outcome. To continue with the easy to understand coin example, we want to know how many more or less heads we’ve flipped compared to the 50% we should expect. This gives the reader the idea of the deviation. If the deviation from the expectation is greater than  , (where N is the number of trials) then we can assume that either our expectations or our instruments were flawed (maybe they were “trick” coins, for example). Explaining where that   comes from is not easy, and here some of Feynman’s logic is hard to follow. Still, the logic is important to understand as it provides the basis for error calculation, Brownian motion of atoms and molecules, and (of course) the next section of the chapter.

The material here flows seamlessly into a discussion of probability density. Instead of a coin, we have an atom. This atom can not only fly off in either direction, but the length of each “step” it takes can vary, though it is usually around a certain value (Feynman uses an average step length of one for simplicity). Because of that, it is impossible to say where exactly the atom will be. Instead, the idea of a probability density allows us to know, after a certain number of steps, the probability that the atom will be around a certain location. Feynman ties the probability density in with integral calculus, which will make the material easier to understand for those who know what integrals are. Although we cannot concretely describe how atoms are structured, by gathering statistical data we can approximate their shape based on where their components are likely to be.

From there it is one final, short hop to Feynman’s last subject; the uncertainty principle. Knowing what a probability distribution is, a statement like ∆x*∆v ≥ h/4πm begins to make sense. We understand that the ∆ in the ∆x refers to how accurate that measurement is, and we appreciate the statistics that underlies such an austere equation. Of course, Feynman couldn’t discuss Heisenberg’s uncertainty principle without speaking on what the principle itself means. It describes, in Feynman’s words, “an inherent fuzziness that must exist in any attempt to describe nature.” If this is true, then the study of probability becomes the way in which one must understand the workings of the universe. In this chapter Feynman has given the reader a wonderful tool to begin understanding these workings. What strikes me the most about his exposition on probability is how well it flows from one topic to the next, each building on the previous components; it is both remarkably readable and instructive as a result.