You are here

An Introduction to Measure-Theoretic Probability

George G. Roussas
Academic Press
Publication Date: 
Number of Pages: 
[Reviewed by
Michael Berg
, on

The first thing you might notice about the book under review is its captivating cover: none other than Constantin Carathéodory stares at you from the shadows, from a formal portrait, with his name signed right above the title. Of course, the next thing you notice is that there appears to a redundancy in this title: isn’t probability by its very nature measure theoretic? In fact, when I was a student in the dark ages, by no means in the area of probability, my friends (in number theory and algebra, generally) tended to refer to probability as “measure theory with total measure one,” and I fully bought into that perspective. Well, with age comes wisdom, or at least that’s how things are supposed to be, and I guess I can now make the more mature judgment that the flippancy of the foregoing characterization testifies to the goofiness of youth: the flavor of probability is unique, and even if its fundamental tools are analytic, its character is uniquely its own. For one thing, such mainstays as the central limit theorem and the law of large numbers are idiosyncratic probabilistic beasts, and in the present text these huge creatures piggyback: Roussas introduces the weak law of large numbers in an exercise on p. 92, goes at it formally on p. 223 ff. in order to get to the central limit theorem, proceeds on p. 239 to a serious coverage of the broad central limit problem (Ch. 12: the centered case; Ch. 13: (what else?) the non-centered case), and then, on p. 289 (Ch. 14), begins to develop material in preparation for the strong law of large numbers (Theorem 7 of Ch. 14, with credit given to (who else?) Kolmogorov).

What this attests to is the fact that Roussas employs a holistic pedagogical style in developing this extensive subject, and this is borne out by his remarks in the book’s preface: “it is an excursion in measure-theoretic probability with the objective of introducing the student to the basic tools in measure theory and probability as they are commonly used in statistics, mathematics, and other areas employing this moderately advanced mathematical machinery.” It is also correspondingly borne out by the arrangement of the book’s chapters. Chapter 1 is foundational: measure theory and random variables, and the culmination a theorem providing for pointwise approximation of random variables by sequences of simple random variables. Chapter 2 amplifies measure theory, playing measures against point functions. Chapter 3 hits the important theme of sequences of random variables. Chapter 4 gives us integration of a random variable relative to a measure. Chapter 5 … Well, lest I just present you with a parroting of the preface, please consult the book’s table of contents, above. On the other hand, I do want to stress that the to-be-expected named theorems (Lebesgue’s monotone and dominated convergence theorems, Fatou, Fubini, etc.) appear in Chapter 5, while Chapter 6 gives us the major inequalities (Hölder, Cauchy-Schwarz, Minkowski, for example), and in Chapter 7, we get Radon-Nikodym (and more). The suggestion is that in a way, my friends were right, at least up to a point: so much of the material in this part of the book is concerned with what I encountered once upon a time Royden and Rudin.

But the game changes measurably (Hah! My friends were wrong!) around Chapter 8, when Roussas hits us with distribution functions, and then, in the next chapters, goes on to develop, first, “the concept of conditional expectation of a random variable in an abstract setting: the concept of a conditional probability then follows as a special case”; second, he hits independence of random variables (“the expectation of the product of independent random variables is the product of the individual expectations” — how often don’t we hammer away at this in the profoundly more prosaic setting of introductory prob/stat courses?!). There is a lot more to follow, of course: there are still a number of chapters to go. But we’ve actually come to the stage where the earlier remarks about the law(s) of large numbers, the central limit theorem, and the looming figure of Kolmogorov, casting a very long shadow, apply. Suffice it to say that we’ve reached some very serious probability theory and Roussas makes no bones about it: he describes his last chapter as “a very brief introduction to an important class of discrete parameter stochastic processes — stationary and ergodic or nonergodic processes — with a view toward proving the … Ergodic Theorem.” And he adds, a propos, that “one direction of the Kolmogorov Strong Law of Large Numbers is a special case of the Ergodic Theorem …” With the advent of the Ergodic Theorem, a pun is de rigeur: we are truly stirring in deep waters!

Well, it’s clear, then, that this book is serious probability theory, covering a number of bases: It is a very thorough discussion of many of the pillars of the subject, showing in particular how “measure theory with total measure one” is just the tip of the iceberg, and it is a voyage from a relatively prosaic point of departure (a beginning graduate student might book passage) to some pretty sophisticated destinations replete with visits to Kolmogorov. It’s quite a book.

And, by the way, I particularly like the book’s layout and printing: the pages have very broad margins asking to be filled with notes, doodles, sketches of solutions, queries, questions, and perhaps even an occasional emotional outburst. Furthermore, with Roussas having based the book on a lot of experience in the classroom (Wisconsin at Madison, UC Davis), there are abundant problem sets to be tackled, and, as an additional bit of sound pedagogy, the chapters start with a short description of what’s coming up. This makes the book exceptionally easy to read. To digest it, however, will require hard work — as it should — and that’s what the wide margins are for, as is coffee, of course, to indulge once again in a stirring ergodic pun that’s been percolating for a while. 

Michael Berg is Professor of Mathematics at Loyola Marymount University in Los Angeles, CA.

1. Certain Classes of Sets, Measurability, Pointwise Approximation
2. Definition and Construction of a Measure and Its Basic Properties
3. Some Modes of Convergence of a Sequence of Random Variables and Their Relationships
4. The Integral of a Random Variable and Its Basic Properties
5. Standard Convergence Theorems, The Fubini Theorem
6. Standard Moment and Probability Inequalities, Convergence in the r-th Mean and Its Implications
7. The Hahn-Jordan Decomposition Theorem, The Lebesgue Decomposition Theorem, and The Radon-Nikcodym Theorem
8. Distribution Functions and Their Basic Properties, Helly-Bray Type Results
9. Conditional Expectation and Conditional Probability, and Related Properties and Results
10. Independence
11. Topics from the Theory of Characteristic Functions
12. The Central Limit Problem: The Centered Case
13. The Central Limit Problem: The Noncentered Case
14. Topics from Sequences of Independent Random Variables
15. Topics from Ergodic Theory