Skip to the Main Content

Note:These pages make extensive use of the latest XHTML and CSS Standards. They ought to look great in any standards-compliant modern browser. Unfortunately, they will probably look horrible in older browsers, like Netscape 4.x and IE 4.x. Moreover, many posts use MathML, which is, currently only supported in Mozilla. My best suggestion (and you will thank me when surfing an ever-increasing number of sites on the web which have been crafted to use the new standards) is to upgrade to the latest version of your browser. If that's not possible, consider moving to the Standards-compliant and open-source Mozilla browser.

December 8, 2019

Random Permutations (Part 8)

Posted by John Baez

Last time we saw a strong connection between random permutations and Poisson distributions. Thinking about this pulled me into questions about partitions and the moments of Poisson distributions. These may be a bit of a distraction — but maybe not, since every permutation gives a partition, namely the partition into cycles.

Here’s a good puzzle to start with:

Puzzle 11. Suppose raindrops are falling on your head, randomly and independently, at an average rate of one per minute. What’s the average of the cube of the number of raindrops that fall on your head in one minute?

To get started we need a bit of probability theory. Suppose random events are happening at an average rate of one per minute, in a nice independent way so that the history so far doesn’t affect the rate of new events now. Let P(k,t)P(k,t) be the probability that kk events have happened between time 00 and time tt. Then:

ddtP(k,t)=P(k1,t)P(k,t) \frac{d}{d t} P(k,t) = P(k-1,t) - P(k,t)

since if k1k-1 events have happened an extra event will bring the total to kk, while if kk events have already happened an extra one will make the total stop being kk.

Now, it takes a bit of work to figure out P(k,t)P(k,t) given this equation and the obvious initial conditions

P(k,0)={1 if k=0 0 if k>0P(k,0) = \left\{\begin{array}{ccl} 1 &\mathrm{if}& k = 0 \\ 0 &\mathrm{if}& k \gt 0 \end{array}\right.

But if someone tells you the answer, you just need elementary calculus to check that it’s right:

P(k,t)=t kk!e t P(k,t) = \frac{t^k}{k!} e^{-t}

So, let’s be happy someone has already figured out the answer!

Back to our raindrop problem. Here the time tt is 1 minute, so the probability of kk raindrops landing on your head in this time is

1k!e 1 \frac{1}{k!} e^{-1}

Sanity check: these probabilities sum to 1.

We’re trying to figure out the expected value of the cube of the number of raindrops that land on your head in a minute. We now know this is

1e k=0 k 3k! \frac{1}{e} \sum_{k=0}^\infty \frac{k^3}{k!}

So this is the sum we need to do. But being mathematicians, let’s generalize. Let’s figure out the expected value of the nnth power of the number of raindrops that land on your head in a minute:

1e k=0 k nk! \frac{1}{e} \sum_{k=0}^\infty \frac{k^n}{k!}

The sum here looks like a mutant version of

k=0 n kk!=e n \sum_{k=0}^\infty \frac{n^k}{k!} = e^n

But I mention this only so you don’t get confused and take a wrong turn!

To compute

1e k=0 k nk! \frac{1}{e} \sum_{k=0}^\infty \frac{k^n}{k!}

it’s nice to express the powers k nk^n in terms of ‘falling powers’:

k n̲=k(k1)(k2)(kn+1) k^{\underline{n}} = k (k-1) (k-2) \cdots (k - n + 1)

The power k nk^n counts the number of functions from nn to kk, where now I’m identifying a natural number with a set having that number of elements. Similarly, the falling power k n̲k^{\underline{n}} counts the number of injections from nn to kk. Why? With an injection, the first element of nn has kk choices of where to go, but the next one has only k1k-1, and so on.

There’s a nice formula expressing ordinary powers as linear combinations of falling powers:

k n= i=0 k{n i}k i̲ k^n = \sum_{i = 0}^k \left\{ \begin{array}{c} n \\ i \end{array} \right\} \; k^{\underline{i}}

Here {n i}\left\{ \begin{array}{c} n \\ i \end{array} \right\} is just the number of partitions of nn into ii parts.

This formula has a wonderfully slick proof. Any function f:nkf \colon n \to k factors as a surjection followed by an injection:

nfimfk n \stackrel{f}{\to} \mathrm{im} f \hookrightarrow k

and the surjection gives a partition of nn into ii parts, where ii is the number of points in the image imf\mathrm{im} f. Conversely a partition of nn into ii parts together with an injection iki \hookrightarrow k determines a function f:nkf \colon n \to k. So, the number of functions f:nkf \colon n \to k is the sum over 0in0 \le i \le n of the number of partitions of nn into ii parts times the number of injections iki \hookrightarrow k. And that’s all the formula says!

So, the expected value of the nnth power of the number of raindrops that fall on your head is

1e k=0 k nk!=1e k=0 i=0 k{n i}k i̲k! \frac{1}{e} \sum_{k=0}^\infty \frac{k^n}{k!} \; = \; \frac{1}{e} \sum_{k=0}^\infty \sum_{i = 0}^k \left\{ \begin{array}{c} n \\ i \end{array} \right\} \, \frac{k^{\underline{i}}}{k!}

But in fact we can simplify this! Since there are no injections iki \hookrightarrow k when i>ki \gt k, the falling power vanishes in this case, so our sum is just

1e i,k=0 {n i}k i̲k! \frac{1}{e} \sum_{i,k=0}^\infty \left\{ \begin{array}{c} n \\ i \end{array} \right\} \, \frac{k^{\underline{i}}}{k!}

And now we can do the sum over kk first. Since

k=0 k i̲k!= k=0 k(k1)(ki+1)k(k1)1= k=i 1(ki)!= j=0 1j!=e \sum_{k = 0}^\infty \frac{k^{\underline{i}}}{k!} \; = \; \sum_{k = 0}^\infty \frac{k(k-1)\cdots (k-i+1)}{k(k-1) \cdots 1} \; = \; \sum_{k = i}^\infty \frac{1}{(k-i)!} \; = \; \sum_{j = 0}^\infty \frac{1}{j!} \; = \; e

we get

1e i,k=0 {n i}k i̲k!= i=0 {n i} \frac{1}{e} \sum_{i,k=0}^\infty \left\{ \begin{array}{c} n \\ i \end{array} \right\} \, \frac{k^{\underline{i}}}{k!} \; = \; \sum_{i = 0}^\infty \left\{ \begin{array}{c} n \\ i \end{array} \right\}

which is the total number of partitions of an nn-element set!

In our original question we were interested in n=3n = 3. There are 5 partitions of a 3-element set:

{{1,2,3}}, \{\{1,2,3\}\}, {{1,2},{3}},{{2,3},{1}},{{3,1},{2}}, \{\{1,2\}, \{3\}\}, \; \{\{2,3\}, \{1\}\}, \; \{\{3,1\}, \{2\}\}, {{1},{2},{3}} \{\{1\}, \{2\}, \{3\}\}

So, the expected value of the cube of the number of raindrops that land on your head in a minute is 5.

Now that we’re done, let’s reflect on what happened, and introduce more jargon so you can read more about the concepts involved.

The function

P(k,t)=t kk!e t P(k,t) = \frac{t^k}{k!} e^{-t}

is called a Poisson distribution. It’s the answer whenever you want to know the probability of a given number kk of events occurring in some amount of time tt if these events occur with a constant average rate, set here to 11 for convenience, independently of the time since the last event.

The parameter tt is the mean of the Poisson distribution. We worked out the nnth moment of the Poisson distribution with mean 11. In other words, we worked out the expected value of k nk^n in this case:

k=0 k nP(k,1)=1e k=0 k nk! \sum_{k = 0}^\infty k^n P(k,1) \; = \; \frac{1}{e} \sum_{k = 0}^\infty \frac{k^n}{k!}

We got the number of partitions of an nn-element set, which is called the Bell number B nB_n, after the author of a fantasy novel about a planet where all mathematics is done by men.

This result is called Dobiński’s formula:

1e k=0 k nk!=B n \frac{1}{e} \sum_{k = 0}^\infty \frac{k^n}{k!} = B_n

Dobiński published a paper about it in 1877, but it seems to me that he only proved some special cases:

The number of partitions of an nn-element set into ii parts, {n i}\left\{ \begin{array}{c} n \\ i \end{array} \right\} , is called a Stirling number of the second kind. Given the definitions it’s obvious that

i=0 {n i}=B n \sum_{i = 0}^\infty \left\{ \begin{array}{c} n \\ i \end{array} \right\} = B_n

(We can stop the sum at i=ni = n without changing its value.)

The identity

k n= i=0 k{n i}k i̲ k^n = \sum_{i = 0}^k \left\{ \begin{array}{c} n \\ i \end{array} \right\} \, k^{\underline{i}}

is more interesting, since it connects Stirling numbers of the second kind to falling powers, which are also known as falling factorials, falling sequential products, and various other things. The proof I gave comes from here:

Category theorists will say it relies on the uniqueness of the epi-mono factorization of a function between finite sets. Rota doesn’t use these terms, but he talks about the ‘kernel’ of a function, which is his name for the epi in this factorization.

In this paper Rota goes ahead and proves Dobiński’s formula, but he does not explicitly discuss its connection to the Poisson distribution. Someone on Wikipedia translated his argument into the language of probability, which I like. Unfortunately the Wikipedia article merely reduces Dobiński’s formula to the identity

1e k=0 k i̲k!=1 \frac{1}{e} \sum_{k = 0}^\infty \frac{k^{\underline{i}}}{k!} = 1

and stops there, remarking that the kkth factorial moment of the Poisson distribution of mean 11 equals 11. Factorial moments are like ordinary moments but with the power k ik^i replaced by the falling power k i̲k^{\underline{i}}.

I was frustrated to see the argument fizzle out before it ended. Then I saw why

k=0 k i̲k!=e \sum_{k = 0}^\infty \frac{k^{\underline{i}}}{k!} = e

It follows from some facts about species and groupoid cardinality.

If you have a species, namely a functor FF from the groupoid of finite sets to the category of sets, you can think of it as describing a type of structure you can put on finite sets. For each natural number kk, F(k)F(k) is the set of structures of this type that you can put on a kk-element set. The species has a generating function

f(x)= k=0 |F(k)|k!x k f(x) = \sum_{k = 0}^\infty \frac{|F(k)|}{k!} x^k

and the number f(1)f(1) has a nice meaning if the sum involved is well-defined. The category of elements F\int F has finite sets equipped with the given structure as objects and bijections preserving the structure as morphisms. This category F\int F is a groupoid, and the number f(1)f(1) is just the groupoid cardinality of F\int F.

Now fix a finite set ii. There’s a species FF that assigns to each finite set kk the set of all injections iki \hookrightarrow k. The generating function of this species is

f(x)= k=0 k i̲k!x k f(x) = \sum_{k = 0}^\infty \frac{k^{\underline{i}}}{k!} x^k

Thus,

f(1)= k=0 k i̲k! f(1) = \sum_{k = 0}^\infty \frac{k^{\underline{i}}}{k!}

is the cardinality of the groupoid F\int F whose objects are finite sets equipped with an inclusion of the set ii. But this groupoid is equivalent to the groupoid of finite sets and bijections! Why? Because the morphisms in F\int F are bijections that preserve the inclusion of the set ii. Thus, points in the image of this inclusion have no ‘freedom of movement’, and our morphism boils down to an arbitrary bijection between the remaining points.

Since F\int F is equivalent to the groupoid of finite sets, it must have the same cardinality, namely

k=0 1k!=e \sum_{k = 0}^\infty \frac{1}{k!} = e

So, we get

k=0 k i̲k!=e \sum_{k = 0}^\infty \frac{k^{\underline{i}}}{k!} = e

Later, Akiva Weinberger pointed out to me that this identity can be shown by a simple reindexing:

k=0 k i̲k!= k=i 1(ki)!= j=0 1j!=e \sum_{k = 0}^\infty \frac{k^{\underline{i}}}{k!} \; = \; \sum_{k = i}^\infty \frac{1}{(k-i)!} \; = \; \sum_{j = 0}^\infty \frac{1}{j!} \; = \; e

This is precisely a decategorified version of my observation that F\int F is equivalent to the groupoid of finite sets.

By the way, Dobiński’s formula, written this way:

k=0 k nk!=eB n \sum_{k = 0}^\infty \frac{k^n}{k!} = e B_n

says the cardinality of the groupoid of finite sets equipped with a function from a fixed finite set is ee times the number of partitions of that set. After a while this starts to seem obvious, if you keep thinking about epi-mono factorizations.

So groupoid cardinality establishes a nice connection between Poisson distributions, partitions, and injections. I doubt I’ve gotten to the bottom of it, since I haven’t really seen why the combinatorics is connected to Poisson processes. I’m also not sure how much all this will help me with permutations. But it was fun.

Posted at December 8, 2019 9:55 PM UTC

TrackBack URL for this Entry:   https://golem.ph.utexas.edu/cgi-bin/MT-3.0/dxy-tb.fcgi/3174

10 Comments & 0 Trackbacks

Re: Random Permutations (Part 8)

While probably all your regular audience here knows, it may be appropriate to mention for the benefit of passersby that Bell didn’t realise, or at least didn’t acknowledge, that his work was a fantasy. (I like the description, though!)

Posted by: L Spice on December 9, 2019 2:36 PM | Permalink | Reply to this

Re: Random Permutations (Part 8)

Sure, I don’t mind clarifying for anyone who doesn’t know it’s a joke. Eric Temple Bell’s famous Men of Mathematics was helpful to me as a boy in the deep pre-internet era, trying to imagine what mathematicians were like. Only later did I realize how over-romanticized and inaccurate it was, and the sexist title came to disgust me. This edition even has “Men” in bright red:

The mathematician Clifford Truesdell wrote:

…[Bell] was admired for his science fiction and his Men of Mathematics. I was shocked when, just a few years later, Walter Pitts told me the latter was nothing but a string of Hollywood scenarios; my own subsequent study of the sources has shown me that Pitts was right, and I now find the contents of that still popular book to be little more than rehashes enlivened by nasty gossip and banal or indecent fancy.

Rebecca Goldstein described the mathematician characters in Men of Mathematics thus:

Some of these people sounded as if they had to be changelings, non-human visitors from some other sphere, with powers so prodigious they burst the boundaries of developmental psychology, lisping out profundities while other children were playing with their toes.

Posted by: John Baez on December 9, 2019 6:33 PM | Permalink | Reply to this

Re: Random Permutations (Part 8)

Typo: the nn after the first equation should be a kk.

Posted by: Blake Stacey on December 9, 2019 4:33 PM | Permalink | Reply to this

Re: Random Permutations (Part 8)

Thanks! Fixed! Indecision about what letter to use for what, many revisions…

Posted by: John Baez on December 9, 2019 6:33 PM | Permalink | Reply to this

Re: Random Permutations (Part 8)

Ignacio Larrosa Cañestro points out the charming similarity between Dobiński’s formula

B n=1e k=0 k nk! B_n = \frac{1}{e} \sum_{k = 0}^\infty \frac{k^n}{k!}

and the generating function for partitions

e e x1= n=0 B nn!x n e^{e^x - 1} = \sum_{n = 0}^\infty \frac{B_n}{n!} x^n

Combining them, we get this:

e e x= n,k=0 k nn!k!x n e^{e^x} = \sum_{n,k = 0}^\infty \frac{k^n}{n! k!} x^n

What’s the best way to think about all this?

Posted by: John Baez on December 9, 2019 6:44 PM | Permalink | Reply to this

Re: Random Permutations (Part 8)

I seem to be very typo prone today, but your last identity is just: e e x= k 0 e kxk!= k=0 1k! n=0 (kx) nn!e^{e^x} = \sum_{k^0}^{\infty} \frac{e^{k x}}{k!} = \sum_{k=0}^{\infty} \frac{1}{k!} \sum_{n=0}^{\infty} \frac{(k x)^n}{n!} Dividing both sides by ee and extracting the coefficient of x nx^n does seem like a slick proof of Dobinski’s formula.

Posted by: David E Speyer on December 9, 2019 9:18 PM | Permalink | Reply to this

Re: Random Permutations (Part 8)

That’s great! I’m kind of glad I didn’t think of it, since the longwinded proof I wrote down here taught me a lot of interesting things. But this is clearly the most efficient way to prove Dobiński’s formula. I’ve added it to Wikipedia. Now I should find a reference, so I don’t get accused of “original work”.

I also fixed up the proof that was already there.

Posted by: John Baez on December 10, 2019 6:54 AM | Permalink | Reply to this

Re: Random Permutations (Part 8)

I need an information about G. Dobiński’s biography and his photo.

Posted by: Olga Belova on December 10, 2019 1:52 PM | Permalink | Reply to this

Re: Random Permutations (Part 8)

I have no information about G. Dobiński. I’m curious to know who he was. If anyone has information (or a photo), please post a link to it here!

Posted by: John Baez on December 11, 2019 1:48 AM | Permalink | Reply to this

Re: Random Permutations (Part 8)

For the combinatorics using partitions these might be helpful:

https://rolandspeicher.com/lectures/fpt1819/s2/ (see Thm 2.16 there) https://rolandspeicher.files.wordpress.com/2019/02/nica-speicher.pdf (no classical case, just as a reference)

I just flew over the post and didn’t think too much about it, but I guess this might help.

Posted by: Alex on December 11, 2019 10:13 AM | Permalink | Reply to this

Post a New Comment