Skip to the Main Content

Note:These pages make extensive use of the latest XHTML and CSS Standards. They ought to look great in any standards-compliant modern browser. Unfortunately, they will probably look horrible in older browsers, like Netscape 4.x and IE 4.x. Moreover, many posts use MathML, which is, currently only supported in Mozilla. My best suggestion (and you will thank me when surfing an ever-increasing number of sites on the web which have been crafted to use the new standards) is to upgrade to the latest version of your browser. If that's not possible, consider moving to the Standards-compliant and open-source Mozilla browser.

December 18, 2019

Random Permutations (Part 10)

Posted by John Baez

Groupoids generalize sets in an obvious way: while elements of a set can only be the same or different, objects of a groupoid can be ‘the same’ — that is, isomorphic — in more than one way.

Less obviously, groupoids also let us generalize the concept of cardinality. While the cardinality of a finite set is a natural number, the cardinality of a finite groupoid can be any nonnegative rational number!

This suggests that probabilities in combinatorics could secretly be cardinalities of groupoids. Indeed, Poisson distributions naturally show up this way. For a good self-contained introduction, see:

Now I want to explain how groupoid cardinality can be used to prove some important (and well-known) facts about random permutations.

A while back we saw that many questions about random permutations could be answered using just this one fact: the expected number of cycles of length kk in a random permutation of an nn-element set is 1/k1/k, at least if 1kn1 \le k \le n.

Last time we proved a stronger fact. To state this, we treat the number of cycles of length kk in a permutation of an nn-element set as a random variable, say C n,kC_{n,k}. Then we have this: the first nn moments of C n,kC_{n,k} match those of a Poisson distribution with mean 1/k1/k.

We actually did more: we described all the moments! We did this by working with ‘factorial moments’. The ppth factorial moment of a random variable xx is the mean of

x p̲=x(x1)(x2)(xp+1) x^{\underline{p}} = x(x-1)(x-2) \, \cdots \, (x-p+1)

Then we have this: the first nn factorial moments of C n,kC_{n,k} match those of the Poisson distribution of mean 1/k1/k, while the rest vanish. Or if you prefer to leave Poisson distributions out of it:

E(C n,k p̲)={1k p if pkn 0 if pk>n E(C_{n,k}^{\underline{p}}) = \left\{ \begin{array}{ccl} \displaystyle{\frac{1}{k^p}} & \mathrm{if} & p k \le n \\ \\ 0 & \mathrm{if} & p k \gt n \end{array} \right.

The proof of this was a fairly easy calculation, which however involved some miraculous cancellations and provided no compelling story about why the answer turns out to be so simple.

I now think I think I’ve found that story. It doesn’t connect to Poisson distributions — not yet, anyway. But it does explain the meaning of the answer 1/k p1/k^p. It’s the cardinality of a groupoid.

A quick sketch

The idea is quite simple when you know the underlying concepts. We’ll show that E(C n,k p̲)E(C_{n,k}^{\underline{p}}) is the cardinality of a certain groupoid. When n<kpn \lt k p this groupoid has no objects, so its cardinality is zero. Otherwise, this groupoid is equivalent to a product of a groupoid with cardinality 11 and pp copies of the group /k\mathbb{Z}/k, thought of as a one-object groupoid. Thus

E(C n,k p̲)=1k p E(C_{n,k}^{\underline{p}}) = \frac{1}{k^p}

The nice thing is that each copy of /k\mathbb{Z}/k is naturally associated to a cycle of length kk in our random permutation. It’s no coincidence that /k\mathbb{Z}/k is a ‘cyclic group’.

What is this groupoid, whose cardinality is E(C n,k p̲)E(C_{n,k}^{\underline{p}})? Its objects are nn-element sets equipped with a permutation that in turn is equipped with an ordered pp-tuple of disjoint cycles of length kk. There’s an obvious concept of isomorphism between two such gadgets, so we get a groupoid of them. And then we just think about this groupoid a bit, and everything follows.

But let me slow down and start from the beginning.

Review of groupoid cardinality

To compute the cardinality of a groupoid XX, we go through its isomorphism classes of objects, pick one representative xx of each class, count the morphisms from xx to itself, take the reciprocal… and add up all these reciprocals. In a formula:

|X|= [x]X̲1|Aut(x)| |X| = \sum_{[x] \in \underline{X}} \frac{1}{|\mathrm{Aut}(x)|}

Here X̲\underline{X} is the set of isomorphism classes of XX, and Aut(x)\mathrm{Aut}(x) is the automorphism group of xx, an arbitrary representative of some isomorphism class.

To make sure the sum converges, we can play it safe and assume our groupoid is finite. We don’t need to do this: for example, for the groupoid of finite sets the sum converges to ee. But today I’ll only work with groupoids that are either finite or equivalent to one that’s finite. (Equivalent groupoids clearly have the same cardinality.)

One way to get a groupoid is by taking a set XX with a group GG acting on it. Then we get a groupoid X//GX/\!/G where an object is an element xXx \in X and a morphism from xx to xx' is a group element gGg \in G with gx=xg x = x'. We compose these morphisms in the obvious way: by multiplication in GG.

We call X//GX/\!/G the weak quotient of XX by GG. In the ordinary quotient X/GX/G, we declare xx and xx' equal when there’s a group element gg with gx=xg x = x'. In the weak quotient, we merely put in an isomorphism between xx and xx'. Thus, the weak quotient retains more information.

Here’s one of my favorite equations:

|X//G|=|X|/|G| |X/\!/G| = |X|/|G|

It says that to understand division, we can use groupoids. If you take a 5-element set and mod out by a 2-element group, you don’t get a set with 5/25/2 elements. But if you take the weak quotient, you get a groupoid with cardinality 5/25/2.

Groupoid cardinality and probability theory

If you have a group GG, how can you find a set that it acts on? One way is to use GG itself. GG acts on itself in three interesting ways: by left translations, by right translations, and by conjugation. No matter which method you choose, you’ll get

|G//G|=1 |G/\!/G| = 1

If GG acts on itself by left or right translation, every object of G//GG/\!/G is isomorphic to every other object in a unique way. Thus, G//GG/\!/G is equivalent to the groupoid with one object and one morphism. Groupoid cardinality is invariant under equivalence. So it’s no surprise that |G//G|=1|G/\!/G| = 1 in this case.

It’s more interesting when GG acts on itself by conjugation. Then the weak quotient G//GG/\!/G has many different isomorphism classes, one for each conjugacy class of GG. The fact that |G//G||G/\!/G| equals 11 then says this: if we run through the conjugacy classes of GG, pick a representative of each one, take the reciprocal of the cardinality of its centralizer, and sum up these reciprocals, we get 11.

But the fun really starts when we consider a subset XGX \subseteq G that’s invariant under conjugation. The probability that a random element of GG lies in XX is |X|/|G||X|/|G|. But this equals |X//G||X/\!/G|. So, we can compute probabilities using groupoid cardinality!

Here, of course, we are treating GG as a probability measure space where each element has measure 1/|G|1/|G|.

More generally, suppose π:XG\pi \colon X \to G is a map of GG-sets, where GG acts on itself by conjugation. Then we get a function f:Gf \colon G \to \mathbb{N} that counts the points of XX that map to each element of GG:

f(g)=|π 1(g)| f(g) = |\pi^{-1}(g)|

We can think of ff as a random variable, since it’s a function on a probability measure space, namely GG. The expected value of ff is then the groupoid cardinality of X//GX/\!/G. Here’s why:

E(f) = 1|G| gGf(g) = 1|G| gG|π 1(g)| = |X|/|G| = |X//G| \begin{array}{ccl} E(f) &=& \displaystyle{ \frac{1}{|G|} \sum_{g \in G} f(g) } \\ \\ &=& \displaystyle{ \frac{1}{|G|} \sum_{g \in G} |\pi^{-1}(g)| } \\ \\ &=& |X|/|G| \\ \\ &=& |X/\!/G| \end{array}

This may seem like an impractical way to compute expected values. But it really works wonders in some cases, namely when X//GX/\!/G is equivalent to a groupoid whose cardinality we can compute by other means.

And that’s what we’ll see now.

The calculation

The key fact is this: since the function

C n,k:S nC_{n,k} \colon S_n \to \mathbb{N}

tells us the number of cycles of length kk in a permutation of an nn-element set,

C n,k p̲=C k(C k1)(C k2)(C kp+1) C_{n,k}^{\underline{p}} = C_k (C_k-1)(C_k-2) \, \cdots \, (C_k-p+1)

tells us the number of ordered pp-tuples of distinct cycles of this sort. And since distinct cycles are the same as disjoint cycles, we can replace the word ‘distinct’ with ‘disjoint’ here — and we’ll do that from now on.

The expected value E(C n,k p̲)E(C_{n,k}^{\underline{p}}) is thus the number of ordered pp-tuples of disjoint cycles of length kk of a permutation, summed over all permutations, divided by n!n!.

But this is the cardinality of a groupoid!

Let XX be the set whose elements are of this form: a permutation σS n\sigma \in S_n together with an ordered pp-tuple (c 1,,c p)(c_1, \dots, c_p) of disjoint cycles of length kk. The group S nS_n acts on XX in obvious way: it acts by conjugation on itself, and the cycles go along for the ride. There’s a map

π:XS n \pi \colon X \to S_n

that forgets the cycles and only keeps the permutation σ\sigma. This is clearly a map of S nS_n-sets. So, we can use the results described in the last section!

We get a function f:S nf \colon S_n \to \mathbb{N} that counts the points of XX that map to each element of S nS_n:

f(σ)=|π 1(σ)| f(\sigma) = |\pi^{-1}(\sigma)|

And this function is just what I’m calling C n,k p̲C_{n,k}^{\underline{p}}. For each permutation σ\sigma, it gives the number of ordered pp-tuples of distinct cycles of length kk.

It follows that the expected value we’re trying to compute is a groupoid cardinality:

E(C n,k p̲)=|X//S n| E(C_{n,k}^{\underline{p}}) = |X/\!/S_n|

But this groupoid X//S nX/\!/S_n is equivalent to one I find more charismatic: the groupoid Cyc(n,k,p)\mathrm{Cyc}(n,k,p) of nn-element sets equipped with a permutation that is itself equipped with a chosen pp-tuple of disjoint cycles of length kk. The morphisms in this groupoid are the obvious ones. (At this point I’m speeding up, assuming you can think like a category theorist.)

In other words, instead of fixing our ‘favorite’ nn-element set, which is sort of ridiculous, we use all of them.

Now, it’s easy to see that an nn-element set equipped with this structure is the same as a pp-tuple of cyclically ordered kk-element sets together with the ‘remainder’: a set with npkn - p k elements equipped with an arbitrary permutation. Thus, we have

Cyc(n,k,p)Cyc(k) p×Perm(nkp) \mathrm{Cyc}(n,k,p) \simeq \mathrm{Cyc}(k)^p \times \mathrm{Perm}(n-k p)

Here Cyc(k)\mathrm{Cyc}(k) is the groupoid of cyclically ordered kk-element sets, and Perm(nkp)\mathrm{Perm}(n - k p) is the groupoid of sets with nkpn - k p elements equipped with a permutation.

But Cyc(k)\mathrm{Cyc}(k) is equivalent to the groupoid B(/k)B(\mathbb{Z}/k), which has one object and the cyclic group /k\mathbb{Z}/k as automorphisms. That’s because all objects of Cyc(k)\mathrm{Cyc}(k) are isomorphic, and the symmetry group of a cyclically ordered kk-element set is /k\mathbb{Z}/k. An easy calculation shows that

|B(/k)|=1k |B(\mathbb{Z}/k)| = \frac{1}{k}

On the other hand,

Perm(nkp)S nkp//S nkp \mathrm{Perm}(n-k p) \simeq S_{n-k p} /\!/S_{n - k p}

where the symmetric group is acting on itself by conjugation. So, we get

|Perm(nkp)|=1 |\mathrm{Perm}(n-k p)| = 1

We thus get

E(C n,k p̲) = |Cyc(n,k,p)| = |Cyc(k) p×Perm(nkp)| = |Cyc(k)| p|Perm(nkp)| = (1/k) p1 = 1k p \begin{array}{ccl} E(C_{n,k}^{\underline{p}}) &=& |\mathrm{Cyc}(n,k,p)| \\ \\ &=& |\mathrm{Cyc}(k)^p \times \mathrm{Perm}(n-k p)| \\ \\ &=& |\mathrm{Cyc}(k)|^p \cdot | \mathrm{Perm}(n-k p)| \\ \\ &=& (1/k)^p \cdot 1 \\ \\ &=& \displaystyle{ \frac{1}{k^p} } \end{array}



Experts in category theory will note that I’m vacillating between using various large groupoids, like Perm(nkp)\mathrm{Perm}(n-k p), and various finite groupoids that are equivalent to these, like S nkp//S nkpS_{n-k p} /\!/S_{n - k p}. In my own mind, I barely notice the distinction — I flit back and forth depending on what’s more convenient at the moment. But when I try to explain what I’m doing, I feel some need to talk about this, and then everything becomes more lengthy. It’s possible I could have given a more efficient treatment, but I wanted to show you a way of thinking.

I also introduced a bit more notation than strictly necessary. When you think of a group GG as a one-object groupoid, people often call that groupoid BGB G. The reason is that if we turn this groupoid into a space, it becomes a space known as the classifying space of GG, also denoted BGB G. But the groupoid BGB G is also a weak quotient:

BG1//G B G \cong 1/\!/G

where GG is acting on the 11-element set in the only possible way. So, it should be no surprise that

|BG|=1|G| |B G| = \frac{1}{|G|}

Also, we have

Cyc(k)Cyc(k,k,1) \mathrm{Cyc}(k) \cong \mathrm{Cyc}(k,k,1)

In other words, a kk-element cyclically ordered set is the same as a kk-element set equipped with a permutation having one chosen cycle of length kk.

Also, we have

Perm(m)Cyc(m,k,0) \mathrm{Perm}(m) \cong \mathrm{Cyc}(m,k,0)

In other words, an mm-element set equipped with a permutation is the same as an mm-element set equipped with a permutation having no chosen cycles of length kk. (It can have lots of cycles of length kk; we’re just not choosing any of them.)

So, I didn’t need to talk about B(/k)B(\mathbb{Z}/k) or Cyc(k)\mathrm{Cyc}(k) or Perm(nkp)\mathrm{Perm}(n - k p). But I thought that reducing notation to the absolute minimum would make things harder to understand, not easier.

Posted at December 18, 2019 5:43 PM UTC

TrackBack URL for this Entry:

0 Comments & 0 Trackbacks

Post a New Comment