Skip to the Main Content

Note:These pages make extensive use of the latest XHTML and CSS Standards. They ought to look great in any standards-compliant modern browser. Unfortunately, they will probably look horrible in older browsers, like Netscape 4.x and IE 4.x. Moreover, many posts use MathML, which is, currently only supported in Mozilla. My best suggestion (and you will thank me when surfing an ever-increasing number of sites on the web which have been crafted to use the new standards) is to upgrade to the latest version of your browser. If that's not possible, consider moving to the Standards-compliant and open-source Mozilla browser.

May 31, 2011

Möbius Inversion for Categories

Posted by Tom Leinster

Every small category AA has a classifying space BAB A. It’s a topological space, and it can be constructed as the geometric realization of the nerve of AA.

The homotopy type of BAB A depends on every aspect of AA: its underlying directed graph, its composition, and its identities. But the Euler characteristic of BAB A does not: it only depends on the underlying graph. What’s going on?

Last week I gave a talk about this at a conference in Louvain-la-Neuve, Belgium. I’ll also be doing a shorter version at the major annual category theory meeting in Vancouver. So, I’d be grateful for any comments, suggestions or thoughts. The slides are here.

(If the title slides mystify you, what you need to know is that the conference marked the appointment of Ross Street to a visiting chaire.)

As you’ll have surmised from the title, my answer was all about Möbius inversion. Let me explain…

Posted at 9:03 AM UTC | Permalink | Followups (106)

May 18, 2011

An Operadic Introduction to Entropy

Posted by Tom Leinster

Bless British trains. A two-hour delay with nothing to occupy me provided the perfect opportunity to figure out the relationships between some of the results that John, Tobias and I have come up with recently.

This post is intended to serve two purposes. First, for those who have been following, it ties together several of the theorems we’ve found. I hope it will make them seem less like a bunch of distinct but vaguely similar results and more like a coherent whole.

Second, for those who haven’t been following, it’s an introduction to entropy, and our recent results on it, for the categorically minded.

I will not assume:

  • that you know anything whatsoever about entropy
  • that you’ve read any other posts on this blog.

I will assume:

  • that you know the definitions of operad and algebra for an operad
  • that you know a bit of category theory, including roughly what category theorists mean when they use the word lax.
Posted at 5:27 AM UTC | Permalink | Followups (142)

May 12, 2011

∞-Dijkgraaf-Witten Theory

Posted by Urs Schreiber

In another thread Tom Leinster would like to learn what a sigma-model in quantum field theory is. Here I want to explain this in a way that will make perfect sense to Tom, and hopefully even intrigue him. To do so, I will look at the σ\sigma-model called Dijkgraaf-Witten theory, which stands out as a canonical toy example with which to exhibit precisely those aspects of σ\sigma-models that mean something to abstractly-minded people like Tom and suppresses precisely all the other technical details.

In order to keep things interesting despite the toy model nature of the example, I shall use this occasion to talk about the general class of models that deserve to be called ∞-Dijkgraaf-Witten models. The next step in this hierarchy of models is what is called the Crane-Yetter model.

I’ll post the content here iteratively in small digestible bits in the comment section below.

An Idea-section is

One sequence of posts starts below at

Another one starts at

The discussion in the title of this thread finally starts at

There is also

Posted at 9:55 PM UTC | Permalink | Followups (77)

Making Things Simpler by Duality

Posted by David Corfield

If you have a moment in your busy day, try out the game of Jam. It shouldn’t take you long to realise that there’s something rather familiar about it. There’s a chance you may lose as you learn to play the game, but when you come to know its secret, this becomes very unlikely. As the title to this post suggests, the secret involves duality.

Something puzzling me at the moment is what to make of the way posing a problem in a dual situation may make it easier to resolve. As Vafa says in Geometric Physics:

Dualities very often transform a difficult problem in one setup to an easy problem in the other. In some sense very often the very act of ‘solving’ a non-trivial problem is finding the right ‘dual’ viewpoint.

Now my question is whether we should count these terms ‘easy’ and ‘difficult’ as concerning our psychology, or whether they concerns aspects of the situation themselves.

Posted at 10:32 AM UTC | Permalink | Followups (19)

May 11, 2011

QVEST, Summer 2011

Posted by Urs Schreiber

This May we have the third QVEST meeting:

  • Quarterly seminar on topology and geometry

    Utrecht University

    May 20, 2011

    seminar website

The speakers are

If you would like to attend and have any questions, please drop me a message.

The previous QVEST meeting was here.

Posted at 11:44 AM UTC | Permalink | Followups (28)

May 10, 2011

Entropies vs. Means

Posted by Tom Leinster

If you’ve been watching this blog, you can’t help but have noticed the current entropy-fest. It started on John’s blog Azimuth, generated a lengthy new page on John’s patch of the nLab, and led to first this entry at the Café, then this one.

Things have got pretty unruly. It’s good unruliness, in the same way that brainstorming is good, but in this post I want to do something to help those of us who are confused by the sheer mass of concepts, questions and results—which I suspect is all of us.

I want to describe a particular aspect of the geography of this landscape of ideas. Specifically, I’ll describe some connections between the concepts of entropy and mean.

This can be thought of as background to the project of finding categorical characterizations of entropy. There will be almost no category theory in this post.

Posted at 7:10 AM UTC | Permalink | Followups (9)

May 8, 2011

Category-Theoretic Characterizations of Entropy II

Posted by John Baez

We’re having a lively conversation about different notions related to entropy, and how to understand these using category theory. I don’t want to stop this conversation, but I have something long to say, which seems like a good excuse for a new blog post.

A while back, Tom Leinster took Fadeev’s characterization of Shannon entropy and gave it a slick formulation in terms of operads. I’ve been trying to understand this slick formulation a bit better, and I’ve made a little progress.

Tom’s idea revolves around a special law obeyed by the Shannon entropy. I’ll remind you of that law, or tell you about it for the first time in case you didn’t catch it yet. I’ll explain it in a very lowbrow way, as opposed to Tom’s beautiful highbrow way. Then, I’ll derive it from a simpler law obeyed by something called the ‘partition function’.

I like this, because I’ve got a bit of physics in my blood, and physicists know that the partition function is a very important concept. But I hope you’ll see: you don’t need to know physics to understand or enjoy this stuff!

Posted at 12:57 PM UTC | Permalink | Followups (131)

May 1, 2011

Category-Theoretic Characterizations of Entropy

Posted by John Baez

Tobias Fritz, Tom Leinster and I seem to be writing a paper on category-theoretic characterizations of entropy—not only the good old Shannon entropy, but also the Rényi entropy S qS_q, which depends on a parameter qq and reduces to the Shannon entropy as q1q \to 1. Curiously, you can’t get the Shannon entropy by just sticking q=1q = 1 in the definition of Rényi entropy: you really need to sneak up on it, taking a limit as q1q \to 1.

Tobias has worked on the category whose morphisms are stochastic matrices, and a category-theoretic approach to convex sets. Tom likes category-theoretic characterizations of familiar concepts. So it was natural for them to become interested in this topic. But for me, it started by trying to understand Rényi entropy.

A while back I wrote a paper on Rényi entropy and free energy and blogged about it here. I got a bunch of very helpful feedback, which improved the paper immensely and allowed me state the main result in this way: the Rényi entropy is a ‘qq-deformed version’ of the ordinary entropy. The ordinary Shannon entropy can be defined as minus the derivative of free energy F(T)F(T) with respect to temperature TT. To get the Rényi entropy, we must use a generalization of the usual derivative, called a ‘q-derivative’:

S q=F(T/q)F(T)1/q1 S_q = - \frac{F(T/q) - F(T)}{1/q - 1}

To be completely honest, this should be called a ‘1/q1/q-derivative’. But that’s no big deal. What’s more interesting is that now it’s obvious why taking the limit q1q \to 1 is a good idea: that’s how qq-derivatives reduce to ordinary derivatives!

Even more interesting is that qq-derivatives show up throughout the theory of quantum groups, the combinatorics of finite fields, and elsewhere. Does that shed light on why they’re showing up here too? I wish I knew.

Posted at 8:04 AM UTC | Permalink | Followups (126)