Skip to the Main Content

Note:These pages make extensive use of the latest XHTML and CSS Standards. They ought to look great in any standards-compliant modern browser. Unfortunately, they will probably look horrible in older browsers, like Netscape 4.x and IE 4.x. Moreover, many posts use MathML, which is, currently only supported in Mozilla. My best suggestion (and you will thank me when surfing an ever-increasing number of sites on the web which have been crafted to use the new standards) is to upgrade to the latest version of your browser. If that's not possible, consider moving to the Standards-compliant and open-source Mozilla browser.

September 17, 2023

Counting Algebraic Structures

Posted by John Baez

The number of groups with nn elements goes like this, starting with n=0n = 0:

0, 1, 1, 1, 2, 1, 2, 1, 5, …

The number of semigroups with nn elements goes like this:

1, 1, 5, 24, 188, 1915, 28634, 1627672, 3684030417, 105978177936292, …

Here I’m counting isomorphic guys as the same.

But how much do we know about such sequences in general? For example, is there any sort of algebraic gadget where the number of gadgets with nn elements goes like this:

1, 1, 2, 1, 1, 1, 1, 1, … ?

No! Not if by “algebraic gadget” we mean something described by a bunch of operations obeying equational laws — that is, an algebra of a Lawvere theory.

This follows from a result of László Lovász in 1967:

On Mastodon, Omar Antolín sketched a proof that greases the wheels with more category theory. It relies on a rather shocking lemma:

Super-Yoneda Lemma. Let C\mathsf{C} be the category of algebras of some Lawvere theory, and let A,BCA, B \in \mathsf{C} be two algebras whose underlying sets are finite. If the functors hom(,A)\mathrm{hom}(-,A) and hom(,B)\mathrm{hom}(-,B) are unnaturally isomorphic, then ABA \cong B.

Here we say the functors hom(,A)\mathrm{hom}(-,A) and hom(,B)\mathrm{hom}(-,B) are unnaturally isomorphic if

hom(X,A)hom(X,B) \mathrm{hom}(X,A) \cong \mathrm{hom}(X,B)

for all XCX \in \mathsf{C}. We’re not imposing the usual commuting naturality square — indeed we can’t, since we’re not even giving any specific choice of isomorphism!

If hom(,A)\mathrm{hom}(-,A) and hom(,B)\mathrm{hom}(-,B) are naturally isomorphic, you can easily show ABA \cong B using the Yoneda Lemma. But when they’re unnaturally isomorphic, you have to break the glass and pull out the Super-Yoneda Lemma.

Given this shocking lemma, it’s easy to show this:

Theorem. Let A,BA, B be two algebras of a Lawvere theory whose underlying sets are finite. If A kB kA^k \cong B^k for some natural number kk then ABA \cong B.

Here’s how. Since A kB kA^k \cong B^k, we have natural isomorphisms

hom(,A) khom(,A k)hom(,B k)hom(,B) k \mathrm{hom}(-,A)^k \cong \mathrm{hom}(-, A^k) \cong \mathrm{hom}(-, B^k) \cong \mathrm{hom}(-,B)^k

so for any XCX \in \mathsf{C} the sets hom(X,A) k\mathrm{hom}(X,A)^k and hom(X,B) k\mathrm{hom}(X,B)^k have the same cardinality. This means we have an unnatural isomorphism

hom(,A)hom(,B) \mathrm{hom}(-,A) \cong \mathrm{hom}(-,B)

The lemma magically lets us conclude that

AB A \cong B

Now, how do we use this to solve our puzzle? Let a(n)a(n) be the number of isomorphism classes of algebras whose underlying set has nn elements. We must have

a(n k)a(n) a(n^k) \ge a(n)

since we’ve just seen that nonisomorphic algebras with nn elements give nonisomorphic algebras with n kn^k elements. So, for example, we can never have a(4)<a(2)a(4) \lt a(2), since 4=2 24 = 2^2. Thus, the sequence can’t look like the one I showed you, with

a(0)=1,a(1)=1,a(2)=2,a(3)=1,a(4)=1,... a(0) = 1, \; a(1) = 1, \; a(2) = 2, \; a(3) = 1,\; a(4) = 1, ...

Nice! So let’s turn to the lemma, which is the really interesting part.

I’ll just quote Omar Antolín’s proof, since I can’t improve on it. I believe the ideas go back to Lovász, but a bit of category theory really helps. Remember, AA and BB are algebras of some Lawvere theory whose underlying sets are finite:

Let mon(X,A)\mathrm{mon}(X, A) be the set of monomorphisms, which here are just homomorphisms that are injective functions. I claim you can compute the cardinality of mon(X,A)\mathrm{mon}(X, A) using the inclusion-exclusion principle in terms of the cardinalities of hom(Q,A)\mathrm{hom}(Q, A) for various quotients of XX.

Indeed, for any pair of elements x,yXx, y \in X, let S(x,y)S(x, y) be the set for homomorphisms f:XAf \colon X \to A such that f(x)=f(y)f(x) = f(y). The monomorphisms are just the homomorphisms that belong to none of the sets S(x,y)S(x, y), so you can compute how many there are via the inclusion-exclusion formula: you’ll just need the cardinality of intersections of several S(x i,y i)S(x_i, y_i).

Now, the intersection of some S(x i,y i)S(x_i, y_i) is the set of homorphisms ff such that for all ii, f(x i)=f(y i)f(x_i) = f(y_i). Those are in bijection with the homorphisms QAQ \to A where QQ is the quotient of XX obtained by adding the relations x i=y ix_i=y_i for each ii.

So far I hope I’ve convinced you that if hom(,A)\mathrm{hom}(-, A) and hom(,B)\mathrm{hom}(-, B) are unnaturally isomorphic, so are mon(,A)\mathrm{mon}(-, A) and mon(,B)\mathrm{mon}(-, B). Now it’s easy to finish: since mon(A,A)\mathrm{mon}(A, A) is non-empty, so is mon(A,B)\mathrm{mon}(A, B), so AA is isomorphic to a subobject of BB. Similarly BB is isomorphic to a subobject of AA, and since they are finite, they must be isomorphic.

Beautiful!

But if you look at this argument you’ll see we didn’t use the full force of the assumptions. We didn’t need AA and BB to be algebras of a Lawvere theory. They could have been topological spaces, or posets, or simple graphs (which you can think of as reflexive symmetric relations), or various other things. It seems all we really need is a category C\mathsf{C} of gadgets with a forgetful functor

U:CFinSet U \colon \mathsf{C} \to \mathsf{FinSet}

that is faithful and has some extra property… roughly, that we can take an object in C\mathsf{C} and take a quotient of it where we impose a bunch of extra relations x i=y ix_i = y_i, and maps out of this quotient will behave as you’d expect. More precisely, I think the extra property is this:

Given any XCX \in \mathsf{C} and any surjection p:U(X)Sp \colon U(X) \to S, there is a morphism j:XQj \colon X \to Q such that the morphisms f:XYf \colon X \to Y that factor through jj are precisely those for which U(f)U(f) factors through pp.

Can anyone here shed some light on this property, and which faithful functors U:CFinSetU \colon \mathsf{C} \to \mathsf{FinSet} have it? These papers should help:

but I haven’t had time to absorb them yet.

By the way, there’s a name for categories where the super-Yoneda Lemma holds: they’re called right combinatorial.

And there’s a name for the sequences I’m talking about. If TT is a Lawvere theory, the sequence whose nnth term is the number of isomorphism classes of TT-algebras with nn elements is called the fine spectrum of TT. The idea was introduced here:

  • Walter Taylor, The fine spectrum of a variety, Algebra Universalis 5 (1975), 263–303.

though Taylor used not Lawvere theories but an equivalent framework: ‘varieties’ in the sense of universal algebra. For a bit more on this, go here.

I’m interested in which sequences are the fine spectrum of some Lawvere theory. You could call this an ‘inverse problem’. The direct problem — computing the fine spectrum of a given Lawvere theory — is already extremely difficult in many cases. But the case where there aren’t any equational laws (except trivial ones) is manageable:

Some errors in Harrison’s paper were corrected here:

I suspect Harrison and Tureček’s formulas could be nicely derived using species, since they’re connected to the tree-like structures discussed here:

  • François Bergeron, Gilbert Labelle and Pierre Leroux, Combinatorial Species and Tree-Like Structures, Cambridge U. Press, Cambridge, 1998.

For all I know these authors point this out! It’s been a while since I’ve read this book.

Posted at September 17, 2023 1:00 AM UTC

TrackBack URL for this Entry:   https://golem.ph.utexas.edu/cgi-bin/MT-3.0/dxy-tb.fcgi/3499

24 Comments & 0 Trackbacks

Re: Counting Algebraic Structures

“If the functors hom(−,A) and hom(−,A) are unnaturally isomorphic” Pretty sure they’re naturally isomorphic.

Posted by: Allen Knutson on September 17, 2023 3:15 PM | Permalink | Reply to this

Re: Counting Algebraic Structures

You forgot the letter A has a reflection symmetry!

But yes, I meant B.

Posted by: John Baez on September 17, 2023 4:10 PM | Permalink | Reply to this

Re: Counting Algebraic Structures

Neat! I think the following hypothetical variant of the theorem on powers could give even stronger constraints on such “number of algebras” sequences:

Theorem? Let AA, BB, CC be algebras of a Lawvere theory whose underlying sets are finite. If A×CB×CA \times C \cong B \times C, then ABA \cong B.

The problem with the analogous proof is that trying to cancel ×hom(,C)- \times \mathrm{hom}(-,C) from an isomorphism may not be possible because hom(X,C)\mathrm{hom}(X, C) could be empty for some XX.

But perhaps one can get this to work with some additional cleverness? Or are there obvious counterexamples?

If it’s true, then one could conclude that a “number of algebras” sequence must increase (non-strictly) not only along powers, but along arbitrary multiples, provided that there is at least one algebra of every positive finite cardinality.

Posted by: Tobias Fritz on September 17, 2023 4:52 PM | Permalink | Reply to this

Re: Counting Algebraic Structures

You might look at

If you have a single constant in the signature and the constant always gives a subalgebra then you never have empty hom sets and you can do this.

Posted by: Benjamin Steinberg on September 18, 2023 3:58 PM | Permalink | Reply to this

Re: Counting Algebraic Structures

This works if every nonempty finite algebra contains a copy of the 1 element algebra. So groups, rings, monoids and semi groups are fine.

Posted by: Benjamin Steinberg on September 18, 2023 4:07 PM | Permalink | Reply to this

Re: Counting Algebraic Structures

Alas, Tobias’s conjectured theorem isn’t true in full generality. Here’s a counterexample.

Take the theory consisting of three constants and no equations. Thus, an algebra is a set AA together with distinguished elements a 1a_1, a 2a_2 and a 3a_3, in order, which may or may not be equal.

Let A=B=C={1,2,3}A = B = C = \{1, 2, 3\}. Make AA, BB and CC into algebras by taking the following constants:

  • a 1=2a_1 = 2, a 2=2a_2 = 2, a 3=3a_3 = 3

  • b 1=1b_1 = 1, b 2=2b_2 = 2, b 3=3b_3 = 3

  • c 1=2c_1 = 2, c 2=3c_2 = 3, c 3=3c_3 = 3.

Then A×CA \times C is the 9-element set {1,2,3}×{1,2,3}\{1, 2, 3\} \times \{1, 2, 3\} with distinguished elements

(2,2),(2,3),(3,3), (2, 2), (2, 3), (3, 3),

and B×CB \times C is the same 9-element set with distinguished elements

(1,2),(2,3),(3,3). (1, 2), (2, 3), (3, 3).

Since both A×CA \times C and B×CB \times C are 9-element sets whose three distinguished elements are all distinct, they are isomorphic as algebras. Concretely, the self-map of {1,2,3} 2\{1, 2, 3\}^2 that exchanges (2,2)(2, 2) and (1,2)(1, 2) while leaving everything else fixed defines an isomorphism of algebras A×CB×CA \times C \cong B \times C.

On the other hand, AA is not isomorphic to BB, because the three distinguished elements of BB are distinct but those of AA are not.

Posted by: Tom Leinster on September 19, 2023 8:21 PM | Permalink | Reply to this

Re: Counting Algebraic Structures

Fiendishly clever, Tom!  

Your trick reminds me of how geometric realization of simplicial sets preserves products thanks to the miracle of degeneracies. The simplicial 1-simplex (the simplicial set that looks like an interval) has vertices 00 and 11. It has a degenerate 2-simplex with vertices 0,0,10, 0, 1 and another with vertices 0,1,10, 1, 1. These give the product of two simplicial 1-simplexes a nondegenerate 2-simplex with vertices (0,0),(0,1)(0,0), (0,1) and (1,1)(1,1). As we move along from (0,0)(0,0) to (0,1)(0,1) to (1,1)(1,1), at each stage the first coordinate may stay the same, or the second coordinate may stay the same, but they don’t both stay the same.

Was this on your mind?

Posted by: John Baez on September 19, 2023 10:53 PM | Permalink | Reply to this

Re: Counting Algebraic Structures

No, I just guessed the general conjecture would be false and looked for counterexamples in the simplest possible algebraic theories.

Posted by: Tom Leinster on September 20, 2023 4:04 PM | Permalink | Reply to this

Re: Counting Algebraic Structures

That’s very nice indeed! It’s a shame that my conjecture is wrong in general.

In fact, I realize now that also my conjectured corollary about every “number of algebras” sequence being non-strictly increasing along multiples is wrong. For example, the number of Boolean algebras on 22 elements is one, but the number of Boolean algebras on 232 \cdot 3 elements is zero. In this case, the problem is not that hom(,C)\mathrm{hom}(-,C) can be empty, which is how my envisioned argument fails in your example, but rather the complete absence of algebras in cardinalities that are not powers of two.

Posted by: Tobias Fritz on September 20, 2023 10:32 AM | Permalink | Reply to this

Re: Counting Algebraic Structures

Oh, that’s a very simple counterexample, Tobias!

Given a set of natural numbers closed under multiplication, is there a Lawvere theory whose finite algebras only have cardinalities in this set?

Posted by: John Baez on September 20, 2023 10:45 AM | Permalink | Reply to this

Re: Counting Algebraic Structures

I take it that what you’re asking is this: given a multiplicative submonoid S{0}S \subseteq \mathbb{N} \setminus \{0\}, does there exist a Lawvere theory which has algebras precisely in cardinalities in SS?

Here’s a positive answer in some special cases. The set S={1,p k,p 2k,}S = \{1, p^k, p^{2k}, \ldots\} for a prime pp and exponent kk \in \mathbb{N}, can be realized by the theory of vector spaces over the finite field 𝔽 p k\mathbb{F}_{p^k}. Given sets S 1,,S mS_1, \ldots, S_m which are of this form for different primes, we can also realize the set of all products S 1S mS_1 \cdots S_m by using products of Lawvere theories, so that an algebra of such a structure is simply the product of mm vector spaces over the respective fields. More generally, this shows that the set of realizable sets is itself a monoid under multiplication!

Looking at other cases seems difficult. Here are two of them which should be at roughly the next level of difficulty:

  • Does there exist a Lawvere theory which realizes the powers of two, except for two itself?

A natural idea would be to try and extend the structure of Boolean algebra or vector space over 𝔽 2\mathbb{F}_2 in such a way that this extra structure exists on all algebras except on the two-element one, but it’s unclear how to do this.

  • What about a Lawvere theory realizing all the powers of 66?

This one seems easier, and perhaps one can consider pairs consisting of a vector space over 𝔽 2\mathbb{F}_2 and one over 𝔽 3\mathbb{F}_3 and somehow force them to have the same dimension.

Posted by: Tobias Fritz on September 25, 2023 5:26 PM | Permalink | Reply to this

Re: Counting Algebraic Structures

Here’s a positive example on my first bullet point, namely what should be a Lawvere theory which has finite algebras in cardinalities exactly the powers of two except for two itself. Consider first the Lawvere theory whose algebras are product sets V×WV \times W where VV is a vector space over 𝔽 2\mathbb{F}_2 and WW is a vector space over 𝔽 4\mathbb{F}_4; this is the product of the individual Lawvere theories for these kinds of vector spaces. Now throw in two additional unary operations corresponding to maps s:VWs : V \to W and r:WVr : W \to V and impose the equation rs=1 Vrs = 1_V; you can also impose 𝔽 2\mathbb{F}_2-linearity if you like, but it’s not needed. The idea is that ss embeds VV into WW, and that such maps exist if and only if |V||W||V| \le |W|.

In this way, the possible cardinalities of V×WV \times W are clearly of the form 2 m4 n2^m 4^n with 2 m4 n2^m \le 4^n. It’s easy to see that these numbers are exactly the powers of two except for two itself. (It’s enough to use m{0,1}m \in \{0,1\} to get these.)

There should be a pretty general idea here on how to build Lawvere theories without models in specific cardinalities, but I’m not quite sure about how to formulate it.

Posted by: Tobias Fritz on September 28, 2023 7:25 AM | Permalink | Reply to this

Re: Counting Algebraic Structures

The powers of 2 except for 2 are of the form 8 m4 n8^m 4^n, and I’m convinced you can tackle general powers of 2 by just adding more powers into the product.

For powers of 2 and 3, there are uncountably many multiplicative submonoids. (Consider {2 m3 n|mαn}\{2^m 3^n\,|\,m \le \alpha n\}, which encodes any positive real α\alpha.)

Somewhere in between, we can ask what this gadget can produce.

I’m assuming you can impose inequalities of the form |V 1V 2||V 3V 4||V_1V_2| \le |V_3V_4|, which means we can abstract it down to the following.

Let II be a (finite?) set and consider functions q:I,m:Iq:I \to \mathbb{N}, m:I \to \mathbb{N} and a collection of inequality conditions CP(I) 2C \subseteq P(I)^2. Then we are asking what sets are of the form

S I,q,C={ iIq i m i| cC 1q c m c cC 2q c m c(C 1,C 2)C}S_{I,q,C} = \mathbb{N} \cap \left\{\prod_{i \in I} {q_i}^{m_i} \,|\, \prod_{c \in C_1} {q_c}^{m_c} \le \prod_{c \in C_2} {q_c}^{m_c} \quad\forall (C_1,C_2) \in C\right\}

(We also have to impose the condition that each q iq_i is a prime power. This is the main restriction we are trying to remove in the second puzzle.)

Posted by: unekdoud on October 4, 2023 4:20 AM | Permalink | Reply to this

Re: Counting Algebraic Structures

The number of outer automorphisms of S nS_n is 1,1,2,1,1,1,2,1,1,1,2,1,1,1,2,1,\ldots.

Posted by: James Sheppard on September 19, 2023 2:28 PM | Permalink | Reply to this

Re: Counting Algebraic Structures

Not true. Are you testing us?

Posted by: John Baez on September 19, 2023 10:58 PM | Permalink | Reply to this

Re: Counting Algebraic Structures

I didn’t find that the politest of responses. Of course I mis-wrote: my point was that the automorphism group of S nS_n is not what one “expects” (ie S nS_n again) precisely when n=2,6n=2,6, and that set doesn’t have the property of algebraic structures which is under discussion.

Posted by: James Sheppard on September 20, 2023 8:54 PM | Permalink | Reply to this

Re: Counting Algebraic Structures

Sorry, I was trying to joke around, which is always risky without facial expressions to indicate it. I suspected many people would skim over the sequence and not notice that the first “2” was wrong.

If you expand the scope a bit and look for epimomorphisms from S nS_n to S mS_m with mnm \le n then you’ll also find our friend the sign homomorphism from S nS_n to S 2S_2 and one more exotic delight: the nontrivial homomorphism from S 4S_4 to S 3S_3, which arises from the 3 ways to partition a 4-element set into 2-element subsets.

Posted by: John Baez on September 20, 2023 9:49 PM | Permalink | Reply to this

Re: Counting Algebraic Structures

It is almost surely unrelated, but talk of the nontrivial homomorphism S 4S 3S_4 \to S_3 makes me think of the unique non-null-homotopic map S 4S 3S^4\to S^3.

Posted by: David Roberts on September 21, 2023 12:58 AM | Permalink | Reply to this

Re: Counting Algebraic Structures

Actually I think the nontrivial homomorphism S 4S 3S_4 \to S_3 is related to the existence of a nontrivial homomorphism SO(4)SO(3)\mathrm{SO}(4) \to \mathrm{SO}(3).

If we take the 4-element set {1,2,3,4}\{1,2,3,4\}, there are 3 ways to split it:

{1,2}+{3,4} \{1,2\} + \{3,4\} {1,3}+{4,2} \{1,3\} + \{4,2\} {1,4}+{2,3} \{1,4\} + \{2,3\}

and S 4S_4 permutes these 3 splittings. This gives a nontrivial homomorphism S 4S 3S_4 \to S_3.

Similarly, if we take 4\mathbb{R}^4 with its usual basis e 1,e 2,e 3,e 4e_1, e_2, e_3, e_4 , the space of self-dual elements of Λ 2 4\Lambda^2 \mathbb{R}^4 is 3-dimensional with basis

e 1e 2+e 3e 4 e_1 \wedge e_2 + e_3 \wedge e_4 e 1e 3+e 4e 2 e_1 \wedge e_3 + e_4 \wedge e_2 e 1e 4+e 2e 3 e_1 \wedge e_4 + e_2 \wedge e_3

and SO(4)\mathrm{SO}(4) preserves this 3-dimensional space. This gives a nontrivial homomorphism SO(4)SO(3)\mathrm{SO}(4) \to \mathrm{SO}(3).

We can think of these two phenomena as ‘the same thing’ over the field with one element and the field \mathbb{R}. But the real case is different because we also have a 3-dimensional space of anti-self-dual elements of Λ 2 4\Lambda^2 \mathbb{R}^4, with basis

e 1e 2e 3e 4 e_1 \wedge e_2 - e_3 \wedge e_4 e 1e 3e 4e 2 e_1 \wedge e_3 - e_4 \wedge e_2 e 1e 4e 2e 3 e_1 \wedge e_4 - e_2 \wedge e_3

This gives a different nontrivial homomorphism SO(4)SO(3)\mathrm{SO}(4) \to \mathrm{SO}(3). But arguably this is because \mathbb{R} lets us use minus signs, which we can’t do in the field with one element.

I should really see how this story plays out in all the finite fields 𝔽 p\mathbb{F}_p. I get a bit confused about orthogonal groups and exterior algebras in characteristic pp, especially when p=2p = 2.

Posted by: John Baez on September 21, 2023 1:11 PM | Permalink | Reply to this

Re: Counting Algebraic Structures

This stuff is so cool, John! It’s so nice to see categorical tools being useful in such a concrete, ‘hard math’ question.

Re the condition on the functor U:CFinSetU:C \to \mathrm{FinSet} to prove super-Yoneda, it looks very much like exponentiability for a functor.

Quoting from the nLab:

A functor p:EBp\colon E\to B is exponentiable if for any morphism α:ab\alpha\colon a\to b in EE and any factorization paβcγpbp a \overset{\beta}{\to} c \overset{\gamma}{\to} p b of pαp \alpha in BB, we have:

  1. there exists a factorization aβ˜dγ˜ba \overset{\tilde{\beta}}{\to} d \overset{\tilde{\gamma}}{\to} b of α\alpha in EE such that pβ˜=βp \tilde{\beta} = \beta and pγ˜=γp \tilde{\gamma} = \gamma, and

  2. any two such factorizations in EE are connected by a zigzag of commuting morphisms which map to id cid_c in BB.

You see, the first condition is almost an exact match of yours.

Posted by: Matteo Capucci on September 21, 2023 2:24 PM | Permalink | Reply to this

Re: Counting Algebraic Structures

Thanks! This could be a wiser way of formulating my condition. It will take me a while to understand the precise relation, but I can see they’re similar.

Posted by: John Baez on September 23, 2023 3:32 PM | Permalink | Reply to this

Re: Counting Algebraic Structures

If it can help, I like to express the exponentiability condition (also called Conduché property) for a functor in a different way (I used the other one to stress the similarity to yours):

Let F:CDF:C \to D be a functor. FF is exponentiable iff the displayed category of its fibers, which is a unitary lax functor F 1:DProfF^{-1}:D \to \mathbf{Prof}, is actually a pseudofunctor.

So exponentiability gives you ‘half’ of the structure of a fibration, namely that composition of cartesian lifts is still cartesian, and vice versa. What it doesn’t guarantee is that all cartesian lifts exists, only that when they do, they compose nicely.

This other guarantee holds for prefibrations, and indeed a Grothendieck fibration is exactly an exponentiable prefibration!

This is all stuff more or less explicitly treated by Benaboù in ‘Distributors at work’.

Posted by: Matteo Capucci on September 25, 2023 12:15 PM | Permalink | Reply to this

Re: Counting Algebraic Structures

Coming to this a bit late, but I don’t see how exponentiability says something about cartesian morphisms if they don’t exist.

If all weakly cartesian lifts do exist, so that the normal lax functor DProfD\to Prof lands in CatCat, then the weakly cartesian arrows are closed under composition if and only if the lax functor is pseudo, hence if and only if the functor CDC\to D is exponentiable. But if weakly cartesian lifts don’t exist, I don’t see how assuming that they are closed under composition is relatable to exponentiability.

Posted by: Mike Shulman on December 11, 2023 10:53 PM | Permalink | Reply to this

Re: Counting Algebraic Structures

(Sorry to come to this a bit late; this semester was kind of overwhelming for me.)

Given any XCX\in C and any surjection p:U(X)Sp:U(X)\to S, there is a morphism j:XQj:X\to Q such that the morphisms f:XYf:X\to Y that factor through jj are precisely those for which U(f)U(f) factors through pp.

That looks to me like saying exactly that pp has a semi-final lift. In particular, since jj factors through itself, U(j)U(j) factors through pp, and this condition together with faithfulness of UU implies that it is the initial such factorization.

In particular, therefore, this condition holds whenever UU is a solid functor. This holds if:

  1. UU is topological concrete, or
  2. UU is the restriction of a monadic functor into SetSet with the property that quotients of finite algebras are finite.

However, it seems to me that the argument is using a bit more than this assumption. Don’t you need at the end that UU is conservative or something?

Posted by: Mike Shulman on December 11, 2023 11:14 PM | Permalink | Reply to this

Post a New Comment