Skip to the Main Content

Note:These pages make extensive use of the latest XHTML and CSS Standards. They ought to look great in any standards-compliant modern browser. Unfortunately, they will probably look horrible in older browsers, like Netscape 4.x and IE 4.x. Moreover, many posts use MathML, which is, currently only supported in Mozilla. My best suggestion (and you will thank me when surfing an ever-increasing number of sites on the web which have been crafted to use the new standards) is to upgrade to the latest version of your browser. If that's not possible, consider moving to the Standards-compliant and open-source Mozilla browser.

November 29, 2016

Quarter-Turns

Posted by Tom Leinster

Teaching linear algebra this semester has made me face up to the fact that for a linear operator TT on a real inner product space, Tx,x=0xT *=T \langle T x, x \rangle = 0 \,\, \forall x \,\, \iff \,\, T^\ast = -T whereas for an operator on a complex inner product space, Tx,x=0xT=0. \langle T x, x \rangle = 0 \,\, \forall x \,\, \iff \,\, T = 0. In other words, call an operator TT a quarter-turn if Tx,x=0\langle T x, x \rangle = 0 for all xx. Then the real quarter-turns correspond to the skew symmetric matrices — but apart from the zero operator, there are no complex quarter turns at all.

Where in my mental landscape should I place these facts?

The proofs of both facts are easy enough. Everyone who’s met an inner product space knows the real polarization identity: in a real inner product space XX,

x,y=14(x+y 2xy 2). \langle x, y \rangle = \frac{1}{4} \bigl( \| x + y \|^2 - \| x - y \|^2 \bigr).

All we used about ,\langle -, - \rangle here is that it’s a symmetric bilinear form (and that w 2=w,w\|w\|^2 = \langle w, w \rangle). In other words, for a symmetric bilinear form β\beta on XX, writing Q(w)=β(w,w)Q(w) = \beta(w, w), we have

β(x,y)=14(Q(x+y)Q(xy)) \beta(x, y) = \frac{1}{4} \bigl( Q(x + y) - Q(x - y) \bigr)

for all x,yXx, y \in X.

The crucial point is that we really did need the symmetry. (For it’s clear that the right-hand side is symmetric whether or not β\beta is.) For a not-necessarily-symmetric bilinear form β\beta, all we can say is

12(β(x,y)+β(y,x))=14(Q(x+y)Q(xy)) \frac{1}{2} \bigl( \beta(x, y) + \beta(y, x) \bigr) = \frac{1}{4} \bigl( Q(x + y) - Q(x - y) \bigr)

or more simply put,

β(x,y)+β(y,x)=12(Q(x+y)Q(xy)). \beta(x, y) + \beta(y, x) = \frac{1}{2} \bigl( Q(x + y) - Q(x - y) \bigr).

Now let TT be a linear operator on XX. There is a bilinear form β\beta defined by β(x,y)=Tx,y\beta(x, y) = \langle T x, y \rangle. It’s not symmetric unless TT is self-adjoint; nevertheless, the polarization identity just stated tells us that

Tx,y+Ty,x=12(T(x+y),x+yT(xy),xy). \langle T x, y \rangle + \langle T y, x \rangle = \frac{1}{2} \bigl( \langle T(x + y), x + y \rangle - \langle T(x - y), x - y \rangle \bigr).

It follows that TT is a quarter-turn if and only if

Tx,y+Ty,x=0 \langle T x, y \rangle + \langle T y , x \rangle = 0

for all x,yXx, y \in X. After some elementary rearrangement, this in turn is equivalent to

(T+T *)x,y=0 \langle (T + T^\ast)x, y \rangle = 0

for all x,yx, y, where T *T^\ast is the adjoint of TT. But that just means that T+T *=0T + T^\ast = 0. So, TT is a quarter-turn if and only if T *=TT^\ast = -T.

The complex case involves a more complicated polarization identity, but is ultimately simpler. To be clear, when I say “complex inner product” I’m talking about something that’s linear in the first argument and conjugate linear in the second.

In a complex inner product space, the polarization formula is

x,y=14 p=0 3i px+i py 2. \langle x , y \rangle = \frac{1}{4} \sum_{p = 0}^3 i^p \| x + i^p y \|^2.

This can be compared with the real version, which (in unusually heavy notation) says that

x,y=14 p=0 1(1) px+(1) py 2. \langle x , y \rangle = \frac{1}{4} \sum_{p = 0}^1 (-1)^p \| x + (-1)^p y \|^2.

And the crucial point in the complex case is that this time, we don’t need any symmetry. In other words, for any bilinear form β\beta on XX, writing Q(x)=β(x,x)Q(x) = \beta(x, x), we have

β(x,y)=14 p=0 3i pQ(x+i py). \beta(x, y) = \frac{1}{4} \sum_{p = 0}^3 i^p Q(x + i^p y).

So given a quarter-turn TT on XX, we can define a bilinear form β\beta by β(x,y)=Tx,y\beta(x, y) = \langle T x, y \rangle, and it follows immediately from this polarization identity that Tx,y=0\langle T x, y \rangle = 0 for all x,yx, y — that is, T=0T = 0.

So we’ve now shown that over \mathbb{R},

Tis a quarter-turnT *=T T \,\,\text{is a quarter-turn}\,\, \iff T^\ast = - T

but over \mathbb{C},

Tis a quarter-turnT=0. T \,\,\text{is a quarter-turn}\,\, \iff T = 0.

Obviously everything I’ve said is very well-known to those who know it. (For instance, most of it’s in Axler’s Linear Algebra Done Right.) But how should I think about these results? How can I train my intuition so that the real and complex results seem simultaneously obvious?

Whatever the intuitive picture, here’s a nice consequence, also in Axler’s book.

This pair of results immediately implies that whether we’re over \mathbb{R} or \mathbb{C}, the only self-adjoint quarter-turn is zero. Now let TT be any operator on a real or complex inner product space, and recall that TT is said to be normal if it commutes with T *T^\ast.

Equivalently, TT is normal if the operator T *TTT *T^\ast T - T T^\ast is zero.

But T *TTT *T^\ast T - T T^\ast is always self-adjoint, so TT is normal if and only if T *TTT *T^\ast T - T T^\ast is a quarter-turn.

Finally, a bit of routine messing around with inner products shows that this is in turn equivalent to

T *x=Txfor allxX. \| T^\ast x \| = \| T x \| \,\,\text{for all}\,\, x \in X.

So a real or complex operator TT is normal if and only if T *xT^\ast x and TxT x have the same length for all xx.

Posted at November 29, 2016 11:14 PM UTC

TrackBack URL for this Entry:   https://golem.ph.utexas.edu/cgi-bin/MT-3.0/dxy-tb.fcgi/2924

24 Comments & 0 Trackbacks

Re: Quarter-Turns

The first thing that jumps out at me is that \mathbb{C} has a scalar quarter-turn already. Such is famously absent over the Reals.

Posted by: Jesse C. McKeown on November 30, 2016 1:42 AM | Permalink | Reply to this

Re: Quarter-Turns

Indeed, I don’t know why I don’t see more often an informal layman’s explanation of what ii “is”: it’s a 90 90^\circ turn (in the same way that 1-1 is a 180 180^\circ turn). It ought to remove a lot of mystery of what “imaginary numbers” are.

Posted by: Todd Trimble on November 30, 2016 11:25 AM | Permalink | Reply to this

Re: Quarter-Turns

That’s an interesting point, Jesse. Of course, i,1\langle i, 1 \rangle is not zero in the one-dimensional complex inner product space \mathbb{C}. Nevertheless, i,1=0\langle i, 1 \rangle = 0 in the two-dimensional real vector space \mathbb{C} with its obvious real inner product.

Posted by: Tom Leinster on November 30, 2016 12:22 PM | Permalink | Reply to this

Re: Quarter-Turns

There’s also a natural sense in which the “obvious” real inner product on a complex inner product space is the only natural one: it’s the only one that induces the same norm as the complex inner product.

Posted by: Mark Meckes on December 1, 2016 3:49 PM | Permalink | Reply to this

Re: Quarter-Turns

Here’s one way to think about it (in finite dimensions anyway, or with sufficient additional hypotheses in infinite dimensions). The inner product of vectors Tx,x\langle T x, x \rangle is equal to a Hilbert–Schmidt inner product of operators: Tx,x=T,xx HS. \langle T x, x \rangle = \langle T, x \otimes x \rangle_{HS}. Here xxx \otimes x is the self-adjoint rank-one operator given by xx(y)=y,xx. x\otimes x (y) = \langle y, x \rangle x. So TT is quarter-turn if and only if it is in the orthogonal complement of the span of self-adjoint rank-one operators.

Now if your scalar field is \mathbb{R}, then by the spectral theorem that span consists precisely of the self-adjoint operators. So quarter-turns are the operators which are (Hilbert–Schmidt-)orthogonal to self-adjoint operators; it’s not hard to show this is equivalent to T *=TT^* = -T.

On the other hand, if the scalar field is \mathbb{C}, then it’s still true that the real linear span of self-adjoint rank-one operators is the space of self-adjoint operators, but S=ixxS = i x \otimes x is not self-adjoint — it satisfies S *=SS^* = -S. So the complex linear span of self-adjoint rank-one operators is the space of all operators. In particular, it contains TT itself. So a quarter-turn on a complex inner product space is Hilbert–Schmidt-orthogonal to itself, hence 00.

Posted by: Mark Meckes on November 30, 2016 2:38 PM | Permalink | Reply to this

Re: Quarter-Turns

Thanks, Mark. After nearly a semester of teaching, my thinking has slowed to a crawl, so I’m going to have to take this bit by bit.

You mentioned the Hilbert-Schmidt inner product of operators. Let me try to explain what this is. In brief, I claim that it’s the only reasonable inner product on Hom(X,Y)Hom(X, Y), for inner product spaces XX and YY.

Start with the category of inner product spaces over 𝔽\mathbb{F} (either \mathbb{R} or \mathbb{C}) and all linear maps between them. To make things easy, let’s stick to finite-dimensional spaces, although I know there are wider settings where you can make this work.

For any two inner product spaces XX and YY, we have the vector space Hom(X,Y)Hom(X, Y) of linear maps. Can we make it into an inner product space in a sensible way?

Surely any sensible method for doing this has these two properties:

  • The isomorphism of vector spaces Hom(𝔽,Y)YHom(\mathbb{F}, Y) \cong Y preserves inner products. Here YY is any inner product space, and Hom(𝔽,Y)Hom(\mathbb{F}, Y) has the as-yet-unknown inner product on it.

  • The isomorphism of vector spaces Hom(X 1X 2,Y)Hom(X 1,Y)Hom(X 2,Y)Hom(X_1 \oplus X_2, Y) \cong Hom(X_1, Y) \oplus Hom(X_2, Y) also preserves inner products. Here each of the three Hom-spaces has the as-yet-unknown inner product on it. On the right-hand side, we’re using the fact that when VV and WW are inner product spaces, VWV \oplus W naturally acquires an inner product: (v,w),(v,w)=v,v+w,w. \langle (v, w), (v', w') \rangle = \langle v, v' \rangle + \langle w, w' \rangle.

Putting these two requirements together says that for all n0n \geq 0 and inner product spaces X 1,,X n,YX_1, \ldots, X_n, Y, the canonical isomorphism

Hom(X 1X n,Y)Hom(X 1,Y)Hom(X n,Y) Hom(X_1 \oplus \cdots \oplus X_n, Y) \cong Hom(X_1, Y) \oplus \cdots \oplus Hom(X_n, Y)

should preserve inner products.

And in fact, this requirement completely determines what the inner product must be. Taking X 1,,X nX_1, \ldots, X_n all to be the base field 𝔽\mathbb{F}, it gives us an isomorphism of inner product spaces

Hom(𝔽 n,Y)YY. Hom(\mathbb{F}^n, Y) \cong Y \oplus \cdots \oplus Y.

And since every inner product space is isomorphic to one of the form 𝔽 n\mathbb{F}^n, this determines the inner product on Hom(X,Y)Hom(X, Y) for all inner product spaces XX.

When you work out what this says explicitly, you find that this inner product is as follows. Given linear maps α,β:XY\alpha, \beta : X \to Y and an orthonormal basis x 1,,x nx_1, \ldots, x_n of XX,

α,β= i=1 nα(x i),β(x i) \langle \alpha, \beta \rangle = \sum_{i = 1}^n \langle \alpha(x_i), \beta(x_i) \rangle

where the inner products on the right-hand side take place in YY. A neater way to write this is

α,β=tr(β *α) \langle \alpha, \beta \rangle = tr(\beta^\ast \circ \alpha)

where β:YX\beta : Y \to X is the adjoint of β\beta. And that’s the definition of the Hilbert-Schmidt inner product on Hom(X,Y)Hom(X, Y).

The trace formulation makes clear that the definition does not depend on the choice of basis. It also suggests that the definition should work in any context where trace makes sense, beyond the finite-dimensional setting.

The Hilbert-Schmidt inner product has other good categorical properties. First note that the tensor product V×WV \times W of two inner product spaces VV, WW naturally carries an inner product:

vw,vw=v,vw,w. \langle v \otimes w, v' \otimes w' \rangle = \langle v, v' \rangle \, \langle w, w' \rangle.

Now it’s not hard to show that for any inner product spaces XX, YY and ZZ, the canonical isomorphism of vector spaces

Hom(XY,Z)Hom(X,Hom(Y,Z)) \Hom(X \otimes Y, Z) \cong \Hom(X, \Hom(Y, Z))

is in fact an isomorphism of inner product spaces. Here we’re using the natural inner product on XYX \otimes Y and the Hilbert-Schmidt inner product on the Hom-spaces.

Now that I’ve figured all this out, I want to not call it the “Hilbert-Schmidt inner product”: it’s just the inner product on hom-spaces, the only one that’s sensible. Would you agree, Mark?

Right, now onto the rest of your comment!

Posted by: Tom Leinster on December 1, 2016 1:52 PM | Permalink | Reply to this

Re: Quarter-Turns

I agree that the Hilbert–Schmidt inner product is the only reasonable inner product on Hom(X,Y)Hom(X,Y) when XX and YY are inner product spaces (at least, when there is no other additional structure assumed). But I would justify it in a different way.

As befits a category theorist, for you “an inner product on Hom(X,Y)Hom(X,Y)” means “an assignment, for each pair of inner product spaces X,YX, Y, of an inner product to Hom(X,Y)Hom(X,Y)”, and you then proceed to show that there is a unique such assignment which interacts nicely with the structure of the category of inner product spaces.

But the Hilbert–Schmidt inner product on a single Hom(X,Y)Hom(X,Y) is also uniquely determined (up to normalization) by an invariance property that makes no mention of any other spaces. Namely, it is the unique (up to normalization) inner product on Hom(X,Y)Hom(X,Y) which induces a unitarily invariant norm: T HS=UTV HS \| T \|_{HS} = \| U T V \|_{HS} for any unitary maps (i.e., isomorphisms of inner product spaces) U:YYU: Y \to Y and V:XXV: X \to X. (I bet there’s actually a nice categorial way to state this property, but I haven’t ever bothered to figure out what it is.)

Now you may ask, how would I choose to justify the normalization of the Hilbert–Schmidt inner product? One answer is that I might not choose to: different normalizations turn out to be more convenient in different situations. Another answer is given by a property you alluded to above, but in slightly different form: with the usual normalization, vw,vw HS=v,vw,w, \langle v \otimes w, v' \otimes w' \rangle_{HS} = \langle v, v' \rangle \langle w, w' \rangle, where I’m using \otimes as in my comment above to denote rank-one operators, making this agree with the natural inner product on VWV \otimes W.

Posted by: Mark Meckes on December 1, 2016 3:23 PM | Permalink | Reply to this

Re: Quarter-Turns

In my first, unsatisfactory, attempt to figure out the “meaning” of the Hilbert-Schmidt inner product, what I actually ended up with was not tr(β *α)tr(\beta^\ast \alpha) but 1ntr(β *α), \frac{1}{n} tr(\beta^\ast \alpha), where n=dimX=tr(1 X)n = \dim X = tr(1_X). I seem to remember that cropping up in free probability theory, in keeping with the analogy between trace and expected value. So I’m guessing that’s the kind of alternative normalization that a random matrix theorist might have in mind.

Posted by: Tom Leinster on December 1, 2016 3:49 PM | Permalink | Reply to this

Re: Quarter-Turns

You’ve got it. More generally, in the context of operator algebras people often like to work with functionals φ\varphi on an algebra normalized so that φ(1)=1\varphi(1) = 1. Of course in infinite dimensions the identity map on a Hilbert space doesn’t have a trace, but when one specializes the operator-algebraic machinery to finite dimensions, a natural special case is φ=1ntr\varphi = \frac{1}{n} tr.

Posted by: Mark Meckes on December 1, 2016 4:00 PM | Permalink | Reply to this

Re: Quarter-Turns

A colleague pointed out yet another way of seeing that the Hilbert-Schmidt inner product is a natural thing. Let XX and YY be inner product spaces (finite-dimensional, say). We have Hom(X,Y)X *×Y. Hom(X, Y) \cong X^\ast \times Y. As already mentioned, the tensor product of inner product spaces gets an inner product in a natural way. The dual X *X^\ast of an inner product space XX also has a natural inner product. Indeed, we can just transport the inner product of XX across the isomorphism XX *X \cong X^\ast defined by x,xx \mapsto \langle -, x \rangle. Putting this together gives an inner product on Hom(X,Y)Hom(X, Y), and it’s exactly Hilbert and Schmidt’s.

Posted by: Tom Leinster on December 5, 2016 2:17 PM | Permalink | Reply to this

Re: Quarter-Turns

Right. I meant to be alluding to that above, when pointing out that the Hilbert–Schmidt inner product, applied to rank-one operators, agrees with the natural inner product on a tensor product. What’s lurking in the background there are Hom(X,Y)X*Y, Hom(X,Y) \cong X* \otimes Y, and $$ X^{\ast} \cong X naturallyforafinitedimensionalinnerproductspace.(Okay,sincethewholepointofthispostisaboutthingsworkingoutdifferentlydependingonwhetherthescalarfieldis naturally for a finite-dimensional inner product space. (Okay, since the whole point of this post is about things working out differently depending on whether the scalar field is \mathbb{R}or or \mathbb{C}Ishouldownuptothefactthatthelatternaturalisomorphismisonlyconjugatelinearover I should own up to the fact that the latter natural isomorphism is only conjugate-linear over \mathbb{C}$. But for the application to characterizing quarter-turns, what matters is just orthogonality, which is unaffected by this particular subtlety.)

Posted by: Mark Meckes on December 5, 2016 2:50 PM | Permalink | Reply to this

Re: Quarter-Turns

Right, got it. So when you use “xyx \otimes y” to denote a linear map XYX \to Y, you’re implicitly moving across the isomorphisms

XYX *YHom(X,Y) X \otimes Y \cong X^\ast \otimes Y \cong Hom(X, Y)

(give or take a conjugate). But here’s something I got a bit stuck on: is it obvious that

Tx,x=T,xx? \langle T x, x \rangle = \langle T, x \otimes x \rangle?

I can calculate that it’s true… and indeed that

Tx,y=T,xy \langle T x, y \rangle = \langle T, x \otimes y \rangle

for arbitrary xXx \in X, yYy \in Y and T:XYT: X \to Y. But how should I think about this?

Posted by: Tom Leinster on December 13, 2016 10:33 AM | Permalink | Reply to this

Re: Quarter-Turns

I have to say I find this description of the Hilbert-Schmidt inner product much more transparent than any of the others. Apparently it’s something I’m perfectly familiar with even though I didn’t know it had a fancy name. (-:

I think one way to think about Tx,x=T,xx\langle T x,x\rangle = \langle T, x\otimes x\rangle is with string diagrams: there are just some strings being turned around. I don’t have the time right now to remember how to create such diagrams as images and post them here, but maybe you can recreate them yourself.

Another way to say the same thing is with abstract index notation for tensors: x ax^a for a vector (a 1-0-tensor), T b aT^a_b for a transformation (a 1-1-tensor), g abg_{a b} for the inner product and g abg^{a b} for its inverse, hence g acT c bg^{a c} T^b_c for the transformation with one index raised to become a 2-0-tensor, and g acg bdg_{a c} g_{b d} for the induced metric on 2-0-tensors, so we have

Tx,x=g ab(T c ax c)x b \langle T x, x\rangle = g_{a b} (T^a_c x^c) x^b

T,xx=(g acg bd)(g aeT e b)(x cx d) \langle T, x\otimes x \rangle = (g_{a c} g_{b d}) (g^{a e} T^b_e) (x^c x^d)

which are equal since g acg ae=δ c eg_{a c} g^{a e} = \delta_c^e (which in string diagrams is “pulling a bent string straight”). Although now that I’ve written it out, I suppose that’s probably essentially the same calculation you already did.

Posted by: Mike Shulman on December 13, 2016 10:53 AM | Permalink | Reply to this

Re: Quarter-Turns

Tom asked:

is it obvious that Tx,x=T,xx\langle T x, x \rangle = \langle T, x \otimes x \rangle ?

What’s obvious is very much a function of what you take as your definitions (among other things). Here it’s better to think of the Hilbert–Schmidt inner product in terms of the tensor product, instead of in terms of the trace. If T=uvT = u \otimes v is any rank-one map, then (uv)(x),y=u,xv,y=u,xv,y=uv,xy. \langle (u \otimes v) (x), y \rangle = \langle \langle u, x \rangle v, y \rangle = \langle u, x \rangle \langle v, y \rangle = \langle u \otimes v, x \otimes y \rangle. (Please excuse sloppiness about conjugates or conventions about order.)

Since an arbitrary TT is a linear combination of rank-one TTs, the identity follows.

That’s still a calculation, but a pretty simple one.

Posted by: Mark Meckes on December 13, 2016 8:18 PM | Permalink | Reply to this

Re: Quarter-Turns

I see. Thanks. Taking apart what you said, I see the three points:

  • The identity u,xv,y=uv,xy\langle u, x \rangle \langle v, y \rangle = \langle u \otimes v, x \otimes y \rangle (for u,xXu, x \in X and v,yYv, y \in Y).

    This is an immediate consequence of the definition of the tensor product of inner product spaces.

  • The fact that every rank-one map XYX \to Y is of the form uvu \otimes v for some uXu \in X and vYv \in Y.

    To prove this, take any rank-one map T:XYT : X \to Y, let uu be any nonzero element of the orthogonal complement of ker(T)\ker(T), and let vv be a suitably-scaled element of the one-dimensional space im(T)\im(T).

  • The fact that every linear map T:XYT: X \to Y is a linear combination of rank-one maps.

    To prove this, just observe that the identity on YY is a sum of rank-one maps, then compose.

Alternatively, one could just argue that Hom(X,Y)X *Y\Hom(X, Y) \cong X^\ast \otimes Y, so TT is a linear combination of maps of the form uvu \otimes v, which is what you wanted anyway. (No need to mention rank at all.)

To round out the picture, under the isomorphism XYHom(X,Y)X \otimes Y \cong Hom(X, Y), the adjoint of xyx \otimes y is yxy \otimes x. Moreover, in the case X=YX = Y, we can only have xy=yxx \otimes y = y \otimes x if one of xx and yy is a scalar multiple of the other. So the rank-one self-adjoint operators on XX are exactly the operators of the form xxx \otimes x — as you said at the start of this thread!

Posted by: Tom Leinster on December 14, 2016 1:22 PM | Permalink | Reply to this

Re: Quarter-Turns

It’s worth mentioning the second most famous quarter-turn after the number ii: the Fourier transform!

F 4=1 F^4 = 1

The Fourier transform is just the quantization of multiplication by ii, so it’s not really a different example. Nonetheless, learning how they’re the same is quite a fascinating adventure.

Posted by: John Baez on December 5, 2016 6:16 AM | Permalink | Reply to this

Re: Quarter-Turns

That’s quite a striking statement! Pray tell, where may one set off on such an adventure?

Posted by: qf on December 6, 2016 11:47 PM | Permalink | Reply to this

Re: Quarter-Turns

Take a look at John’s lecture notes Quantization and Categorification - Fall 03. The quarter turn is explicitly mentioned on pp. 62-63 and p.68.

Did it crop up elsewhere in the seminars?

Posted by: David Corfield on December 13, 2016 9:24 AM | Permalink | Reply to this

Re: Quarter-Turns

I think it’s common knowledge that second quantization functorially gives a Hilbert space K(H)K(H) for any Hilbert space HH and a unitary operator K(U):K(H)K(H)K(U): K(H) \to K(H) for any unitary U:HHU: H \to H; this is what Edward Nelson meant when he said

Quantization is a mystery, but second quantization is a functor.

When we take H=H = \mathbb{C} there’s a famous isomorphism between K(H)K(H) and L 2()L^2(\mathbb{R}). Using this, and taking the unitary UU to be multiplication by ii, the unitary K(U):K(H)K(H)K(U) : K(H) \to K(H) corresponds to the Fourier transform F:L 2()L 2()F: L^2(\mathbb{R}) \to L^2(\mathbb{R}). Since second quantization is a functor, i 4=1i^4 = 1 implies F 4=1F^4 = 1.

What’s a good place to learn this stuff? I gave a pretty thorough introduction to second quantization in my Fall 2003 seminar and David spotted the place where I explained this. I learned a lot of this material from my thesis advisor Irving Segal, and one can see it spelled out in a painfully general and rigorous way in our book:

But it should be in some easier books. Perhaps

  • Gerald Folland, Harmonic Analysis on Phase Space.

has it — I forget! It deserves to be on Wikipedia but I don’t see it.

Posted by: John Baez on December 16, 2016 6:04 PM | Permalink | Reply to this

Re: Quarter-Turns

Sorry to be late to the party. In a jet-lag fug I thought I had posted a comment last week, but apparently not. Anyway, is the terminology standard? Two things stand out that seem to run against the impression you might get from the name. Firstly, four quarter-turns do not necessarily make a whole turn and, secondly, a rotation by 90 degrees about an axis in 3-space is not a quarter-turn!

Posted by: Simon Willerton on December 16, 2016 10:41 AM | Permalink | Reply to this

Re: Quarter-Turns

No, I just made it up. I agree that it has the disadvantages that you mention. Any better ideas?

Posted by: Tom Leinster on December 16, 2016 1:03 PM | Permalink | Reply to this

Re: Quarter-Turns

Here’s some standard stuff, which overlaps in an interesting way with Tom’s discoveries.

The usual term for an operator TT on either a real or complex Hilbert space obeying T *=TT^\ast = -T is skew-adjoint, and we should definitely keep that. An operator with T *=T 1T^\ast = T^{-1} is called orthogonal or unitary depending on whether it’s acting on a real or complex Hilbert space; sometimes people want a word that covers both cases, but none has caught on. Let me say ‘unitary’. If TT is unitary and skew-adjoint it clearly obeys T 4=1T^4 = 1. If you have a unitary skew-adjoint operator on a real Hilbert space, you can use it as ‘multiplication by ii’ to get a complex Hilbert space, so it’s often called a complex structure.

Posted by: John Baez on December 16, 2016 6:14 PM | Permalink | Reply to this

Re: Quarter-Turns

“Skewnitary”

Posted by: Layra Idarani on December 17, 2016 12:56 AM | Permalink | Reply to this

Re: Quarter-Turns

… yes, “Linear Tangent Field” is a bit of a mouthful.

Posted by: Jesse C. McKeown on December 17, 2016 5:16 AM | Permalink | Reply to this

Post a New Comment