Skip to the Main Content

Note:These pages make extensive use of the latest XHTML and CSS Standards. They ought to look great in any standards-compliant modern browser. Unfortunately, they will probably look horrible in older browsers, like Netscape 4.x and IE 4.x. Moreover, many posts use MathML, which is, currently only supported in Mozilla. My best suggestion (and you will thank me when surfing an ever-increasing number of sites on the web which have been crafted to use the new standards) is to upgrade to the latest version of your browser. If that's not possible, consider moving to the Standards-compliant and open-source Mozilla browser.

December 1, 2007

Astronomers Destroy Universe

Posted by John Baez

On November 12th, Lawrence Krauss and James Dent wrote a paper that caused quite a kerfuffle in the media:

Most of the paper is quite technical, but on November 22, the New Scientist took a couple of sentences and blew them out of proportion in a story with a far-out title: Has observing the universe hastened its end?

On November 24, Krauss and Dent changed those sentences to something a bit more reasonable.

The bulk of the paper remains unchanged… but nobody ever talked about that part. It’s a cute example of how sensationalism amplifies the least reliable aspects of science, while ignoring the solid stuff.

Details follow…

For the most part, Krauss and Dent’s paper is an unremarkable analysis of what might happen if the universe were in a ‘false vacuum state’ — that is, a state with a bit more energy density than it would have in its ‘true ground state’.

Astronomers believe the Universe contains about a billionth of a joule of ‘dark energy’ per cubic meter. Could we be in a ‘false vacuum state’? If so, the false vacuum could decay into true vacuum! The universe could be unstable! But, it could last a long time.

A similar but much simpler problem is the decay of a radioactive atom. An atom of uranium-238 has a bit more energy before it decays than afterwards. Why doesn’t it decay right away? Because the nucleus must tunnel through a classically forbidden ‘barrier’ before it can shoot out an alpha particle and decay. Metaphorically, we’re talking about something like this:

The nucleus tunnels from 1 to 3 in a random sort of way, with a half-life of about 4.5 billion years. So, if you occasionally look at an atom of this isotope, the chance of it still being U-238 decreases exponentially… but very slowly.

In theory, our universe could be in a similar situation. If it ‘decayed’ to a state of lower energy density, that could be really bad. What if the half-life were, say, 15 billion years? Then it might go poof! any day now. Or in just a few billion years!

This is way, way, way down on my list of worries. But, it’s fun subject for theoretical physics papers.

There are lots of subtleties my simplified description overlooks. For one thing, the ‘decay’ probably wouldn’t happen everywhere at once — we’re talking quantum field theory here, not quantum mechanics. For another, we’re talking about quantum field theory on curved spacetime.

Krauss and Dent focus on another subtlety. I said that radioactive decay proceeds exponentially. But, this is only approximately true! There are some theorems that say there must be deviations from this exponential law. Under some reasonable assumptions, people think the approximate exponential decay switches over to an approximate power-law decay at late times.

This is a little-known and somewhat controversial fact, mainly because the deviations are very small in practice. As far as I know, they’ve never been seen!

Krauss and Dent give a nice review of these slight deviations and then apply the idea to cosmology — that’s what ‘The late time behavior of false vacuum decay’ means.

They might be right, they might be wrong. Either way, it’s not the sort of thing you’d normally find huge crowds of people arguing about on the internet. You need to understand the Paley–Wiener theorem to really follow this stuff, for god’s sake!

But the last sentence of their abstract was this:

Several interesting open questions are raised, including whether observing the cosmological configuration of our universe may ultimately alter its mean lifetime.

Could just looking at the Universe speed or slow the decay of a false vacuum state? That’d sound completely crazy if you’d never heard of the quantum Zeno effect,where repeatedly observing a quantum system can keep it from changing its state. If you understand this effect, I think you’ll conclude that while it’s not completely crazy, it’s just wrong to worry about human observations of the cosmos affecting the decay of a false vacuum state. For one thing, the relevant concept of “observation” in the quantum Zeno effect is not at all anthropomorphic. The universe was “observing” itself long before we ever showed up.

Krauss and Dent don’t say much about this, except in the last two sentences of the paper:

If observations of quantum mechanical systems reset their clocks, which has been observed for laboratory systems, then by measuring the existence dark energy [sic] in our own universe have we reset the quantum mechanical configuration of the universe so that late time will never be relevant? Put another way, could internal observations of the state of a metastable universe affect its longevity?

But then the folks at New Scientist got ahold of this and ran with it. Who cares that nobody knows if we’re in a false vacuum state. And forget the quantum Zeno effect! What if observing the universe made the false vacuum decay faster? Then we could blame cosmologists for hastening the end of the Universe!

Now that’s a story! So, the New Scientist published this:

Has observing the universe hastened its end?

22 November 2007
Marcus Chown

Have we hastened the demise of the universe by looking at it? That’s the startling question posed by a pair of physicists, who suggest that we may have accidentally nudged the universe closer to its death by observing dark energy, which is thought to be speeding up cosmic expansion.

Lawrence Krauss of Case Western Reserve University in Cleveland, Ohio, and colleague James Dent suggest that by making this observation in 1998 we may have caused the universe to revert to a state similar to early in its history, when it was more likely to end. “Incredible as it seems, our detection of the dark energy may have reduced the life-expectancy of the universe,” says Krauss.

[…]

Clearly Krauss shares a lot of the blame if he actually said that sentence.

Then folks at newspapers like the Telegraph read the New Scientist article and processed it into something even more sensational. Instead of a question, the headline is now a bald statement of ‘fact’:

Mankind ‘shortening the universe’s life’

By Roger Highfield, Science Editor

Forget about the threat that mankind poses to the Earth: our very ability to study the heavens may have shortened the inferred lifetime of the cosmos.

[read on - the version you’ll see has been edited to make it less nutty than the original]

Then the blogosphere kicked into action, with Slashdot advertising the story and Peter Woit pouring some much-needed cold water on it.

Krauss got embarrassed, and on November 24 he wrote on Woit’s blog:

Hi. I wanted to chime in with an apology of sorts regarding the confusion in the press regarding our work. Our paper was in fact about late-decaying false vacuum decay and its possible cosmological implications. Needless to say, the explosion of press interest, prompted by the final two sentences of the paper, misrepresented the work, which was not intended to imply causality, but rather to ask the question of whether by cosmological measurements we constrain the nature of the quantum state in which we find ourselves, inferring perhaps that we are not in the late-decaying tail. However, I do take responsibility in part for the flood, as I was undoubtedly glib in talking to the new scientist reporter who read the paper on the arxiv. I have learned that one must be extra careful in order not to cause such misrepresentations in the press, and I should know better. In any case, the last two sentences of the paper have been revised so that it should be clear to the press that causality will not be implied. mea culpa

He changed the last sentence in the paper’s abstract from this:

Several interesting open questions are raised, including whether observing the cosmological configuration of our universe may ultimately alter its mean lifetime.

to this:

Several interesting open questions are raised, including whether observing the cosmological configuration of a metastable universe can constrain its inferred lifetime.

Note how ‘alter’ becomes ‘constrain’. And, he changed the last two sentences in their paper to this:

Have we ensured, by measuring the existence dark energy [sic] in our own universe, that the quantum mechanical configuration of the universe is such that late time decay is not relevant? Put another way, what can internal observations of the state of a metastable universe say about its longevity?

This isn’t terribly clear. But the tempest in the teacup has blown over — apparently with no one the wiser about the actual contents of the paper, or the fascinating issue of deviations from exponential decay.

To begin remedying the last point, here’s an old post of mine on sci.physics.research, written on May 23,1992:

Carlo Graziani writes:

Exponential decay is in fact a theoretical necessity. It is a generic quantum mechanical feature of problems in which you have a discrete state (e.g. an excited atomic or nuclear state) coupled to a continuum of states (e.g. the atomic or nuclear system in the ground state and an emitted photon flitting around somewhere). There is nothing ad hoc about it. The original paper is Weisskopf & Wigner, 1930, Z. Physik, 63, 54. If you can’t get a translation from German, (or don’t speak German), see Gasiorowicz, “Quantum Physics” 1974 (Wiley), pp 473-480, or Cohen-Tannoudji, Diu, & Laloe, “Quantum Mechanics” 1977 (Wiley), vol 2, pp. 1343–1355.

The essence of the result is the effective modification of the energy of the excited state by a small complex perturbation,

EE+(dEiR/2)E \mapsto E + (d E - i R/2)

where dEd E is the small radiative energy correction (Lamb shift) and RR is the decay rate. The time dependent phase factor is thus also modified:

exp(iEt)exp[i(E+dE)t]exp[Rt/2].exp(-i E t) \mapsto exp[-i(E+d E) t]exp[-R t/2].

This is the source of the decay; probabilities, which go as the square of the amplitudes, will exhibit a time dependence exp[Rt]exp[-R t].

This is indeed the conventional wisdom. Let me begin by saying:

1) I agree that the exponential decay law is backed up by theory in this sort of situation and is far from an ad hoc “curve fitting” sort of thing.

2) The exponential law is apparently an excellent approximation, and as far as I know no deviations from it have ever been observed. Here I am not talking about the (necessary) deviations due to finite sample size. I am talking about deviations present in the limit as the sample size approaches infinity.

3) If you ever wanted someone to actually calculate a decay rate for you, I’m sure Graziani would do a whole lot better job than I would. What follows has nothing to do with the important job of getting an answer that’s good enough for all practical purposes. It is a matter of principle (my specialty). There’s no real conflict.

Okay. So, Graziani has offered the conventional wisdom, what everyone knows about radioactive decay, that it is a “theoretical necessity”. It’s precisely because this is so well-entrenched that I thought I should point out that one can easily prove that quantum-mechanical decay processes cannot be EXACTLY exponential. There are approximations in all of the arguments Graziani cites.

Let me just repeat the proof that decay processes aren’t exactly exponential. It uses one mild assumption, but if the going gets rough I imagine someone will raise questions about this assumption. It’d be nice to get a proof with even weaker assumptions; I vaguely recall that one could use the fact that the Hamiltonian is bounded below to do so.

This is just the proof that Robert Israel gave a while ago (an improved version of mine).

Let ψ\psi be the wavefunction of a “new-born radioactive nucleus”, together with whatever fields that are involved in the decay. Let P be the projection onto the space of states in which the nucleus has NOT decayed. Let HH be the Hamiltonian, a self-adjoint operator. The probability that at time t the system will be observed to have NOT decayed is

||Pexp(itH)ψ|| 2 ||P exp(-i t H) \psi||^2

The claim is that this function cannot be of the form exp(kt)exp(-k t) for all t>0t\gt 0, where kk is some positive constant.

Just differentiate this function with respect to tt and set t=0t = 0. First, rewrite the function as

exp(itH)ψ,Pexp(itH)ψ, \langle exp(-i t H) \psi, P exp(-i t H) \psi \rangle,

and then differentiate to get

iHexp(itH)ψ,Pexp(itH)ψ+exp(itH)ψ,iPHexp(itH)ψ \langle -i H exp(-i t H) \psi, P exp(-i t H) \psi \rangle + \langle exp(-i t H) \psi, -i P H exp(-i t H) \psi \rangle

and set t=0t = 0 to get

iHψ,ψ+ψ,iHψ=0 \langle -i H \psi, \psi \rangle + \langle \psi, -i H \psi \rangle = 0

Here we are using Pψ=ψP \psi = \psi. Since we get zero, the function could not have been equal to exp(kt)exp(-k t) for kk nonzero.

That should satisfy any physicist. A mathematician will worry about why we can differentiate the function. This is a simple issue if you know about unbounded self-adjoint operators. (Try Reed and Simon’s Methods of Modern Mathematical Physics vol. I: Functional Analysis, and vol. II: Fourier Analysis and Self-Adjointness.) For the function to be differentiable it suffices that ψ\psi is in the domain of HH. For physicists, this condition means that Hψ<\|H \psi\| \lt \infty .

[Let me put in a digression only to be read by the most nitpicky of nitpickers, e.g. myself. An excited state ψ\psi, while presumably an eigenvector for some “free” Hamiltonian which neglects the interactions causing the decay, is not an eigenvector for the true Hamiltonian HH, which of course is why it doesn’t just sit there. One might worry, then, that the eigenvector ψ\psi of the “free” Hamiltonian is not in the domain of the true Hamiltonian HH. This is a standard issue in perturbation theory and the answer depends on how singular the perturbation is. Certainly for perturbations that can be treated by Kato-Rellich perturbation theory any eigenvector of the free Hamiltonian is in the domain of the true Hamiltonian HH, cf. Thm X.13 vol. II R&S. But I claim that this issue is a red herring, the real point being that any state we can actually prepare has Hψ<\|H \psi \| \lt \infty. Instead of arguing about this, I would hope that any mathematical physicists would just come up with a theorem with weaker hypotheses.]

As Israel pointed out, this argument shows what’s going on: when you are SURE the nucleus has not decayed yet (i.e., it’s “new-born”), the decay rate must be zero; the decay rate then can “ramp up” very rapidly to the value obtained by the usual approximate calculations.

Physicists occasionally mistrust mathematicians on matters such as these. Arcane considerations about the domains of unbounded self-adjoint operators probably only serve to enhance this mistrust, which is ironic, of course, since the mathematicians are simply trying to avoid sloppiness. In any event, just to show that this isn’t something only mathematicians believe in, let me cite the paper:

  • Grotz and Klapdor, Time scale of short time deviations from exponential decay, Phys. Rev. C 30 (1984), 2098–2100.

“In this Brief Report we discuss critically whether such quantum mechanically rigorously demanded deviations from the usual decay formulas may lead to observable effects and give estimates using the Heisenberg uncertainty relation.

It is easily seen that the exponential decay law following from a statistical ansatz is only an approximation in a quantum mechanical description. [Gives essentially the above argument.] So for very small times, the decay rate is not constant as characteristic for an exponential decay law, but varies proportional to t. [….] Equations (2) and (3) tell us that for sufficiently short times, the decay rate is whatever [arbitrarily - these guys are German] small. However, to make any quantitative estimate is extremely difficult. Peres uses the threshold effect to get a quantitative estimate for the onset of the exponential decay […] Applying this estimate to double beta decay yields approximately 10 2110^{-21} sec, which is much too small to give any measurable effect. [They then go on to argue with Peres.]”

This is all I want to say about this, unless someone has some nice theorems about the allowed behavior of the function Pexp(itH)ψ 2 \|P exp(-i t H) \psi\|^2 when HH is bounded below and ψ\psi is not necessarily in the domain of HH. (This would probably involve extending tt to a complex half-plane.)

I sound like quite a showoff in that post — am I still that bad, 15 years older? I hope not. I probably am.

My post was far from definitive. The argument I gave uses the assumption Pψ=ψP \psi = \psi — it assumes that at the start of the experiment, we’re sure the atom has not decayed. Do we need that? Also, it treats the situation as a closed system. What if we treat it as an open system? Also, Grotz and Klapdor talk about short-time deviations from exponential decay. What about long-time deviations? How does the Paley–Wiener theorem get into the act?

And: has anyone ever seen deviations from exponential decay for radioactive nuclei or similar systems?

There are lots of interesting questions. You can learn a lot about some of them — though not, alas, the last — from the paper by Krauss and Dent. But you’d never know that from the media kerfuffle.

Posted at December 1, 2007 9:18 PM UTC

TrackBack URL for this Entry:   https://golem.ph.utexas.edu/cgi-bin/MT-3.0/dxy-tb.fcgi/1519

16 Comments & 0 Trackbacks

Re: Astronomers Destroy Universe

You need to understand the Paley–Wiener theorem to really follow this stuff, for god’s sake!

Next you’ll be telling us we need to understand Lie theory in order to follow discussions about E 8E_8.

And forget the quantum Zeno effect! What if observing the universe made the false vacuum decay faster?

I knew something sounded horribly backwards about the whole kerfuffle. But no, a thousand Slashdotters can’t be wrong.

Hey, maybe we’ve finally found the reason why the aliens build “Bubbles” around solar systems full of impudent astronomers… .

Posted by: Blake Stacey on December 2, 2007 3:51 AM | Permalink | Reply to this

Re: Astronomers Destroy Universe

Blake wrote:

Next you’ll be telling us we need to understand Lie theory in order to follow discussions about E 8E_8.

You need to understand lie theory to make sense of science reporting on any of these issues.

Posted by: John Baez on December 2, 2007 4:14 AM | Permalink | Reply to this

Re: Astronomers Destroy Universe

Once upon a time, I thought you could trust a journalist to provide a semi-simple representation of a complex issue.

Posted by: Blake Stacey on December 2, 2007 4:42 AM | Permalink | Reply to this

Re: Astronomers Destroy Universe

The proof you gave seems to suggest deviation from exponential form for early times. There are also fascinating issues to do with deviations from pure exponential at (very) late times, quasi-periodic behavior and Poincare recurrences. Good reference (and an all around essential reading) is Maldacena’s “Eternal black holes in anti-de Sitter”, hep-th/0106112.

Posted by: Moshe on December 3, 2007 7:08 PM | Permalink | Reply to this

Re: Astronomers Destroy Universe

The deviations at late times are also what Krauss and Dent are interested in — but not because of something like Poincaré recurrence. Apparently most decay processes switch over from exponential decay to power-law decay at late times. The question is, when do ‘late times’ start? That’s what they’re studying for the decay of false vacuum states.

I would like to understand it for something simpler, like radioactive decay.

Apparently this paper is important:

  • L.A. Khalfin, Soviet Phys. JETP 6 (1958), 1053.

But, I’m too lazy to go dig it out of the library!

This one also looks interesting:

  • R. E. Parrott and J. Lawrence, Persistence of exponential decay for metastable quantum states at long times, Europhys. Lett. 57 (2002), 632–638.

    Abstract: Quantum dynamics predicts that a metastable state should decay exponentially except at very early and very late times. We show through an exactly soluble model that if the decay products can interact weakly with their environment, then the exponential decay regime is prolonged to later times, while the exponential decay constant itself remains essentially unaffected. As the number of environmental degrees of freedom is increased, the asymptotic late-time decay follows higher powers of 1/t and the exponential decay regime is extended without limit.
Posted by: John Baez on December 3, 2007 7:24 PM | Permalink | Reply to this

Re: Astronomers Destroy Universe

Yeah, even power law behavior decays all the way down to zero so has nothing to do with recurrences (those probably have to do with breakdown of semi-classical approximations).

I took a quick glance in PRD 24, 496, which is well written and has the advantage of being easily available…there the power law behavior was related to the existence of an additional saddle point (the “slide”), and more physically to the existence of continuum of states the system can decay into. The latter makes it all seem consistent with the intuition about QM decay from one isolated local minimum to another. I’d expect such decay (in QM) to be purely exponential at late time, until the semi-classical approximation breaks down and recurrences occur etc.

The case of field theory (with or without gravity) is much more interesting…

Posted by: Moshe on December 3, 2007 8:05 PM | Permalink | Reply to this

Re: Astronomers Destroy Universe

I’m already interested in plain old quantum mechanics. Krauss and Dent speak of semiclassical steepest-descent calculations and say:

This approach, at least in the context of a simple quantum mechanical model, was used to calculate non-exponential decay at large times in [A. Patrasciouiu, Phys. Rev. D24 (1981), 496]. For the simple case of a particle trapped in a square potential well the Euclidean (imaginary time in Minkowski space) kernel was calculated and at large times was shown to exhibit a power law behavior T 3/2\sim T^{-3/2}.

I haven’t had a chance to read Patrasciouiu yet. Is he talking about a single particle in a square well in nonrelativistic quantum mechanics — in which case K&D’s allusions to “Minkowski space” are a bit misleading)? Why do we need fancy steepest-descent approximations to calculate what a particle in a square well will do?

Are you saying that the power law decay at large times here is due to the fact that the particle can tunnel to a continuum of states?

Posted by: John Baez on December 4, 2007 6:30 AM | Permalink | Reply to this

Re: Astronomers Destroy Universe

I’ll try to look at it again today, but your last sentence summarizes my first impression. In nuclear decay the result is a continuum of states, and the power law behavior at late (but not very late) times is intimately related to that. Incidentally, the Patrasciouiu paper claims to reproduce WKB results by steepest descent methods, making an interesting observation that real or imaginary time solutions only are insufficient for that purpose.

Posted by: Moshe on December 4, 2007 3:14 PM | Permalink | Reply to this

Re: Astronomers Destroy Universe

I had some time to look at K+D, and some of their references. The power law decay in all cases is related to the existence of a continuous spectrum, and will not appear otherwise. Notice that all the old references talk about decay of unstable particles, not the more general problem of tunneling, so the assumption of continuous spectrum is sometimes implicit. In the K+D paper this assumption is also implicitly made in stating (following their reference 3) that the resolvent of the Hamiltonian has a branch cut, and not just poles. I am puzzled then why this result is at all relevant to the general case of false vacuum decay.

Posted by: Moshe on December 4, 2007 11:52 PM | Permalink | Reply to this

Re: Astronomers Destroy Universe

One more piece of babble, then I’ll stop. I pieced together a picture of the behavior of the decay at late times. Paley + Wiener theorem guarantees this falls off slower than exponential at late times. I see two ways this happens for an unstable particle:

1. If space is infinite the particle can completely decay, meaning the survival probability goes to zero. In that case the spectrum is continuous, and the decay falls off as power law at late times, and there is no issue with recurrences. The potential relevant for that is not what you drew in your post, but one for which V(x) goes to zero at large x.

2. If space is finite there is never a power law regime. However in that case the particle cannot completely decay, there is always some small but finite probability for the decay products to recombined to the initial state. So in that case the late time behavior cannot be exponential for different reasons, essentially the recurrence story.

Comments welcome; again I am not sure how any of this is related to the story of Coleman and collaborators, which typically has potential with two isolated minima.

Posted by: Moshe on December 5, 2007 1:23 AM | Permalink | Reply to this

Re: Astronomers Destroy Universe

Thanks for your enlightening comments, Moshe. I wish I understood this stuff better… decay processes are clearly an important, basic topic in physics!

It’s sort of scandalous how almost everyone thinks decay goes exponentially. Do any textbooks treat this issue clearly? I imagine Sommerfeld would have done a good job if he’d treated it at all.

Maybe I should restart my series on Gnarly Issues in Physics once I get a wiki up and running. It would be a good way to gradually collect information on questions like this.

Another classic puzzle, which Sommerfeld does treat nicely: can group velocities be faster than light (yes) and does this allow us to transmit information faster than light (no). Amusingly, this too involves a lot of contour integrals, branch cuts and the like.

Posted by: John Baez on December 5, 2007 6:03 PM | Permalink | Reply to this

Re: Astronomers Destroy Universe

Actually, this is a small error in my mind, even for questions of principle. For the potential you drew, or the false vacuum decay discussed by Krauss and Dent, the survival probability is exponential all the way to the recurrence time. For later times the probabilities discussed are so small I am not sure I believe the concept makes sense. It will need a strictly infinite ensemble to discuss probabilities beyond that point, and I have an instinctive distrust of any concept that is well defined only in the strictly infinite case (though I stop short of being a fan of constructive mathematics either).

Posted by: Moshe on December 5, 2007 6:32 PM | Permalink | Reply to this

Re: Astronomers Destroy Universe

You can watch John Horgan and George Johnson talking about the Krauss kerfuffle over at bloggingheads.

They think it’s one of many recent episodes of science journalists jumping the shark.

You gotta feel sort of sorry for Krauss at this point, as more and more vultures keep circling in…

Posted by: John Baez on December 4, 2007 4:02 AM | Permalink | Reply to this

Re: Astronomers Destroy Universe

Horgan and Johnson have criticized Carl Zimmer, among others; in this case, I think they’re in error. Rather than go on at great length, I’ll just say I agree with Zimmer’s own self-defense.

Posted by: Blake Stacey on December 4, 2007 6:20 PM | Permalink | Reply to this

Re: Astronomers Destroy Universe

Just curious, but as I understand the Quantum Zeno effect, it delays transitions by resetting the clock on the half life for the event. Doesn’t that mean we should be forcing the Astronomers to look more so as to delay the “End of the Universe”?

Posted by: Mark Biggar on December 6, 2007 12:18 AM | Permalink | Reply to this

Re: Astronomers Destroy Universe

Mark wrote:

Doesn’t that mean we should be forcing the Astronomers to look more so as to delay the “End of the Universe”?

Well, this is certainly a great way for astronomers to justify their NSF grants: “Give me more telescope time or the Universe is a goner!”

But, as I said, the quantum Zeno effect is most unlikely to come into play in situations like this:

If you understand this effect, I think you’ll conclude that while it’s not completely crazy, it’s just wrong to worry about human observations of the cosmos affecting the decay of a false vacuum state. For one thing, the relevant concept of “observation” in the quantum Zeno effect is not at all anthropomorphic. The universe was “observing” itself long before we ever showed up.

To make matters worse, I have no idea what sort of “reverse quantum Zeno effect” Larry Krauss or Marcus Chown could have been thinking about, which would make a transition happen faster when you observe a system.

James Dolan called it the “quantum Coyote effect”, based on the cartoon where Wile E. Coyote, busily chasing the Roadrunner, runs off a cliff but doesn’t fall down until he observes that he’s running in mid-air.

Unfortunately, I know of no such actual effect! Except, of course, in cartoon physics, where it’s a special case of this law:

Cartoon characters can achieve deity-like powers by simply being ignorant of physical or biological laws, as in Water, Water Every Hare, where Bugs Bunny can breathe (and snore) underwater because he is not awake and therefore unaware that he would drown.

Posted by: John Baez on December 6, 2007 2:44 AM | Permalink | Reply to this

Post a New Comment