Skip to the Main Content

Note:These pages make extensive use of the latest XHTML and CSS Standards. They ought to look great in any standards-compliant modern browser. Unfortunately, they will probably look horrible in older browsers, like Netscape 4.x and IE 4.x. Moreover, many posts use MathML, which is, currently only supported in Mozilla. My best suggestion (and you will thank me when surfing an ever-increasing number of sites on the web which have been crafted to use the new standards) is to upgrade to the latest version of your browser. If that's not possible, consider moving to the Standards-compliant and open-source Mozilla browser.

July 9, 2004

LHC

After Strings 2004, I spent the weekend in Paris, and hopped on the TGV to Geneva, where I’ve been spending the week at CERN. The big excitement here, of course, is the LHC, which is scheduled to turn on in the summer of 2007. The first half of the week was devoted to a Strings Workshop, where Fabiola Gianotti (of the ATLAS Collaboration) gave the lead-off lecture.

Wednesday afternoon, we got to tour the two main experimental facilities (ATLAS and CMS) and the magnet testing facility.

Since some of my readers aren’t physicists, let me reverse the order of things, and talk about the computing challenges involved. High energy experimental physics has always been on the cutting-edge of computing technology (the WorldWide Web, you’ll recall, was invented here at CERN), but the LHC experiments raise the bar considerably.

Because of the extraordinarily high luminosity of the LHC (10 34cm 2s 110^{34} \text{cm}^{-2} \text{s}^{-1}), each detector will “see” 10 910^9 collisions/s. An impressive amount of computation takes place in the custom ASICs in the fast electronic triggers which whittle those 10 910^9 events down to 100 “interesting” events/s. Here, I have already oversimplified the problem. The protons arrive in 25 ns bunches. Each bunch crossing produces about 25 events. 25 ns is a short period of time, shorter than the time it takes a particle traveling at the speed of light to cross the detector. So there are a large number of events happening all-but-simultaneously, each producing hundreds of charged particle tracks. ATLAS’s level-1 triggers have the job of disentangling those multiple events and cutting the event rate from 10 910^9/sec to 10 410^4/s, with a discrimination time of 2 μs. To do this, the level-1 Calorimetry trigger has to analyse more than 3000 Gbits of input data/sec. The level-2 triggers cut the event rate further down to 100 events/s. These 100 events then need to be “reconstructed”. That’s done “offline” at a processor farm. Each event requires about 1 second of processing on a 1000 MIPS processor. The reconstructed event is whittled down to 0.5 MB of data. You can probably guess where I’m headed. 0.5 MB/event × 100 events/s: in a year of running you accumulate … 1.5 petabytes (1.5 million gigabytes) of data1.

And then begins the data analysis …

Live Webcam of the ATLAS Cavern
The ATLAS Cavern

The construction engineering challenges are equally impressive. The CMS detector (the smaller, but heavier of the two) is a 12,500 tonne instrument (the magnetic return yoke contains more iron than the Eiffel Tower), containing a 6 m diameter superconducting solenoid, operating at 4 Tesla. The various pieces need to be positioned with an precision of 0.01mm, after having been lowered into a pit 100 m underground. The ATLAS cavern, 55 m long, 40 m high and 35 m wide (also located 100m below the surface) is the largest man-made underground structure in the world and the ATLAS detector, 45 m long, 12 m in diameter, just barely squeezes inside.

A cross-sectional slice through the CMS Detector
Schematic Cross-Sectional view of the CMS Detector

Engineering challenges aside, what we really care about is the physics.

The LHC will reach a center-of-mass energy of s=14TeV\sqrt{s}= 14\text{TeV}, with the aforementioned luminosity of 10 34cm 1s 110^{34} \text{cm}^{-1}\text{s}^{-1}. Most optimistically, this gives them a “reach” for the discovery of new particles of up to m6TeVm\sim 6\text{TeV}. Realistically, what one can see depends very much on the decay channel. Leptons and γs are very-well discriminated. Fully hadronic final states can only be extracted from the (huge) QCD background with a hard p T>100GeVp_T \gt 100\text{GeV} cut, which limits them to only very heavy objects.

To see a very light Higgs (say, 115 GeV) at the 5σ5\sigma level will require a year of running, as the HγγH\to \gamma\gamma channel requires excellent EM calorimetry. For m H>130GeVm_H\gt 130 \text{GeV}, the HZZ *4leptonH\to Z Z^* \to 4 \text{lepton} channel opens up, which will be much easier to separate from background. A squark or gluino with a mass less than 1.3 TeV will be seen in less than a month of running. Masses up to 2.5 TeV could be seen in about a year.

With a year of running, all sorts of other beyond-the-Standard-Model physics (from new ZZ' gauge bosons to TeV-scale blackholes) will be testable too. I’m really very excited about the prospects for our field finally seeing some experimental input again.

1 The total storage requirements expected for the LHC is about 10 petabytes/year. Currently, the largest database in the world is the BaBar Database at SLAC, which holds just shy of a petabyte.

Posted by distler at July 9, 2004 7:42 AM

TrackBack URL for this Entry:   https://golem.ph.utexas.edu/cgi-bin/MT-3.0/dxy-tb.fcgi/395

6 Comments & 2 Trackbacks

Re: LHC

From your report, it seems that higher masses will be easily seen than lower ones. I am afraid about if they are going after all to use the gained luminosity to recheck old events. Recently I have learn that a 40 GeV event profusely reported in 1984 by UA1 was rejected after additional statistics darkened it and, it seems, no recheck was scheduled. Surely the same will happen with the L3’ 68 GeV I told some weeks ago, and the ALEPH’ 115, which suffered the same pattern of preliminary up, final down, no new experimental input in.

By the way, I’d like to collect all the cases, if only for curiosity. Does anyone know of other singular events?

Posted by: alejandro rivero on July 9, 2004 9:50 AM | Permalink | Reply to this

Strings 2008

If all goes according to schedule then Strings 2008 (CERN) is going to be interesting.

Posted by: Volker Braun on July 9, 2004 12:28 PM | Permalink | Reply to this

Re: LHC

I shouldn’t get too excited. The likelihood of the machine, detectors and electronics working sufficiently well from the very start to fit in 1 year’s design luminosity by July 2008 is extremely small. Plus, once the thing’s started up, every bit of those underground chambers is going to be dangerously radioactive.

At least, by 2007, the TeVatron should be working really smoothly!

What we need is a time capsule to store all the supersymmetry and extra-dimension (and little Higgs, etc. etc.) phenomenologists in for the next 3 years, since they won’t be doing anything useful. As it is, we face the minor inconvenience of having to find postdocs/jobs without any data to write about.

Posted by: Thomas Dent on July 13, 2004 6:02 AM | Permalink | Reply to this

Re: LHC

The likelihood of the machine, detectors and electronics working sufficiently well from the very start to fit in 1 year’s design luminosity by July 2008 is extremely small.

I’ll be amazed if it works at all. But, to be fair, a lot of numbers the experimentalists are quoting assume a luminosity of 10 3310^{33}, rather than the design luminosity of 10 3410^{34}.

As it is, we face the minor inconvenience of having to find postdocs/jobs without any data to write about.

Alas, there’s nothing the least bit novel about that.

Posted by: Jacques Distler on July 13, 2004 11:47 AM | Permalink | PGP Sig | Reply to this

Re: LHC

Alternatively, phenomenologists could be invited to the historical database of 20 years of colision events, so they can care for their own models. As a minimum, we should get a bunch of well trained event-trackers for 2007.

Posted by: Alejandro Rivero on July 14, 2004 6:31 AM | Permalink | Reply to this
Read the post CERN
Weblog: Charles Nadeau's Radio Weblog
Excerpt: LHC .
Tracked: July 13, 2004 10:31 PM

Re: LHC

“the WorldWide Web, you’ll recall, was invented here at CERN”

HAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHA

you geek.

Posted by: Enigma on August 3, 2004 3:06 AM | Permalink | Reply to this
Read the post Strings 2007, Part 1
Weblog: Musings
Excerpt: Madrid, Madrid...
Tracked: June 25, 2007 10:32 AM

Post a New Comment