Old Equations Tell New Stories
By Davide Castelvecchi
Future experiments at the Large Hadron Collider (LHC), at the Tevatron
and at the Linear Collider will hunt for the elusive particles that
could exist beyond the Standard Model. Finding new particles amidst the
barrage of old ones produced by any collision may require calculating
Standard Model predictions with unprecedented precision. A recent paper
by Charalampos ‘Babis’ Anastasiou and Lance Dixon (both of THP) and
their colleagues introduces a method to make such predictions with a
previously inaccessible level of accuracy.
Production of Z bosons at the
LHC, at three different orders of approximation. The bands represent
the chance that a particle shoots out at a given angle.
(Graphic by Lance Dixon)
In the paper, soon to appear in Physical
Review D, the researchers also applied their method to a special case,
the copious W and Z particles that will come out of strong-force
interactions at the LHC, plotting curves that predict how such particles
will disperse inside the detector.
“You can’t make experiments that only produce new physics,” Dixon points
“The data you are looking for,” he says, “could be masked or mimicked by
Standard Model particles. In many cases, to pin down the new physics you
have to figure out what the Standard Model background is going to be.”
That’s not easy.
If nature were a chess game, Richard Feynman used to say, the
fundamental laws of physics would be the rules for how pieces are
allowed to move.
And just as knowing the rules doesn’t make you a grandmaster chess
player, knowing the basic equations of physics—as encoded in the
Standard Model—is not enough by itself to know what those equations
Making predictions on the strongforce is notoriously hard, and gets
increasingly harder if one tries to improve the level of accuracy. Known
as order of approximation, these predictions quickly lead to impossibly
complex calculations. Until now, the complete curves for any particles
produced by hadron colliders were known only ‘in first approximation,’
known as leading order, and at the ‘first step up’, or next-to-leading
order. Dixon’s team carried out the first calculation at the next level,
known as next-to-next-to-leading order, or NNLO.
The NNLO involves enormous numbers of challenging mathematical formulas,
which correspond to the many possible strong-force interactions, or
Feynman diagrams. Brute-force calculations would be too complex even for
advanced computing farms such as SLAC’s, so physicists have had come up
with clever shortcuts.
The method employed in the new paper was first introduced by Laporta in
2000, building on earlier work by Chetyrkin and Tkachov and others.
Laporta’s technique uses algebra to reduce the calculation to a
manageable number of formulas.
Calculating the NNLO still required solving between 10,000 and a million
interdependent equations. Dixon’s team devised several methods to reduce
the amount of calculations by a factor of 1,000. The results were better
To estimate the reliability of their curves, the physicists repeated the
calculation wiggling certain parameters, and they plotted bands that
represented how the curves changed in the process. The bands came out
surprisingly thin, meaning that the results were quite reliable, with
less than one percent error. “We were pretty shocked when the bands came
out so thin,” Dixon says.
“Calculations at the NNLO level are truly daunting,” comments Michael
Peskin (THP). “This is a major piece of work.”
Dixon and his team are making their software publicly available, and
hope that other researchers will find it a valuable tool. Dixon himself
plans to use it in future projects. “The need to do this basic, not very
flashy theoretical work is not always recognized. But there are many
more types of NNLO calculations that can be done over the next few
years,” he says.
NNLO predictions are precise enough for practical purposes – but how
about going to the NNNLO? “That would be almost total insanity,” Dixon
says. “But maybe it can be done.”