The Small Problem with the Large Hadron Collider

advertisement
Small Problem with the Large Hadron Collider
by
Professor Alek Samarin, FTSE
“It is a hypothesis that the sun will rise
tomorrow, and this means that we do not
know whether it will rise”.
Ludwig Wittgenstein (1889 – 1951).
There is a small minority of prominent scientists (we shall call them “skeptics”), who
predict possible destruction of our solar system, as a result of certain conditions taking
place in the Large Hadron Collider, such as creation of a black hole (even if it is
extremely small) inside the Collider during an experiment.
However, the vast majority of scientists (we shall call them the “true believers”), and
there are approximately ten thousand involved in this project, vehemently deny this
possibility.
I shall attempt to explain the possible reason for this difference of opinions among the
scientific community as, after all, we all presumably accept that science is based on
logic, and most of all on the results of experimental tests and observations.
In the middle of the XVII century the new notions of the Laws of Nature developed by Sir
Isaac Newton (1643 – 1727) and by Baron Gottfried Wilhelm von Leibniz (1646 – 1716)
presented both, mathematics and physics as absolutely precise, exact and infallible
sciences.
Marquis de Pierre Simon Laplace (1749 – 1827) expressed his confidence in the
precision of our “clockwork Universe” as follows:
“Given for one instant an intelligence which could comprehend all the forces by which
nature is animated and the respective positions of the beings which compose it, if
moreover this intelligence were vast enough to submit these data to analysis, it would
embrace in the same formula both the movements of the largest bodies in the universe
and those of the lightest atom; to it nothing would be uncertain, and the future as the
past would be present to its eyes”.
Even without casting a strong doubt on the integrity of perception by Laplace that
physics, cosmology and mathematics are exact sciences, it should be mentioned that
Newtonian view of the Universe created several paradoxes, which were unsolved in his
lifetime.
These were the so-called Bently’s paradoxes, named after bishop Richard Bently (1662
– 1742), and brought to Newton’s attention in his letter, questioning the effects of gravity
in either finite or infinite stationary universe, and also the Olbers’ or Photometric
paradox, named after Heinrich Olbers (1758 -1840), which implied that in the infinite
stationary universe night sky will be as bright as it is during the day.
Newton was unable to give natural explanation to Bently’s paradoxes, and suggested a
presence of the divine, rather than physical force for their solution. The explanation to
the Photometric paradox was provided much later, and not by a scientist, but by a
famous mystic-writer – Edgar Allan Poe (1809 – 1949).
1
It is not unreasonable to suggest, that the destruction of a perception of the “clockwork
Universe” commenced on December 14th 1900, when speaking at the meeting of
German Physical Society, Max Karl Ernst Ludwig Planck (1858 - 1947) introduced the
fundamental concepts of Quantum Mechanics. I can not possibly list, within the
constrains of this article, all the new theories and hypothesis of mathematics, physics
and cosmology that led to the devastation of classical scientific thought, and
consequently I shall refer just to the two theorems, which may be considered the most
fundamental in this respect.
I have no doubt, that Kurt Gödel’s incompleteness theorem, published in 1931, should
be regarded as the most decisive actuality, which contradicted our notion of the
exactness and reliability of mathematics.
To illustrate this, I shall quote Kurt Gödel (1906 – 1978), who summarized the essence
of his theorem thus:
“Any effectively generated theory capable of expressing elementary arithmetic
cannot be both consistent and complete. In particular, for any consistent, effectively
generated formal theory that proves certain basic arithmetic truth, there is an arithmetical
statement that is true, but not provable in the theory”.
The deep, philosophical meaning of this theorem was explained by one of the most
eminent scientists of our time, Research Professor of Mathematical Sciences at the
University of Cambridge – John David Barrow:
“If a religion is defined to be a system of ideas that contains improvable
statements, then Gödel taught us that mathematics is not only a religion, it is the only
religion that can prove itself to be one”.
In my opinion the second most decisive blow to the infallibility of classical physics was
delivered by the Nobel Prize Laureate (in 1932), Werner Karl Heisenberg (1901 – 1976).
In his revolutionary principle of indeterminicity or uncertainty (published in 1927), he
showed that there is a fundamental limit to the accuracy to which certain pairs of
variables (such as position and momentum of a particle) can be determined. The
underlying idea of a quantum theory is that particles and energy fields exhibit a dual
character. They possess behavior traits of both waves and particles. An uncertainty must
always exist in the product of the ambiguity in the particle position Δ x and the ambiguity
of its momentum (mass times velocity) Δ p. The smallest value of this combination that
can be measured is given by Planck’s constant h (which is the product of energy and
time), divided by 4π.
Thus, we have the second most important equation contradicting classical physics,
which is:
Δx∙Δp≥h∕4π
The numerical value of Planck’s constant h is extremely small (it is equal to 6.626 ∙ 10 -34
J s) and for large objects the uncertainties in position and momentum are so negligible,
that Newtonian physics for all practical applications remains valid.
The uncertainty principle also has a direct implication in the argument about the safety of
Large Hadron Collider (LHA). Paradoxically, it relates to the perception of quantum
vacuum. In classical physics vacuum was imagined as empty, and therefore pure and
simple. However, the uncertainty principle forbids this notion.
2
But before establishing a relationship between quantum vacuum and its impact on the
safe operation of LHA, I would like to briefly describe the design and purpose of this
world’s largest high-energy particle accelerator. The collider consists of 3.8 meter,
concrete-lined tunnel, with a circumference of 27 kilometers, located at a depth of
between 50 and 175 meters underground, crossing the border between Switzerland and
France. The LHA experimental physics agenda is principally planned to produce protonproton collisions, but for the shorter running periods (typically one month per year) to
include heavy-ion collisions. Please note, that hadron is a collective name for particles
held together by strong nuclear force. The best-known hadrons are protons and
neutrons.
The main purpose of these experiments is to explore the validity of the Standard Model,
which is currently the most acceptable theoretical version of particle physics. It is also
expected, that the collider will confirm the reality of the Higgs boson, which is a
hypothetical massive elementary particle, predicted to exist by the Standard Model, and
in theory responsible for the presence of mass in every elementary particle.
The Standard Model however falls short of being a complete theory of all four
fundamental interactions. It includes electromagnetic, strong and weak nuclear forces,
but failed to include gravitational force.
The strong and weak
nuclear
forces are
reduced in strength with
the increase in
temperature,
but the electromagnetic
force
increases. The
Standard
Model proposes that in less than
one millionth’s
of a second
after the Big Bang, when the
temperature was close to 1027
degrees Kelvin, all three forces
were unified.
Now, I shall return to the concept
of quantum vacuum. According to
the quantum mechanics all spacetime is perceived as an area of continuous activity, in which pairs of appositively charged
particles constantly appear and disappear. However, each of these pairs remains
invisible (thus creating an appearance of vacuum), because the distances they move
between creation and annihilation and their momentum satisfy Heisenberg’s uncertainty
principle. The creation of each pair of particles from “nothing” does not violate the law of
energy conservation, as the uncertainty principle implies that energy can be “borrowed”
to create these transient virtual particles. The “loan” can be as large as one can imagine,
but the larger it is, the shorter is the period in time before “repayment” must be made by
means of annihilation.
Mathematically we can express this phenomenon thus:
(magnitude of the energy loan) ∙ (period of the loan) ≥ h / 4π
3
By the year 1980 there were several seemingly conflicting concepts (based on
observations) in the mathematical models of the generally accepted scientific perception
of expanding Universe. These contradictions were mostly due to the hypothesis
proposed by Edwin Hubble (1889 – 1953), suggesting that the galaxies recede at the
uniform rate. In 1981 Alan Guth, who is presently the Victor F. Weisskopf Professor of
Physics at the Massachusetts Institute of Technology, published a paper: “The
Inflationary Universe: a Possible Solution to the Horizon Flatness Problems” Phys. Rev.
D23:347.
He suggested that the above contradictions can be largely dispelled, if we accept that
during the first second after the Big Bang our Universe cooled down from 10 27 degrees
Kelvin to almost zero. This would have caused formation of the multiplicity of bubbles of
the false (metastable) quantum vacuum; these bubbles in turn would have then
spontaneously created bubbles of true quantum vacuum and the whole system would
have then expanded at the speed of light, increasing in size at least 10 23 times. This
hypothesis became known as the Inflationary Universe.
However, after publishing his paper Guth realized that there was a problem with the
nucleation of bubbles, as they would have failed to generate any radiation. The model of
inflating universe, proposed by Guth, was further developed Andrei Dmitriyevich Linde,
Russian theoretical physicist, who became Professor of Physics at Stanford University.
His paper: “A New Inflationary Universe Scenario: a possible Solution to the Horizon,
Flatness, Homogeneity, Isotropy and Primordial Monopole Problems” was published in
1982, Phys. Lett. B108:389.
It became apparent, that this vacuum metastability event, should it be artificially created,
may be considered as a potentially doomsday scenario. However, according to Guth,
artificial creation of a new universe should not affect the old one. He anticipated that in
about 10 -37 of a second it will disconnect from its parent. It should cause a formation of a
small black hole, which presumably will almost instantly be obliterated. On the other
hand, creation of a new universe should result in the release of colossal amount of
energy, the impact of which on the parent universe is impossible to accurately estimate.
And this is what, I presume, concerns the “skeptics”.
In 2002 Guth and Linde received Dirac Medals of the International Centre for Theoretical
Physics in Trieste for their work – which is recognized as one of the most prestigious
awards in physics. After receiving this award Andrei Linde, in an interview with Jim Holt,
a journalist with The New Yorker said: “When I invented chaotic inflation theory, I found
that the only thing you need to get a universe like ours in a hundred-thousand of a gram
of matter. That’s enough to create a small chunk of vacuum that blows up into the
billions of galaxies we see around us… If somebody had told me that twenty-five years
ago, I would have thought he was crazy, but that’s what we’re getting this medal for. It
represents the acceptance of our theory by the general community”.
I am not sure if the experiments at LHC can result in the creation of a quantum vacuum,
but some “skeptics” apparently anticipate that they possibly will. The grim irony of their
prediction is that should it happen, they will be deprived of the satisfaction of stating: “we
told you so!”
November 1st 2008.
4
Download