RobinCollins_CPiS_Ca..

advertisement
The Fine-Tuning of the Universe
for Scientific Technology and
Discoverability
PART I: BACKGROUND
Review of Anthropic Fine-tuning
evidence
The anthropic fine-tuning refers to the fact that the basic
structure of the universe must be precisely set for life to
exist, particularly embodied conscious agents (ECAs). The
fine-tuning comes in three types:
(1) Fine-tuning of mathematical form of the laws of physics
(2) Fine-tuning of the fundamental parameters of physics
(3) Fine-tuning of the initial conditions of the universe
Most of the discussion in the literature has been on (2), the
fine-tuning of the fundamental parameters of physics.
Fine-tuning of Fundamental Parameters
Question: “What are the fundamental parameters of
physics?”
Answer: They are the fundamental numbers that occur
in the laws of physics.
Many of these must be precisely adjusted to an
extraordinary degree for ECAs to exist.
Example: Gravitational Constant
The Gravitational constant – designated by G
-- determines the strength of gravity via
Newton’s Law of Gravity:
F = Gm1m2/r2,
m1
r
m2
where F is the force between two masses, m1
and m2, that are a distance r apart. Increase or
decrease G and the force of gravity will
correspondingly increase or decrease. (The
actual value of G is 6.67 x 10-11 Nm2/kg2.)
Dimensionless Expression of Strength of
Gravity
The gravitational constant G has units (e.g., in the standard
international system it is G is 6.67 x 10-11 Nm2/kg2 ).
Physicists like to use a measure of the strength of gravity
that does not have units. A standard choice is:
αG = G(mp )2/ℏc, where mp is the mass of the proton, ℏ is
the reduced Planck’s constant, and c is the speed of light.
Other parameters are also usually expressed in
dimensionless form.
Example of Fine-Tuning: Dark
Energy Density
The effective dark energy density helps determine the
expansion rate of space. It can be positive or negative.
Unless it is within an extremely narrow range around
zero, the universe will either collapse or it will expand too
rapidly for galaxies and stars to form.
How fine-tuned is it?
Answer:
In the physics and cosmology literature, it is typically
claimed that in order for life to exist, the cosmological
constant must fall within at least one part of 10120 – that
is, 1 followed by 120 zeros -- of its theoretically natural
range.
This is an unimaginably precise degree of fine-tuning.
Dark Energy Density: Radio Dial
Analogy
WKLF: You must tune your dial to
much less than a trillionth of a
trillionth of an inch around zero.
-15 billion
light years.
+15 billion
light years.
Summary of Evidence
Biosphere Analogy: Dials must be perfectly set for
life to occur. (Dials represent values of fundamental
parameters. Illustration by Becky Warner, 1994.)
Summary-continued
Review of Multiverse Explanation
The so-called “multiverse hypothesis” is the most common
non-theistic explanation of the anthropic fine-tuning. According
to this hypothesis, there are an enormous number of
universes with different initial conditions, values for the
fundamental parameters of physics, and even the laws of
nature. Thus, merely by chance, some universe will have the
“winning combination” for life; supposedly this explains why a
life-permitting universe exists.
Observer Selection Effect
The Observer Selection Effect is crucial to the multiverse
explanation. According to this idea, observers can only
exist in universes in which the laws, constants, and initial
conditions are life-permitting. Therefore, it is argued, it is
also no coincidence that we find ourselves in an
observer-permitting universe.
Multiverse Hypothesis
Humans are winners of a cosmic lottery:
Many Planets Analogy
Given that the universe contains a huge number of planets,
it is no surprise that there is a planet which orbits just the
right star and is just the right distance from the star for life
to occur. Further, it is no surprise that we find ourselves on
such a planet, since that is the only kind of planet creatures
like us could exist on.
Testing the Theistic Explanation
against the Multiverse Explanation
Features of the universe that confirm divine purpose over a
naturalistic multiverse will consist of features of the
universe that meet the following conditions:
(1) We can glimpse how they could help give rise to a net
positive moral value – and hence it would not be surprising
that an all-good God would create a universe with these
features;
(2) They cannot be explained by an observer-selection
effect .
(3) They are very coincidental (surprising, epistemically
improbable) under the non-theistic multiverse.
Discoverability
Our ability to discover the nature of the universe, which
prominent scientists such as Albert Einstein and Eugene
Wigner considered to border on the “miraculous,” seems to
meet the three criteria:
Criterion (1): We normally take discovering the nature of
our universe to be of value – either as intrinsically valuable
or because it helps us develop technology. Therefore, it
would not be surprising under theism that the universe
would be structured so that it exhibits a high degree of
discoverability.
Criteria (2) – (3)
Criteria (2) – (3): There seems to be no necessary
connection between a universe being life-permitting and its
being discoverable beyond that required for getting around
in the everyday world. Thus if the proportion of lifepermitting universes that are as discoverable as ours is
really small, it would be very improbable under a multiverse
hypothesis that as generic observers we would find
ourselves in such a universe. I will provide quantitative
evidence that this proportion is small..
Life-permitting universes that
not highly discoverable.
A life-permitting universe
that is highly discoverable
PART II: CASES OF
DISCOVERABILTY
In the following slides, I will focus on the cases of
discoverability involving the fundamental parameters
of physics since we can potentially get a quantitative
handle on the degree of discoverability in these cases.
However, a significant case for discoverability can be
made from the fact that the laws of nature have the
right form so that we can discover them. This has been
pointed out by Eugene Wigner in his famous piece “The
Unreasonable Effectiveness of Mathematics in the Natural
Sciences” (1960), and recently elaborated in some detail by
Mark Steiner in his Mathematics as a Philosophical
Problem (1998).
Examples of Fine-tuning of Laws
1. Hierarchical simplicity
2. Quantization Technique
3. Gauge (local phase) invariance technique
4. Structure of quantum mechanics itself (complex
numbers, measurement rule, etc.)
Eugene Wigner: “The miracle of the appropriateness of the
language of mathematics for the formulation of the laws of
physics is a wonderful gift which we neither understand nor
deserve” (1960).
Einstein: “The most incomprehensible thing about the
universe is that it is comprehensible."
Two Types of Fine-tuning for
Discoverability
To get a quantitative handle on how coincidental the
discoverability of the universe is, for each fundamental
parameter of physics, I consider the effects on
discoverability of varying it. By doing this, I have found that
there are two types of fine-tuning for discoverability:
Type 1: Livability/Discoverability-Optimality
Fine-tuning
Livability/Discoverability-Optimality Fine-tuning. This sort of fine-tuning
of a parameter occurs if given the basic overarching principles of
physics and the current mathematical form of the laws: (i) the
parameter is within its livability-optimality range (the range between the
thin solid vertical lines); (ii) The parameter falls into that part of the
livability-optimality range that maximizes discoverability (range between
thin dashed lines). This is shown in the figure below, with the star
representing the actual value of the parameter in question:
Note: The region between the two thick black lines is the life-permitting
range. As far as I can tell, all fundamental parameters seem to be finetuned in such a way as to satisfy Livability/Discoverability Optimality.
Example 1: CMB
The most dramatic case that I have discovered of this kind
of fine-tuning is that of the Cosmic Microwave Background
Radiation (CMB). The CMB is microwave radiation that
permeates space. It was caused by the big bang.
Basic Idea Behind Big Bang
The visible universe
began in an explosion in
which all its matter and
energy was condensed
into a volume less than
the size of a golf ball. It
consisted mostly of very
intense light in the form of
photons and particle/antiparticle pairs.
Since that time, the
universe has been
expanding, causing it to
cool.
Why Microwave Radiation?
 As the universe expands, a
photon’s wavelet is stretched
because of the expansion of
space between the beginning
and end of the wavelet. This
causes the distance between the
crests to get longer and longer.
 Thus if a photon of light starts off
with a wavelength (~450nm)
corresponding to blue light, it’s
wavelength will get longer and
longer. If the universe expands
enough, the wavelength will be
stretched into the microwave
region of the spectrum (~1mm –
10mm).
Blue
wavelength
Microwave
Wavelength
Significance of the CMB
The CMB tells us critical information about the large scale
structure of the universe:
“The background radiation has turned out to be the
‘Rosetta stone’ on which is inscribed the record of the
Universe’s past history in space and time.” (John Barrow
and Frank Tipler, The Cosmological Anthropic Principle,
1986, p. 380).
Optimizing CMB
• Much of the information in CMB is in very slight
variations in its intensities of less than one part in
100,000 in different parts of the sky.
• Since it is already fairly weak, this implies that within
limits, the more intense it is, the better a tool it is for
discovering the universe.
CMB and Baryon/Photon Ratio
Intensity of CMB depends on baryon to photon ratio: ηbγ =
(#baryons/#photons) = (#protons + #neutrons per unit
volume)/(#photons per unit volume).
Prediction of Livability/Discoverability-optimality Finetuning: Within the range that ηbγ does not influence
livability or other types of discovery, its value is such as to
maximize the intensity of the CMB since this would
maximize discoverability.
Prediction Correct!
1.2
CMB/CMB0
1
0.8
0.6
0.4
0.2
0
0.1
1
10
100
1000
ηbγ/ηbγ0
Plot of the intensity of the cosmic microwave background radiation
(CMB) versus the baryon to photon ratio. CMB/CMB0 represents the
intensity of the CMB in the alternative universe compared to our
universe, and ηbγ/ηbγ0 represents the baryon to photon ratio in the
alternative universe compared to that in our universe.
Note that the intensity of the CMB is maximal when ηbγ/ηbγ0 =
1: that is, when the baryon to photon ratio is the same as in
our universe.
*Example 2: Weak Force (α w)
 The primary role the weak force plays in the universe is the
interconversion of protons to neutrons. Potassium 40 (K40) and
Carbon 14 (C14) both form the basis of two important dating
techniques. Both decay via the weak force. Their decay rate ∝ α
2.
w
 Thus, increase the weak force by ten-fold, the decay of rate of
potassium-40 would be one hundred times as large, making the
amount of K40 in the earth far below the range of detectability;
this would render K40 dating useless for dating. Further, the lifetime of C14 would be 57 years, instead of 5,700 years. This
would make C14 dating useless for artifacts much older than 300
– 400 years old.
 Note: Particularly in the case of C14, decreasing the weak force
does not allow any new dating technique that could replace C14
to become available.
Weak Force -- Continued
The neutrino interacts with other particles via the weak force.
Because of this interaction is so weak, neutrinos are very
difficult to detect. However, neutrinos carry important
information about nuclear processes in the interior of the earth
and stars, information that no other known form of radiation
carry.
Decrease weak force by 10-fold, it would be virtually
impossible to detect neutrinos from the earth, sun, and
supernovae. Already, detectors are very expensive and the
number of neutrinos detected is barely above background
noise. require an enormous amount of fluid.
Thus, when both radioactive dating and neutrino detection are
taken into account, the weak force seems to fall into the
discernible-discovery-optimal range.
Type 2: Tool Usability Fine-tuning
The second type of “fine-tuning” is that for having enough
usable tools to make the universe as discoverable as our
universe is. To explicate this fine-tuning requires defining
some terms.
Tools and Discoverability Constraints
 A Tool of Discovery is either some artifact or feature of
the universe that is used to discover the domain. For
example, a light microscope is a tool used to discover the
structure of living cells.
 A Tool Usability Constraint is a non-anthropic/livability
constraint that must be met in order for the tool to be
usable. These constraints constitute necessary, though
not sufficient, conditions for usability. For example, a
necessary condition for the use of potassium-argon dating
is that there be detectable levels of radioactive potassium
40 in the earth.
Tool Usability Constraint Range and
Bounds
A tool usability constraint bound on a parameter is the
range of values that the parameter can have for which the
tool is usable.
Two Illustrations of Concepts
1. Wood fires and the fine-structure constant.
2. Light microscopes and the fine-structure constant.
Fine-Structure Constant (α)
The fine-structure constant, α, is a physical constant that
governs the strength of the electromagnetic force. If it were
larger, the electromagnetic force would be stronger; if
smaller, it would be weaker.
Example #1: Fires and α
A small increase in α would have resulted in
all wood fires going out . . .
Civilization and Wood Fires
. . . but harnessing fire was essential to the
development of civilization, technology, and
science – e.g., the forging of metals.
Explanation
Why would an increase in α have this result?
Answer: In atomic units, everyday chemistry and the size of
everyday atoms are not affected by a moderate increase or
any decrease in α. Hence, the combustion rate of wood
remains the same. In these units, however, the rate of radiant
output of a fire is proportional to α2 . Therefore, a small
increase in α – around 10% to 40% -- causes the radiant
energy loss of a wood fire to become so great that the energy
released by combustion cannot keep up, and hence the
temperature of the fire must decrease
to below the combustion point.
.
Conclusion for α and Wood Fires
Upper Bound on α: The ability of embodied conscious agents
(ECAs) to build open wood fires, and hence forge
metals,drastically decreases if α is greater than 10% to 40% of
its current value. This is represented in the figure below: the
actual value of α is represented by the star. The star must fall
below the dashed vertical line in order to have open wood fires.
0
Tool = open wood fires for forging metals.
Tool usability constraint –ability to ignite and maintain open
wood fires.
Tool usability constraint range for α – range of α for which it
is possible to have open wood fires (all the values for α below the
thick vertical dashed line).
Example 2: Microscopes and α
A relatively small decrease in α would
decrease the maximum resolving power of
microscopes so they could no longer see cells
– thus severely inhibiting, if not rendering
impossible, advanced medical technology
(such as the development of germ theory).
As is, α is just large
enough to allow us to see
of 0.2 microns, the size
of the smallest living cell.
*Why this Effect?
 In atomic units, the speed of light = c = 1/α. Decreasing α, therefore,
increases the speed of light without affecting everyday chemistry or
the size of atoms. Thus, the world would look mostly the same.
 The energy of a photon = E = hf, where h is Planck’s constant and f
is the frequency of light. (h = 1 in atomic units).
 A photon of visible light cannot have more energy than the bonding
energy of typical biochemical molecules, otherwise it would destroy
the molecules in an organism’s eye. This requires that for light
microscopes, f < 800 trillion cycles per second.
 Wavelength of light = λ = c/f. In our world, the above restriction on
frequency means that λ > 0.35 microns. Since c = 1/α, as α
decreases, c increases, which causes the minimum wavelength of
light for a light that can be used without destroying an organism’s
eye; this in turn means that the resolving power of light microscopes
will decrease since their maximum resolving power is half a
wavelength (λ/2).
A “Second-Order” Coincidence
The only alternative to light microscopes for seeing the
microscopic world is electron microscopes, which can see
objects up to a thousand times smaller than can be seen by
light microscopes. Besides being very expensive and
requiring careful preparation of the specimen, electron
microscopes cannot be used to see living things. Thus, it is
quite amazing that the resolving power of light microscopes
goes down to that of the smallest cells (0.2 microns), but no
further. If it had less resolving power, these cells could not
be observed alive.
Summary
0
0
Top Figure: The star represents the current value of α and thin dashed line
represents the lower bound of α for which the light microscope would be
usable for seeing all living cells. Bottom Figure: Combines the wood-fire
upper bound and the light microscope lower bound for α. The coincidence is
that the wood-fire upper bound falls above the actual value of α and the lightmicroscope lower bound falls below.
Main Argument
Define {Ti0} as the set of tools that we use in our universe to
discover various physical domains. Given this definition, the
main argument can be summarized as follows:
1. It is highly epistemically improbable (i.e., very surprising)
under naturalism that every member of {Ti0} is usable
2. It is not surprising under theism that every member of {Ti0}
is usable.
3. Therefore, by the likelihood principle of confirmation
theory, the usability of {Ti0} strongly confirms theism over
naturalism.
.
FURTHER EXAMPLES
Example 3: Electric Transformers and α
 In atomic units, the strength of a magnetic field
produced by a current or magnetic dipole in a
ferromagnetic substance is proportional to α2.
 One Consequence: Decreasing α would require that
transformers be proportionally larger; this would cause
a proportionate increase in loss of energy by hysteresis
– already a limiting factor.
Example 4: Length of Year and α
1. Length of year determines length of seasons.
2. Importance of seasons: (a) instills planning for future; (b)
allows for dating by means of stratigraphy (tree rings, lake
beds, coral reefs, ice cores, etc.); (c) helps in keeping
historical records.
To be effective in the above ways, the seasons must not be
too short or too long.
Length of Year and α -- continued
L(year) ∝ α-11/2 (Lightman, 1984, Eq. 22, p. 213).
Increase α by a factor of 3, a year becomes less than an
earth day.
Decrease α by a factor of 5, a year becomes greater than
10,000 earth years.
Both cases would eliminate usefulness of seasons
mentioned above, without giving rise to anything to replace
their role in discoverability.
*Example 5: Parallax and α
By measuring the angle p’’, one can determine the distance
to a star using the formula:
1au = distance from sun to earth = dsin(p’’).
Therefore, d = 1au/sin(p’’).
*Parallax and α -- continued
• Distance of habitable planet from its star ∝ α-4 (Lightman,
1984, Eq. 21, p. 213).
• With a three-fold increase in α, a habitable planet would be
81 times closer to its star, and thus parallax would be good
only to 1/81 the distance given the same atmosphere. With
the best ground-based telescopes on earth, parallax can
only be used to for stars within a 100 light years. The
nearest star is Alpha Centauri, approximately 4 light years
away.
• Thus a three-fold increase in α would
eliminate the usability of ground-based
parallax measurements for any stars.
Summary Diagram for α
0
The star marks the current value of α, which is approximately 1/137.
Usability Constraint Bounds:
Upper Usability (thick dashed): wood fires; parallax; seasons;
Lower Usability (thin dashed lines): light microscopes; electric
transformers; seasons.
Example #6: Natural Radioactivity
Importance of Naturally Occurring Radioactive Elements for
Discoverability.
1. Use for discovering atom.
2. Radioactive dating. (An irreplaceable means of dating).
Dependence on Strength of Gravity, αG
If αG is decreased more than a billion-fold, the amount of
naturally occurring radioactivity in a planet as livable as
earth would be less than 1/10,000 of what it is in the earth.
This would severely hamper, if not render impossible,
radioactive dating. It would also at least hamper the
discovery of the nature of the atom.
*Why? Basic Explanation
If αG is decreased, to retain an atmosphere, the radius of a
habitable planet must increase. Now, the ratio of volume to
surface area of a planet is proportional to its radius. Since
the amount of heat produced in a planet via radioactive
decay is proportional to the volume of the planet, unless the
density of radioactive elements decreases with a decrease in
αG, the amount of heat energy per unit area going out
through the planet’s surface – and hence the amount of
volcanic activity – will increase as αG decreases; at some
point this would drastically decrease the planet’s livability.
Fairly simple calculations show that for there to be even
1/10,000 as much radioactive elements in a planet’s crust, a
planet as livable as ours requires that αG > 10-47.
*More Detailed Explanation
If strength of gravity (αG) is decreased, size of planet must increase in
order for planet to retain an atmosphere.
Q = Rate of heat energy generated by radioactivity in planet = [average
density of radioactive elements in the planet] x [average energy released
per radioactive element per unit of time] x [volume of planet]
QS = rate of heat energy from radioactive decay going through a unit of
surface area = Q/surface area.
Since Q is proportional the volume of the planet, and the volume of the
planet is proportional to [radius of planet]3, whereas the surface area is
proportional to [radius of planet]2 , QS is proportional to [radius of
planet]3/[radius of planet]2 = [radius of planet].
Hence, to keep too much heat energy from flowing through the surface
(and hence causing unlivable levels of volcanic activity), the average
density of radioactive elements must decrease as the radius of a habitable
planet gets larger. If the average density decreases below a certain point,
there will be no significant naturally occurring radioactivity.
Optimality
The level of natural radioactivity is about as large as it could
be without causing a health hazard to any carbon-based life
form as complex as us. Since the higher the level the better
for radioactive dating, it seems adjusted to be nearly optimal
for livability while also optimal for discoverability.
Diagram
The star (αG ~ 10-38)represents the actual value of αG (and
the dashed vertical line ((αG ~ 10-47 ) represents the lower
usability bound for non-C14-radioactive dating.
0
1
Example 7: Dimensions of Space
Using the constraint that the mathematical form of the law
of gravity be maximally discoverable, it is possible to derive
both that gravity obeys an inverse square law and that
space is three dimensional.
We will show this on the next two slides, starting with
Newton’s shell theorem.
*Newton’s Shell Theorem
Discoverability Constraint. The gravitational force obeys Newton’s
shell theorem, which says that (i) for an object located outside a
spherically symmetric shell of mass, all the matter in the shell can be
considered to be at the center of the shell; and (ii) if an object is inside a
spherically symmetric shell of mass, the net force on the object is zero.
This theorem greatly simplifies calculations of gravitational force – for
example, it allows one to consider all the mass of the earth to be at the
center; without the theorem, to calculate the force of gravity on the
surface of the earth, one would have to know how the density of the
earth varies with radius. The theorem is only true if Fg ∝ 1/rN-1 , where N
is the number of spatial dimensions.
Figure: A uniform shell of mass with a center marked
by the star. The attraction of the shell on the object
inside is zero; the force of attraction on the object
external to the spherical shell is Fg = G x (mass of
shell x mass of object)/r2, where r is the distance
from the object to the center of mass of the shell.
*Final Step in Derivation
 Anthropic Condition: Given that F ∝ 1/rN-1 as required for Newton’s
shell theorem, it is well known that stable planetary orbits require that
the N – 1 < 3. Since N > 0, this means that either N = 2 or N = 3.
 A 2-dimensional space is highly unlikely to allow for the kind of
complex neuron-like interconnections that ECAs require, and even it
did, a 3-dimensional space would be far superior for the existence of
ECAs that could discover the universe. Thus, discoverability requires
that N ≠ 2.
 Therefore: If the form of the law of gravity is to be maximally useful
for discovery, space must be 3-dimensional and gravity must
approximately obey an inverse square (1/r2) law.
Example 8: Low Entropy of the
Universe
The universe started with an exceedingly low entropy. This
low entropy state is a special, highly ordered, state that is
extraordinarily improbable. This improbability is illustrated
in the next slide, taken from a book by one of Britain's
leading theoretical physicists, Roger Penrose:
Cannot be Explained by Multiverse
This low entropy cannot be explained by a multiverse hypothesis
since it is enormously more probable for a random fluctuation to
give rise to a small low-entropy region – such as the size of the
solar system -- in which observers can exist than for it to give
rise to a region of low entropy the size of the universe. Thus,
under the multiverse hypothesis, the vast majority of observers
should expect to find themselves in a small region of low
entropy.
Analogy: If a hundred coins in a row are shaken, it is vastly
more likely that five coins would all come up heads in a row
(local region of order) than for all the coins to land on heads (a
large region of order). In fact, the former is virtually certain to
happen at least once in the entire row whereas the latter has a
chance of less than one in 1030 of occurring.
Explained by Discoverability
Low entropy of entire universe makes it more discoverable
for at least two reasons:
1. A universe that has a low entropy throughout is
necessary for us to observe other stars and galaxies,
and thus to understand the origin and nature of our own
planet and sun. (The existence of stars and galaxies
requires low entropy.)
2. To apply general relativity to the cosmos – which is
central to doing cosmology – one must assume that the
distribution of matter is nearly uniform at large scales.
This would not be true if the universe started in a high
entropy state.
*Other Cases
1. Earth’s Magnetic Field – allows for naturally occurring magnets; used
for navigation; helps determining position of continents. Puts a lower
bound on the strong nuclear force and a lower bound on the electron to
proton mass ratio.
2. Existence of supernovas and Cepheid variable stars at time ECAs
exist. (Important for determining distances to other galaxies.) Puts
constraints on several parameters, such as the strength of gravity and
the photon to baryon ratio.
3. Planets too far away to observe (or reach and communicate by
satellites). Places lower bound on strength of gravity.
4. Conditions for workable satellites – atmosphere not too thick;
satellite not too far away. Places lower bound on strength of gravity.
5. Existence of enough copper for an advanced civilization that uses
electronics. Puts a lower bound on strength strong nuclear force;
requires the existence of weak force.
PART III
PHILOSOPHICAL ANALYSIS AND
THEOLOGICAL IMPLICATIONS
Is it Coincidental?
Next I will indicate why we should find it highly coincidental
(epistemically improbable) under naturalism that the tool
usability bounds presented above are met.
Coincidences for α
0
The star marks the current value of α, which is approximately 1/137.
Discoverability Constraints:
Upper Discoverability (thick dashed): wood fires; seasons; parallax
Lower Discoverability (thin dashed lines): seasons; light microscopes;
electric transformers.
In order for the discoverability constraints to all be met, all the thin
dashed lines must fall below the star and all the thick ones above the
star. Since the upper end of the scale is far above 1, and the star is at
~1/137, using α itself as the natural probability measure, it is extremely
unlikely by chance for all the thin lines to fall below the star and all the
thick ones above the star.
Radioactivity and Strength of Gravity
Coincidence Analysis
The theoretically possible range of the strength of
gravity, αG, is 0 to 1. Let the star represent its actual
value, αG ~ 10-38 . A conservative estimate of the
usefulness of radioactivity requires that αG > 10-47. This
lower bound is represented by the thin dashed line. If we
think of this lower bound as having an equal probability
of falling anywhere in the theoretically possible range,
the chance of its falling below 10-38 is one part in 1038.
0
1
Unknown Tools Objection to Argument
This objection is that there are other possible tools for
discovering the physical domains in question. Given
enough other potential tools that would be as good for
discoverability, it is likely by chance alone that one of their
tool usability ranges will overlap the value of the parameter.
This would undermine the argument. To answer this
objection, we first must articulate it by considering a
fictitious illustration.
Illustration
Let the star represent the value of some fundamental
parameter, say αq . Suppose its range is 0 to 10, and the
star
is located at 1. Finally, let each dashed line
represent the lower bound on αq for the usability of some
possible tool to probe the microscopic world. Finally, let the
first dashed line represent the lower bound on αq for the
usefulness of light microscope. One might then say that
there was a 1/10 chance that the lower bound of light
microscopes would be below the star. But, because there
are so many possible usable tools, the chance that one of
them would be usable is close to 100%. Hence, we should
not be surprised that some tool as good as a light
microscope is usable.
Response
In most cases, there are no other good alternatives to the tool
in question. Consider the case of light microscopes. Within
the types of worlds we are considering, apart from extrasensory
perception, creatures of our size can only gain information via
taste, touch, sound, and light. The first three do not have the
ability to resolve objects the size of cells. That leaves only light,
and hence light microscopes, or something like electron
microscopes that generate an image that can be seen.
Electron microscopes, however, require light microscopes to
construct them; also they are limited – for example, they are
very expensive and you cannot see a living cell with them.
Thus, light microscopes are irreplaceable.
Another Example: Radioactive Dating.
*Fine-tuning of Underlying Laws
Given the value of the parameter is almost fixed by the anthropic
range, then the fine-tuning is at the level of underlying laws. To see
this, note that:
 Since the mathematical form of the laws of nature determine the
tool usability constraint ranges, given the underlying laws, it is no
coincidence that the usability constraint ranges fall where they do.
 What is coincidental, however, is that our universe has these
underlying laws instead of laws in which the usability ranges do
not overlap the value of the parameter in question. So, the finetuning is at the level of the fundamental laws, not the values of
the parameters.
 Using the parameter itself, or some natural function of it, as a
probability measure for the overlap will allow us to obtain a
quantitative handle on the degree of coincidence.
Return to Main Argument
Define {Ti0} as the set of tools that we use in our universe to
discover various physical domains.
1. It is highly epistemically improbable (i.e., very surprising)
under naturalism that every member of {Ti0} is usable
2. It is not surprising under theism that every member of {Ti0}
is usable.
3. Therefore, by the likelihood principle of confirmation
theory, the usability of {Ti0} strongly confirms theism over
naturalism.
Discoverability, therefore, gives us evidence in favor of the
theistic explanation of the anthropic fine-tuning over that of
the multiverse hypothesis.
Primary Theological Implication
If the above argument is correct, this provides good
evidence that it was one of God’s primary purposes to
create a highly discoverable universe, instead of being a
bi-product of some other aim of God’s.
Wartime Analogy: If on a bombing run, the bombs landed in
such a way as to maximize the killing of civilians, and this
required considerable fine-tuning, that would be strong
evidence that the civilians were intentionally targeted. Why?
Because if they were dropped to achieve some other goal -such as destroying military targets, one would not expect
them to land in just the right way as to maximize the killing of
civilians – unless achieving that other goal implied
maximizing civilian deaths.
Possible Secondary Theological
Implications
If one of God’s primary purposes was to make a discoverable
universe, that raises the question as to why God would want
such a universe. Some possibilities:
1. Such a universe allows for technology and hence gives
us the resources to make our lives better and help one
another. Can this idea be generalized?
2. Suggests that reality is constructed to be morally and
spiritually discoverable.
Theological Implications -- Continued
3. By being discoverable, the world is structured in such a
way that it both trains our reasoning capacities and
greatly amplifies the value we place in reason. Is there
some important divine purpose for this?
4. Science, which depends on discoverability, also gives us
new conceptual resources for thinking about theological
and spiritual matters. Is there something particularly
important about this?
5. Discoverability raises the question: is there something
spiritually important about discovering the structure of the
universe?
Best of all Possible Worlds?
6. The data so far also indicates that for the variations we
can look at, the universe seems to be the most conducive
for the flourishing of ECAs. Does this suggest that
structurally, our world might in some ways among the best
of all possible worlds?
Final Implication
If technology and discoverability are part of God’s
purposes, it would be a good bet to invest in
technology stocks.
End of main presentation
EXTRA SLIDES
Categorizing Tools
(1) An absolutely critical tool or means is one that is
central to all of advanced science and technology.
The ability to forge metals is an example of an
absolutely critical tool.
(2) An irreplaceable tool is one such that the ability to
discover a domain of physical reality will be
severely degraded, if not rendered impossible,
without the tool. For example, light microscopes
are irreplaceable tools for cell biology.
(3) An important tool is one that plays an important,
but not irreplaceable, role in discovering some
domain. For example, carbon 14 dating is an
important, but probably not irreplaceable, tool for
archeological dating.
Further Hypothesis: Tool-NearOptimality Hypothesis
The Tool-Near-Optimality hypothesis is the hypothesis that
the level of usability of the tools is such that cultural factors
– such as the ability of societies to organize and develop
mathematical thought –are the major limiting factor for the
development of science, not the quality of the tools.
Technical Stuff
Odd’s form of Bayes’ Theorem:
𝑃 𝑇│𝐸 & 𝑘’
𝑃 𝐸│𝑇 & 𝑘’
𝑃 𝑇 𝑘’
=
𝑥
𝑃 𝑁 𝑘’
𝑃 𝑁│𝐸 & 𝑘’
𝑃 𝐸│𝑁 & 𝑘’
T = Theism
N = Naturalism
E = C1 & C2 & . . . Ck
Where Ci is the claim that the ith tool usability constraint is
met.
Download