David Hume (1711-1776) was a Scottish philosopher during the Scottish Enlightenment, an intellectual and scientific boom in Scotland.
(Other Scottish Enlightenment figures include the economist Adam Smith and the poet Robert
Burns.)
He’s regularly considered one of the “top 5” most important Western philosophers.
One of Hume’s main concerns was trying to understand how the mind worked, and what it was reasonable to believe.
In his book An Enquiry concerning Human
Understanding, Hume argued that it was never reasonable to believe in miracles.
Different claims are confirmed by out experience in differing degrees:
Sometimes they are always confirmed: “An ice cube melts after an hour in the sun.”
Sometimes they are sometimes confirmed:
“Birds fly.”
Sometimes they are rarely confirmed: “Thai food is bland.”
Hume says:
“A wise man, therefore, proportions his belief to the evidence.”
You believe things “more” when there’s more evidence for them.
For example, suppose you have three friends:
Butter, Candice, and Diana:
• Butter tells the truth very rarely.
• Candice tells the truth often, but not always.
• And Dianne only tells the truth, never a lie.
If Dianne tells you something, and Butter tells you the opposite, you should believe Dianne.
Sometimes, we have “mixed” evidence– not just testimony, but testimony AND observation AND theoretical prediction AND…
For example, suppose while you’re walking along with a friend, you get robbed. How old was the guy who robbed you?
• Testimony: your friend was with you and he thought the guy was between 38 and 42.
• Observation: to you he seemed younger, between 30 and 34.
• Theoretical Prediction: the police have a suspect– if the suspect is the robber, he was
39.
If you have to guess the robber’s age, you need to use all the available evidence.
Karl von Frisch was a
Nobel Prize winning scientist who studied bees, and bee communication.
“If we use excessively elaborate apparatus to examine simple natural phenomena Nature herself may escape us. This is what happened some forty-five years ago when a distinguished scientist, studying the colour sense of animals in his laboratory, arrived at the definite and apparently well-established conclusion that bees were colour-blind…”
“…It was this occasion which first caused me to embark on a close study of their way of life; for once one got to know, through work in the field, something about the reaction of bees to the brilliant colour of flowers, it was easier to believe that a scientist had come to false conclusion than that nature had made an absurd mistake.”
Suppose now that we get testimony concerning something we have never experienced.
Hume imagines someone from the equatorial regions being told about frost, and snow, and ice. They have never experienced anything like that before.
Hume thinks this person would have reason to disbelieve stories about a white powder that fell from the sky, covered everything by several inches, and then turned to water and went away.
It’s not that they should believe the stories are
*not* true, just that they don’t have to believe they *are* true. We need more evidence.
But suppose someone tells us an even stranger story. It’s like the snow-story, in that we’ve never experienced anything like it before. But it’s even stranger, because we have always experienced the opposite before.
For Hume, this is the definition of a miracle. A miracle is a violation of the laws of nature. Every event or process in the world conforms to the laws of nature (for example, the laws of physics like the law of gravity)– except, if there are any, miracles.
There are about 100 billion people who have lived and died in the history of humanity (and there are 7 billion more who are alive now).
As far as we know, none of the 100 billion people who have ever died and were dead for four days, later came back to life. It’s a law of nature that when you die, that’s the end, there’s no more.
Although there is testimony, in at least one religious book– the
Christian bible– that such an event occurred at least once in history, when
Jesus raised Lazarus from the dead, after he had been dead for four days.
According to Hume, we should be wise and apportion our belief to the evidence.
Since on the one hand we have 100 billion people who died and never came back, and on the other hand we have an old legend from a book intended to make people believe its religious views, it’s most probable that the raising of Lazarus never happened.
“No testimony is sufficient to establish a miracle, unless the testimony be of such a kind that its falsehood would be more miraculous than the fact which it endeavors to establish.”
So, for example, Hume would even say that if you saw someone die and come back to life, you should not believe that it really happened.
Because it’s always possible that what you saw was a trick, or the person was never really dead, or you were on drugs or… Since none of those suppositions are miraculous, you should believe them instead of believing in the miracle. They’re more likely than a violation of nature’s laws.
Hume gives several reasons why testimony about miracles is problematic:
1. There has never been a well-established, independently corroborated miracle.
2. People have reasons to lie about miracles, to convince others to believe their religion.
3. We have lots of evidence of fake miracles and forgeries (e.g. the shroud of Turin).
Finally, Hume argues, if there is at most one true religion (and maybe none) then all of the other religious holy books must contain false testimony of miracles. Therefore, most testimony of miracles is false. So upon receiving testimony of a miracle, the most likely supposition is that it is false.
While Hume was concerned with miracles, we can still learn some general points from him:
First, for each claim that we consider believing, we usually have multiple lines of evidence for and against it: testimony from various sources, observations and past experiences, predictions from our scientific theories, etc.
Second, how much any line of evidence supports a given claim is in part determined by our past experience.
For example, if we have testimony from a source about the weather, our past experience will tell us how reliable in general testimony about the weather is, and how reliable this source is (have they told us lies in the past?).
Third, our goal should be to believe whatever is most likely to be true, given our different lines of evidence and their individual reliabilities.
Sometimes this involves rejecting the evidence, and assuming that people are telling us falsehoods for some reason, or that we had a mistaken perception, or that our scientific theories are wrong.
Finally, “extraordinary claims require extraordinary evidence,” as Carl Sagan said.
If something has a low likelihood of happening, then the evidence must vastly increase this likelihood, or we should not believe the claim that it happened.
Let’s remember Bayes’ Theorem. It says:
P(A / B) = [P(A) x P(B / A)] ÷ P(B)
P(A) and P(B) are the probabilities of A and B being true, respectively. P(A / B) is the probability that A is true if we assume that B is true.
We can write out the theorem in terms of claims and evidence:
P(Claim/ Evidence)
= [P(Claim) x P(Evidence/ Claim)] ÷ P(Evidence)
Notice that as the probability of the claim
P(Claim) goes down << 1, the probability of the claim given your evidence goes down!
P(Claim/ Evidence)
= [P(Claim) x P(Evidence/ Claim)] ÷ P(Evidence)
However, as P(Evidence) goes down << 1,
P(Claim/ Evidence) goes up. It’s good when receiving the evidence is unusual and unexpected– like if no one would ever lie about a miracle, so we expect evidence for miracles about as often as we expect miracles.
This is related to our talk of the base rate neglect fallacy:
P(terrorist = True/ Evidence)
= [P(t = T) x P(Evidence/ t = T)] ÷ P(Evidence)
As the probability of being a terrorist goes down, the evidence that someone is a terrorist becomes less reliable.
This is related to our talk of the base rate neglect fallacy:
P(terrorist = True/ Evidence)
= [P(t = T) x P(Evidence/ t = T)] ÷ P(Evidence)
But as the accuracy of the test (Evidence) goes up (so the probability of getting false positives decreases), P(t = T/ Evidence) goes up.
Just because someone is not a liar or has no reason to lie to you, doesn’t mean you should always trust them.
The lower the probability of the events or claims that are being described, the less reliable testimony is. Don’t ignore the base rate (“prior probability”) of the claim being true.