Uploaded by Rachel Li

pd-20-unit-03-course-notes

advertisement
Unit 03 :
Monitoring the System
Welcome to PD20: Engineering
Workplace Skills I: Developing
Reasoned Conclusions
This downloadable version of Unit 03
contains edited content. To get the full
experience of this course and all of its
content (including additional
interactive content and assignments),
you are encouraged to engage with the
course on LEARN.
Contents







Information
Biases in Gathering
Information
Inattentional Blindness
Attentional Bias
Interpretive Bias
Structural Bias
How to Guard Yourself
Learning Outcomes
1. Recognize that various types of biases affect the collection and examination of
information
2. Differentiate between types of biases in gathering information (inattentional
blindness, attentional, interpretive, structural)
3. Describe the impact of one type of confirmation bias in a workplace or everyday
scenario
4. Illustrate an example of inattentional blindness based on a situation outside of
course content
Main Takeaway
Biases can influence how we gather and interpret data.
THERE ARE ASSESSMENTS WITH THIS UNIT. PLEASE LOG IN TO LEARN TO COMPLETE THEM.
1
© University of Waterloo and others
Ring Wearer Experience
We can gain a lot of knowledge from those who have gone before us. Below, read an interview
with Gord Stubley, Associate Dean, Teaching as he discusses his experiences as a professional
engineer.
THE INTERVIEW REFERS TO PICTURES THAT ARE ONLY AVAILABLE IF YOU WATCH THE
INTERVIEW IN LEARN
Hello, my name is Gordon Stubley, it’s my pleasure today to introduce you to PD20,
Developing Reasoned Conclusions, the first course in this new program.
As you know, PD20, a course in developing reasoned conclusions, is a course about thinking.
And you might be wondering: “Well, I did well in high school, why do I need to know more
about thinking?” What I’d like to do today is to tell you the story of some good engineers in
action, and what happened to them, to help illustrate to you why we think paying attention
to our thinking is a crucial skill for you to be developing as early as possible…
The slide you’re looking at is a picture of the space shuttle
Challenger, just after takeoff on January 28th, 1986.
1. Screen capture from Unit 03 Ring Wearer Video
2
© University of Waterloo and others
The purpose of the O-rings is for when the hot gasses flow, and arrive at the top of the O-Ring,
the O-Ring squeezes itself and pushes out to seal the passage-way and stop the hot gasses from
leaving the inside of the rocket and going to the outside. For the O-Ring to work, it is crucial that
the rubber in the O-Ring be resilient.
The first thing you need to know is that O-Ring technology is very, very well established. At the
time, it had been established for well over 40 years. Indeed, nowadays, most bathroom sinks and
kitchen sinks in North America use O-Rings to make sure that they don’t leak. The second thing
that you need to know is that on a previous flight a year before (January, 1985) when the rockets
were recovered and set up to be reconditioned, it was discovered that a third of the
circumference of the O-ring was scarred. This scarring indicated that there had been a passage
of hot gasses through the O-rings and that the O-rings had not done their job.
That particular flight had taken off after an overnight temperature of 12 degrees Celsius, which
was the coldest overnight temperature before a launch in the history of the shuttle program.
The manufacturers of the solid-fuel booster rockets, Morton-Thiokol, set up an engineering
team, under the leadership of Roger Boisjoly, to examine what the problem was. Very quickly,
that team realized the importance of temperature in terms of establishing the resilience of the
rubber and the O-Ring, and the necessity that the temperature be relatively warm for the ORings to maintain their proper sealing properties.
This brings us now to the night before the launch, on January, 1986. The predicted temperature
at Kennedy Space Center for that night was -6 degrees Celsius. It turned out that the actual
temperature was -3 degrees Celsius, but no matter how you look at it, a temperature
significantly colder than any temperature that had existed before a launch.
The engineers (under the leadership of Roger Boisjoly) gave a presentation over the telephone,
the night before the launch, outlining the importance of temperature in terms of the
performance of the O-Rings. At the end of their talk, they made the conclusion that the launch
be postponed until the temperature was more favorable. He was supported by his supervisor,
Mr. Bob Lund, the Vice President of Engineering.
Upon hearing about the decision, the management of NASA showed their displeasure over the
telephone. Upon hearing that displeasure, the Senior Vice President in the room in Utah, Mr.
Jerrold Mason, asked if he could go offline, In other words, he wanted to hang up the telephone
and have a discussion with his people, before coming back to NASA and recommending a final
decision on the launch the next day. When the phone was hung up, Mr. Mason said: “I think
we’re going to have to make a tough management decision now.”
Upon hearing those words, Roger Boisjoly jumped back to his feet and made a very passionate
presentation to his supervisors demanding that they hold to the decision and postpone the
launch. After 20 minutes, Mr. Mason thanked Mr. Boisjoly for his input, he turned and said: “We
are going to have to make a management decision, I want the four Vice Presidents to make the
decision.” He turned to Bob Lund and said: “Bob, I want you to make the decision with your
3
© University of Waterloo and others
management hat.” Bob Lund took off his engineering hat, put on his management hat, and the
rest is history. Seventy-three seconds into the flight, the Challenger blew up; all 7 crew members
including 2 civilians were killed.
As we watch this story, clearly we see two good engineers in a tough situation with a lot of
pressure on them. One made an error in failing to communicate well (a relatively minor error,
there’s no question), and one made a very major error - an error that in this course you will see,
is an error that comes about because of the phenomena of structural bias in our thinking. These
engineers made errors, even after a lot of experience and a lot of time in the workplace. They
still, in their thinking, made a mistake that had terrible, terrible consequences.
In putting PD20 together, we’ve tried to set up a course which will give you the opportunity to
practice your thinking in assessing evidence in situations, in drawing reasoned conclusions, that
actually are truly held by that evidence. Most importantly, we want you to be able to
communicate those decisions so that people truly understand the importance and the impact of
what you’ve arrived at. In doing so, we hope that you will be able to do the great work that we
associate with Waterloo Engineering. I turn you over to professor Andres, your instructional
team, for the rest of this course. I hope you find success and good things in it, and we look
forward to seeing you at the end.
It is suggested that you watch the video found here before continuing the rest of the unit:
https://www.youtube.com/watch?v=vJG698U2Mvo
4
© University of Waterloo and others
Lecture Content: Monitoring the System
Unit 03 is about biases that trip us up when we think about information and evidence.
We want to begin by telling you the story of McArthur Wheeler. It’s an old story, but it’s a good
one!
Information
In 1995, Mr. Wheeler robbed two banks, on the same day, in broad
daylight, with no visible attempt to disguise his face. Police, with
the help of the surveillance tapes and tips from the public, quickly
found Mr. Wheeler and arrested him. Mr. Wheeler expressed
surprise at the police’s ability to identify him so quickly. It turns out
that Mr. Wheeler believed that his face would be invisible to the
surveillance cameras if he wiped lemon juice on his face. One has
to ask: What led him to such an obviously false belief?
Did you ever use lemon juice when you were kid to write invisible
messages? It is simple and fun. You dip a Q-Tip in lemon juice,
write a message on a piece of white paper, and let the juice dry. To
reveal the message, you simply expose the paper to a heat source.
Figure 2. Screen Capture from Unit
03 Slideshow
According to reports, Mr. Wheeler knew this fun fact about lemon juice. He hypothesized that
the invisible-making properties of lemon juice would also hide his face. He proceeded to wipe
lemon juice on his face. To check that the juice worked, he took a picture of himself with a
Polaroid camera. Remember, this was 1995. There were no cell phones with cameras. According
to reports, Mr. Wheeler later explained to police that his face had not shown up in the photo.
Upon hearing this story, you may have shaken your head in disbelief, rolled your eyes at his
simplicity, or simply thought to yourself, “I would never do something this dumb.” Perhaps not,
but before we move on, confident in our intelligence and abilities, let’s take a look at how Mr.
Wheeler arrived at his unfortunate conclusion. He started with something that is true. You can,
in fact, hide messages using lemon juice. The message is invisible until it is exposed to heat. From
this fact he hypothesized that lemon juice will hide his face from surveillance cameras. He then
tested his hypothesis and his test confirmed his hypothesis. The image of his face did not show
up in the Polaroid photo.
5
© University of Waterloo and others
Figure 3. Screen Capture from Unit 03 Slideshow
The reality is, we reason like this every day. Here’s a simple example. Fact: the line at the Tim
Horton’s in South Campus Hall is sometimes annoyingly long. But there are times when there is
no line. Our hypothesis is that there is a small window between the start of class and the end of
class when there is no line. If we go between 15 and 25 minutes after classes have started, there
is never a line. This hypothesis has been confirmed every time we go to the Tim Horton’s in
South Campus Hall.
A moment’s reflection should convince you that we and Mr. Wheeler have reasoned in exactly
the same way. How is it that he can be wrong and we can be right? The problem with this
question is that it presupposes that we are right. The reality is, our explanation of the behaviour
of lines at Tim Hortons could be wrong. In fact, we could be wrong in exactly the same way that
Mr. Wheeler was wrong. Mr. Wheeler did not collect enough of the right kind of information to
justify his conclusion that lemon juice would hide his face. Likewise, the observations that we
have made may not be enough to support our beliefs about when we should go to Tim Hortons.
Let’s unpack what this means.
We cannot make good decisions and draw reasonable conclusions without the right kind of
information. It would be nice if we were naturally inclined to impartially gather information, but
we are not. Our intuitions, prior beliefs, and rules of thumb that we have developed have a
profound effect on how much time we spend gathering information. For the most part, these
intuitions and rules of thumb serve us well. But, at times, our biases can trip us up. So what is a
bias?
Biases in Gathering Information
A bias is simply a disposition that we have; a tendency that leads us to a skewed endpoint in
reasoning. We want to talk about biases that affect how we gather information.
6
© University of Waterloo and others
Inattentional Blindness
We are assuming that by now you have watched the Gorilla-on-the-court video (see video link at
start of this unit). Did you see the guy in the Gorilla suit? If you didn’t, don’t feel bad. You are in
the majority of people who, when focused on counting passes, miss the guy altogether. What
explains this? Psychologists call this phenomenon Inattentional Blindness. Psychologists have
demonstrated over and over again that when normal people are focused on doing normal things,
it is quite likely that they will not notice unusual events — even if these events happen right in
front of them.
Figure 4. Screen Capture from Unit 03 Slideshow
Another example of Inattentional Blindness comes from a study out of Western Washington
University. For this study, researchers had a polka-dot-dressed clown ride a unicycle in a busy
square on campus. Researchers wanted to see which pedestrians would notice the clown. It
turns out that the people who were least likely to see the unicycling clown were people talking
on their cell phone. Just 25 per cent of the people talking on their cell phone saw the clown
cycling around the square. This statistic says something about us. It says something about our
inability to focus on one task and to remain cognizant of our surroundings.
7
© University of Waterloo and others
It is easy to find news stories where Inattentional Blindness trips people up:
A man walking and
texting in Los
Angeles almost
walked into a black
bear that was
roaming the streets.
A woman in Reading,
Pennsylvania tripped
and fell into a water
fountain at a mall. She
had been walking and
texting.
A Street in London, England, now has padded lampposts! This is to
minimize the injuries of people who run into the lampposts while
walking and texting.
8
© University of Waterloo and others
Sadly, not all texting stories are humorous and light-hearted. Back when Pharrell’s song Happy
was still a hit, a woman posted on Facebook saying how much she loved the song. She was
driving and had just heard the song on the radio. Because of the distraction, she lost control of
her vehicle and crashed. She was killed. There are too many stories like this one. Perhaps you’ve
never run into a lamppost, or fallen into a water fountain, or walked into a bear, or been in an
accident caused by texting. How, then, does this apply to the workplace? Here are a couple of
scenarios.
You are meeting with your supervisor and two other coop students. Your supervisor is
sitting on one side of the table, the three of you are sitting on the other side. Your
supervisor is giving you detailed instructions of what she wants done by the end of the day.
You are diligently writing down the instructions when you see something out of the corner
of your eye. You look at your co-worker’s computer and see a Not-Safe-for-Work post on
reddit. You are momentarily taken aback by the brazen behaviour of your peer. You want to
say something, but think better of it. Now is not the time or the place. You turn your
attention back to your supervisor only to hear her say, “And that’s about it. Any questions?”
You realize you have completely missed the last part of her instructions.
Here is another example. Your supervisor and you are conducting a time-sensitive experiment in
a lab. The sequence of steps must be timed just right. You are in the midst of the experiment
when your supervisor tells you he needs to step out for a second. He instructs you to let him
know when the timer dings. You look over. There are four minutes left on the timer. You
patiently wait, and when the timer dings, you walk over to the door and poke your head out. You
see your supervisor looking in your direction. You tell him the timer dinged. Your supervisor
looks at you, but doesn’t say anything. You assume he heard you, so you step back into the lab. A
couple minutes pass and your supervisor hasn’t returned, so you go looking for him. You find him
sitting at the desk eating a candy bar. He hadn’t heard you.
Think back to the experiment from Western Washington University. Remember the
statistic: only 25 per cent of the people who were talking on their cell phone saw the clown
cycling around the university square. Let’s say you have a friend who just read the report
and claims that they never miss anything when they are on their cell phone. They even
marshal evidence: they have never tripped into a fountain, walked into a lamppost, been in
an accident. They even boldly claim that they can text during staff meetings at work and not
miss a thing. What should you say to your friend? Here is what we would say: if you are
blind to the fact that you have missed something (because of Inattentional Blindness), how
can you accurately judge that you haven’t missed anything? You can’t.
9
© University of Waterloo and others
Attentional Bias
We have a dual burden. Not only are we liable to miss things, we often fall prey to Attentional
Bias — a type of Confirmation Bias. This is a bias which affects the degree to which we examine
and remember evidence.
Let’s examine the claim that the best time to go to Tim Horton’s in South Campus Hall is
between 15 and 25 minutes after lectures begin; if you go then, there will be no line. Given what
you know now about attentional bias, you should immediately question how evidence was
gathered to support that claim.
Interpretive Bias
The second type of confirmation bias that we want to talk about is Interpretive Bias. The basic
idea is that an interpretive bias affects the significance we assign to the evidence we examine.
As an example of this, let’s go back to the Challenger disaster of 1986. On January 28th, 1986,
the Space Shuttle Challenger lifted off from the Kennedy Space Center. Tragically, the vehicle
broke apart 73 seconds into the flight. It was later determined that the cause of the disaster was
a failure of an O-ring in one of the solid rocket boosters. This failure allowed hot combustion
gases to leak out, leading to a chain of events which culminated in the disintegration of the
vehicle.
One of the lead investigators of the incident was Richard Feynman, a well-respected American
physicist. Part of Feynman’s investigation looked at the culture of risk taking in NASA. Feynman
asked a very simple question: What are the chances of a vehicular failure and the subsequent
loss of human life? Feynman discovered that engineers had assessed the risk to be one in one
hundred: so if you launch one shuttle a day for 100 days, you can expect to lose one vehicle.
Management, however, assessed the risk as one in one hundred thousand. From the point of
view of management, you can send the shuttle up each day for roughly 300 years and you could
expect to lose one vehicle.
This is a huge discrepancy. The engineers and management had all the same data in front of
them. How was it possible for management to assess the risk much lower than the engineers?
Feynman concluded that it was the managements’ interpretation of the evidence that led them
to give a much lower assessment of risk.
It was known at the time that gas would leak out when the O-rings failed. It had happened on
previous flights, but no previous flight had resulted in tragedy. Management, according to
Feynman, took this to be evidence of success. But, as Feynman points out, this is exactly the
wrong conclusion: “...erosion and blow-by are not what the design expected, they are warnings
that something is wrong. The equipment is not operating as expected.” Feynman’s point is this:
You design an O-ring to stop leaks. If the O-ring is leaking, there is something wrong, and you
need to fix it. He says further, “The fact that this danger did not lead to a catastrophe before is
no guarantee that it will not the next time.” Getting away with it in the past doesn’t mean that
10
© University of Waterloo and others
you will get away with it in the future. This is a very clear instance of an Interpretive Bias that led
to a tragic end.
Structural Bias
The third type of confirmation bias that we want to talk about is called Structural Bias. The basic
idea is that this bias affects the degree to which evidence is made available to us.
Have you ever wondered what it’s like at the bottom of the ocean? Here is how the author of an
Economist article summarizes the typical attitude towards creatures living on the bottom of the
sea: The ocean floor is a domain of exile. It is the place species remain when they have been
pushed out of intensely competitive shallow-water environments. Then, when waves of
extinction rock the planet, such banished animals vanish and their places are filled by another set
of losers from the shallows. A typical explanation for why this is the case sounds entirely
plausible. Yes, there are fewer resources at the bottom of the ocean. But there are also fewer
species with lower population densities, which makes the struggle for scarce resources less
fierce, which in turn allows species which are no longer evolutionarily viable on the surface to
persist.
As the author of the article points out, this evidence is rather one-sided. It is one-sided because
it is exceedingly difficult to study fossils at the bottom of the ocean. In other words, the received
view is born of a structural bias. Without access to fossils, any evolutionary explanation about
how animals end up down there is tenuous at best. If you’re wondering, scientists have recently
found a deep-sea fossil field in a gorge in the Alps. They have uncovered thousands of fossils
from 68 different species. With this information, they are able to challenge the orthodox view
that the ocean floor is for evolutionary losers.
How to Guard Yourself
We have discussed four biases that affect how we collect and examine information. It is very
easy to be tripped up by these biases. In fact, they are structured in such a way so as to evade
detection. Even if we are aware of them, it doesn’t guarantee that we will catch them in action.
This is called a bias blindspot. We hinted at it in the first unit. Even if we are aware that we are
prone to error, we can quickly convince ourselves that we are not, at this moment, making a
mistake. Why?
We are storytellers, and we like stories that are consistent. And when it comes to gathering
information and evidence, we have a tendency to stop looking once we are able to tell a
consistent story. Note that we are using the word story very broadly here and mean to include
things like theories and explanations. Go back to the example of Mr. Wheeler. Mr. Wheeler was
convinced that lemon juice would hide his face from security cameras. He even tested his theory.
He took a picture of his face with a Polaroid camera, and the photo came out blank. To Mr.
Wheeler’s mind this confirmed his theory. The information he gathered from the photo was
11
© University of Waterloo and others
consistent with his belief lemon juice has invisible-making properties. So why bother doing any
more testing? And this is where the confirmation bias tripped him up. He should have done
more testing. He should have taken more pictures of his face with different types of cameras. He
should have looked at himself in the mirror and asked, “Why can I still see my face?” The
confirmation bias convinced him that he had all the information he needed.
Everyone is susceptible to the same type of error that Mr. Wheeler made. And if you think
otherwise, that is likely because you have fallen prey to the confirmation bias. So, what are some
ways to avoid being tripped up?
1. Insert yourself into a community of skeptics
This can be as simple as frequenting different websites committed to checking facts and
debunking common myths and misconceptions. Or perhaps you can surround yourself with
friends and colleagues who have developed good behaviours, like fact checking, asking obvious
questions, and examining issues from different perspectives.
2. Listen to the naysayers
It is easy to let group-think take over when working collaboratively. So when working in a group,
be sure to listen to what the naysayer has to contribute. This is not easy to do. If you think
something is so painfully obvious, and should be obvious to anyone with a brain, it is very easy to
dismiss the person who insists on asking pesky questions.
3. Question your assumptions
Ask yourself questions like: What would need to happen for me to be wrong? Under what
conditions would I be wrong? What will happen if I am wrong? What information/evidence am I
overlooking? Finally, who can I ask for impartial feedback?
Learning about biases and taking steps to route the potential biases in our thinking is a lot of
work, but no one ever said self-betterment was easy. If they did, they were trying to sell you
something! We’re here to tell you that your hard work will pay off. You will become a valuable
person to have around and this will undoubtedly engender future success.
Remember this: biases affect how we collect and examine
information.
12
© University of Waterloo and others
Extra Resources
Confirmation Bias: Science Gone Wrong
Author Simon Singh Puts Up a Fight in the War on Science
https://www.wired.com/2010/08/mf_qa_singh/?+wired%252Findex+%2528Wired%253A
+Index+3+%2528Top+Stories+2%2529%2529
"Wired: Is science under assault?” Simon Singh: “What shocks me is people who have no
expertise championing a view that runs counter to the mainstream scientific consensus. For
example, we have a consensus amongst the best medical researchers in the world - the leading
authorities and the World Health Organization - that vaccines are a good thing, and that MMR,
the triple vaccine, is a really good thing. And yet there are people who are quite willing to
challenge that consensus - film stars, celebrities, columnists - all of whom rely solely on the tiny
little bit of science that seems to back up their view." (Emphasis is the course instructor’s)
This is an instance of a confirmation bias. Specifically, it is an instance of attentional bias. If you
believe that vaccines cause autism and then only focus on anecdotal evidence that suggests
you’re right, don’t be surprised if your belief that vaccines cause autism is confirmed. Don’t look
for confirming evidence. Look for evidence which suggests you’re wrong.
Here is a link to the Rogers Commission Report. We reference this report in the lecture. Pay
close attention to R. P. Feynman’s contribution in Appendix F.
Appendix F - Personal observations on the reliability of the Shuttle (TXT)
https://science.ksc.nasa.gov/shuttle/missions/51-l/docs/rogers-commission/AppendixF.txt
13
© University of Waterloo and others
Download