Lecture 27 Social & Legal Issues (S&G, ch. 14) 5/29/2016

advertisement
Lecture 27
Social & Legal Issues
(S&G, ch. 14)
5/29/2016
CS 100 - Lecture 27
1
Social Implications of
Artificial Intelligence
5/29/2016
CS 100 - Lecture 27
2
Kismet (Brooks’ Lab, MIT)
• Responds to a face
with a happy
expression
• Responds to rapidly
moving face with
disgusted expression
5/29/2016
CS 100 - Lecture 27
QuickTime™ and a
Cinepak decompressor
are needed to see this picture.
3
Kismet (Brooks’ Lab, MIT)
• Example of three-way
conversational
interaction
5/29/2016
CS 100 - Lecture 27
QuickTime™ and a
Sorenson Video decompressor
are needed to see this picture.
4
Social Implications of AI
• Elimination of jobs
• Replacing human judgment by machine
judgment
• Altering our understanding of ourselves
5/29/2016
CS 100 - Lecture 27
5
Consider Computers that are
More Intelligent than Humans
• Suppose it’s impossible
– establishing this would require some kind of scientific
proof that a computer cannot emulate a brain, or that a
brain is not sufficient for intelligence
– in either case, a major scientific breakthrough
• Suppose it’s possible
– could displace us in all intellectual activities
– if we had nothing to do, what would it mean to be
human?
– would/should humans be obsolete?
5/29/2016
CS 100 - Lecture 27
6
Computer Enhancements of
Humans
• Computers are already embedded in humans to
correct problems
– pacemakers, cochlear implants, etc.
• Computers have also been embedded for
enhancing human abilities
• This technology may advance very quickly
• How does this affect our understanding of human
ability and accomplishment
– analogy: use of performance-enhancing drugs
5/29/2016
CS 100 - Lecture 27
7
Social & Legal Issues
5/29/2016
CS 100 - Lecture 27
8
Promethean Fire
• “All technology is non-neutral”
• Technologies influence
focus & action
• Computer technology has:
– social benefits
– social costs
• Often, issues are not peculiar to computers, but are
amplified by computer technology
• Sometimes problems arise from using computers
as they are intended to be used
5/29/2016
CS 100 - Lecture 27
9
Privacy
•
•
•
•
Government databases (federal & state)
Computer matching
Computer profiling
Open & closed files
5/29/2016
CS 100 - Lecture 27
10
Uses of Personal Data
• Marketing
• Decision making
• Other uses:
– secondary use
– invisible information gathering
– information from children
• “Information underground”
5/29/2016
CS 100 - Lecture 27
11
Example: Google Mail
• Google.com is introducing a new, free email
service
• Why?
• By scanning the content of your emails, they will
be able to compile a profile of your interests
• Then they will be able to present advertisements
targeted to you
• A way of bringing you only the information you
are potentially interested in and stimulating
business?
• An invasion of privacy?
5/29/2016
CS 100 - Lecture 27
12
Protecting Privacy
•
•
•
•
•
Access control
Audit trails
Encryption
Ethical use policies
Informed consent
5/29/2016
•
•
•
•
Regulation
Ownership of data
Contracts
Markets, options &
consumer pressure
CS 100 - Lecture 27
13
Computer Error
• Overdependence on computers
• Example illustrating issues: computerized
voting
– ampliative & reductive
• Major failures
• Lesser failures
• Limitations of computer simulation
5/29/2016
CS 100 - Lecture 27
14
Simulation & Models
• Recall our discussion of models:
– a model is intended to address a certain class of
questions or issues
– models make simplifying assumptions
• Recall the omission of aerodynamic
stability from bridge models before the
Tacoma Narrows Bridge disaster
• Nevertheless, computer models &
simulations are essential
5/29/2016
CS 100 - Lecture 27
15
Evaluating Models
• How well is the underlying science
understood?
• What sorts of simplifying assumptions are
made?
• How well does the model agree with
experimental results?
5/29/2016
CS 100 - Lecture 27
16
Computer Crime
• Often just a new or “better” way to commit
traditional crimes
• Ampliative aspects of computer technology
– may allow committing crimes on a larger scale
• Reductive aspects of computer technology
– may make it harder to detect crimes, identify
criminals, etc.
5/29/2016
CS 100 - Lecture 27
17
Some Kinds of Computer Crime
• Fraud, embezzlement & theft
• Stealing (data)
• “Hacking” (system cracking)
– individual systems
– the information infrastructure
– benign hacking?
• Viruses & worms: reproduce selves
– virus: hides in another program
– worm: an independent program
5/29/2016
CS 100 - Lecture 27
18
Constitutional & Civil Liberties
Issues
• 1st Amendment protects freedom of speech
& press
• 4th Amendment protects against
unreasonable search & seizure
• Do they apply to electronic & other new
media?
• So far, the courts generally have held them
to apply, but there is much uncertainty
5/29/2016
CS 100 - Lecture 27
19
Communication Media
• Is the operator of a communication medium
responsible for the information it carries?
– analogy: telephone companies & post office
– analogy: a privately-owned meeting room
• Can the operator restrict this information?
– analogy: telephone companies & post office
– analogy: private presses
5/29/2016
CS 100 - Lecture 27
20
Encryption & Wiretapping
• Encryption can be used to ensure privacy &
authenticity
– may be used for legal/illegal purposes
• May be subject to export restrictions
• What provisions should be made for
government interception & recording of
communications?
• What limits should be imposed on the
government
5/29/2016
CS 100 - Lecture 27
21
These issues are especially
relevant since passage of
“USA PATRIOT” Act
5/29/2016
CS 100 - Lecture 27
22
Download