Ethics Module - Sacred Heart University

advertisement
Title: Ethics of Information Technology Module
Title: Cyber ethics:
The Ethics Of Information Technology
Author: Frances S Grodzinsky
Professor of Computer Science and Information
Technology Sacred Heart University
Grodzinskyf@sacredheart.edu
Description: This module encompasses topics that are related to computing and society.
They all deal with what happens at the intersection of society and technology. We are all
users of technology and as such should recognize that not only does technology impact
society, but also society impacts technology. To that end, there must be an ethical
dimension in both the development and implementation of technological solutions to
problems, and in the use of technology in everyday life. This two dimensional focus
broadens the discussion of computer ethics to that of cyber ethics, Internet ethics or
information ethics. Cyber ethics as a designation is broader as it encompasses computer
technology that exists and yet might not be related to the Internet. Discussions and
dialogues about cyber ethics are important not only for computer science and information
technology majors, but also for the wide spectrum of students and faculty at the
university who are all computer users.
This module will help you to investigate the questions at the intersection of computing
technology and society from an ethical perspective. To be an ethical computer
professional or computer user, one has to know the values that underlie one’s decisionmaking process. While this is largely and ultimately an individual process and choice,
the values one espouses become the foundation for the individual’s interaction with
society and technology. Aristotle believed that becoming an ethical person is akin to an
apprenticeship under the tutelage of moral mentors. The role of the teacher/scholar in
cyber ethics is to guide her students towards this goal within the safe boundaries of the
classroom, to develop undergraduate curricula around ethical issues that impact on
computer users and to disseminate these ideas through the incorporation of one or more
modules within the classroom. It is through this incorporation that we can evaluate the
ethical values that should be espoused in the use of computer technologies, and the
policies that should be developed along with these technologies that will ensure their
ethical implementation.
Module:
Relevant Topics for Discussion :
Justice, Autonomy, Freedom, Democracy and Privacy are all values that can be supported
or threatened by computer technology. Considering that computer technologies are the
fundamental infrastructure of the Information Age, ethical questions arise regarding
access and control, privacy, property, identity and professional responsibility. We can
break down these issues into three broad sections:
1. Ethical Process and Conceptual Frameworks
a. Here is a framework for ethical analysis, which can be used in oral and written
work for your course. See The Ethical Process by Marvin Brown published
by Prentice Hall. This little workbook details how one presents ethical
arguments. It goes over the proposal, observations, assumptions and value
judgments. It teaches, using worksheets, how to develop counterarguments
and how to support your position using ethical theories. There is also a good
chapter on philosophical ethics (Chapter 2) in Ethics and Technology by
Herman Tavani published by Wiley. Tavani also has a good treatment on
logical argument and critical thinking skills. He gives a detailed analysis and
tools for evaluating cyber ethical issues.
i. Both of these books evaluate Utilitarianism, Deontology, Virtue Ethics
and Rights theories. Discussions of these theories can be found in the
Philosophical Ethics Module in this grouping.
ii. James Moor has developed an interesting theoretical approach called
Just Consequentialism that marries a consequentialist analysis with
theories of justice (see Moor, “Just Consequentialism and
Computing”).
iii. This ethical analysis can be used in your course with any given
question that requires a position. Using the worksheets in groups or in
the class as a whole, gives students a handle on ethical analysis skills.
b. An issue of importance that could be discussed in this section is whether
Cybertechnology, or any technology, is value laden or value neutral.
i. Questions to ask:
1. Are values embedded in technological artifacts?
2. Are biases evident in the design of technology? Where?
a. Example 1: one could look at the user interfaces to
determine if there is gender bias, or insensitivity to
users with color blindness.
b. Example 2: one could study computer game interfaces
and content for gender bias.
c. Example 3: can be chosen from any technological
artifact such as the ipod, cell phone, etc.
2. Issues of Access and Control
In this section, relevant topics include: equity of access, computer crime,
democracy, content control and free speech, the Digital Divide, privacy, and
property. Issues around these topics could be raised in sociology, criminal justice,
political science and history classes. In the sciences, issues of data protection
especially within the medical community, and biometric indicators would provide
the class with a perspective on some cutting edge ethical issues. Philosophy
classes could tackle the problem of privacy as a social and/or individual good.
Here are some subtopics that could be used under the more general ones and
associated questions to investigate. The bibliography provides sources for all
these topics:
a.
Computer Crime
i.
Hackers and Crackers
1. Hacker code of ethics
2. Peer group pressures / reputation
3. Internet crime rings
ii.
Cyber crime
1. Trespass
a. Viruses
b. Theft of data and information
2. Piracy
a. Software copying
i. Proprietary software v. Open source
ii. This can also be discussed under Intellectual
Property
b. Movies / cd’s / dvd’s
c. Identity Theft
3. Predatory crimes
a. Stalking
b. Child pornography
c. Chat Room predators
iii. Security in cyberspace: The Internet and ethical values
1. Protecting computer systems from unauthorized access,
viruses, worms, bots
2. Protecting data that is stored in computer systems.
3. Counter measures: Do they work? What are we giving up?
a. Firewalls
b. Virus protection software
c. Encryption tools
b. Cybertechnology & Democracy:
i.
Is the Internet a democratic medium?
a. What is democracy?
b. Can we find the characteristics of a democracy on line?
c. Is privacy necessary for democracy?
ii.
Does the Internet facilitate democracy and democratic ideals?
a. If so how?
b. Look at the use of the Internet in democratic vs. nondemocratic countries like China or North Korea.
iii.
Surveillance: Security using computer technology
a. Biometric markers
b. Cookies: How much do we know about the cookies
stored on our systems?
i. Marketing strategies using cookies: Look at
sites like Amazon.
ii. Tracing our web usage through cookies.
c. RFID tags in clothing
d. CCTV - cameras used in the UK to monitor street
activity
e. Chips implanted in pets.
i. Should we put chips in our children, in case of
kidnapping? This is a provocative question that
causes a lot of discussion.
iv. An important question is, What are we giving up in the name of
security? Privacy.
a. A study of the 4th Amendment and the Patriot Act is
enlightening.
b. Older technology that raises this question is the Clipper
Chip
c. Newer technologies: Is there what James Moor calls an
“ethics gap”, i.e., technology developed without
consideration of the ethical implications of that
technology.
i. Body scans
ii. Automobile black boxes
iii. RFID tags in passports
c. Content Control vs. Free Speech
i. Question: Is Free Speech an Absolute or a Conditional Right?
ii. Spam
a. contrast spam to junk mail. How is it the same? Different?
iii. Pornography
a. Should we censor pornographic web sites?
b. Child pornography
i. Look at CIPA: Child Internet Protection Act
c. Interesting legal cases and attempts at making laws, most
of which have been overturned (see Tavani text).
iv. Hate web sites and hate speech
a. Should we censor hate web sites?
v. Filtering programs
a. What are filtering programs?
b. How do they work? What is filtered? Who decides?
c. Should libraries have filtering programs on their
computers
i. There are interesting cases in the courts around
this issue (United States v. American Library
Association, March, 2003).
d. Digital Divide: See Warshauer and Thorseth references
i. Local – within the US ( see NTIA references)
a. What is it based on ? Economics? Literacy?
ii. Global—across countries
a. The role of the Internet in developing countries
i. Issues of Cultural Differences
ii. Issues of infrastructure
iii. Racial inequities
iv. Gender inequities
a. See the work of Gumbus and Grodzinsky in the role of
gender bias in the workplace.
b. Another interesting topic is to discuss the role of gender in
computer game interfaces.
i. How are men portrayed?
ii. How are women portrayed?
v. Disability inequities
a. Equity of access on computer hardware
b. Equity of access on web sites?
i. Are web sites disability accessible
ii. Look at the World wide web Consortium web
site
e. Privacy
i. Why is privacy important?
a. Privacy as a social good
i. Is it only a social good for a democracy?
ii. This question could lead to one on a global
perspective on privacy.
1. Look at China, North Korea, countries in
the former Eastern Bloc and Latin
America for examples.
b. Privacy as an individual good
ii. Privacy is not in the US Constitution
a. From where does our privacy protection come?
i. Look at the 3rd and 4th amendments of the US
Constitution
ii. Warren and Brandeis article “A Right to
Privacy” published in the Harvard Law Review
in 1890.
iii. Discuss the concept of a zone of privacy
iii. Gathering personal information
a. Question: What is the difference between confidential personal
information and non-confidential personal information?
i. Should there be some protection for the latter in
cyberspace?
1. Kinds of information: divorce records
including bank information: thieves send
for checks
2. Identity theft.
3. oogle in your telephone number and see
what is revealed about you.
4. Do a google search on yourself.
b. Data mining: Finding patterns based on transaction information
i. Should buyers be informed on this secondary use
of their information?
ii. Should they be allowed to opt out?
c. Data merging: Merging records of multiple databases to create a
more complete source of information
i. Is this illegal?
ii. What are the restrictions against creating these
computer profiles
d. Matching records
i. Are we innocent until proven guilty or through
matching records, are we guilty until proven
innocent?
ii. Matching fingerprints to lists of suspected
terrorists
iii. Matching pictures of people entering the Super
Bowl to pictures of terrorists
iv. What is the potential for false identification?
1. What is the recourse for those accused?
iv. Privacy policies
a. Private policies: Is there an expectation of Privacy
i. At work?
ii. At university?
iii. A good way to frame this discussion is around
email monitoring.
b. Legislation in the US
i. Privacy Act (1974)
ii. Electronic Communications Privacy Act (1986)
iii. Video Privacy Protection Act (1988)
iv. Health Insurance, Portability and Accountability
Act (1996)
v. Children’s Online Privacy Protection Act (2000)
vi. Real ID Act (2005)
c. It is interesting to contrast this privacy legislation to the privacy
legislation put forth by the European Union.
v. Privacy enhancing tools
a. PGP- Pretty good privacy (there is a web site)
b. PET- Privacy enhancing tools
c. Encryption
f. Property Rights
i. What is Intellectual Property and how is it protected?
a. Copyright
i. Examine the initial intent of copyright law and
how it was modified.
b. Patent
c. Trade Secrets
d. Use a coke bottle or some other soft drink bottle to illustrate
each of these protection devices.
e. In this discussion it is interesting to emphasize the differences
among the three forms of protection and the motivation for these
laws.
ii. New Models for Digital Intellectual Property
a. Creative Common License – Larry Lessig
i. Involves intellectual property on the web; what
you can and cannot borrow in building a web site
b. Digital Rights Management (DRM) and Fair Use
i. DRM uses technological measures to prevent the
distribution of intellectual property
1. What is the implication for Fair Use?
iii. Open source software (see Grodzinsky, Miller, Wolf)
a. Studies of the Free Software Foundation web site, will give you
articles by Richard Stallman and the founding of GNU free
software.
b. It will also give you information about the licensing of Free
Software.
c. For articles on Open source software and its break from
Stallman and the Free Software Foundation, see Eric Raymond’s:
The Cathedral and the Bazaar
Homesteading the Noosphere
The Magic Cauldron
d. The last two are particularly good for economic classes and
classes in social theory.
iv. P2P networks and file sharing
a. Sharing of music files has become a very important issue
especially to college students
i. The Verizon case is an interesting example of an
Internet Service Provider that refused to give up
names of its members to the Recording Industry,
and how it challenged the provision of the Digital
Millennium Copyright Act that the RIAA tried to
use against it. ( See Grodzinsky and Tavani)
ii. Look at the Apple model of distribution of music
over iTunes
iii. Look at other distribution models (see Litman)
3. Impact on Human Life
Issues of interest in this section include: identity, biometrics, professional responsibility
and accountability, the workplace, virtual community, humans and machines and shaping
information technology for the future. These issues lend themselves nicely to science,
religious studies, sociology, and philosophy and psychology classes. Topics on what it
means to be human and online identity can generate a lively discussion.
a. Cyborgs and Transhumans
i. What does it mean to be human? Nanotechnology, virtual reality and
embedded technology all have ethical dimensions that will impact how we
understand what it means to be “human”.
ii. Artificial Intelligence
a. Do computers think?
i. Is it the same as human “thinking”?
b. What are Transhumans
ii. Nick Bostrom, of Oxford University, has done a
lot of work in this field and has a website.
iii.
Robotics
a. Look at films involving robots: Spielberg’s Artificial
Intelligence; Terminator; Star Wars
b. Questions: If we build robots to be like people, should
we give them free will? Should we program them with
pain? If we give them free will and they want freedom,
should they be free?
iv. Nanotechnology
a. Computer chips are getting smaller and smaller
b. Positive uses
i. Artificial limbs
ii. Restorative human functions
c. Negative uses?
i.What should the limitations be?
d.. Should we be going in this direction?
v. Who will shape society in the 21st century?
a. Kurweil has written many books on this
i. See The Age of Spiritual Machines:When
Computers Exceed Human Intelligence
b. Identity and Community
i. On line communities
a. Virtual reality
b. Rape in cyberspace (see Lambda Moo and Julian Dibble)
ii. Self-expression
a. instant messaging
b. blogs
iii. Sense of self: Who are we in cyberspace?
a. Core identity
b. Exploring identity
c. Multiple windows/ multiple selves?
i. Sherry Turkel has done interesting work in this
area.
iv. Questions for discussion:
a. What is our responsibility as users to our online
community?
b. Where we are going in the 21st century?
c. Reorganization of Work
i. Surveillance in the workplace
a. who owns your networks in the workplace?
b. Should email be monitored?
b. What is your expectation of privacy in the workplace?
c. Is surveillance in the workplace “fair?”
ii. Telecommuting: How has technology changed how we do our work?
a. Job sharing
b. Disability access
c. Outsourcing
d. Repetitive strain injuries and ergonomics in the
workplace
iii. Ethics in the workplace (see Gumbus- Business ethics module on this
site)
d. Whistleblowing / errors
i. What kind of technological errors deserve to be made public
ii. Who are whistleblowers? How can they protect themselves and the
public at the same time?
a. There are several good films on this: Norma Rae, The
Insider,
b. Several recent cases: Enron, Arthur Anderson, BART,
Therac 25
II. Suggestions for classroom activities:
a. Ethics Quiz: I have developed an Ethics Quiz that can be tailored to your area. It is
posted on the website. In this quiz, students are asked to determine if an issue is
ethical or not. It is a binary decision: yes or no. It takes 5 minutes of class time.
Then we count up the votes for each question, and students give the reasons for their
votes. Two things inevitably happen: 1. Students find it hard to say a decision is
ethical or not. They want to say “maybe” or “it depends”. I am firm in making them
say yes or no. 2. Ultimately, after all the discussion, students realize that each
question involves an ethical dilemma on some level. The purpose of this quiz is to
sensitize students to ethical issues, i.e., to realize that most decisions in life have an
ethical component associated with them. This activity is a great icebreaker in class.
b. Written Assignments: Using the ethical process, present a proposal stating your
observations, assumptions and value judgments. Present counter arguments and discuss
why you could not accept them. Support your arguments with ethical theory.
i. These can either be short one-two page papers, or longer 3-5 page
essays.
ii. The outlines for the papers can be developed in class.
iii. Rubrics for grading papers are on this website.
c. Small Collaborative Group Activities:
i. Divide the class in pro / con groups around issues. Have each group
write the proposal, e.g. we should abolish the death penalty in this state;
we should not abolish the death penalty in this state. Each group then
develops its arguments and presents them to the class with its opposing
group. The class judges the effectiveness of the arguments.
i. This can also be set up as a courtroom with lawyers, a judge, and a jury.
Each side presents its arguments and the jury returns a verdict.
ii.Chairs: There are three chairs in the front of the room. The professor
writes a position statement on the board. One chair is the pro chair, the
other the con chair and the third is a Meta ethics chair. Whoever sits in
the chair can only argue from that position. The metaethicist observes
how the other two are arguing: e.g., this person is offering a utilitarian
argument; or, this person is not supporting his/her argument and is
appealing to emotion. A person can only speak when sitting in the chair,
so there is a lot of shuffling around in this exercise, but it can be a very
energizing way to lead a discussion.
iii.Have students chose a short story or film that depicts some ethical issue
within your classroom. In small groups, have them design a video or
short scene that depicts the ethical choices that the characters face. The
group should lead a discussion around these choices.
iv. Team Presentations: I have found that having teams introduce the
topic of the day in a PowerPoint, or other presentation format (some use
illustrative film clips), is a very effective means of energizing the class.
This format also makes me aware of what issues in the topic are of
particular interest and/or concern for my students. In these presentations,
they are required to discuss the Readings assigned for that day and to
engage the class in discussion. This format is very popular with my
students who are seniors and can handle the responsibility that this
activity entails. It is difficult for the instructor, who then has to fill in
the important issues on the topic that were not discussed. Instructors
could check presentation outlines in advance to make sure the topic is
covered thoroughly. I have not found this to be necessary. I usually
follow this activity with a small collaborative exercise that applies what
they have learned.
d. On line Activities:
i. Discussion Boards: For each half of the semester, I have each student post at
least one news article concerning an ethical issue involving technology. Other
students are responsible for discussing the posted article. Part of their grade is
based on their participation on the discussion board. This board takes off and
there is a lot of interesting discussion around the articles. Students relate them
back to class discussion as well.
ii. Discussion Boards, Readings, Student Journals: Students who present
readings in class post three questions on the Readings Discussion Board. They
will center their class presentation of the reading on these questions. The rest of
the class responds to the questions in their online student journals after reading
the material. Their journal entries combined with their responses in class count
towards a classroom grade.
iii. Learning Units: I have created learning units around the topics I present in
class. These help the students focus on what is important in the lesson and are
always on the Blackboard website for later use.
Acknowledgement: I would like to thank Professor Steve Lilley, Associate Professor of
Sociology who has taught Computer Ethics, Society and Technology as an Honors course
with me here at SHU. The organization of topics is one that we have fine-tuned together.
Professor Herman Tavani, Professor of Philosophy, Rivier College has been a co-author
and friend.
Bibliography:
Articles and Books on specific topics:
Grodzinsky, Frances S. (2000). “Equity of Access: Adaptive Technology,” Science and
Engineering Ethics, Vol. 6, No. 2, 2000.
Grodzinsky, Frances S. (2000). “The Development of the ‘Ethical’ ICT Professional
And The Vision of an Ethical On-line Society: How Far Have We Come and Where are
We Going?” Computers and Society, Vol. 30, No. 1, March.
Grodzinsky, F and Tavani, H. “P2P Networks and the Verizon v. RIAA
Case: Implications for Personal Privacy and Intellectual Property”, Ethics and
Information Technology : Springer Netherlands ISSN: 1388-1957 (Paper) 1572-8439
(Online), DOI: 10.1007/s10676-006-0012-4, Issue: Volume 7, Number 4 ,Pages: 243 250 Title: Surveillance and Privacy, Published May 2006.
Grodzinsky, F.S., Miller K., and Wolf, M.J. (2003) “Ethical Issues in Open Source
Software,” Journal of Information, Communication and Ethics in Society, I, 4, pp. 193205, Troubadour, Publishing, London.
Grodzinsky, F, and Gumbus, A, “Internet and Productivity: ethical
perspectives on workplace behaviour”, Information, Communications and
Ethics in Society, 3: 249-256, 2006.
Gumbus A, Grodzinsky F, “Gender Bias in Internet Employment: A Study
of Career Advancement Opportunities for Women in the Field of ICT”
Journal of Information, Communication and Ethics in Society, Vol 2,
Issue 3, July 2004.
Johnson, Deborah G. (2001). Computer Ethics. 3rd. ed. Upper Saddle River, NJ: Prentice
Hall.
Johnson, Deborah G. (2000). "Democratic Values and the Internet." In D.
Langford,edInternet Ethics. New York: St. Martin's Press, pp. 180-199.
Kurzweil, Ray (1999) The Age of Spiritual Machines: When Computers Exceed Human
Intelligence. New York: Penguin.
Litman, J. (2004), Sharing and Stealing,
http://www.law.wayne.edu/litman/papers/sharing&stealing.pdf.
Moor, James H. (2001) “Just Consequentialism and Computing”, in Readings in
Cyberethics, eds. Spinello and Tavani, Jones and Bartlett.
Moor, James H. (1998). “Reason, Relativity, and Responsibility in Computer Ethics.”
Computers and Society, Vol. 28, No. 1, 1998, pp. 14-21.
Negroponte, Nicholas (1995). Being Digital. New York: Alred Knopf Books.
National Telecommunications and Information Administration (NTIA). (1995). Falling
Through the Net: A Survey of the Have-nots in Rural and Urban America. Washington,
DC: US Department of Commerce. Retrieved from
http://www.ntia.doc.gov/ntiahome/fallingthru.html.
National Telecommunications and Information Administration (NTIA). (1999). Falling
Through the Net: Defining the Digital Divide. Washington, DC: US Department of
Commerce. Retrieved from http://www.ntia.doc.gov/ntiahome/fttn99.html,
National Telecommunications and Information Administration (NTIA). (2000). Falling
Through the Net: Toward Digital Inclusion. Washington, DC: US Department of
Commerce. Retrieved from http://www.ntia.doc.gov/ntiahome/fttn00/contents00.html.
Nissenbaum, Helen (1994). “Computing and Accountability,” Communications of the
ACM, Vol. 37, No. 1, pp. 73-80.
Perens, B. (2002) “Debian Social Contract.” www.debian.org/social_contract.html.
Perens, Bruce (2006) “The Emerging Economic Paradigm of Open Source”,
http://perens.com/Articles/Economic.html.
Raymond, E.S. (2000) “Homesteading the Noosphere.”
http://www.catb.org/esr/writings/homesteading/homesteading/.
Raymond, E.S. (2001) “The Cathedral and the Bazaar,” in Readings in Cyberethics, eds.
Spinello and Tavani. Sudbury, MA: Jones and Bartlett.
Raymond, E.S. (2002) “The Magic Cauldron” v. 2.0.
http://www.catb.org/~esr/writings/cathedral-bazaar/magic-caulddron/index.html.
Rheingold, Howard (2001). The Virtual Community: Homesteading on the Electronic
Frontier. Revised ed. Cambridge, MA: MIT Press.
Stallman, Richard (2001) “Copyright versus Community in the Age of Computer
Networks” http://www.gnu.org/philosophy/copyright-versus-community.html.
Stallman, Richard (2002) “Linux, GNU, and Freedom,”
http://www.gnu.org/philosophy/linux-gnu-freedom.html.
Stallman, Richard (1985) “The GNU Manifesto” http://www.gnu.org/gnu/manifesto.html.
Stallman, Richard (1998) “Why ‘Free Software’ is Better than ‘Open Source’,”
http://www.gnu.org/philosophy/free-software-for-freedom.html.
Stallman, Richard (1992) “Why Software Should be Free,”
http://www.gnu.org/philosophy/shouldbefree.html.
Stallman, Richard (1994) “Why Software Should Not Have Owners,”
http://www.gnu.org/philosophy/why-free.html.
Shade, L.R. ( 2002). Gender and Community in the Social Construction of the Internet.
New York: Peter Lang.
Spinello, Richard A. (2001). “Internet Service Providers and Defamation: New Standards
of Liability.” In R. A. Spinello and H. T. Tavani, eds. Readings in CyberEthics. Sudbury,
MA: Jones and Bartlett, pp. 198-209.
Sunstein, Cass R. (2002). Republic.com. Princeton, NJ: Princeton University Press
Tavani, Herman T., and Frances S. Grodzinsky (2002). “Cyberstalking, Personal Privacy,
and Moral Responsibility,” Ethics and Information Technology, Vol. 4, No. 2, pp. 123132.
Tavani, Herman T. (2000). “Defining the Boundaries of Computer Crime: Piracy, Breakins and Sabotage in Cyberspace.” Computers and Society, Vol. 30, No. 4, 2000, pp. 3-9.
Tavani, Herman T. (2003). “Ethical Reflections on the Digital Divide,” Journal of
Information, Communication and Ethics in Society, Vol. 1, No. 2, pp. 99-108.
The Open Source Definition (2006), version 1.9
http://www.opensource.org/docs/definition.php.
Thorseth, M. (2005). “ IT, multiculturalism and global democracy—ethical challenges”,
Institute for philosophy, pedagogy and rhetoric, Copenhagen University. Papers for a
workshop on positive discrimination, March 18, 2005. Retrieved from
http://trappe13.dynamicweb.dk,
Torvalds, Linus (2006a) “Re: Linux vs. GPL v3—Dead Copyright Holders,” Linuxkernel mail archives. Jan 25, 2006.
http://www.ussg.iu.edu/hypermail/linux/kernel/0601.3/0559.html.
Torvalds, Linus (2006b) “Re: Linux vs. GPL v3—Dead Copyright Holders,” Linuxkernel mail archives. Jan 27, 2006.
http://www.ussg.iu.edu/hypermail/linux/kernel/0601.3/1489.html.
Warschauer, M. (2002). “Reconceptualizing the Digital Divide,” First Monday, Vol. 7,
No. 7. Available at: http://www.firstmonday.dk/issues/issue7_7/warschauer/.
Warschauer, Mark (2003). Technology and Social Inclusion: Rethinking the Digital
Divide. MIT Press.
Wheatley, Malcolm (2004) “The Myths of Open Source,” CIO Magazine, March 1, 2004.
http://www.cio.com/archive/030104/open.html.
White, Michelle (2002). "Regulating Research: The Problem of Theorizing Research in
LambdaMOO," Ethics and Information Technology, Vol. 4, No. 1, pp. 55-70.
Wolf, M.J., K. Bowyer, D. Gotterbarn, and K. Miller (2002) Open Source Software:
Intellectual Challenges to the Status Quo, panel presentation at 2002 SIGCSE Technical
Symposium, SIGCSE Bulletin, 34(1), March, pp. 317-318 and
www.cstc.org/data/resources/254/wholething.pdf.
Wolf, M.J. and Grodzinsky, F.S. (2006) “Good/Fast/Cheap: Contexts, Relationships and
Professional Responsibility During Software Development,” Proceedings of the
Symposium of Applied Computing 2006 (April).
World Wide Web Consortium, http://www.w3.org/WAI/Technical/Activity.html
General Texts:
Edgar, Stacey L. (2006) Morality and Machines. 2nd ed. Sudbury: Jones and Bartlett.
Quinn, Michael (2006) Ethics for the Information Age. 2nd ed. Boston: Addison Wesley.
Spinello, R. (2006) Cyberethics. 3rd edition. Sudbury: Jones and Bartlett.
Spinello, R. and Tavani, H. eds (2001) Readings in Cyberethics.
Tavani, Herman T. (2006). Ethics and Technology: Ethical Issues in an Age of
Information and Communication Technology. 2nd ed. Hoboken, NJ: John Wiley and Sons.
Download