- Syneratio

advertisement
USE: samenvatting
Definities technologie
1. Hardware (restricted definition)
2. Socio-technisch systeem (broad definition, inclusief USE)
Invloed van techniek (als socio-technisch systeem) op de geschiedenis
 Technological determinism
 Engineers/scientists create new technological solutions with big impact on users,
societies and enterprises.
 Technology follows its own logic; is autonomous; and causes change
 Co-constrution
 Technical changes involve many choices.
 Technological change is shaped by interactions among various stakeholders.
 Engineers, scientists, users, societies and enterprises (USE) co-construct technology.
Het leerproces van co-construction heeft 3 karakteristieken:
 Explorative, open-ended & uncertain.
 Interrelated and built up the development of socio-technical system.
 Contested because participants have to make choices.
Utilitarisme: an act is morally right if and only if its consequences are at least as good as the
consequences of all alternative acts.
 Consequences for everyone who is affected, measured in terms of pleasure or preference
satisfaction
 De consequenties zijn vaak niet op voorhand te bepalen.
Act utilitarianism: evaluate each individual act
Rule utilitarianism: evaluate sets of acts
Hedonistic Utilitarianism: maximize the sum total of happiness
Plichtenethiek (Kant): het categorisch imperatief (universalisatie en autonomie)
 Universalisatie: een daad is verkeerd wanneer deze niet geüniversaliseerd kan worden
 One cannot make an exception for oneself or for one’s group.
 Nothing is morally permissible if it cannot be made universal (if universalization
would lead to a self-contradiction or a situation no person could want)
 Autonomie/humaniteit/end-in-itself: act in such a way that you treat humanity, whether in
your own person or in the person of any other, never merely as a means to an end, but
always at the same time as an end-in-itself
Hypothetisch imperatief: if you want X, choose Y as means.
 Categorisch imperatief geldt altijd en heeft dus geen conditie wanneer het geldt.
Morele verantwoordelijkheid: being held to account for, or justify one’s actions towards others
 Causal contribution
 Foreseeability
 Freedom of action
Types corporate morality:
 Type 1: Conscience as a guide to self-interest.
 Type 2a: Conscience as a systematic constraint.
 Invisible hand: market forces will automatically lead to the good.
 Type 2b: Conscience as a systematic constraint.
 Visible hand: non-economic forces will set the rules of the games in a way that
morality is guaranteed.
 Responsibility lies with politicians
 Responsibility of managers is to obey the law
 Type 3: Conscience as an authoritative guide.
 Respect for the rights and concerns of stakeholders is given independent force in the
leader’s operating consciousness.
Kernboodschap Enterprise:
 Company strategy is crucial for technical change/design choices.
 Strategies/choices are influenced by actions of other companies, users, governments,
societal groups.
 Strategies/choices are specific for historical period.
Enterprises kunnen 3 type keuzes maken:
 Schaal
 Kennis (geen informatie!!)
 Verantwoordelijkheid
Inventing the modern project:
 Schaal: mixed picture, both large-scale and small-scale.
 Knowledge: dominance of informal, applied and bought knowledge production, yet start of
laboratory
 Responsibility: limited; Some companies (e.g. Philips) took responsibility for workers:
travel, education & housing.
Controlles modernization:
 Schaal: mixed picture, both large-scale and small-scale; in this period build up of very large
companies.
 Knowledge: Expansion of R&D infrastructure
 Responsibility: Contested
Kernboodschap:
 Strategies/choices are influenced by actions of other companies, users, governments,
societal groups (USE contextualization)
 Strategies/choices are specific for historical period (historical contextualization)
 Company strategy is crucial for technical change/design choices (co-construction)
Technologies are often not neutral tools, but have a deep link to moral values
These values might reflect values of the designer, of the projected user, of the real user or of society.
Mediation: Technology has an influence on the the behavior and the worldview of the user.
 Actions (mediation of action): human actions are not only determined by their intentions but
also by their environment.
 Script: a prescription how to act that is built (designed) into an artifact; vision of how
an artifact is supposed to be used by the user.
 Perceptions (mediation of perception): technology changes the way users see the world and
the way in which users interpret reality.
 Representation always works via amplification and reduction
Summary
 Technology / artifacts may influence the interpretation of reality (Mediation of perception)
 Technology may suggest or prescribe a certain (type of) action (Mediation of action)
 Technologies contain a ‘script’ that pre-scribes how it should be used (expected usage)
Users-based Knowledge
 Experience-based Knowledge
 Informal Knowledge
 Tacit Knowledge (stilzwijgend)
Conclusies:
 Users
 Co-Produce Technology & Essential for Success Innovations
 Are Real vs. Configured etc.
 Produce Valuable Experience/user-based knowledge
 Shape Technologies in History Dependent Time & Place
Society meaning 1: Technology’s stakeholders - What individuals/groups have a stake in a given
technology choice?
Society meaning 2: Technology’s public interest - What technology choice is best for society?
Regime: dominant mechanisms that govern Society’s choices about the public interest (regarding
technology)
 Regime of distributed responsibilities:
 Limited technical responsibilities of the minimal state
 Technological responsibilities of stakeholders in non-state sectors
 Regime of control and coordination
 Technocratic regime
 Regime of participation
 Participation by protest
 Participation by invitation
 Participation by delegation
Utilitarianism makes our obligations to society calculable; Kantianism makes them responsive to
individuals
Risk is basically an unwanted future event that may or may not occur (involves a probability)
 According to utilitarianism, that risk is acceptable which is brought about by the action that
has the highest expected utility.
 According to Kantianism, a risk is acceptable if and only if (most of) the person(s) affected by
it give their autonomous informed consent to it (or would do so if they were rational).
Problem of paralysis: If each person has a duty not to put others at any risk, then it will be practically
impossible to do anything (since most activity poses some risks to others.
Known unknown (estimable probability) = risk
Unknown unknown = uncertainty, uncertain danger, uncertain hazard
Virtue:
 good trait of character
 disposition to act
Virtues (the “golden mean”): equilibrium, balance, proportionality between emotions
Examples:
 Generosity: middle between giving too little and too much
 Pride: middle between vanity and humility
 Courage: middle between cowardice and recklessness
Dimensies van verantwoordelijkheid (Professional codes for engineers and scientists):
 Self-reflection (virtues)
 Responsibility for employers and clients
 Responsibility for society and the environment
Conclusies:
 Codes: starting point, a checklist, not the final truth
 Make you aware of difficulties and values
 Critical self-reflection and awareness of responsibilities
 Virtue ethics: take responsibility, as persons
 Think for yourself! Remain critical!
Technological Determinism should specify 3 of the 4 following points:
 Technology as an autonomous agent of change
 Causal relation between technology and effects
 Internal logic of technological development
 Engineers/scientists create new technological solutions with big impact on users, societies
and enterprises
Ethical principles of risk:
 Kantian Principle of Risk (informed consent)
 Utilitarian Principle of Risk (expected utility/ cost-risk-benefit analysis)
 The Precautionary Principle: principle that prescribes how to deal with threats that are
uncertain and/or cannot be scientifically established. If there is (1) a threat, which is (2)
uncertain, then (3) some kind of action (4) is mandatory. This definition has thus four
dimensions.
 Best available technology: no better alternative.
Download