Moral Intensity and Ethical Decision

advertisement
Moral Intensity
and Ethical
Decision-Making:
A Contextual
Extension
Tim Goles
University of Texas at San Antonio
Gregory B. White
University of Texas at San Antonio
Nicole Beebe
University of Texas at San Antonio
Carlos A. Dorantes
University of Texas at San Antonio
Barbara Hewitt
University of Texas at San Antonio
Abstract
This paper explores the role of an individual’s
perception of situation-specific issues on decisionmaking in ethical situations. It does so by examining
the influence of moral intensity on a person’s
perceptions of an ethical problem, and subsequent
intentions. Moral intensity (Jones, 1991) is an issuecontingent model of ethical decision-making based
on the supposition that situations vary in terms of the
moral imperative present in that situation. An
individual’s decision is guided by his or her
assessment of six different components that
collectively comprise the moral intensity of the
situation. The relationship between the components
of moral intensity and the decision-making process is
tested through the use of scenarios that present ISrelated ethical situations. The results indicate that
moral intensity plays a significant role in shaping the
perceptions and intentions of individuals faced with
IS-related ethical situations. The conclusion drawn
from this is that, consistent with prior research, the
decision-making process is influenced by an
individual’s perception of situation-specific issues;
that is, the moral intensity of the situation.
ACM Categories: K.4.1, K.7.4
Keywords: Ethics, Ethical Decision-making,
Intentions, Moral Intensity, Perceived Ethical
Problems, PLS
Introduction
Acknowledgements
The authors would like to thank the anonymous
reviewers for their insightful and helpful comments on
this paper, and Sue Brown for her advice and support
in her role as Guest Editor of this special issue. The
paper greatly benefited from their contributions.
86
Concern over ethical decision-making in business
has dramatically increased recently due to incidents
such as the Enron and Global Crossing scandals.
This is paralleled by mounting apprehension over the
misuse of information systems and technology,
leading to calls for more research that explores
ethical decision-making in an information systems
(IS) context (Banerjee et al., 1998; Cappel &
Windsor, 1998; Ellis & Griffith, 2001). This paper
answers that call by extending research investigating
ethical decision-making in the marketing field to
generate new knowledge in the IS arena. It does so
by employing the concept of moral intensity, or “the
extent of issue-related moral imperative in a
situation,” (Jones, 1991, p. 372) to examine an
individual’s perception of IS-based ethical scenarios,
and subsequent behavioral intentions. Moral intensity
suggests that ethical decisions are primarily
contingent upon the perceived characteristics of the
issue at stake, and therefore ethical decision-making
involves the collective assessment of those
characteristics. For example, using a companyowned electronic mail system to send innocuous
personal messages (e.g., sending a shopping list to
The DATA BASE for Advances in Information Systems - Spring-Summer 2006 (Vol. 37, Nos. 2 & 3)
one’s spouse) would be rated low in moral intensity
by many people. On the other hand, using that same
system to send harmful or malicious messages (e.g.,
child pornography or hate literature) would probably
be ranked high in moral intensity.
A basic premise of this study is that ethics are largely
situation-specific. This is in line with the argument
that “hypernorms,” or fundamental ethical principles,
cannot effectively address all behaviors because
there are a large number of diverse milieus in which
people live, work, and play, and these milieus often
have different ethical norms (Conger & Loch, 2001).
A second premise is that the proliferation of IS in
societal, business, and personal settings has
spawned new ethical issues, or at least exacerbated
existing ones. Thus the purpose of this paper is to
explore the impact of moral intensity on two
significant components of ethical decision-making in
an IS context: an individual’s perception of an ethical
problem, and ensuing intentions.
Prior Research
Much prior research into ethical decision making has
focused on personal characteristics (gender, age,
education, level of moral development) and social or
organizational factors (corporate ethical climate,
influence of peer groups, codes of ethics). Metaanalyses of this research reveals mixed results, and
recommends further empirical testing, particularly in
the area of ethical decision-making intentions and
moral intensity (Ford & Richardson, 1994; Loe et al.,
2000). These meta-analyses also suggest that ethical
decision-making is situation-specific, a conclusion
that is seconded by empirical studies in marketing
(Singhapakdi et al., 1996) and IS (Banerjee et al.,
1998).
Moral intensity (Jones, 1991) is often used to
examine ethical decision-making in different
circumstances (Chia & Mee, 2000; Frey, 2000;
Harrington, 1996; Morris & McDonald, 1995; Paolillo
& Vitell, 2002; Singer, 1996; Singhapakdi et al.,
1996). In brief, this theory postulates that moral
issues can be viewed in terms of underlying
characteristics that influence the various stages of
the decision making process.
This paper draws from a previous study that
investigated the impact of moral intensity on two
components of ethical decision-making, ethical
perceptions and behavioral intentions, in a marketing
context (Singhapakdi et al., 1996). Their findings are
consistent with prior research indicating that
situation-specific issues influence the ethical
decision-making process, and that moral intensity
helps to shape perceptions and intentions. We
extend their study by testing the model using a
confirmatory approach in place of the exploratory
approach they used, and by changing the context to
IS-related situations. This is a valid and necessary
step in expanding the ethical decision-making body of
knowledge, due to the emergence of new and
unanticipated issues brought about by the rapid
development and deployment of information systems
and technology (Maner, 1996; Moor, 1985). These
issues arise from factors that add an element of
uniqueness to IS-related situations.
The unique factors that are salient in shaping the
ethics of individuals in an IS setting include cultural
lag (Ogburn, 1966), moral distancing (Rubin, 1994),
and context-specific norm development (Conger &
Loch, 2001). The notion of cultural lag (Ogburn,
1966) argues that the evolution of material culture
(e.g., technological inventions, innovation, and
diffusion) outpaces non-material culture (e.g., ethics,
philosophy, and law). This is reflected in current
debates revolving around the interplay between the
Internet and issues related to privacy, intellectual
property rights, and pornography (Marshall, 1999).
Moral distancing suggests that information systems
and technologies may increase the propensity for
unethical behavior by allowing an individual to
dissociate himself from an act and its consequences
(Rubin, 1994). Development of ethical norms often
varies across different contexts or milieus.
Consensus on ethical norms within each milieu helps
define acceptable behavior for that setting, and may
provide a guide to ethical behavior in an IS context
(Conger & Loch, 2001). Furthermore, it has been
argued that IS ethical issues are philosophically
interesting in their own light, and deserving of
recognition as a legitimate domain of applied ethics
that merits further investigation (Tavani, 2002). Thus
it is appropriate and desirable to evaluate the
applicability of existing theory to ethical decisionmaking in an IS context.
Model Development
Moral intensity is multidimensional, consisting of six
components: 1) magnitude of consequences - the
aggregate harm or benefits of the act; 2) probability
of effect - the likelihood that the act will cause harm
or benefits; 3) temporal immediacy – the length of
time between the act and its consequences; 4)
concentration of effect – the number of people
affected by the act; 5) proximity – the social distance
between the decision maker and those affected by
the act; and 6) social consensus – the degree to
which others think the act is good or evil (Jones,
1991).
The DATA BASE for Advances in Information Systems - Spring-Summer 2006 (Vol. 37, Nos. 2 & 3)
87
Perceived
Ethical
Problem
Moral
Intensity
Behavioral
Intentions
Figure 1. Research Model (based on Singhapakdi et al., 1996).
A person’s collective assessment of these
characteristics results in a given situation’s moral
intensity. This influences the individual’s moral
judgment, intentions, and subsequent behavior. In
general, issues with high moral intensity will be
recognized as ethical dilemmas more often than
those with low moral intensity, leading to a positive
relationship between moral intensity and perception
of an ethical problem. Furthermore, issues with high
moral intensity have a positive relationship with an
individual’s intention to behave in an ethical manner
(Singhapakdi et al., 1996). Consistent with existing
ethical theories (Dubinsky & Loken, 1989; Ferrel et
al., 1989; Hunt & Vitell, 1986; Singhapakdi et al.,
1999) we postulate an additional positive
relationship between perception of an ethical
problem and intention to behave ethically. This is
reflected in the research model shown in Figure 1.
Factor
Gender
Male
Female
Age
19-24
25-30
31-40
41+
Mean age = 24.55
Major
Accounting
Economics
Finance
IS
Management
Marketing
Mgt. Science
Other
Employment Status
Part time
Full time
Not employed
Number of Surveys
Distributed
Usable Responses1
Response Rate
Methodology
Sample
Participants were solicited from all sections of the
core IS course required for business majors at a
large southwestern state university. Participation
was voluntary. Details are provided in Table 1.
Operationalization
Data was collected via a scenario-based
questionnaire. Scenarios are commonly used to
examine ethical judgments and intentions in many
different areas, including IS (Banerjee et al., 1998;
Cappel & Windsor, 1998; Ellis & Griffith, 2001;
Harrington, 1996). Consistent with this approach,
scenarios developed by Ellis & Griffith (2001) were
adopted (Appendix A). Measures of moral intensity
were adapted from prior research, as were items
that measured ethical perceptions and intentions
(Singhapakdi et al., 1996) (Appendix B).
88
#
%
213
214
49.9
50.1
295
84
31
18
68.9
19.6
7.2
4.2
83
25
49
39
104
75
10
52
19
5.7
11.2
8.9
23.8
17.2
2.3
11.9
213
115
105
49.2
26.6
24.2
511
442
86%
Table 1. Demographics
Scenario Validation
It is crucial that scenarios used in an empirical test of
moral intensity be perceived as presenting an ethical
problem (Hunt & Vitell, 1986).
1
Our predetermined criteria were to reject any that answered every
question with the same number (e.g., all “5”) or any that were missing
entire sections of data. There was one survey rejected for identical
answers, and two for incomplete data, resulting in a total of 442
usable surveys. Every respondent did not answer each demographic
question, so the totals for each factor may not equal 442.
The DATA BASE for Advances in Information Systems - Spring-Summer 2006 (Vol. 37, Nos. 2 & 3)
Moral Intensity
Components
Magnitude of
Consequences
Social
Consensus
Temporal
Immediacy
Probability of
Effect
Proximity
Concentration of
Effect
Perceived Ethical
Problem
Intentions
1
Mean
Std. Deviation
Mean
Std. Deviation
Mean
Std. Deviation
Mean
Std. Deviation
Mean
Std. Deviation
Mean
Std. Deviation
Mean
Std. Deviation
t-value
Mean
Std. Deviation
t-value
6.46
2.14
5.66
2.16
6.76
1.90
6.39
2.08
5.22
2.09
6.40
1.95
3.70
2.22
12.23
6.15
2.35
10.30
2
5.50
2.34
6.36
2.24
5.57
2.23
5.52
2.25
6.06
2.36
5.69
2.29
3.03
2.09
19.78
6.84
2.14
18.05
3
4.57
2.40
6.53
2.12
4.80
2.40
4.59
2.34
5.67
2.53
3.94
2.31
3.47
2.20
14.65
7.02
2.32
18.33
Scenario
4
4.66
2.30
6.33
2.12
4.81
2.21
4.82
2.23
5.94
2.43
4.65
2.34
2.95
1.97
21.90
5.95
2.58
7.733
5
4.54
2.15
5.73
2.20
4.57
2.15
4.57
2.16
5.75
2.36
4.44
2.22
3.97
2.32
9.23
5.42
2.56
3.48
6
5.75
2.21
6.27
2.35
6.05
2.05
5.90
2.09
5.93
2.56
5.42
2.20
2.58
1.90
26.78
5.88
2.63
6.99
7
5.88
2.17
4.74
2.22
6.03
2.02
5.74
2.06
4.85
2.34
5.70
2.13
4.54
2.57
3.72
5.35
2.45
3.02
For the Moral Intensity components, a higher mean indicates a higher level of moral intensity.
For Perceived Ethical Problem, a lower mean indicates the scenario is perceived to present a greater ethical problem. All
the t-values are significant at the .05 level.
For Intentions, a higher mean indicates a greater intention to behave in a different (more ethical) manner than the actor in
the scenario. All the t-values are significant at the .05 level.
Table 2. Scenario Means, Standard Deviations, and t-values
Furthermore, to test the underlying premise of moral
intensity, the scenarios should vary in terms of the
different components. To ensure this, the scenarios
were evaluated through a three-stage process. First,
an independent panel of researchers reviewed the
scenarios to determine whether or not they presented
an ethical problem. There was consensus that the first
six did indeed involve an ethical situation. Scenario
seven was judged to be marginal. However, it was
included to determine if respondents would
differentiate between what the panel felt were clear
ethical problems and one that was deemed to be
borderline.
Next, the means and standard deviations of the
responses for each scenario were calculated (Table
2). The results suggest that the respondents perceive
each of the moral intensity components as varying
between scenarios. Likewise, the results varied for
“perceived ethical problem”, again suggesting that the
respondents viewed each scenario as differing in
ethical considerations.
Finally, t-tests were conducted comparing the mean
responses of “perceived ethical problem” and
“intentions” for each scenario to the neutral value of 5
(the midpoint of the 9-point Likert scale). As Table 2
indicates, there was a significant difference between
the perceived ethical problem mean and the neutral
value for each scenario, indicating the respondents
viewed each scenario as involving an ethical problem.
As expected, the results for scenario 7 suggest that
the respondents viewed this scenario as less intense
than the others.
Additionally, the means and t-values for intentions vary
according to scenario, and the t-values indicate a
significant difference between the respondents and the
actors in the scenarios. This suggests that the
respondents are ‘ethically sensitive’, in that they can
differentiate between scenarios, and are inclined to
behave differently – that is, more ethically – than the
scenario actors. Again, the differences are much less
for scenario 7.
Analysis
PLS Graph version 3.0 was used to analyze the
research model for two reasons. First, the model
contains both formative constructs (moral intensity)
and reflective constructs (perceived ethical problem;
intentions). Constructs measured by formative
indicators cannot be adequately evaluated using
covariance-based structural equation modeling
techniques such as LISREL or AMOS. PLS has the
ability to deal with both types of indicators (Chin,
1998a). Second, within each individual scenario the
reflective constructs are measured using single-item
The DATA BASE for Advances in Information Systems - Spring-Summer 2006 (Vol. 37, Nos. 2 & 3)
89
indicators. Consequently, their measurement error
cannot be estimated. However, PLS provides results
identical to multiple regression analysis in the case of
single-item indicators (Chin et al., 2003).
Formative indicators are defining characteristics of the
construct; they form or ‘cause’ the construct, as
opposed to reflective indicators that are manifestations
of the construct. Formative indicators are not
necessarily correlated with each other, nor can
unidimensionality be assumed. Thus commonly used
methods to assess individual item reliability and
convergent validity are irrelevant (Chin, 1998a; Gefen
et al., 2000). Instead, item weights are used in place of
loadings and are interpreted similar to beta coefficients
in multiple regression, indicating the relevance of the
items to the research model (Chin, 1998b). Statistical
significance was determined by using the bootstrap
option with 200 resamples. Each scenario was
analyzed individually in accordance with the research
model (Figure 1). In addition, a “mega-scenario” was
created, consisting of the aggregated responses to all
seven individual scenarios. The weights and t-statistics
for the formative indicators are presented in Table 3.
The structural model is assessed by examining the r2
values, path values, and associated t-statistics, as
shown in Tables 4 and 5.
Results and Discussion
Table 3 indicates that, as expected, moral intensity is
situation–specific; that is, the significance of each
component varies by scenario. For example, social
consensus is significant in all seven of the scenarios,
while concentration of effect is significant in only one.
This is not entirely unexpected, nor without precedent.
Previous research suggests that the impact of
individual components on overall moral intensity varies
according to the characteristics of the scenario, both in
IS (Banerjee et al., 1998; Cappel & Windsor, 1998;
Ellis & Griffith, 2001) and non-IS (Chia & Mee, 2000;
Frey, 2000; Morris & McDonald, 1995) contexts.
In addition to differences in scenario characteristics,
there is another possible explanation for the
inconsistent showing of two specific components,
probability of effect and concentration of effect. There
exists a cognitive bias in risk perception that often
leads people to underestimate potential future risky or
negative implications of a situation (Jones, 1991).
Although Jones downplays this bias, it may have
weakened the recognition of these two components.
Nevertheless, this does not invalidate the concept of
moral intensity. Overall, Table 3 reinforces the notion
that moral intensity is truly issue-contingent.
90
The DATA BASE for Advances in Information Systems - Spring-Summer 2006 (Vol. 37, Nos. 2 & 3)
Scenario 1
Scenario 2
Scenario 3
Scenario 4
Scenario 5
Scenario 6
Scenario 7
Mega-Scenario
Perceived Ethical
Problem
0.166
0.228
0.336
0.175
0.260
0.249
0.280
0.406
Intentions
0.329
0.309
0.325
0.282
0.399
0.195
0.277
0.443
Table 4. r2 Values
Also of interest are the implications drawn from
Tables 4 and 5. The r2 values signify that, overall, the
model explains 41% of the variance in the perceived
ethical problem construct, and 44% of the variance in
behavioral intentions. The variance explained is less
in the individual scenarios, due to the uniqueness of
each scenario. Table 5 indicates that all the paths
from moral intensity to perceived ethical problem, and
from moral intensity to intentions, exceed the
suggested minimum standard of 0.20 (Chin, 1998b),
and are statistically significant. This suggests that
moral intensity does indeed help to explain the
relative influence of situational factors on the ethical
decision-making process. There is, however, less
support for the expected relationship between
perceived ethical problem and intentions. The overall
path coefficient, although statistically significant, falls
below the 0.20 threshold, as do the path coefficients
for scenarios 1, 2, 6, and 7. This is not fully in
accordance with suggestions in the literature that
perceptions of an ethical problem precede intentions
(Dubinsky & Loken, 1989; Ferrell et al., 1989;
From:
Moral
Intensity
path
coefficient
0.408 ***
0.478 ***
0.579 ***
0.418 ***
0.510 ***
0.499 ***
0.529 ***
0.637 ***
Scenario 1
Scenario 2
Scenario 3
Scenario 4
Scenario 5
Scenario 6
Scenario 7
MegaScenario
*** significant at the .05 level
*** significant at the .01 level
To:
Perceived
Ethical
Problem
t-statistic
8.633
11.430
18.674
8.783
10.407
11.860
13.416
20.258
Harrington, 1996; Hunt & Vitell, 1986). To explore this
further, the indirect effect of moral intensity on
intentions was investigated. The indirect effect is
calculated by multiplying the path coefficient from
moral intensity to perceived ethical problem (0.637)
by the path coefficient from perceived ethical problem
to intentions (0.144). The resulting indirect effect
(0.637 * 0.144 = 0.092), when added to the direct
effect of moral intensity on intentions (0.564) results
in a total effect of moral intensity on intentions of
0.656 (0.092 + 0.564 = 0.656). This suggests that
moral intensity strongly influences intentions, both
directly and indirectly, through an individual’s
perception of the ethical ramifications of a given
situation. One reason for the lack of direct effect
between perceived ethical problem and intentions
might be that individuals assess a specific situation in
terms of moral intensity without consciously realizing
that they are going through an ethical evaluation
process. In other words, they base their behavioral
intentions on their own egocentric value system,
oversimplifying the situation and bypassing a formal
assessment of the ethical implications (Cappel &
Windsor, 1998).
Conclusion
This paper examined the role of moral intensity in the
ethical decision-making process in an IS context. It
contributes to the body of knowledge by extending
existing theory to a new context, and by employing a
confirmatory analytical approach. The results support
the basic concept underlying moral intensity; the
decision-making process is influenced by the
individual’s perception of situation-specific issues.
From:
Moral
Intensity
path
coefficient
0.524 ***
0.501 ***
0.307 ***
0.387 ***
0.490 ***
0.352 ***
0.508 ***
0.564 ***
To:
Intentions
t-statistic
9.802
10.110
5.684
8.564
9.768
5.884
10.483
12.095
From:
Perceived
Ethical
Problem
path
coefficient
0.103 *
0.099 *
0.335 ***
0.235 ***
0.221 ***
0.144 ***
0.032
0.144 ***
To:
Intentions
t-statistic
1.818
1.962
5.596
4.810
4.306
2.364
0.563
2.666
Table 5. Path Coefficients and t-statistics
The DATA BASE for Advances in Information Systems - Spring-Summer 2006 (Vol. 37, Nos. 2 & 3)
91
This is consistent with previous research, and
extends research in other areas into an IS context,
providing additional assurance that moral intensity is
a viable theoretical concept across varying settings.
The results also highlight both the direct and indirect
role of moral intensity in shaping behavioral
intentions.
It is intriguing to speculate as to what extent the
factors (cultural lag, moral distancing, and contextspecific norm development) that make IS-related
ethical issues different from other contexts may have
influenced the results of this study. For example,
concentration of effect was significant in only one
scenario. This may be due, at least in part, to a lack
of consensus on ethical norms regarding
technological innovations, arising in turn from the
time lag between technological evolution and norm
development, the ability of technology to diffuse or
spread out the impact of an act, or a combination of
the two. These factors may also have undermined
any direct effect between perceived ethical problem
and intentions. Our opinion is that these factors do in
fact differentiate IS-related ethical issues from those
in other fields. Obviously, however, much work
remains to be done to confirm or refute this belief.
As with any study, this one is subject to certain
limitations, one of which is the use of students as
research subjects. Previous research has shown that
students’ views on ethical dilemmas vary from
professionals. Students tend to view workplacebased ethical situations from an individual
perspective, while professionals tend to take the
firm’s perspective into account. Professionals are
also more accepting of authority, and generally adopt
a more sophisticated level of moral reasoning in the
ethical decision-making process (Athey, 1993;
Cappel & Windsor, 1998). Other researchers,
however, argue that the use of students is
reasonable in instances where their response can be
linked to their 'real world' context (Sambamurthy &
Chin, 1994). Our position is that the use of students
in this study is justified because we are not claiming
that students are surrogates for professionals.
Rather, we contend that students are IS users, and
as such are suitable subjects for examining IS-related
ethical issues (Ellis & Griffith, 2001). Furthermore, a
strong argument can be made that the respondents
can identify with the scenarios used, based in part on
the fact that over 75% of the respondents work either
full or part time (Table 1). Finally, students are the
employees and managers of tomorrow. Their outlook
towards IS-related ethical issues is at least partially
shaped prior to their entry into the workforce, and is
likely to be carried forward into their business careers
(Cheng et al., 1997; Sims et al., 1996).
Consequently, insights into students’ perceptions and
92
judgment of ethical issues can provide guidance to
organizations seeking to foster an ethical corporate
climate. Nevertheless, it would be prudent to extend
this study to include IS professionals.
If moral intensity is indeed a key component in ethical
decision-making, as suggested by this study, then a
significant implication is that both students and
professionals
may
benefit
from
a
better
understanding of moral intensity’s individual
components. Educating individuals on potential
consequences and implications of ethical problems
could sharpen their perception and decision-making
skills when they encounter ethically sensitive
situations. This may be accomplished through a mix
of instruction at the university level, and on-going
education and training at the professional level.
Discussion of various scenarios, such as the ones
used in this paper, can serve as a vehicle to aid
individuals in evaluating their ethical reasoning, and
comparing it to others. This can be supplemented
with
establishing
guidelines
for
individual
accountability (Banerjee et al., 1998), publishing and
enforcing codes of ethics (Harrington, 1996), and
implementing detective, preventive, and deterrence
measures (Straub et al., 1993).
In short, the findings from this study furnish insight
into the ethical decision-making process. Initiatives
based on these insights offer an opportunity to
increase the comfort level and decision-making
capability of individuals when confronted by ethically
complex situations.
References
Athey, S. (1993). “Comparison of Experts’ and High
Tech Students’ Ethical Beliefs in Computer
Related Situations,” Journal of Business Ethics,
Vol.12, No.5, pp. 359-370.
Banerjee, D., Cronan, T., and Jones, T. (1998).
“Modeling IT Ethics: A Study in Situational
Ethics,” MIS Quarterly, Vol.22, No.1, pp. 31-60.
Cappel, J. and Windsor, J. (1998). “A Comparative
Investigation of Ethical Decision Making:
Information Systems Professionals versus
Students,” The Database for Advances in
Information Systems, Vol.29, No.2, pp. 20-34.
Cheng, H., Sims, R., and Teegen, H. (1997). "To
Purchase or Pirate Software: An Empirical
Study," Journal of Management Information
Systems, Vol.13, No.4, pp. 49-60.
Chia, A. and Mee, L. (2000). “The Effects of Issue
Characteristics on the Recognition of Moral
Issues,” Journal of Business Ethics, Vol.27,
No.3, pp. 255-269.
The DATA BASE for Advances in Information Systems - Spring-Summer 2006 (Vol. 37, Nos. 2 & 3)
Chin, W.W. (1998a). “Issues and Opinion on
Structural Equation Modeling,” MIS Quarterly,
Vol.22, No.1, pp. vii-xvi.
Chin, W.W. (1998b). “The Partial Least Squares
Approach for Structural Equation Modeling,” in
Marcoulides, G.A. (Ed.), Modern Methods for
Business Research, Mahwah, NJ: Lawrence
Erlbaum Associates, pp. 295-336.
Chin, W.W., Marcolin, B.L., and Newsted, P.R.
(2003). “A Partial Least Squares Latent
Variable Modeling Approach for Measuring
Interaction Effects: Results from a Monte Carlo
Simulation Study and an Electronic-Mail
Emotion/Adoption Study,” Information Systems
Research, Vol.14, No.2, pp. 189-217.
Congor, S. and Loch, K. (2001). “Invitation to a Public
Debate on Ethical Computer Use,” The DATA
BASE for Advances in Information Systems,
Vol.32, No.1, pp. 58-69.
Dubinsky, A. and Loken, B. (1989). “Analyzing Ethical
Decision Making in Marketing,” Journal of
Business Research, Vol.19, No.2, pp. 83-107.
Ellis, T. and Griffith, D. (2001). “The Evaluation of IT
Ethical Scenarios Using a Multidimensional
Scale,” The DATA BASE for Advances in
Information Systems, Vol.32, No.1, pp. 75-85.
Ferrell, O.C., Gresham, L.G., and Fraedrich, J.
(1989). “A Synthesis of Ethical Decision
Models for Marketing,” Journal of Macromarketing, Vol.9, No.2, pp. 55-64.
Ford, R. and Richardson, W. (1994). “Ethical
Decision Making: A Review of the Empirical
Literature,” Journal of Business Ethics Vol.13,
No.3, pp. 205-221.
Frey, B. (2000). “The Impact of Moral Intensity on
Decision Making in a Business Context,”
Journal of Business Ethics, Vol.26, No.3, pp.
181-195.
Gefen, D., Straub, D.W., and Boudreau, M.C. (2000).
“Structural Equation Modeling and Regression:
Guidelines for Research Practice,” Communications of the AIS, Vol.4, No.7, pp. 1-77.
Harrington, S. (1996). “The Effects of Codes of Ethics
and Personal Denials of Responsibility on
Computer Abuse Judgments and Intentions,”
MIS Quarterly, Vol.20, 3, pp. 257-278.
Hunt, S. and Vitell, S. (1986). “A General Theory of
Marketing Ethics,” Journal of Macromarketing,
Vol.8, No.1, pp. 5-16.
Jones, T. (1991). “Ethical Decision Making by
Individuals in Organizations: An IssueContingent Model,” Academy of Management
Review, Vol.16, No.2, pp. 231-248.
Loe, T., Ferrell, L., and Mansfield, P. (2000). “A
Review of Empirical Studies Assessing Ethical
Decision Making in Business,” Journal of
Business Ethics, Vol.25, No.3, pp. 185-204.
Maner, W. (1996). “Unique Ethical Problems in
Information
Technology,”
Science
and
Engineering Ethics, Vol.2, No.2, pp. 137-152.
Marshall, K.P. (1999). “Has Technology Introduced
New Ethical Problems?” Journal of Business
Ethics, Vol.19, No.1, pp. 81-90.
Moor, J.H. (1985). "What Is Computer Ethics?"
Metaphilosophy, Vol.16, No.4, pp. 266-275.
Morris, S. and McDonald, R. (1995). “The Role of
Moral Intensity in Moral Judgments: An
Empirical Investigation,” Journal of Business
Ethics, Vol.14, 9, pp. 715-726.
Ogburn, W.F. (1966). Social Change with Regard to
Cultural and Original Nature, New York: Dell
Publishing Co.
Paolillo, J. and Vitell, S. (2002). “An Empirical
Investigation of the Influence of Selected
Personal, Organizational and Moral Intensity
Factors on Ethical Decision Making,” Journal of
Business Ethics, Vol.35, No.1, pp. 65-74.
Rubin, R. (1994). “Moral Distancing and the Use of
Information
Technologies:
The
Seven
Temptations,” Proceedings of the ACM Conference on Ethics in the Computer Age,
Gatlinburg, TN, pp. 151-155.
Sambamurthy, V. and Chin, W. (1994). “The Effects
of Group Attitudes toward Alternative GDSS
Designs on the Decision-Making Performance
of Computer-Supported Groups,” Decision
Sciences, Vol.25, No.2, pp. 215-241.
Sims, R., Cheng, H., and Teegen, H. (1996). “Toward
a Profile of Student Software Piraters,” Journal
of Business Ethics, Vol.15, No.8, pp. 839-849.
Singer, M. (1996). “The Role of Moral Intensity and
Fairness Perception in Judgments of Ethicality:
A Comparison of Managerial Professionals and
the General Public,” Journal of Business
Ethics, Vol.15, No.4, pp. 459-474.
Singhapakdi, A., Vitell, S., and Kraft, K. (1996).
“Moral Intensity and Ethical Decision-Making of
Marketing Professionals,” Journal of Business
Research, Vol.36, No.3, pp. 245-255.
Singhapakdi, A., Vitell, S., and Franklin, G.R. (1999).
“Antecedents, Consequences, and Mediating
Effects of Perceived Moral Intensity and
Personal Moral Philosophies,” Journal of the
Academy of Marketing Science, Vol.27, No.1,
pp. 19-36.
Straub, D.W., Carlson, P.J., and Jones, E.H. (1993).
“Deterring Cheating by Student Programmers:
A Field Experiment in Computer Security,”
Journal of Management Systems, Vol.5, No.1,
pp. 33-48.
Tavani, H.T. (2002). “The Uniqueness Debate in
Computer Ethics: What Exactly is at Issue, and
Why Does it Matter?” Ethics and Information
Technology, Vol.4, No.1, pp. 37-54.
The DATA BASE for Advances in Information Systems - Spring-Summer 2006 (Vol. 37, Nos. 2 & 3)
93
About the Authors
Tim Goles is Assistant Professor at the University of
Texas at San Antonio. He earned his Ph.D. from the
University of Houston. He has over fifteen years
management experience in the information
technology arena, including evaluating, developing,
and implementing strategic and operational
information
systems,
outsourcing
contract
management, and IS security. His research interests
and publications parallel his work experience.
Dr. Gregory White obtained his Ph.D. in Computer
Science from Texas A&M University in 1995 and has
been involved in computer and network security
research since 1986. He currently serves as the
Director for the Center for Infrastructure Assurance
and Security and is an Associate Professor of
Computer Science at The University of Texas at San
Antonio (UTSA).
Nicole Lang Beebe is a doctoral candidate in
Information Technology at the University of Texas at
San Antonio. She has over a decade of experience in
information security in both the corporate and
government sector. She is a Certified Information
Systems Security Professional (CISSP) and holds
degrees in electrical engineering and criminal justice.
Carlos Alberto Dorantes is a doctoral student in
Information Technology at the University of Texas at
San Antonio. He obtained his M.S. in Computer
Science from the Tecnológico de Monterrey. He has
over a decade of experience in enterprise systems
implementation in a multi-campus university. His
work has appeared in IS conferences such as HICSS
and AMCIS.
Barbara Hewitt is a doctoral candidate in Information
Technology at the University of Texas at San
Antonio. She has over a decade of experience in
system analysis and software development in the
corporate, education, and government sectors. She
holds degrees in computer science and business
administration.
Appendix A: IS Ethics Scenarios
(adapted from Ellis and Griffith, 2001)
Scenario 1: A programmer developed a tool that would contact corporate sites, scan their networks, and find flaws
in their security system. The programmer made the software available to everyone over the Internet. Corporations
felt the programmer was assisting hackers and cyber-criminals. The programmer felt that he was providing a tool for
network managers to troubleshoot their security systems.
Scenario 2: A popular Internet Service Provider (ISP) offers online registration. Any user with an Internet connection
can access the Hookyouup Network and register for Internet service. What the users do not know is that as part of
registration, the ISP scans their hard drive assessing their system for potential new software marketing
opportunities.
Scenario 3: Ruth likes to play practical jokes on friends. Once she tried to log on to Jim’s account, guessing his
password was his wife’s name. Once she had access, she installed a program that would flash the message “There
is no Escape” every time the escape key was pressed. Jim discovered the joke after a few days and was upset.
Scenario 4: Joe is giving an on-line demonstration in which he uses software that was licensed for a 90-day trial
period. Prior to giving the seminar, he noted that the license would expire. Rather than pay the licensing fee, he
changes the date on his computer, effectively fooling the software into believing it is at the beginning of the licensing
period.
Scenario 5: Anna needs software to convert TIFF formatted images to GIF format. She found an excellent piece of
shareware and has used it once to convert the images. The shareware developer requests that she send $5 if she
likes and uses the software. She has not sent a check to the developer to date.
Scenario 6: Joan is a programmer at XYZ, Inc. While working late one night, she notices that her boss has left his
computer on. She enters his office to turn it off and finds that he is still connected to his email. She scans the
messages briefly, noticing whom they are from and what the topics are. One message catches her eye. It is
regarding herself in an unflattering way.
Scenario 7: Jim was recently fired from The Spot, a national discount department store. Jim is a techno-savvy
individual who felt he was wrongfully fired. In protest, he created a web page called “This Spot Unfair” in order to
state his case to the world about The Spot’s unfair treatment.
94
The DATA BASE for Advances in Information Systems - Spring-Summer 2006 (Vol. 37, Nos. 2 & 3)
Appendix B: Measures
(adapted from Singhapakdi et al., 1996)
1. The situation above involves an ethical problem.
Strongly . . . . Neutral . . . . Strongly
Agree
Disagree
1 2 3 4 5 6 7 8 9
2. I would act in the same manner as (the actor) did in the above
scenario.
1 2 3 4 5 6 7 8 9
3. The overall harm (if any) done as a result of (the actor’s) action
would be very small.
1 2 3 4 5 6 7 8 9
4. Most people would agree that (the actor’s) actions are wrong.
1 2 3 4 5 6 7 8 9
5. (The actor’s) actions will not cause any harm in the immediate
future.
1 2 3 4 5 6 7 8 9
6. There is a very small likelihood that (the actor’s) actions will
actually cause any harm.
1 2 3 4 5 6 7 8 9
7. If (the actor) is a personal friend of her boss, the action is
wrong.
1 2 3 4 5 6 7 8 9
8. (The actor’s) actions will harm very few people (if any).
1 2 3 4 5 6 7 8 9
The DATA BASE for Advances in Information Systems - Spring-Summer 2006 (Vol. 37, Nos. 2 & 3)
95
Download