Sie sind auf Seite 1von 3

Author: Conger, Sue; Loch, Karen

Year: 1995
Title: Ethics and Computer Use
Journal: Communications of the ACM
Volume: 38
Issue: 12 (December)
Pages: 30-32
Keywords: Ethics | Computer security | Social psychology | Freedom of
speech | Information technology | ( 5200) Communications & information
management | ( 5140) Security management
Abstract: Everyone who develops applications, designs equipment, performs
any kind of testing, uses methodologies, analyzes jobs, designs human
interfaces, writes documentation or prescribes the use of computers, will have
ethical dilemmas on every project; they just might not recognize them. Some
situations with ethical implications are obvious: availability and distribution of
pornography on the Internet, corporate monitoring of e-mail, copying of
protected software, data and algorithm inaccuracy, inadequate access
controls, and misuse of computer databases, to name a few. These issues give
rise to a groundswell of public opinion that determines an acceptable way of
acting and then coerces compliance to the newly developed social norm.
Notes: Over the past 50 years, computers have undergone transformation
from monolithic number crunchers, to centralized repositories of management
information systems, to distributed, networked, cyberspace support systems.
During the same period, uses of computers have moved from computational
problems to life support, from machine language to GUIs, from abstractions of
work to virtual reality on the World-Wide Web. These transformations have
brought with them situations that have ethical implications.

Before thinking, "Ethics has nothing to do with computers" or "There are no


ethical issues in my job"; reflect again. Everyone who develops applications,
designs equipment, performs any kind of testing, uses methodologies;
analyzes jobs, designs human interfaces, writes documentation, or prescribes
the use of computers, will have ethical dilemmas on every project; they just
might not recognize them. This is not a proselytizing message for some
particular way of thinking or for a certain set of values. Rather, it is a call to
remedy a serious shortcoming of our profession--to incorporate a social
conscience along with the technology in our jobs and to help end users
effectively and ethically use computers in their jobs.

Some situations with ethical implications are obvious: availability and


distribution of pornography on the Internet, corporate monitoring of email,
copying of protected software, data and algorithm inaccuracy, inadequate
access controls, and misuse of computer databases, to name a few. These
issues give rise, as in the case of the Equifax/Lotus Marketplace product, to a
groundswell of public opinion that determines an acceptable way of acting and
then coerces compliance to the newly developed social norm.
The typical course of social norm development in our society begins with
actions that occur and might irritate or harm some of the population but not
enough to gain society's attention. An example is the use of purchased
databases by banks to weed out high-risk individuals as loan applicants.
Legitimate applicants might be discriminated against because the information
derived from the database is incorrect; but, because it came from a computer,
it is viewed as inviolate. Another example might be the incorporation of
redlining policies on loan applications from minorities into an expert system
that evaluates loan requests. The hidden reasoning might never be know to
the users.

Eventually, some threshold is reached as more people are harmed and the
events become public. Then, we begin to attend to the issue and to discuss it.
As the number of people interested in the issue increases, the discussion
group raises and resolves the issue, developing a social norm. The norms
might be applied through peer pressure, through professional associations, or
through legal means, depending on the severity of sanction the norm
prescribes.

This typical course of social norm development runs into snags in computer
use because most of the population is relatively novice in computing even
thought the number of social transgressions is rapidly increasing. Another
problem in the development of social norms in the electronic realm is that
many situations are not well known and not well understood. And, for every
novel technology, new situations arise.

The lack of problem recognition and understanding comes from several


sources. First, many people are unable to draw analogies from computing
situations to real-world circumstances. Therefore, situations that an individual
would never typically contemplate, such as reading a neighbor's postal mail,
pose novel situations when electronic (i.e., reading a neighbor's email). This
whim might be indulged to satisfy curiosity or it might be thought of as
harmless because the information will not be used.

Second, people who have been trained in engineering, computer science, and
management information systems, frequently have little training in ethics,
philosophy, and moral reasoning. Without a vocabulary with which to think and
talk about what constitutes an ethical computing issue, it is difficult to have
the necessary discussions to develop social norms.

The purpose of this special section is to confront these problems by drawing


analogies, defining the terrain, identifying the state of legal support in email
privacy, and reasoning through ethical issues. The articles in this section were
developed to help define representative ethical situations relating to
computing. It is not possible to be exhaustive because new transgressions
occur daily. Second, we hope to shed some light on how to think about and
analyze situations when they arise. Finally, we hope to spark a continuing
debate on what the ethical positions relating to our profession should be. If we,
the editors and authors, have done our jobs properly, you will be angered,
discomfited, or prodded to action. Please participate in the debate.

Sue Conger is a professor in the Management Information Sciences


Department at Southern Methodist University. Karen D. Loch is an associate
professor in the Department of Decision Sciences at Georgia State University.

Das könnte Ihnen auch gefallen