Sie sind auf Seite 1von 31

Information Privacy: Measuring Individuals' Concerns about Organizational Practices Author(s): H. Jeff Smith, Sandra J. Milberg, Sandra J.

Burke Source: MIS Quarterly, Vol. 20, No. 2 (Jun., 1996), pp. 167-196 Published by: Management Information Systems Research Center, University of Minnesota Stable URL: http://www.jstor.org/stable/249477 . Accessed: 22/06/2011 22:43
Your use of the JSTOR archive indicates your acceptance of JSTOR's Terms and Conditions of Use, available at . http://www.jstor.org/page/info/about/policies/terms.jsp. JSTOR's Terms and Conditions of Use provides, in part, that unless you have obtained prior permission, you may not download an entire issue of a journal or multiple copies of articles, and you may use content in the JSTOR archive only for your personal, non-commercial use. Please contact the publisher regarding any further use of this work. Publisher contact information may be obtained at . http://www.jstor.org/action/showPublisher?publisherCode=misrc. . Each copy of any part of a JSTOR transmission must contain the same copyright notice that appears on the screen or printed page of such transmission. JSTOR is a not-for-profit service that helps scholars, researchers, and students discover, use, and build upon a wide range of content in a trusted digital archive. We use information technology and tools to increase productivity and facilitate new forms of scholarship. For more information about JSTOR, please contact support@jstor.org.

Management Information Systems Research Center, University of Minnesota is collaborating with JSTOR to digitize, preserve and extend access to MIS Quarterly.

http://www.jstor.org

Information Instrument Privacy

Information

Privacy:
Concerns

Measuring
Individuals' About

Organizational

Practices1

By: H. Jeff Smith Georgetown School of Business Georgetown University Old North Hall Washington, D.C. 20057 U.S.A. jsmith@guvax.georgetown.edu

tionage. Publicopinionpolls show risinglevels of concer about privacy among Americans. Against this backdrop, research into issues associated withinformation privacyis increasstudies, ing. Based on a numberof preliminary it has become apparent that organizational practices, individuals' perceptions of these practices, and societal responses are inextricably linkedin many ways. Theoriesregarding these relationships are slowly emerging. Unfortunately, researchers attempting to examine such relationshipsthroughconfirmatoryempiricalapproaches may be impeded by the lack of validatedinstrumentsfor measuring individuals' concerns about organizational information privacypractices. To enable futurestudies in the information privacy research stream, we developed and validated an instrumentthat identifies and measures the primarydimensions of individuals' information concers about organizational priThe development process vacy practices. included examinations of privacy literature; experience surveys and focus groups; and the use of expertjudges. The result was a parsimonious 15-item instrument with four subscales tapping into dimensions of individuals' information concerns about organizational priThe instrumentwas rigorously vacy practices. tested and validatedacross several heterogenous populations, providinga high degree of confidence in the scales' validity, reliability, and generalizability. Keywords: Privacy, LISREL,ethical issues, measures, reliability, validity ISRL Categories: A10401, A10402, A10403, A10611,BD0104.01, BD0105 will that It is inevitable personalprivacy be one of the most significant pressure points...formost of the 1990s. Advancing of technology,depersonalization the worka placeandothersocialenvironments,growcan ing population...all be expectedto create needfora sense of space a greater personal in as anddignity Chemerinsky, quoted (Erwin Smith, 1994). 1996 167 MIS Quarterly/June

Sandra J. Milberg Georgetown School of Business Georgetown University Old North Hall Washington, D.C. 20057 U.S.A. milbergs @gunet.georgetown.edu

Sandra J. Burke Georgetown School of Business Georgetown University Old North Hall Washington, D.C. 20057 U.S.A. burkes@gunet.georgetown.edu

Abstract
Information privacyhas been called one of the most importantethical issues of the informa1 AlienLee was the acceptingseniorforthis paper.

Information PrivacyInstrument

Information privacy-"the abilityof the individual to personally control information about one's self" (Stone, et al., 1983)-has been called one of the most important "ethical issues of the information age" (Mason, 1986; Smith, 1994). Public opinion polls show increasing levels of concern about privacy among Americans(Equifax,1990; 1991; 1992; 1993; Katz and Tassone, 1990). For example, one survey indicates that 79 percent of Americansare concerned about threatsto personal privacy,and 55 prcent feel that "protection of informationabout consumers will get worse by the year 2000" (Equifax, 1992). In addition,a numberof corporationshave faced legal problems and received negative media attentionbecause of privacyissues (Cespedes and Smith, 1993; Culnan, 1993; Smith, 1994). In the United States, pressure for additional laws to guard against information privacy exposures is coming from both domestic and internationalquarters (Cespedes and Smith, 1993; Culnan, 1993). As organizations find their data management activities receiving more scrutinyfroma privacyperspective, informationsystems managers should be aware of exposures and be accountable to their organizations (Strauband Collins,1990). Against this backdrop, research into issues associated with information privacyis increasing. Some privacy research dates back to the 1960s and 1970s (e.g., HEW, 1973; PPSC, 1977; Westin, 1967; Westin and Baker, 1972). Most of the scholarly work, however, has emerged during the 1980s and early 1990s. Consequently, it could be argued that the research stream is still in its infancy and that much worklies ahead as researchers examine the complex web of relationshipsin the informationprivacydomain. Based on a numberof studies (Culnan, 1993; Milberg,et preliminary al., 1995; Smith, 1994; and Stone, et al., 1983), it has become apparent that organizational practices, individuals' perceptions of these practices, and societal responses are inextricably linked in many ways. Theories regarding these relationships are slowly emerging: for example, Stone and Stone (1990) developed a model for information flows and physical/social structures in work environments based on expectancy theory,
1996 168 MISQuarterly/June

Culnan (1993) tested an exploratory model that explains consumer attitudestowardsome direct marketingpractices, and Smith (1994) developed a model that explains corporate approaches to information privacy policymaking. Overall, however, the theoretical base is still quite fragmented,and few empiricaltests of these relationships have been conducted. researchersattempting examto Unfortunately, ine such relationships through confirmatory empiricalapproaches may be impeded by the lack of validated instruments for measuring individuals' concerns aboutorganizational information privacy practices. As in many other areas of informationsystems (IS) research (Jarvenpaa,et al., 1985; Straub, 1989), instrumentationissues have generallybeen ignored in the informationprivacy domain. This can lead to two potentialproblemsover time: First, it will be difficult assess the significance of to any particular study because the "lackof validated measures in confirmatory research raises the specter that no single finding...can be trusted" (Straub,1989, p. 148). Second, development of a research stream willbecome particularly problematicbecause it will be difficult to "compare and accumulate findings and thereby develop syntheses for what is known" 1979, p. 67). Thus, the frontiersof (Churchill, knowledgein privacyresearch can be extended by efforts to enhance the tools at researchers'disposal and to ensure that scientific rigorcan be maintainedin futurestudies.2 Specifically, a rigorouslyvalidated instrument that measures individuals' concerns about privacypractices is organizationalinformation needed because withoutsuch an instrument, researchers cannot credibly test explanatory theories regardingcausal links between practices, individuals' perceptions, and societal will responses. In addition,such an instrument
2 References scientific should be interpreted as not to rigor

of empirical suggestingthe superiority confirmatory The over research approaches otherapproaches. point confirmatois stressed thatforresearchers utilizing being instruments the techniques, use of validated ryempirical that In to factor. addition, theextent theuse of is a critical research a on such instruments strengthens positivist topic,this can also call for advances in interpretive to research be madeonthesametopic (Lee,1991).See section. remarks the"Discussion" in additional

Information PrivacyInstrument

promote cooperative research efforts by "allowing other researchers in the stream to use this tested instrument across heterogenous settings and times" as well as by bringing "greater clarity to the formulation and interpretation of research questions" (Straub, 1989, p. 148). To that end, this paper reports the results of a study that developed and validated a measurement instrument that can be used in future information privacy research. Table 1 shows the final product of our work: a 15-item instrument with four subscales (Collection, Errors, Unauthorized Secondary Use, and Improper Access; see definitions below) that measures individuals' concerns about organizational information privacy practices. The following sections first consider the previous literature regarding individuals' concerns about organizational information privacy practices. They then detail the development and validation of the instrument and conclude with for both of implications a discussion researchers and managers.

concerns.4 Furthermore,while other studies have examined various dimensions underlying information privacy concerns, the specific dimensions differfrom study to study,5 and a common unifying framework relating these dimensions has not yet emerged. Thus, we attempted to ascertain the underlyingdimensions-both central and tangential-that have been identifiedin either scholarly literature,in federal law, or in privacyadvocates' writings.6 Of course, this was a somewhat iterative process in that the understandingof the constructwas refinedin latersteps (see "Methods" section below). This exercise revealed several central dimensions of individuals' concerns about organizational information privacy practices: collection internalunauthorized of personal information; use of personal information; extersecondary nal unauthorizedsecondary use of personal errorsin personal information; and information; Two improperaccess to personal information. additional,tangential dimensions, which were mentionedwith less frequency in the scholarly were also noted:concerns regarding literature,
4 For example, a commonly used public opinion survey concemed are you about threatsto questionasks, "How 1990; today?" (Equifax, yourpersonalprivacyin America 1991; 1992; 1993). Whilesuch a questionprovidesvalid data regardinglevels of public concern, it provides no the insightregarding natureof the concerns. 5 Forexample,Stone, et al. (1983) referto collection,storage, usage, and release. Culnan (1993) utilizeda 3x3 with matrix, acquire,use, and transfer alongone axis and interal customer,extemalcustomer,and prospectalong the otheraxis. 6 An extensive literaturereview of informationprivacy research from prominentmultidisciplinary publications framework the identificafor an and books provided initial dimensionsof information tion of the underlying privacy. To furtherour understandingof the dimensionalityof contentanalysis information concems, a modified privacy technique was used to identify concerns most often raised in privacyadvocates' writingsand in federal law. consist(1986) outline,a "universe" Kerlinger's Following 960 Privacy ingof approximately articlesin the publication for Journal(a leading monthlypublication privacyadvoarticle cates) from1983-1990was examined.A particular was consideredthe unitof analysis. A beginningset of categories, inspired by the "Code of Fair Information of as Practices" describedin the U.S. Department Health, Education,and Welfare study (1973) and the Privacy Protection (PPSC, 1977),was used. StudyCommission

LiteratureReview
A prerequisite step in the creation of a validated measurement instrument is a consideration of the dimensionality of the relevant construct (in this case, individuals' concerns about organizational information privacy practices). As one of several steps in this process, a thorough review of the existing literature was conducted3 (Bearden, et al., 1993; Churchill, 1979). It is common for information privacy to be approached as though it were a unidimensional construct. For example, while available opinion surveys (Equifax, 1990; 1991; 1992; 1993) report an increasing level of concern about information privacy, these surveys do not fully explore the nature (dimensionality) of those
3 This section details the resultsof the literature review.In details of additional a latersection ("Methods"), steps in this process are provided.

1996 MISQuarterly/June

169

Information Instrument Privacy

Table 1. Final Instrument Fromthe standpointof personalprivacy, Here are some statements about personal information. agree or disagree witheach statement by please indicatethe extent to whichyou, as an individual, number.* the appropriate circling A. Itusuallybothers me when companies ask me for personalinformation. in B. Allthe personal information computerdatabases should be double-checkedfor accuracy-no matterhow much this costs. for C. Companies should not use personal information any purpose unless it has been authorized who providedthe information. by the individuals access to personal D. Companies should devote more time and effortto preventingunauthorized information. I it. E. When companies ask me for personal information, sometimes thinktwice before providing in F. Companies should take more steps to make sure that the personal information theirfiles is accurate. to G. When people give personal information a companyfor some reason, the companyshould for never use the information any other reason. H. Companies should have betterproceduresto correcterrorsin personalinformation. should be protectedfromunauthorized I. Computerdatabases that containpersonal information access-no matterhow much it costs. J. to Itbothers me to give personal information so manycompanies. in K. Companies should never sell the personal information theircomputerdatabases to other companies. the L. Companies should devote more time and effortto verifying accuracyof the personal information in theirdatabases. with M. Companies should never share personal information othercompanies unless it has been who providedthe information. authorizedby the individuals N. Companies should take more steps to make sure that unauthorized people cannot access perin sonal information theircomputers. about me. O. I'mconcerned that companies are collectingtoo much personalinformation ItemsA, E, J, and 0 comprisethe "Collection" subscale; items B, F, H, and L comprisethe "Errors" Secondary Use"subscale; and items D, subscale; items C, G, K, and Mcomprisethe "Unauthorized subscale. Subscale scores are calculatedby averagingthe Access" I, and N comprise the "Improper to the items for each subscale; an overallscore is then calculatedby averagingthe subresponses scale scores. * Each of the items is followed a seven-pointLikert scale anchoredby "Strongly disagree"(1) and by agree"(7). "Strongely 1996 170 MIS Quarterly/June

Information Instrument Privacy

reducedjudgmentin decision makingand combining data from several sources. Each is discussed brieflybelow (see synopsis in Table 2).

Collection
The Association for Computing Machinery pro(ACM),a majorassociation of information cessing professionals, has a phrase in its Code of Professional Conduct that dictates members shall "always consider the principle of the individual's privacy and seek . . . to minimizethe data collected. . ." (ACM,1980). This area of concern reflects the perception, 'There's too much damn data collection going on in this society" (Miller, 1982). Individuals often perceive that great quantities of data regardingtheir personalities, background,and actions are being accumulated,and they often resent this. The growingcollectionof personal has information been a theme in privacyliterature since the 1970s. Although Westin and colBaker (1972) did not find that information lection had increased appreciably,they argued of that it was an inevitablebyproduct the growand raised concerns in ing computerrevolution that regard. Shortly thereafter, two major reports from that era (HEW, 1973; PPSC, 1977) confirmed the trend and the concern; this position was echoed by Linowes (1989). Laudon(1986) also inferreda concern regarding excessive collection of data when he coined the moniker "dossier society" to describe our increasing reliance on personal data. In the behavioralliterature,Stone, et al. collection"as one (1983) utilized "information in their study; Stone and Stone component (1990) also discuss a number of concerns collection. associated with information

use of personalinformation very often elicit will a negative response. This concern was raised pointedly in the Code of Fair Information Practices, which was included in a seminal study sponsored by the U.S. Department of Health, Education, and Welfare (1973). Concerns about such secondary use were raised in several industry settings in a later study (PPSC, 1977) and echoed in Linowes (1989). Stone, et al. (1983) refers to this issue underthe label of "information usage";it is reiterated by Stone and Stone (1990) in a discussion of various information uses. Specific examples of such secondary, internaluses are also covered in the literature: for example, "sugging"-a practicein whichdata are collected ostensiblyfor research only to be used later for marketingpurposes-falls into this area of concern (Cespedes and Smith, 1993). It is common for organizationsto find new uses for data: for example, some banks' marketing groups have attempted to use income data collected on loan applications to sort customers into narrow categories for targeted sales offerings-a use of income data likely unanticipated by the customers at the time they filledin the loan application.

Unauthorized secondaryuse (external)


Some studies (for example, Tolchinsky,et al., 1981) have found that concerns about secondary use are exacerbated when personal informationis disclosed to an external party (i.e., another organization). This issue was mentioned in studies conducted in the 1970s (HEW,1973; PPSC, 1977; Westin and Baker, 1972), but it was not stressed in great depth at that time-perhaps because the technology of the day constrainedsuch external exchanges of data. By the 1980s, though,the concern had become a majorone. Linowes (1989) discusses several examples of unauthorizedexternal in use of information variousindustries.Culnan (1993) examines attitudestowardexternalsecondary uses in direct marketingapplications. Stone, et al. (1983) consider the issue under the rubricof "information release," and Stone 1996 171 MIS Quarterly/June

Unauthorized secondaryuse (internal)


Sometimes informationis collected from individuals for one purpose but is used for another, secondary purpose without authorization fromthe individuals.Even if contained internally within a single organization, unauthorized

Information Instrument Privacy

Table 2. Dimensions Dimension Collection Description of Concern Concernthat extensive amountsof personally identifiable data are being collected and stored in databases MajorLiteratureReferences* HEW,1973 Laudon,1986 Linowes,1989 1982 Miller, PPSC, 1977 Stone, et al., 1983 Stone and Stone, 1990 Westinand Baker,1972 HEW,1973 Linowes,1989 PPSC, 1977 Stone, et al., 1983 Culnan,1993 Linowes,1989 Stone, et al., 1983 et Tolchinsky, al., 1981 Westinand Baker,1972

Unauthorized Secondary Use (internal)

Concernthat information collected from is individuals one purpose but is used for for withina another,secondary purpose (internally withoutauthorization from single organization) the individuals Concernthat information collected for one is but is used for another,secondary purpose purpose afterdisclosureto an externalparty (not the collectingorganization)

Unauthorized Secondary Use (external)

ImproperAccess

Errors

Concernthatdata about individuals readily Date, 1986 are to Linowes,1989 availableto people not properly authorized view or workwiththis data PPSC, 1977 (minor references) Concernthat protectionsagainst deliberate Date, 1986 and accidentalerrorsin personaldata are HEW,1973 Laudon,1986 inadequate 1982 Miller, PPSC, 1977 (minor references) Westinand Baker,1972 Kling,1978 Ladd,1989 Laudon,1986 1976 Mowshowitz, HEW,1973 Laudon,1986 PPSC, 1977

of Reduced Judgment Concernthat automation decision-making processes may be excessive and that (tangential)** mechanisms for decouplingfromautomated decision processes may be inadequate Concernthat personaldata in disparate Combining Data databases may be combinedintolarger (tangential)*** databases, thus creatinga "mosaiceffect"

* Not an exhaustive list. Items in this columnshould be viewed as representative.


** While some authors have addressed Reduced Judgment in their scholarly writings, they have

seldom identifiedit as a privacyconcern, per se. T* he CombiningData concern is almost always addressed in concert with either the Collectionor
the Unauthorized Secondary Use (External) concern.

1996 172 MISQuarterly/June

Information PrivacyInstrument

and Stone (1990) discuss the "subsequentdisclosures"of personal information. The most commonly cited examples of this concern are the sale or rental of current or prospective customers' names, addresses, phone numbers, purchase histories, categorizations, etc., on mailing "lists,"which are often transferred between the organizational entities as digital files. Trade publications in the direct marketing industry such as DM conNews ("DM" stands for "direct marketing") tain pages of advertisements for such lists who responded to an adver(e.g., "individuals tisement for a weight-loss product").

vacy-relatedconcerns involveinstead accidental errorsin personal data. Earlyprivacystudies detailsome proceduresfor minimizing such errors (HEW,1973; Westin and Baker, 1972; also see minor references in PPSC, 1977). Later works (Laudon, 1986; Linowes, 1989) documentcontinuing problemsin this domain. Provisions for inspection and correction are often considered as antidotes for problems of erroneous data (HEW, 1973; PPSC, 1977; Smith, 1994). But many errors are stubborn ones, and they seem to snowball in spite of such provisions (Smith, 1994). In addition, a reluctanceto delete old data-which can clearbecause of their static ly become "erroneous" nature in a dynamic world-can exacerbate this problem (Miller,1982). Also at issue are questions of responsibilityin spotting errors: does a system rely on individualsto monitor theirown files, or is there an overarchinginfrastructure in place (Bennett, 1992)? Although errorsare sometimes assumed to be unavoidable problems in data handling,whether controls are or are not includedin a system does represent a value choice on the part of the system's designers (Kling,1978; Mowshowitz, 1976).

access Improper
Who within an organization is allowed to in access personal information the files? This is a question not only of technological constraints(e.g., access controlsoftware)but also of organizational policy. It is often held that individuals should have a "need to know" is before access to personal information grantof ed. However, the interpretation which individuals have, and do not have, a "need to know"is often a cause of much controversy. PPSC (1977) and Linowes (1989) provide some attention to the topic-considering, for access to employexample, the inappropriate ees' healthcare records that are not controlled properly-and it is sometimes considered in of underthe rubric "security" database literature (see, for example, Date, 1986). Of course, technological options now exist for controlling such access at file, record, or field level. But how those options are utilized and how policies associated with those uses are formed representvalue-laden managerialjudgments.

Reduced judgment
As organizationsgrow in size and in theirdata processing capabilities,they tend to rely more often on formulas and rules in their decision making(Cyertand March,1963). Their use of automated decision-making processes may lead individuals to feel that they are being than as treated more as "a bunch of numbers" an individual. As systems are increasingly designed so that these decisions are automatthe ed, mechanisms for "decoupling" decision makingfromthe systems as appropriate-that is, revertingto humancontrolswhen the computer'slimitsas a decision makerare reached (Ladd,1989)--should be included.When such mechanisms are not provided,concerns about this dimension of decision making increase (Kling, 1978; Ladd, 1989; Laudon, 1986; Mowshowitz, 1976). Some examples of this phenomenon border on the ridiculous:
MIS Quarterly/June 1996 173

Errors
believe that organizationsare Manyindividuals not taking enough steps to minimizeproblems from errors in personal data. Although some errors might be deliberate (e.g., a disgruntled employee maliciouslyfalsifyingdata), most pri-

Information PrivacyInstrument

AmericanExpress once issued a Gold Cardto a man who had been dead 14 years, just because his widowfilledin all the blankson an applicationform. Muchof the data would have appeared immediatelysuspicious to a human being (e.g., she entered all zeroes for his social security number and "God" as his employer), but the system apparently had to been programmed accept these entries, and the appropriate mechanisms for decoupling the decision making from the computerized informationsystem had apparently not been included(Smith,1994). While advocates often couple "reducedjudgment"with privacyin their argumentsand writings, it should be noted that "reduced judgment" can easily be considered a tangential construct or even a separate construct of its own (under the rubricof "decision making"). Indeed, it has usually been referenced in scholarlyworkas a relatedconcern ratherthan as a dimensionof privacy,per se.

to measure individuals' concerns about organizationalinformation privacypractices. While it is expected that this instrumentwill be used primarilyin positivist studies, the process of instrumentdevelopment and validation used in this study includedsteps that addressed not only what Lee (1991) would call a researcher's positivist understanding (e.g., a theory taking the form of independent and dependent variables), but also the subjective understanding (i.e., what the research subjects themselves understandtheir situationto be) as well as a researcher's interpretive understanding (i.e., what the researcher observes and interpretsto be the subjective understanding).7 As can be seen in Stage 1 of Figure 1,8 an iterative process is used to (1) specify the domain and dimensionalityof a construct, (2) generate a sample of items, and (3) assess content validityof these items (i.e., the extent to which scale items appear-to be consistent with the theoretical domain/dimensionality of the construct (Churchill, 1979; Cronbach, 1971)). Techniques9 used to accomplish these tasks include literature reviews, experience surveys,10 focus groups, expert judges, and pilot tests with relevantsamples. Duringthis stage, based on inputfromindividuals, expertjudges, and literature, scale items are trimmed and refinedand dimensions may be added, deleted, or modifiedas understandingof the constructimproves. In Stage 2, the items and the conceptualization of the construct are subjected to "field
7 See Lee (1991) for fuller explanationsof these terms. in section. Also, see remarks our"Discussion" 8 Figure1 representsa synthesis and adaptation severof al models of the developmentand validation process. It

data Combining
Concerns are sometimes raised with respect to combined databases that pull personal data from numerous other sources, creating what has been termed a "mosaic effect." These combinationsof data were mentionedin some of the 1970s privacy studies (HEW, 1973; PPSC, 1977) and in the 1980s (see especially Laudon,1986). Even if data items in disparate databases are seen as innocuous by themselves, theircombinationinto largerdatabases appears to some to be suggestive of a "Big Brother"environment. The Combining Data concern is usuallyencountered in the literature in parallelwith Collectionand/or Unauthorized Secondary Use (External) concerns and, in fact, may not be a separate dimension (see "Methods" below).

it to is notclaimed be a definitive however, does model; include steps thatare widelyacceptedand have the beenusedinpaststudies.

Methods
The ultimateobjective of this research was the development and validation of an instrument
1996 174 MISQuarterly/June

9 Forfurther see of discussion thesetechniques, Bearden, 1979. et al.,1993andChurchill, 10 Anexperience a is samplebuta survey "not probability someideasand of who sample persons canoffer judgment into 1979, (Churchill, p.67). insights thephenomenon"

Information PrivacyInstrument

Stage
'I

1:

Techniques/Proce
+

Specifydomainand dimensionality of the construct Generatesampleof items Assess contentvalidity (add,delete, modify items)

Literature review Experience surveys Focusgroups

Expert judges Focusgroups


I

Experience surveys

Stage

| instrument sample to Administer


i

2: I

I
factoranalysis Exploratory Interitem reliabilities factoranalysis Confirmatory (LISREL)

Assess itemsand instrument itemsand/orunderstanding (purify of construct) modelsof Comparealternative the construct

Stage 3:

Assess internal validity


- construct - concurrent - nomological

factor Confirmatory analysis (LISREL) Pearsoncorrelations OLSregressions factor Confirmatory analysis (LISREL) Pearsoncorrelations factor Confirmatory analysis on (LISREL) multiple samples

-internal consistency
- test-retest

Assess reliability

Assess generalizability

Source: AdaptedfromBagozzi (1980), Bearden,et al. (1993), Churchill (1979), and Straub(1989). Figure 1. Instrument Development and Validation Process
MISQuarterly/June 1996 175

Information Instrument Privacy

tests." A preliminary version of the instrument is administered to various samples, and exploratoryfactor analysis is utilizedto assess the items. The instrument is "purified" (Churchill,1979) as researchers find that certain items should again be added, deleted, or modified. Indeed, the researchers may even find that they must modifytheir understanding of the construct's dimensionality.Stage 2 continues until the loading of the items in the exploratoryfactor analysis is consistent with the understanding of the construct's dimensions, and the inter-item reliabilities (Cronbach's alpha) are at satisfactory levels for all dimensions. In addition, Stage 2 includes a comparisonof alternativemodels of the construct through confirmatory factor analysis (CFA). (See the Appendix for a discussion of the meritsof CFA.) In Stage 3, the internalvalidityof the instrument is assessed. This includes evaluations of the instrument'sconstruct, concurrent,and nomological validity.Stage 3 also includes an assessment of the instrument's reliability, specifically internal consistency and testretest reliability.Finally, Stage 3 includes an assessment of the instrument'sgeneralizability (i.e., its usefulness for different populations). In all tests, the instrument must perform at statistically adequate levels. These levels have been documented, based on generally accepted criteria, by numerous writers (see Bagozzi and Yi, 1988; 1991; Bearden, et al., 1993; Bentler and Bonett, 1980; Fornell and Larcker,1981). The following sections provide further detail regarding our instrument development process.

starting point in determiningthe central and tangential dimensions of the construct. Concomitant with this effort, semistructured interviewswere conducted with 83 executives, managers, and employees of banks, insurance companies, and a creditcard issuer.11Further, 18 consumers were interviewedeither individually or in focus groups. (See Table 3a for a descriptionof the samples; see Table 3b for a summaryof the proceduresand results in this study.) By discussing the construct both with people who were employed in organizationshandling much personal data and withpeople who were not in any way associated with such organizathe tions, we were betterable to interpret subjective understanding (i.e., what the individuals themselves understood"concernsabout informationprivacy" mean). To achieve our own to interpretive understanding, rather than rely exclusively on how other researchers and some advocates have alreadyinterpreted what individuals themselves understandinformation privacy to mean (see previous discussion under "Literature Review"),we interviewed12 consumer/privacyadvocates either in person or over the phone. Commonlycited themes regardinginformation privacy concerns from the research articles, from privacy advocates' writings and U. S. laws, and from the interviews/focus groups were sorted in an iterative and interpretive process by the researchers, and the categories were combinedand modifiedin an intuitive manner to suggest underlying dimensions. Based on the dimensions identified,we set generated a preliminary of 72 survey items for measuring individuals' concerns about information organizational privacypractices. To establish content validity the scale items of (i.e., whether the items trulysampled the universe of situationswe were attempting meato
1 Inall interviews, wereasked organizations' employees fortheir both and opinions as employees as individuals. informed organizational about than While better practices wereusually to disable consumers, employees average theirorganizational and between perspectives tinguish their opinions. personal

resultsvalidation Instrument 1 stage


To specify the domain/dimensionality of the "individuals'concerns about organizational information privacypractices"constructand to establish content validity of the generated scale items, the extensive literature review, detailed in a previous section, was used as a 1996 176 MIS Quarterly/June

Information PrivacyInstrument

Table 3a. Samples n of Sample # sample* 1 83 2 3 4 5 6 7 8 9 10 11 12 13 14 15 18 12 3 15 15 25 704 239 270 147 354 186** 170*** 77 Composition of Sample Executives,managers, employees-banks, insuranceorganizations, creditcard issuer Consumers (individual focus group) and advocates Consumer/privacy with Judges (familiar privacyarea) Doctoralstudents and facultymembers (large Easternuniversity) Organizational employees (follow-up interviews) Employees-bank, insuranceorganizations (focus groups) Employees-bank, insuranceorganizations, creditcard issuer Information systems managers;graduatebusiness students (two Easternuniversities) Graduatebusiness students (fourgeographically dispersed U.S. universities) Graduatebusiness students (Easternuniversity) U.S.-based ISACAmembers business students Undergraduate (Easternuniversity) business students Undergraduate (Easternuniversity) Graduatebusiness students (twogeographically dispersed U.S. universities) Date Collected Fall 1989/Spring1990 Spring1990 Spring1990 Spring1990 Spring1990 Spring1990 Spring1990 Spring/Summer 1990 Summer/Fall 1991 Spring1992 Fall 1992 Spring1993 Spring1993 Fall 1993 Fall 1993

* n's reported in this table are for the total samples. These n's may differslightly from those in subsequent tables, which report effective sample sizes (in which responses with some missing data have been removed). **All 186 of these students completed a "test"of the instrument,and these 186 responses were exercise. One hundredtwenty-threeof the 186 also completed the used for the "generalizability" exercise. Halfof the retest"of the instrument eight weeks later)and were used in the "test-retest" retest"group was randomlyselected to complete a "cynicaldistrust" scale; 59 of these, who had distrust" scale. were correlatedwiththe "cynical also completed the "test," These 170 students were randomlyassigned to complete, in additionto the information *** privacy or scale. eithera "paranoia" a "socialcriticism" instrument,
MISQuarterly/June 1996 177

Information Instrument Privacy

Table 3b. Procedures and Results Task Stage* 1 Interpret organizational participants' of understanding to matterspertaining 1 Technique/ Samples Procedure (See Table 3a) Results interviews Semistructured research 1, 2 Preliminary of understanding constructdomain and dimensionality.
-

construct domain and

1 1 1

dimensionality interviews Improveinterpretation Semistructured of the organizational participants' of understanding to matterspertaining constructdomainand dimensionality Assess content validity Judgingexercise of initial item set

4 5 6

Refinedresearch of understanding constructdomain and dimensionality; used in generating set of 72 survey items. 33 items discarded, set. Itemsdeleted and reworded,resulting in a new 32-item set. Confidencein refined of understanding constructdomain and dimen$ionality, Itemsdeleted and i reworded,resulting jina new 20-item $et. 20 items includedon instrument. Deleted 3, added 2, and modified8 forclarity, resultingin a 19-item instrument. 19 items includedon instrument. Deleted 2, added 8, and modified6 for clarity, resultingin a 25-item instrument. Secondary internal and externaluse converged intoa single dimension. Also deleted items that did not clearly load onto single factors and items that were redundant,resulting in a 15-item instrument (see Table 1). Of these 15
resulting in a 39-item

Test content validityof reduced item set Test refinementof constructdomainand dimensionality Pilottest preliminary survey Assess items and instrument

Judgingexercise interviews Follow-up prliir. Focus group:fillin survey and discuss factoranalysis Exploratory reliabilities and interitem

1 2

7 8

10

1996 178 MIS Quarterly/June

Information Instrument Privacy

Table 3b. continued items, all loaded onto unambiguously fourfactors; Cronbach'salphas at adeauate levels. Hypothesized4factormodel providedbest fitto data comparedto alternative models (see Table 4) Non-centralized normedfit index adequate Standardized factor loadings,average varianceextracted, and factorintercorrelations adequate (see Tables 6 and 7) Significant correlations with publicopinion questions Expected relationships withpast experiences and priorknowledge supported Expected relationships withpersonality factorssupported Expected relationships withfuture behavioralintentions supported Compositereliability, average variance extractedadequate (see Table 6) correlations Significant
over an 8-week

factor Confirmatory Comparealternative models of the construct analysis (LISREL)

10

3 3

factor Assess constructvalidityConfirmatory - overallmodel fit analysis (LISREL) factor Assess constructvalidityConfirmatory - convergent/ analysis (LISREL) discriminant

11 11

Assess concurrent validity Assess nomological validity

Pearson correlations

12

OLS regressions

15

Pearson correlations Pearson correlations

13,14 15

Assess reliability - internalconsistency Assess reliability


test-retest

factor Confirmatory analysis (LISREL) Pearson correlations factor Confirmatory analysis (LISREL)

11

3 3

13

Generalizability

12, 13

period and Validity reliability shown for disparate populations(see Table 8)

* As shown in Figure 1.

MIS 1996 179 Quarterly/June

Information Instrument Privacy

sure (Cronbach, 1971; Straub, 1989)), we asked three judges, familiarwith the privacy area, to screen the 72 items for those that did not appear consistent with the construct and the identifieddimensions (i.e., did not appear to actually measure individuals' concerns about organizationalinformation privacypractices).12 This resulted in a reduced set of 39 items. Then, 15 doctoral students and faculty members at a large Eastern universityjudged this reduced set of items. Specifically, the dimensions were explainedto eight of the doctoral students/faculty members, who were asked to evaluate the items for theirapplicability to the respective dimensions. The other seven doctoralstudents/facultymembers were presented withthe items but were not given an explanationof the dimensions. They indicated, for each item, what they perceived that item would measure. Items that were either inconsistently classified or were misclassified were eliminatedor reworded.Afterthis analysis, 32 items remained. Followingthis step, follow-upinterviewswere conducted with 15 corporate employees. In addition,at this point, the items were presented to 25 people employed by banks and insurance organizations. Using a focus group format, participantsfilled in the survey and then discussed their reaction to, and thoughts about, each item. Based on this test, further modificationsand reductions resulted in a 20item scale. During this iterative process, as defined in of Stage 1, the domain and the dimensionality the "individuals' concerns about organizational informationprivacy practices"construct were repetitivelyassessed and modified.It became apparent that the Combining Data dimension was actually subsumed by two other dimensions: Collectionand Unauthorized Secondary Use (External). Further probing with both advocates and consumers revealed that CombiningData concerns are actuallyviewed
12 For example, when presented with some preliminary

as an outcome of Collectionand Unauthorized Secondary Use (External). When we tested some items that purportedto measure concerns about CombiningData, our judges consistently held that it was almost impossible to separate the items' content fromthe Collection and UnauthorizedSecondary Use (External) dimensions.Also duringthis process, Reduced Judgment was found by our judges and by some consumers to be tangentialto the major constructof concern about information privacy and was dropped from furtherconsideration. Few scholarlyworks hold Reduced Judgment to be a dimensionof privacy.Furthermore, privacy advocates, when pressed, were inclined to concede that Reduced Judgmentwas actually outside the sphere of privacy concerns, per se.

validation Instrument resultsstage 2


assessment and Stage 2 includesa preliminary of refinement the instrument through exploratory in methods. In particular, our study, the 20-item scale was administered to a sample of 704 bank, insurance organization,and credit card issuer employees.13Exploratory factor analysis and interitem (Cronbach's reliability alphas) provided tentative supportfor the various dimensions. Unfortunately, however,several items did not load as expected. An additional iterative in process was initiated whichthree versions of were administered varied to revisedinstruments such as information systems manpopulations, agers and graduatebusiness students. Based on additionalexploratoryfactor analysis, items were deleted, added, or revised. This process eventually resulted in a 15-item instrument. Fourfactors emerged representing
to 13The surveywas distributed the employees, who were chosen fromthe organizations' personnel rosrandomly ters, under a cover letterfrom a senior executive. The completed,anonymoussurveys were mailed directlyto the researcher.A totalof 1,103 surveys were distributed, for and 704 were returned, a response rate of 63.8 persaid "Weare interestedin cent. The survey instructions your own, personal opinionsabout the issue of privacy. Yourresponses willbe held in completeanonymity."

items that were attemptingto measure concerns about Reduced Judgment,the judges respondedthat they had in why it was a privacyconcern, difficulty understanding per se.

1996 180 MIS Quarterly/June

Information PrivacyInstrument

the subscales: Collection (4 items), Errors(4 Secondary Use (4 items), items), Unauthorized of Access (3 items). Examination and Improper the exploratory factor analysis revealed that the unauthorizedinternal secondary use and the external secondary use dimensions had converged onto a single factor,the salient feature being that the secondary use of the informationhad not been authorizedby the individuals involved. Chosen items all had factor loadings greaterthan .60 on the same factorin all factoranalyses performed.(The finalinstrument is shown in Table 1.) Stage 2 included one final step: to determine whether the hypothesized model of a fourdimensionconstructprovidedthe best fitto the data as comparedto alternativeplausiblemodels. To accomplish this, the overallfits of four theoretically plausible alternative models (a unidimensional model, a three-dimensional model, a model with two main factors and three sub-factors, and the hypothesized fourfactor model) were compared using the CFA program LISREL (Joreskog and Sorbom, 1984). Four statistics provided in the LISREL programthat are commonly used to compare model fits are the non-adjusted and adjusted goodness-of-fitindices (GFIand AGFI,respectively),root mean square residuals (RMR),and chi-squarestatistics (Bagozzi and Yi, 1988). Comparison of these statistics (see Table 4) suggests that the hypothesized four-factor model performsbetter on the overall model fit

measures than competing models. The chisquare statistics for all models estimated were significant,but-given the large samples used in the study-the significantchi-squares were likely artifacts of sample size (Bentler and Bonett,1980) . Thus, a comparisonof the GFI, AGFIand RMRmeasures-which are independent of sample size (Bagozzi and Yi, 1988)was performedto assess the model's fit. As shown in Table4, boththe GFIand the AGFIof model are higherthan those for the four-factor the three competing models. Further, RMRs are also considerablyhigher in the competing models (more residualvariance remains)than in the hypothesized model. Finally,the coeffia for cient of determination, criterion evaluating the globalfitof a model by assessing explained variance(i.e., how well the items serve as joint measures of the latent variable), was examined. As shown in Table 4, the coefficient of is determination as high or higher in the fourfactor model as in the other three models. Thus, of the fourmodels compared,the hypothesized four-factor model providedthe best fitto the data and was ultimately accepted.

validation resultsInstrument 3 stage


Stage 3 included assessments of the instrument's internal validity, its reliability,and its generalizability.

Table 4. Comparison of Four Models One-Factor Model


Chi-square (d.f.)* 792 (90)

Three-Factor Model
371 (101)

Second-Order Model
371 (101)

Four-Factor (Hypothesized) Model


240 (84)

GFI AGFI RMR Coeff. of Determination

.66 -.36 .10 .812

.75 .42 .07 .991

.85 .42 .066 .970

.90 .67 .047 .996

* p<.001 for all models tested.

MISQuarterly/June 1996

181

Information Instrument Privacy

InternalValidity ConstructValidity Construct validity is defined as the extent to which the operationalization of a construct measures what it is supposed to measure (Cook and Campbell, 1979). To establish constructvalidity,we considered (1) the adequacy of the model's fit and (2) convergent and discriminantvalidity for the model. To this end, the instrument was administered to a new sample of graduate business students
(n=1 47).14

the hypothesized model as comparedto a null model that hypothesizes that all variables are This analysis is commutuallyindependent.15 to monlyperformed assess model fit when the CFA does not fulfill acceptance criterionof the a non-significant chi-square statistic, even though other measures of model fit fall within acceptable ranges (Bagozzi, 1993; Bentler, 1990; McDonald and Marsh, 1990), as was found in this case. The NCNFI for this model is .91, which is greaterthan the .90 ruleof thumbrecommended as a minimumsatisfactory level (Bentler and Bonett, 1980), suggesting adequate model fit from a practicalstandpoint.In other words, the remaining incremental fit that could be achieved by additionalmodel modificationsis small. Further, coefficientof determination the (whichshows how well the hypothesized relations account for the factors) is very high at .996. Finally, RMRis also very low, at .065 the (Bagozzi and Yi, 1988). These findingssupport the overall adequacy of the model fit and provide supportfor the theoreticalstructureof the construct(Bagozzi and Yi, 1988). Convergent and Discriminant Validity: Convergent validity refers to the extent to which multiplemeasures of a construct agree with one another (Campbelland Fiske, 1959). Discriminantvalidity refers to the extent to which measures of different constructsare distinct (Campbelland Fiske, 1959). A traditional method for assessing construct validity has been the multitrait-multimethod (MTMM) matrix.However, CFA affords certain advantages in validity assessment over MTMM matrixanalysis (Bagozzi and Phillips, 1982). CFA explicitly represents random measurement errorand allows for estimationof method variation.In addition,it provides explicittests of the entire model, estimates of parameters, and a varietyof fit measures, not availablewith the MTMM procedure(Bagozzi, 1993). There are several approaches to assessment of convergentvalidity throughCFA. First,con15The method for calculating the NCNFIis specified in Bentler(1990, pp. 239-241). The chi-squarefor the null modelwas 1094 (d.f.= 105).

Adequacy of Model's Fit: In Stage 2, it was shown that the four-factormodel providedthe best fit to the data, suggesting its superiority over the other models in definingthe "individuals' concerns about organizationalinformation privacy practices" construct. However, the accepted model will achieve an adequate or satisfactoryfit to the data only when a signifi-, cant degree of correspondence exists between concepts and their respective measures and when measurement error is random (Bagozzi and Phillips,1982). If significantmeasurement or method errors (or other external confounding factors) are present, overall model fit will be poor, suggesting model misspecification.In other words, subject responses must fit a fairly well-defined pattern for the hypothesized model to be sustained and construct validity supported. On that basis, the first step in assessing construct validity using the CFA technique is to assess overall model fit. While the chi-square statistic provided in LISRELis often used to assess the adequacy of the model's fit, it may be a misleadingartifactof sample size (Bentler and Bonett, 1980). Given the large sample size in this study, the overall fit of the model was examined using the non-centralized whichis The NCNFI, normedfit index (NCNFI). of sample size (Bentler, 1990; independent McDonald and Marsh, 1990), assesses the proportionof additionalvariance explained in
14Loadings from an exploratoryfactor analysis and the for interitem reliabilities this sample are shownin Table5.

1996 182 MIS Quarterly/June

Information Instrument Privacy

Table 5. Final Instrument-Factor Analysis COLLECTION Factor Loading .861 .856 .855 .762 ERRORS Factor Loading UNAUTHORIZED SECONDARY USE Factor Loading IMPROPER ACCESS Factor Loading

Item # J E A 0 F H L B K M G C N D

.864 .816 .811 .679 .778 .768 .719 .717 .773 .771

~~~~~~~~~~~~~I

~~~~.719

These results are froma sample of 147 business graduatestudents fromFall 1992. Allloadings above .40 are listed above. reliabilities Interitem .84; Secondary Use, .80; Improper (Cronbach'salpha):Collection,.88; Errors, .75. Access, corvergence implies that all within-construct relations are both high and of approximately the same magnitude (Fornell and Larcker, 1981). To assess this aspect of convergent validity,the fit of the internalstructureof the model (as discussed above), factor loading size, and significance were assessed. Bagozzi and Yi (1991) suggest weak evidence of convergent validityresults when the factor loading on an item of interest is significant.Strong evidence is achieved when the squared factor loadingis greaterthan .5 (morethan halfof the total variation in the measures is due to the trait).As shown in Table 6, standardizedfactor loadings (SFL) for all measures are greater than .6; all are statisticallysignificantat p<.05. In addition, 12 of the 15 items have squared factorloadings greaterthan .5. Second, convergent validitycan be assessed in terms of the degree to which the four subscales (which might be considered four different measures of concern) are correlated (Bagozzi, 1980; Barki and Hartwick, 1994). assess convergentvalidity,the Thus, to further correlations between the subscales were examined. As shown in Table 7, the correlations between the dimensions are all significantly different from zero (p<.05). This suggests that the four dimensions are all measuring some aspect of the same construct (are not orthogonal). To assess discriminant subscales must validity, be examined to insure they are not perfectly correlated(correlationsequal to 1). As shown in Table 7, all subscale correlations signifiare fromone (p<.05).This suggests different cantly thatwhilethe subscales are measuringaspects of the same construct, they are measuring unique dimensions of that construct. Further, more rigorousevidence of discriminant validity is also observed by lookingat the average variance extracted(AVE)by each factorrelativeto that factors shared variance with other factors in the model (see Fornelland Larcker,1981). (AVEmeasures the amountof variance that is captured by the construct in relation to the MIS 1996 183 Quarterly/June

Instrument Information Privacy

amountof variancedue to measurementerror.) Inevery case, the AVEassociated witha factor (see Table 6) is greater than the shared variance (squared correlation)between that and every otherfactor(see Table 7).

ConcurrentValidity Concurrent validity is considered when one test is proposed as a substitutefor another or

if the test is shown to correlatewithsome useful criterion(e.g., anothertest) administeredat the same time (Cronbach and Meehl, 1955; Bagozzi, 1981). To assess concurrentvalidity of this instrument, the correlation between responses to the current instrument and responses to previouslyutilizedpublicopinion survey questions was assessed. These public opinion surveys (see Cambridge Reports, 1989; Equifax, 1990; 1991; 1992; 1993; Katz and Tassone, 1990) typicallyused questions

Table 6. Summary of Parameter Estimates of Four-FactorModel Graduate Business Students (n= 147) CR

Factor Collection

Item SFL A E J 0 B F H L C G K M D I N .816 .743 .870 .765

AVE

.88 Errors .700 .709 .720 .900 .85 Secondary Use .726 .669 .646 .781 .80 Access Improper .652 .733 .721 .75 Chi-Square NCNFI RMR's Coeff. Determ. Legend: SFL = StandardizedFactorLoading CR = Composite Reliability AVE= Average VarianceExtracted NormedFitIndex NCNFI= Non-centralized = Root Mean-squaredResidual RMR 174 (84) .91 .065 .996

.69

.58

.50

.50

1996 184 MIS Quarterly/June

Information PrivacyInstrument

Table 7. Factor Intercorrelationsof Four-FactorModel FACTORS Collection Errors [SquaredCorrelation] (Std. Dev. of Correlation) Unauthorized Secondary Use [SquaredCorrelation] (Std. Dev. of Correlation) ImproperAccess [SquaredCorrelation] (Std. Dev. of Correlation) Collection 1.00 .216 [.047] (.09) .425 [.181] (.08) .264 [.070] (.10) Errors 1.00 Unauthorized Secondary Use Improper Access

.448 [.201] (.08) .611 [.373] (.07)

1.00

.641 [.411] (.08)

1.00

are different fromzero (p<.05) and one (p<.05). Allfactorintercorrelations significantly results are froma sample of 147 business graduatestudents fromFall 1992. These that have been subjected to limitedvalidation procedures, and the questions address general, unidimensional privacy concerns. Nevertheless, it is expected that a strong relationship should exist between a subject's responses to those questions and the scale developed in this research. To check this, three questions that had been used on previous public opinionsurveys were includedwith our instrument,and this combined survey was administered to a sample of 354 U.S.-based members of the InformationSystems Audit The and ControlAssociation (ISACA).16 three were: questions 1. "Compared with other subjects on your mind, how importantis personal privacy?" (CambridgeReports, 1989). 2. "Howconcerned are you about threats to your personal privacy today?" (Equifax, 1990; 1991; 1992; 1993). 3. "As computer usage increases in business and the general society, more and more
16ISACAmembers were asked to respond"as an individIn ual"and to give their"personal opinions." a latersecresults of a CFA asseSsment of tion ("Generalizability"), the ISACAmembers' responses to our instrumentare described.

information on individual consumers is being acquiredand stored in various computers.Howserious a threatto personalprivacy is this development?" (Cambridge Reports,1989). Correlations between each subject's response to these questions and their overall score on our instrument were .35, .36, and .46 for items and (3), respectively (p < .001 for all (1), (2), three). The expected relationship between public opinion survey questions and the currentinstrument was observed.

NomologicalValidity Nomological validity refers to the extent to whichpredictions based on the constructbeing measured are confirmedwithina wider theoretical context or network of constructs (Bagozzi, 1981; Cronbach, 1971; Cronbach and Meehl, 1955). Not often tested in IS research (see, however, Straub, et al., 1995), nomologicalvalidityexamines the robustness of the constructs as they interrelatewith one another. To assess nomological validity, we considered (1) some possible antecedents that privacyconmight affect levels of information individualfactors that might have cern, (2)
1996 MISQuarterly/June 185

Information PrivacyInstrument

theoretical relationships to levels of concern, and (3) some future behavioralintentionsthat should be associated withlevels of concern. Antecedents: Two theoretically plausible variables were assessed. It has been "causal" suggested that (a) previous personal experiences may impact one's concerns about information privacy (Culnan, 1993; Stone and Stone, 1990), and (b) media coverage may increase the.level of concern about information privacy(Westin, 1990). These assertions lead to a reasonable set of propositions.Itwas proposed that individualswho had been exposed to, or been the victim of, personal information misuses should have stronger concerns privacy.To that end, 77 regardinginformation business graduate students from two geographically dispersed U.S. universities were asked to complete our instrument and were also asked to answer the followingquestions (on seven-point Likertscales) (1) "Howoften have you personally been the victim of what you felt was an improperinvasion of privacy?" and (2) "Howmuch have you heard or read duringthe last year about the use and potenabout tial misuse of computerizedinformation The first question examines, to consumers?"17 some degree, the respondent's perception of his or her own experiences with respect to information handling. The latter question examines the respondent's level of knowledge regardingcollection and use of personal information. Results of regression analyses, with overallconcern as the dependent variableand experience and knowledge as independent variables, strongly support these research propositions,with beta coefficients of .16 and .22, respectively(p < .01 for both). Individual Personality Factors: Prior research has suggested that information privacy concerns may also be associated withvarious personalityfactors (e.g., Berscheid, 1977; Cozby, 1973; Kelvin,1973; Lauferand Wolfe, 1977; Levin and Askin, 1977; Stone, 1986; Warrenand Laslett, 1977). Some factors that one might expect to be correlated with information privacyconcerns include: (1) trust/dis17Both of these questions were patterned after those in (1990). Equifax

trust, (2) paranoia, and (3) social criticism. was administeredin Therefore,the instrument with scales measuring the sugconjunction gested related personalityfactors to separate samples of undergraduatebusiness students as follows: It is argued that cynical distrustmay be positively correlated with concern for information privacy in that individualswith high levels of distrust may also be more concerned about the use and dissemination of their personal Examinationof the responses to information. the "cynicaldistrust"and the "overalllevel of concern"scales supports this contention (n =
59,18 correlation = .30,19 p< .05).

Paranoia is a second personalitytrait argued to be positively correlated with concern for information privacy.Itis plausiblethat individuals who are paranoidare also likelyto be more concerned about the privacyof their personal information. When both the paranoia scale (Fenigstein and Vanable, 1992) and the information privacy concern scale were administered to undergraduates,a significantcorrelation was observed (n = 87, correlation= .37, p< .001). The social criticism scale measures "the degree of acceptance or rejection of the values, norms, and practices of...society"(Jessor and Jessor, 1977). It is proposed that consumers who reject society's values, norms, and practices would also be highlyconcerned about information analyprivacy.Correlational sis of responses to the information privacy concern instrument and the social criticism scale showed supportfor this proposition(n =
83, correlation = .37,20 p< .001).
18Studentsin this sample were also used in the test-retest exercise (see section below). They responded reliability of the scale following "retest" the to the "cynical distrust" utiThe privacyconcern instrument. reportedcorrelation score for the privacyconcern instrument lizes the "test" (completed eightweeks earlier). 19The correlationsin this section refer to our OVERALL scale. 20This correlationis reported as an absolute value; its is direction consistentwiththe proposition.

186

1996 MISQuarterly/June

Information Instrument Privacy

Future Behavioral Intentions: Previous research (e.g., Stone, et al., 1983) suggests that individualswith higher levels of concern about information privacy practices may be more likelyin the futureto refuse to participate in activities that require the provision of personal information. They may also be more likely to contact official agencies or companies regarding informationpractices. To provide a preliminary test of such assertions, the 77 business graduate students from two geographically dispersed U.S. universities (see "Antecedents"section above) were given, in additionto the information privacyinstrument, a set of six items that investigatedsuch future behavioral intentions with respect to informaThe six items exhibited a high tion privacy.21 level of interitemreliability (Cronbach'salpha = A mean score for the six items was corre.87). lated withthe overallscale score for our instrument. The research proposition suggesting that higher levels of informationprivacy concern willbe associated withstrongerintentions to take privacy-related actions was strongly = supported(correlation .33, p<.01).

Internal Consistency Internalconsistency examines the degree to which the "items used to assess a construct reflecta true, common score for the construct" (Bagozzi, 1980; Barkiand Hartwick, 1994). To assess internalconsistency in this research, two measures were calculated in addition to factorloadings:(1) composite reliability (CR)of the dimension measures and (2) AVE from the dimension measures. CR considers the ratio of non-randomvariationassociated with all measures of a subscale to total variationin all these measures. As shown in Table 6, CRs for the dimension measures are all quite high and well above a .6 rule of thumb of acceptability(Bagozzi and Yi, 1988). AVE, as described earlier, measures the amount of variance captured by the construct in relationto the amount of variance attribued to measurement error. If AVE is less than .5, the variance associated with measurement error is larger than the variance captured by the construct, and the construct reliabilityis questionable.As shown in Table 6, AVEs are all at or above .5, which is a rule of thumbfor adequacy of this measure (Bagozzi and Yi, 1988). Thus, the measures of internalreliability and structurefit all surpass the minimum standards of adequacy.

Reliability The reliability the instrumentwas assessed of by evaluating (1) internalconsistency and (2) test-retest reliability.
21 Respondentswere asked how likelythey were, within the next three years, to: (1) "decidenot to apply for something,like a job, credit,or insurance,because you do not about yourwant to providecertain kinds of information to self,"(2) "refuseto give information a business or company because you think it is too personal," (3) "take action to have your name removedfromdirectmail lists or for catalogs, products, or services," (4) "write call a company to complain about the way it uses personal or information," "write call an elected officialor con(5) sumer organizationto complainabout the way compaand nies use personal information," (6) "refuseto purchase a productbecause you disagree with the way a Each was followed companyuses personal information." by a seven-point Likertscale anchored by "verylikely" and "veryunlikely." Some of these questions were patternedafterthose in Equifax (1990).

Test-RetestReliability Test-retest reliability examines an instrument's abilityto achieve stable responses froma single sample over time (Churchill, 1979). To assess the test-retest reliability our instruof ment, it was administered on two separate occasions to a single sample of undergraduate business students. Specifically, 123 students (of 186 total) responded to both the "test"and "retest"of the instrument, which were separated by a period of eight weeks. Correlationsfor these repetitionsfor the four subscales ranged from.63 to .74, and the correlationfor the overall scale was .78 (p<.001 for all), which is in line with acceptable levels reported in prior, similar scale development research (Bearden, et al., 1993). Test-retest MIS 1996 187 Quarterly/June

Instrument Information Privacy

correlations for individualitems ranged from .39 to .66 (p< .001 for all).

Generalizability To achieve its full usefulness, an instrument should be applicable to "othersubjects, other 1986, groups, and other conditions" (Kerlinger, p. 299). Such a concern is includedunder the rubric "external of which is defined as validity," "persons,settings, and times to which findings can be generalized"(Straub 1989). While our instrumentwas initially based upon inputfrom numeroussources (as described in Stage 1), it must also be validated with differentpopulaof tions. To achieve generalizability the instrument, it was administered to, and validated with, two diverse sample populations in addition to the sample of graduate business students: undergraduate business students (n= 186) and U.S.-based members of the ISACA (n= 354).22 As can be seen in Table 8, the results of CFA analyses on data from these of samples supports the validityand reliability the instrument across these populations as well. Specifically, the validationof the instrument across two groups as dissimilar as undergraduatestudents (who have, arguably, a low level of understandingregardingactual industry practices) and IS auditors (who, arguably, should represent a populationwith high on-the-job knowledge) stands as strong evidence of the instrument's generalizability (Gordon,et al., 1986).

chy of concern regardingthe various dimensions. It was observed that the highest levels of concern were associated with Improper Access and Unauthorized Secondary Use. Lowerlevels of concern were associated with Collectionand Errors. Within these categories, however, there seem to be some distinctions between samples (see Table 9). For example, ISACA members ranked Unauthorized Secondary Use as theirtop concern, while the other respondents indicated more concern about Improper Access.

Discussion
This study providestwo majorcontributions to the privacyliterature: a framework describ(1) dimensions of individuals' coning the primary cerns about organizational information privacy practices and (2) a validated instrument for measuring those concerns. The development process includedexaminationsof privacyliterature and U.S. laws; experience surveys and focus groups; and the use of expert judges. The resultwas a parsimonious15-item instrument with four subscales tapping into dimensions of individuals' concerns about organizationalinformation privacypractices.The instrument was rigorously tested and validated across several heterogenous populations,providing a high degree of confidence in the scales' validity,reliability, generalizability. and Before considering implications for researchers and managers, two limitationsof this study should be noted. First, all scale development processes require a number of calls"by researchers based on their "judgment of the literature; inputfrom experion analysis ence surveys, focus groups, and expert judges; and on levels of acceptabilityfor various statistical measures. In particular,based on inputfrom various sources, we concluded that one of the dimensions, CombiningData, was actually represented by two of the other dimensions, Unauthorized Secondary Use (External)and Collection.We also concluded that Reduced Judgmentwas not a part of the concerns about organizamajor "individuals'

Additional Findings The relationshipsbetween the subscales and individuals' response patternsseem to provide natureof additionalinsights into the underlying the informationprivacy concern construct. As can be seen in Table 9, there may be a hierar22The ISACAsample was also utilizedfor some tests of nomologicalvalidity;this student sample was also utievaluationand for one test of lized for the "test-retest" nomological validity(correlationwith "cynicaldistrust") (see Tables 3a and 3b).

188 MIS 1996 Quarterly/June

Information PrivacyInstrument

Table 8. LISRELResults for Generalizability Undergraduate Business Students (n=186) CR SFL AVE .675 .559 .941 .689 .81 Errors B F H L C G K M D N Chi-Square NCNFI
RMR's

Factor Collection

Item A E J 0

IS Auditors (n=354) SFL .759 .772 .878 .667 CR AVE

.53 .637 .864 .775 .890

.86

.60

.647 .841 .698 .857 .85 .59


.

.87 .733 .726 .693 .838


r\

.64
r

Unauthorized Secondary Use

.691 .636 .671 .898 .82 .691 .754 .877 .82

.54 .785 .598 .880 .65

.84

.56

Access Improper

.80 330 (84) .91


.074

.58

139 (84) .96


.063

Coeff. Determ. Legend:

.998

.998

SFL = StandardizedFactorLoading CR = Composite Reliability AVE= Average VarianceExtracted NormedFitIndex NCNFI= Non-centralized RMR= Root Mean-squaredResidual Table 9. Subscales Mean (S.D.) for MBAs Mean (S.D.) for Undergraduates
(n = 183)

Mean (S.D.) for ISACAMembers


(n = 337)

Subscale

(n = 146)

Collection Errors
Unauthorized

5.28 (1.19) 5.36 (1.06)

5.11 (1.04) 5.57 (.99)

5.45 (1.16) 5.46(1.11) 6.15 (1.07) 5.90 (1.01) 5.74 (.86)

Secondary Use Access Improper OVERALL

5.74 (1.14) 5.77 (1.22) 5.83 (1.01) 6.10 (.89) 5.56 (.83) 5.63 (.78) Largermeans are associated withhigherlevels of concern (see Table 1.).

1996 MISQuarterly/June

189

Information Instrument Privacy

tional information privacy practices"construct. Both of these assessments appear to be conviews of scholars, consistent withthe majority and advocates, and the four-dimensumers, sion model of the construct seems to reflect current thinking. We acknowledge, however, that this dimensionalityis neitherabsolute nor static, since perceptions of advocates, consumers, and scholars could shift over time. Thus, the instrument should be viewed as measuring the most central dimensions of the constructat this time. Futureresearch endeavors might consider any changes in these dimensions that may occur. Second, while we made a concerted effortto validatethe instrumentin a nomologicalmodel of antecedents, individual personality variables, privacyconcerns, and futurebehavioral intentions (Stage 3), it should be noted that this nomological model is not purportedto be an exhaustive one, nor did we test it in an experimental,causal context. Indeed, theories between privaregardingthe interrelationships concerns and other constructs are not fully cy at developed in the literature present, and the for creationof a fullmodel is a task appropriate because of a subsequent study. Furthermore, the constraints of time and length associated of with administration writtensurveys, we were unable to test all the antecedents, personality variables, and behavioralintentionswith a sinthis gle sample. Despite these limitations, work has significant implications for both researchers and managers. We examine each in the followingsections.

Theoretical models (see Stone and Stone, 1990) often posit theoreticalrelationshipsthat includeprivacyconcern as one of the model's constructs. With respect to factors that may impactlevels of concern, it has been suggested that concerns may be context-sensitive based on either the type of information being managed (Culnan, 1993; Stone and Stone, collectingand 1990) or the type of organization storingthe data (Stone, et al., 1983). Concerns may also be associated with numerous personality factors and demographic data (see review in Stone and Stone, 1990). Further, some public opinion survey findings (Equifax, 1990) suggest that levels of concern on some subscales may be lowerfor professionals with processing day-to-dayexposure to information activities. There may also be factors that are impacted by levels of concern. It has been asserted that individuals may take a variety of different actions based on their levels of concern, such as "optingout" of various activities (Culnan, 1993; Stone and Stone, 1990). Furthermore, perceptions of organizationalprivacy policies and practices may be related to levels of employee concern (Smith, et al., 1995), and levels of concern may also be associated with differentculturalvalues and regulatorystructures in various countries (Milberg, et al., 1995). It is clear that a significant research stream could emerge from empiricaltests of the relationships between the antecedents, associated factors, levels of concern, and outcomes. As suggested by the discussion in the previous paragraph,most of this instrument'susefulness will come from its applicationin posithe tivist research-in particular, development and testing of theories that take the form of independent and dependent variables (Lee, 1991). But the instrumentmay also assist a research researcher in conductinginterpretive on what the meaning of information privacyis for the individualsthemselves in an organization, apartfromor priorto whatevera positivist theory would define it to be (Lee, 1991). Full understanding of a phenomenon is best achieved not through any singularity in approach, but rather,through iterativecycles

for Implications researchers


Likemany other areas withinthe IS domain,little attention has been paid to instrumentation issues in privacyresearch. Now, witha validated instrumentfor measuring individuals'concerns about organizationalinformation privacy practices, researchers can undertake studies to carefullyexamine the linksbetween relevant privacy-related variables, privacy concerns, and outcomes of those concerns. 1996 190 MIS Quarterly/June

Information PrivacyInstrument

of positivist and interpretive research (Lee, 1991).23 To the extent that researchers confirm some of the theoretical linkages in positivist approaches (e.g., by showing that individuals exhibit higher levels of concern when stimulus materials promptthem to thinkabout medical data ratherthan financialdata), these findings may then feed back to interpretive studies (e.g., field studies that examine different approaches to managing medical data and managers' perceptions of differingresponsibility levels). As privacy increases in importance, it behooves the IS research communityto carefully consider the complexity of individuals' concerns, the factors that may cause increased levels of concern, and the outcomes of those concerns. The instrumentdeveloped in this study should enable futurework in this area. important

cal approaches are being adopted for tracking purposes. Itis acknowledgedthat IS managers and executives willseldom be in a position to correct all the organizational unilaterally problems in these domains since they are likelyto involvesome degree of existing organizational policy. Changing existing policy will demand attention from general managers at a senior level. However, IS professionals can be aggressive in challenging organizationalpolicies for sharing personal data with outside organizations, and they may insist on tighter interpretations of the "need to know"when organizational policies regarding access are constructed. By taking a proactive stance in managing these dimensions of concern, IS managers and executives may reduce the probability that onerous regulatory options will be pursued (see Milberg, et al., 1995; Smith, 1994). Research has shown that increased concerns about information privacyare associated with increased levels of governmentalinvolvement in organizational privacy management (Milberg,et al., 1995), but so far, managers have been primarilyreactive in addressing informationprivacy concerns (Smith, 1994). Managers should be alert to the value-laden choices that are made by systems designers and implementers (Kling, 1978; Mowshowitz, 1976), because these choices can ultimately impactthe privacydomainand reactionsthereto. This study, along with future research addressing the antecedents and consequences of various concerns, may allow managers to evaluate specific situationalcontexts and manage responses to informationmanagement practices, thus avoiding costly consumer and/orregulatory backlashes.

for Implications managers


This study, which identified the most central dimensions of individuals' concerns about organizational informationprivacy practices, can serve as the first step on a path of proactive management. By carefully considering theirown organizations'approaches to the four major dimensions of concern-Collection, Errors, Unauthorized Secondary Use, and can identify Improper Access-managers underlying problems and take corrective actions as appropriate. Table 10 contains a set of possible recommendations that might be embracedfor each of the dimensions. As an example, IS professionals can address secondary use issues by identifyingthe secondary uses of data withintheir organizations and ensuring that the appropriatetechnologi23 f course, the choice of researchapproach(es)is highly

Acknowledgements
The three authors contributedequally on this research. We gratefullyacknowledge several for individuals their assistance in administering versions of the survey instrument: Tom Cooke, Elizabeth Cooper-Martin, Mary Culnan, Bill DeLone, Mark Keil, Mike McCarthy, Keri Pearlson, Craig Smith, Bob Thomas, Suzie
MISQuarterly/June 1996 191

contextualand depends on the type of researchquestion being asked (Yin,1988), the findingsfrompreviousstudies (Bonoma, 1985), and the levels of understanding regardingthe phenomenonof interest (Lee, 1.991).See Bonoma (1985), Lee (1991), Orlikowskiand Baroudi (1991), and Yin (1988) for a broaddiscussionof the relationshipsbetween researchapproaches.

Information Instrument Privacy

Table 10. Recommendations to IS Community Area of Concern Improper Access Recommended Actions in Broader Organizational Domain * Lobbyfor organizational * Implement access technologicalcontrolson policieswitha tightdefinition access to systems of * Ensurethat applicationsare designed "needto know" * Challengeliberalinterpretations to so that access can be restricted of "needto know" narrowestdomains possible * Lobbyforclear organizational * Ensurethat all internaluses of policies on "intended use"for personaldata personal data can be tracked * Refuse to release personaldata to * Challengeinternaluses of personal data that are outside "intended use" outside entitieswithoutexplicitsenior boundaries managementapproyal * Lobbyfor organizational policies outside sharingof restricting personaldata error tradeoffsregarding * Ensurethat applicationsare designed * Identify edit controlsto senior management; withappropriate techniques ensure informed decision making * Practiceparsimonious database design * Challengeexcessive collectionof personaldata withinorganization * Lobbyfor organizational policythat levels limitsdata collectionto minimal for required business Bagozzi, R. P. Causal Modelingin Marketing, John Wileyand Sons, New York,1980. of Bagozzi, R. P. "AnExamination the Validity of Two Models of Attitude," Multivariate Behavioral Research, July 1981, pp. 323359. Bagozzi, R. P. "A Holistic Methodology for to Modeling Consumer Response Innovation," Research, JanuaryOperations 1983, pp. 128-176. February Bagozzi, R.P. "Assessing ConstructValidityin Personality Research: Applications to Measures of Self-Esteem," Journal of Research in Personality (27:1), March 1993, pp. 49-87. Bagozzi, R. P. and Phillips,L. W. "RepresentTheories: A ing and Testing Organizational Science Administrative HolisticConstrual," Quarterly (27:3), 1982, pp. 459-489. of Bagozzi, R. P. and Yi, Y. "Onthe Evaluation Structural EquationModels,"Journalof the Science (16), Spring Academy of Marketing 1988. Recommended Actions Within IS Domain

Unauthorized Secondary Use

Errors

Collection

Weisband,and BerryWilson.We also acknowledge with gratitudethe organizationsthat supan portedthis survey research,including anonymous bank, two anonymous insurance organizations, an anonymous credit card issuer, the Information Systems Audit and Control Association (ISACA), and the Georgetown University Center for Business-Government Relations. Ernest Kallman is especially acknowledgedfor his assistance in much of the data collection. We also appreciate the data entry help provided by Emmy Curtis, Debra and ShirmelRichards.MaryCulnanproMiller, vided helpfulcomments on an earlierversionof this paper. We also gratefully acknowledgethe senior editor, associate editor,and five anonycomments on mous referees for their insightful an earlierdraft.

References
ACM (Association for ComputingMachinery). "Code of Ethics," Communications of the ACM(23:7), July 1980, p. 425. 1996 192 MIS Quarterly/June

Information Instrument Privacy

Bagozzi, R. P. and Yi, Y. "Multitrait-Multimethod Matrices in Consumer Research," Journal of Consumer Research (17:4), March1991, pp. 426-439. Barki, H. and Hartwick,J. "MeasuringUser Participation,User Involvement,and User MISQuarterly Attitude," (18:1), March1994, pp. 59-82. Bearden, W. O., Netemeyer, R. G. and Mobley, M. F. Handbook of Marketing Scales, Sage Publications, Newbury Park, CA, 1993, pp. 3-8. Bennett, C. J. Regulating Privacy: Data Protectionand Public Policy in Europe and the UnitedStates, CornellUniversityPress, Ithaca,NY, 1992. Bentler, P. M. "Comparative Fit Indexes in StructuralModels," Psychological Bulletin (107:2), 1990, pp. 238-246. Bentler, P. M. and Bonett, D. "Significance Tests and Goodness of Fit in the Analysis of Covariance Structures," Psychological Bulletin (88:3), November 1980, pp. 588606. Berscheid, E. "Privacy:A Hidden Variable in ExperimentalSocial Psychology," Journal of Social Issues (33:3), 1977, pp. 85-101. Bonoma, T. V. "Case Research in Marketing: Opportunities, Problems, and a Process," Journal of MarketingResearch (XXII), May 1985, pp. 199-208. Cambridge Reports. "Technology and Consumers: Jobs, Education, Privacy," Bulletin on Consumer Opinion no. 157, Cambridge,MA1989. Campbell,D. T. and Fiske, D. W. "Convergent and DiscriminantValidation by the Multitrait-Multimethod Matrix,"Psychological Bulletin(56:2), March1959, pp. 81-105. Cespedes, F. V. and Smith, H. J. "Database Marketing: New Rules for Policy and Practice,"Sloan ManagementReview (34), Summer 1993, pp. 7-22. Churchill, G. "A Paradigm for Developing Better Measures of Marketing Constructs," Journal of Marketing Research (XVI), February1979, pp. 64-73. Cook, T.D. and Campbell, D. T. Quasi-experimentation:Design and Analysis Issues for Field Settings, Rand McNally, Chicago, 1979.

Cote, J. A. and Buckley,M. R. "Measurement Error and Theory Testing in Consumer Research: An Illustration the Importance of of Construct Validation" Journal of ConsumerResearch (14), March1987, pp. 579-582. Cozby, P.C. "Self-disclosure: A Literature Review," Psychological Bulletin (79:2), 1973, pp. 73-91. February in Cronbach, L. "Test Validation" Educational Measurement(2nd edition),R. L. Thorndike (ed.), American Council on Education, Washington,D.C., 1971, pp. 443-507. Cronbach, L. J. and Meehl, P. E. "Construct Tests," Validity in Psychological PsychologicalBulletin(52:4), July 1955, pp. 281-302. Did Culnan,M.J. "'How They Get My Name?': An ExploratoryInvestigationof Consumer Attitudes Toward Secondary Information Use," MIS Quarterly (17:3), September 1993, pp. 341-363. Cyert, R. M. and March,J. G. A Behavioral Theory of the Firm, Prentice Hall, New York,1963. Date, C.J. An Introduction to Database Systems (4th ed.), Addison-Wesley Publishing Company,Reading,MA,1986. Equifax Inc. The Equifax Report on Consumers in the Information Age, 1990. Also Harris-Equifax Consumer Privacy Survey 1991, Harris-Equifax Consumer Privacy Survey 1992, and Harris-Equifax Health InformationPrivacy Survey 1993. EquifaxInc.,Atlanta,GA. Fenigstein, A. and Vanable, P. A. "Paranoia and Self-Consciousness," Journal of Personality and Social Psychology (62:1), January1992, pp. 129-138. Fornell, C. and Larcker, D. F. "Evaluating Structural Equation Models with UnobservableVariablesand Measurement Journalof Marketing Research (18), Error," 1981, pp. 39-50. February Gordon, M.E., Slade, L.A., and Schmitt, N. 'The 'Science of the Sophomore'Revisited: From Conjectureto Empiricism," Academy of Management Review (11:1), 1986, pp. 191-207. HEW(U.S. Departmentof Health, Education, and Welfare).Records, Computers,and the Rightsof Citizens:Reportof the Secretary's MIS 1996 193 Quarterly/June

Information Instrument Privacy

Advisory Committee on Automated Personal Data Systems, U.S. Government Office,Washington,D.C., 1973. Printing Jarvenpaa, S. L., Dickson, G. W., and DeSanctis, G. "Methodological Issues in Experimental IS Research: Experiences and Recommendations," MIS Quarterly (9:2), June 1985, pp. 141-156. Jessor, R. and Jessor, S. Problem Behavior and Psychosocial Development, Academic Press, New York, 1977, pp. 234-235, as reproducedin Measures of Personalityand Social Psychological Attitudes, J. P. Robinson, P. R. Shaver, and L. S. Wrightsman (eds.), Academic Press, San Diego, 1991, pp. 355-358. Joreskog, K. and Sorbom, D. LISREL VI: Analysis of Linear StructuralRelationships by the Maximum Likelihood and Least Squares Methods, Scientific Software, Mooresville,IN, 1984. Katz,J. E. and Tassone, A. R. "Public Opinion Trends: Privacy and Information (54), Technology,"Public OpinionQuarterly Spring 1990, pp. 125-143. Kelvin,P. "ASocio-psychological Examination of Privacy," BritishJoural of Social Clinical Psychology (12:3), September 1973, pp. 248-261. Kerlinger, F. N. Foundations of Behavioral Research (3rd ed.), Holt, Rinehart, and Winston,New York,1986, pp. 477-483. Kling,R. "ValueConflicts and Social Choices in Electronic Funds Transfer Systems Developments," Communications of the ACM(21:8), August 1978, pp. 642-657. and MoralResponsibility: Ladd, J. "Computer A Frameworkfor Ethical Analysis,"in The Information Web: Ethical and Social Implications of Computer Networking, C. Gould (ed.), Westview Press, Boulder,CO, 1989. Laudon, K.C. Dossier Society: Value Choices in the Design of National Information Systems, Columbia UniversityPress, New York,1986. Laufer, R.S. and Wolfe, M. "Privacy as a Concept and a Social Issue: A MultidimensionalDevelopmental Theory," Journal of Social Issues (33:3), 1977, pp. 22-41. 1996 194 MIS Quarterly/June

Lee, A. S. "Integrating Positivist and Interpretive Approaches to Organizational Research," Organizational Science (2:4), 1991, pp. 342-365. Levin, H.A. and Askin, F. "Privacy in the Courts:Law and Social Reality," Joural of Social Issues(33:3), 1977, pp. 138-153. Linowes, D. F. Privacy in America: Is Your PrivateLifein the PublicEye? University of Illinois Press, Urbana,IL,1989. Mason, R. O. "Four Ethical Issues of the Information Age," MIS Quarterly (10:1), March1986, pp. 4-12. McDonald,R. P. and Marsh,H. W. "Choosing a Multivariate Model: Noncentrality and Goodness of Fit," Psychological Bulletin (107:2), March1990, pp. 247-255. Milberg,S. J., Burke, S. J., Smith, H. J., and E. Kallman, A. "Values,PersonalInformation Privacy Concerns, and Regulatory of Approaches,"Communications the ACM (38:12),December1995, pp. 65-74. Miller,A. "Computersand Privacy,"in Ethics and the Management of Computer Technology, W. M. Hoffman, and J. M. Moore (eds.), Oelgeschlager, Gunn, and HainPublishers,Inc.,Cambridge, MA,1982. Mowshowitz, A. The Conquest of Will, Addison-Wesley,Reading,MA,1976. W. Orlikowski, J. and Baroudi,J. J. "Studying Information Technology in Organizations: Research Approaches and Assumptions," Information Systems Research (2:1), 1991, pp. 1-28. PPSC (PrivacyProtectionStudy Commission). Personal Privacyin an Information Society: Report of the Privacy Protection Study Commission, U.S. Government Printing Office,Washington,D.C., 1977. Smith, H. J. Managing Privacy: Information Technology and OrganizationalAmerica, Universityof NorthCarolinaPress, Chapel Hill,NC, 1994. E. S. Smith,H. J., Milberg, J., and Kallman, A. "Privacy Practices Around the World: An paper,Georgetown working Study," Empirical D.C., 1995. Washington, University, Stone, D. L. "RelationshipBetween Introversion/Extraversion,Values Regarding Control Over Information,and Perceptions of Invasionof Privacy," Perceptualand Motor Skills(62:2), April,1986, pp. 371-376.

Information Instrument Privacy

Stone, E. F. and Stone, D. L. "Privacy in Theoretical Issues, Organizations: Research Findings, and Protection Mechanisms," in Research in Personnel and HumanResources Management(8), K. M. Rowland and G. R. Ferris (eds.), JAI Press, Greenwich,CT, 1990, pp. 349-411. Stone, E. F., Gardner, D. G., Gueutal, H. G., and McClure, S. "A Field Experiment Comparing Information-Privacy Values, Beliefs, and AttitudesAcross Several Types of Organizations," Journal of Applied Psychology (68:3), August 1983, pp. 459468. Instrumentsin MIS Straub, D. W. "Validating Research," MIS Quarterly (13:2), June 1989, pp. 146-169. Straub, D. W., Jr. and Collins, R. W. "Key Information Liability Issues Facing Managers: Software Piracy, Proprietary Databases, and Individual Rights to Privacy,"MIS Quarterly(14:2), June 1990, pp. 142-156. Straub, D.W., Limayem, M., and Karahanna, E. "MeasuringSystem Usage: Implications for IS Theory Testing," Management Science (41:8), August 1995, pp. 13281342. Tolchinsky, P.D., McCuddy,M.K.,Adams, J., Ganster, D.C., Woodman, R.W., and Fromkin, H.L. "Employee Perceptions of Invasion of Privacy: A Field Simulation Journalof AppliedPsychology Experiment," June 1981, pp. 308-313. (66:3), Warren, C. and Laslett, B. "Privacy and Secrecy: A Conceptual Comparison," Journal of Social Issues (33:3), 1977, pp. 43-51. Westin, A. F. Privacyand Freedom,Atheneum Publishers,New York,1967. Westin, A.F. "ConsumerPrivacyIssues in the Nineties," in The Equifax Report on Consumers in the Information Age, Equifax Inc.,Atlanta,GA, 1990, pp. XVIII-XXVIII. Westin, A. F. and Baker, M. A. Databanksin a Free Society, Quadrangle Books, New York,1972. Yin, R. K. Case Study Research: Design and Methods, Sage Publications,Beverly Hills, CA, 1988.

About the Authors


H. Jeff Smith is associate professor,School of Business, Georgetown University, Washington, D.C. He holds B.S. degrees in computerscience and mathematicsfromNorth Carolina State University;an M.B.A. degree fromthe University NorthCarolinaat Chapel of Hill; and a D.B.A. degree from Harvard University.His research focuses on the social issues created by the use of emerging technologies. His research has been published in Communications of the ACM and Sloan Management Review. He is the author of Managing Privacy: Information Technology and Corporate America, published by the of University NorthCarolinaPress. Sandra J. Milberg is assistant professor, School of Business, Georgetown University, Washington,D.C. She holds a B.A. degree in sociology from Washington University, St. Louis; an M.S. degree in marketing from and a Carnegie MellonUniversity,Pittsburgh; Ph.D. degree in business administration from the University of Pittsburgh. Her research focuses on consumer privacy issues, brand equity, and the roles of affect and cognitionin attitude formation and choice behavior. Her been in research has published Communications of the ACM, Journal of Consumer Research, Journal of Personality and Social Psychology, and Journal of Social Psychology. Experimental Sandra J. Burke is assistant professor, School of Business, Georgetown University, Washington,D.C. She holds a B.A. degree in economics from Michigan State University, and M.B.A and Ph.D. degrees in marketing of fromThe University Michigan.Her research issues in marketing, focuses on ethical/privacy consumer information processing and decision making,and consumer inference use and formation. Her research has been published in Communications of the ACM, Advances in Consumer Research, and Journal of Decision Making. Behavioral MIS 1996 195 Quarterly/June

Information Instrument Privacy

Appendix
ConfirmatoryFactorAnalysis
Because an a priorihypothesis is tested, CFA has several advantages over traditionalmethods of scale validation(Bagozzi, 1983). CFA(1) providesexplicitmeasures to assess constructvalidityand to of correct for the unreliability measures that can contaminate theoretical relations, (2) represents the explicitly extent of measurementerror,and (3) overcomes the fundamental indeterminacy (problem of non-unique solutions) of exploratory factor analysis. To be more specific, in models where to sequences of relationshipsoccur, it is important explicitlyrepresent and controlfor systematic and randomerrorsin measurement. Failureto do so can lead to biased and inconsistentestimates of paramost proceduresthat employthe measurementsobtainedfromscale administrameters. Furthermore, tions (e.g., correlations,regression, ANOVA) assume the absence of randomand systematic implicitly errors in observations. Yet, when Cote and Buckley (1987) applied CFA techniques to 70 published data sets, they found that measurementerror,on average, accounted for 32 percent of total variance. CFA goes beyond traditionalvalidation methods, in that theoretical concepts, non-observational assessed. hypotheses, and errorsare explicitly factoranalyses are traditionally used in scale valiwhile obliqueor orthogonalexploratory Furthermore, dation, neitherprocedureyields a uniquesolutionin a statisticalsense. Once a set of factors is found, lineartransformaan infinitenumberof other equallyacceptable factorscan be formedas non-singular the tions of the firstset (Bagozzi, 1983). Again, if the researcherattemptsto interpret factors, use the loadings for furtheranalysis, or compute scores to test hypotheses, this implicitnon-uniqueness can basis. A researcherhypothesizes a model cause problems.CFAyields a uniquesolutionon an a priori set and then tests the goodness-of-fitof the model on a particular of data. In addition,CFA is used to assess the overall fit of this model versus the fit with other models reflectingalternativeunderlying structuresof this constructto assess validity.

1996 196 MIS Quarterly/June

Das könnte Ihnen auch gefallen