Sie sind auf Seite 1von 22

Knowledge as a Basis for Expertise in Systems Analysis: An Empirical Study Author(s): Nicholas P. Vitalari Source: MIS Quarterly, Vol.

9, No. 3 (Sep., 1985), pp. 221-241 Published by: Management Information Systems Research Center, University of Minnesota Stable URL: http://www.jstor.org/stable/248950 . Accessed: 13/10/2011 17:01
Your use of the JSTOR archive indicates your acceptance of the Terms & Conditions of Use, available at . http://www.jstor.org/page/info/about/policies/terms.jsp JSTOR is a not-for-profit service that helps scholars, researchers, and students discover, use, and build upon a wide range of content in a trusted digital archive. We use information technology and tools to increase productivity and facilitate new forms of scholarship. For more information about JSTOR, please contact support@jstor.org.

Management Information Systems Research Center, University of Minnesota is collaborating with JSTOR to digitize, preserve and extend access to MIS Quarterly.

http://www.jstor.org

Expertisein Systems Analysis

Knowledgeas a Basis for Expertise in Systems Analysis: An Empirical Study


By: Nicholas P. Vitalari Graduate School of Management University of California at Irvine Irvine, California

Introduction
An epistemology for systems analysis has evolved slowly. The progression of systems analysis practice has been punctuated by techniques and methods that ebb and flow with rapid frequency. Few of the concepts have been empirically tested and real world practice varies widely. Even among the experts a wide divergence of views is expressed. As computer systems become widely used, this lack of a well formed base of knowledge for systems analysis poses practical problems for the aspiring systems analyst, selection and training problems for the information systems manager, and for the systems analysis policy issues profession. Systems analysis can be viewed as a special form of human inquiry. In its most general case, the process of systems analysis seeks to understand the world for the purpose of rearranging that world into structures and processes. In the specific case of systems analysis for computerized information systems the nature of inquiry focuses on the role of information in the operating world. To perform such an inquiry, the experienced systems analyst relies upon a background of knowledge. Thus, the organization and content of the systems analyst's knowledge base plays a central role in the level of analyst expertise in the analysis domain. This article investigates the characteristics of the practicing systems analyst's knowledge base. To accomplish this, the article focuses on knowledge used by the analyst in the information requirements determination phase. Previous studies have attempted to identify systems analyst's skills and knowledge requirements through expert opinion [5], introspective accounts [3], and retrospective reconstructions [8, 26]. These studies focused on general skills but not on their relative importance in different phases of systems development or on the underlying knowledge required for the skills. This investigation departs from these previous studies in two ways. First, the study focuses on the knowledge used by practicing systems analysts in the requirements determination phase of systems development. Second, the view of the systems analyst's knowledge base is assembled from an evoked set of knowledge cate-

Abstract
This study explores the content of the systems analyst'sknowledgebase. Thestudyutilizesa protocol in research analysismethodology a quasi-experimental setting to define the types of knowledge used by 18 experiencedsystems analysts in solving an accounts receivable problem. The study provides preliminary findings regarding the core knowledge utilized by systems analysts, and differences in the knowledge utilizedby high- and low-ratedanalysts. Keywords: Systems analysis, information requirements determination,systems analyst's knowledge base, problem solving behavior,systems development. ACMCategories: D.2.1, D.2.2, K.6.1, K.7.0.

MIS Quarterly/September 1985

221

Expertisein Systems Analysis

gories taken from a set of problem solving transcripts of experienced systems analysts. The transcripts,technicallycalled a thinkaloud protocol,were coded and analyzedto producea pictureof the knowledge base. The protocolanalysis approachavoids some of the problems with introspectiveand retrospective (expert opinion)accounts of expertise. The evoked set of skills is less dependent on recall and other reconstructions based on what experts or supervisors believe to be the proper set of skills. Protocolanalysis has been used for over thirty years in a wide variety of studies about expertise and knowledge utilization. The key assumption of this article is that, by observingthe types of information systems anain lysts consider important solvinga problem,we can make inferences about the types of knowledgethat are relevantto the systems analysis domain.The incidence of each category of knowledge found in the protocols is used to determine: (1) the components of the systems analyst's knowledgebase, (2) the relativeimportance of each categoryof knowledge,and (3)the differences in the incidence of knowledge systems categories between high-and low-rated analysts.

limitsthe developmentof systems analysis training programs,aptituteassessment instruments, and selection procedures. Most germaine to this study are the systems analyst skillstudies. The firstempiricalstudy of systems analyst skills is seen in the work of Shrout'[45]. Shrout analyzed 468 responses frommanagers and systems analysts who were members of the Association for Systems Management. The findingswere based upon ratings of 98 systems analyst competencies, grouped into five categories (rankedhighest to lowest): 1. Administrative and organizational competencies 2. Accounting, financial, economic and computationalcompetencies 3. Computerand equipmentcompetencies 4. Employee and personnel competencies 5. Public relations, product, marketing, and legal competencies Her study indicated that administrativecompetencies were perceived as more important than computer related skills. The competency ratingsdid not varyaccordingto level of education, years of experience, organization size, or level of systems responsibility. Her study did not, however, investigate the component behaviors or knowledge underlyingeach competency. A second series of empiricalskill studies were based upon skill areas developed by the ACM committee's reporton information systems curriculums [5]. Henry, et al., [26] surveyed 981 managers, users and systems analysts from fourteenfirmsregardingthe usefulness of a set of 111skillsin seven skillcategories. A majorobjective of the study was to validatethe assumptions and recommendationsunderlying ACM the curriculumproposal. The followingutilityrankings were found for the seven skill clusters (highest to lowest): 1. 2. 3. 4. 5. 6. 7. Performanceskills cluster People skills cluster Systems skills cluster skills cluster Organizational Computerskills cluster Society skills cluster Models skills cluster

Previous Research
Systems analysis as a focused area of research in is quiteyoung. Muchof the research literature systems analysis has focused on techniques, tutorials,curriculum proposals, cognitivestyles, and analyst-userinteraction.While much good conceptual workexists regardingsystems analstudies exist aboutthe knowlysis, few empirical edge requiredby systems analysts to effectively do their jobs. Our understandingof the knowledge required for systems analysis is protoscientific, based upon analogies with techniques developed in other fields (i.e., engineerand ing, architecture,computerprogramming), the introspective accounts of practitionersof systems analysis rather than a codified, validated set of concepts, principlesand causal linkages that characterizemore advanced occupations. Moreover,at a practicallevel, this lack of knowledge about systems analysts knowledge

End users ranked each area as above except thatthey rankedthe models skillcluster as more

222

MIS Quarterly/September 1985

Expertisein Systems Analysis

useful than the society skillcluster. The authors concluded that people, organizational,and systems skills were more important than technical skills in systems analysis. A second study five years later,by Benbasat, et al., [8] built upon the Henry study and investigated the effects of IS organizationalmaturity on skill usefulness rankings. Benbasat revised the 111 skills into a skill list of 99 by eliminating the performance skill cluster which failed to discriminate in ranks due to their universal acceptance by bothmanagers and systems analysts. The study foundthe followingranksforthe six skillclusters, by managers and systems analysts respectively (fromhighest to lowest): InformationSystems Manager Ranks 1. People 2. Organizational, Society 3. Systems 4. Computers,Models Systems Analyst Ranks 1. People 2. Organizational, Systems 3. Society 4. Computers 5. Models The rankingsreaffirm findingsof the Shrout the and Henry studies by illustrating that both managers and systems analysts rate the usefulness of behavioralskills higher than the technicalskills. Italso appears thata concern for societal skills increased. It is unclear whether this change in ratingsis caused by a maturation of the discipline from 1974 to 1980, or because of a difference in the samples. Otherstudies also investigatedthe skill requirements of systems analysts. Perhaps one of the strongest studies was completed by Arveyand Hoyle [3]. Arvey and Hoyle utilized the critical incident method [22] to define the skills of the systems analyst. The purpose of the study was to build a behaviorallyanchored rating instrument. The instrumentwas later found to have good discriminant and convergent validity. Twelve skill groups emerged from the study: 1) technical knowledge, 2) planning,organizing and scheduling, 3) maintainingcustomer relations, 4) providingsupervision and leadership, 5) trainingothers, 6) documentation, 7) maintainingcommunications,8) assessing customer

needs and providingrecommendations,9) job commitmentand effort, 10) debugging, 11) program, system modification,development, and 12) conducting presentations. The authors did not provideinformation aboutthe relativeimportance of the skills. However,they did find that skillcategories 3, 7, 8, and 12 were more important for systems analysts as compared to programmers. This finding again emphasized the importanceof behavioraland managerialskills in systems analysis. A variety of other skill studies have examined the skills of systems analysts and theirvariation by level of systems analyst position [15], by priorities placed on skills by managers and systems analysts [2], and in the development aptitudetests [51]. In general, the studies indicated that behavioral skills tend to be more to important successful performancein system analysis as compared to more technical skills. The skill studies have providedus with a broad definitionof the categories of skills considered useful forsystems analysts. The skillstudies are based upon a series of skill lists generated by introspectionand expert opinion. The skill lists have not been based upon observed behavior. As a result, we are unable to ascertain whether the skill lists are based upon the beliefs of the managers and systems analysts or the actual performance requirementsof the job. In addition, with the exception of the Henryand Benbasat studies, each researcherhas developed a skill list of their own, making comparisons difficult.Despite these difficulties, skillsstudies the consistently indicatethat, in general, behavioral skills are more importantto systems analyst's performanceas compared to technical skills. Otherareas of research in information systems have indirectly contributed to the base of systems analysis knowledge. The literatureis quite extensive. Table 1 summarizes other studies that have contributed to systems analysis knowledge. Four conclusions can be drawn from the literature. First,much of the knowledgeis based upon proto-scientificconceptual formulations. These formulationsform the basis of the textbook knowledge, the curriculumrecommendations, and systems analysis methods. Second, most of the directlyrelevant research is based

MIS Quarterly/September 1985

223

Expertisein Systems Analysis

Table 1. Category

Previous Literature and Research Relevant to the Systems Analysis Domain Focus * System analysis activities * Analysis techniques * Systems design * Hardware and software selection * Life cycle methodologies * Structured design * Structured analysis * * * * Informationsystems Computer science Skill development Conceptual understanding Reference Gore and Stubbe [24] Wetherbe [50] Semprevivo [42] Bingham and Davies [12] Senn [43] Yordon and Constantine [52] Weinberg [49] De Marco [20] Nauman and Jenkins [36] Ashenhurst [5] Couger [17] Adams and Athey [1] Taggart and Tharp [46,47] Monro and Davis [34] Berrisford and Wetherbe [10] Rockart [39] Boland [13] Hedberg and Mumford [25] Nunamaker, et al., [38] Shrout [45] Henry, et al., [26] Benbasat, et al., [8] Arvey and Hoyle [3] Bryant and Ferguson [15] Gingras and McLean [23] Boland [13] Atwood [6] Hedberg and Mumford [25] Dagwell and Weber [18] Vitalariand Dickson [48] Keen [27] Markus [32] Bostrom and Heinen [14] Kling [30] Kling and Scacchi [31] Benbasat and Taylor [11] Dickson, et al.. [20] Keen and Bronsema [28] Nolan [37] King and Kraemer [29] Benbasat, et al., [7]

Textbook literature

Software engineering and systems development

Curriculum development and tutorials

Systems analysis and theory

* Systems analysis technique

* Systems analyst skill studies

* User analyst interaction * Cognitive models of systems analysis

General research in information sysems

* Implementation, politics, social impacts

* Cognitive style and interface design * Management of computing

224

MIS Quarterly/September 1985

Expertisein Systems Analysis

on a series of skill studies and several evaluations of systems analysis methods. Third,when we consider the general area of MIS research, togetherwiththe skillsstudies, we can conclude that many of the issues of system analysis actually concern behavioral issues such as politics, social change, organizations, people, implementation,cognitive style, and decision processes. Fourth,previous research provides us with some general understandingabout the knowledge underlyingsystems analyst's expertise, but more detail is required. Questions aboutthe relativeimportanceof certainareas of knowledge, and the roles they play in practice, are largely unanswered.

experimentaltask (Appendix 1) was based on study interviewswith IS managers and systems analysts and was pretested for realism. The experimentaltask was purposelydesigned to be brief because most, if not all, requirements determination tasks are highly ill-definedat the outset and the brevityof the experimentaltask required the subjects to interpretthe task in terms of previous knowledge. The task was viewed as a startingpoint for the problemsolving process. In addition, given the time limits and constraintsof the experimentalsession, the analyst had to relyupon his or her currentrepertoireof skills and knowledge.This controlledfor the problemof the analysts generating new or "acceptable" behaviors duringthe experimental session. Each subject was instructedto thinkaloud while solving the experimental task. The verbalizations were recorded on audio tape by the researcher and later transcribed. The researcher functioned as a non-probing,nondirective informationproviderduring the problem-solving session to avoid any interference withthe subject's problemsolving process. The researcher had no priorknowledge of the skill level or ratingsof the systems analysts.

Research Method
Research objectives
This study investigates four research questions: 1. What knowledge does the practicing systems analyst actually use? 2. Whatis the relativeimportanceof each topic of knowledge in the process of systems analysis problemsolving? 3. Does the focus, importance,and frequency of use of the varioustypes of knowledgediffer among high-ratedsystems analysts and low-ratedsystems analysts? 4. Does the use of certain types of knowledge affect systems analyst performance?

Thesample
Eighteenpracticingsystems analysts, each with a minimum of three years of experience in systems analysis, participatedin the study. The 18 analysts ranged in age from 25 to 55, and included 16 men and 2 women. Years of systems analysis experience ranged from 3 to 22. The analysts were classified into two groups, high-rated and low-rated, based on multiple ratings by their supervisors. The supervisory rating process was facilitated by using a valianchored ratingscale develdated, behaviorally oped by Arvey and Hoyle [3]. The supervisors were also asked to rate each analyst on a single ten-point scale as a converging measure. Althoughsupervisoryratingscan be problematic due to bias and perceptualerror,it is argued that supervisory ratings are appropriatefor an exploratory study. The bias and inter-rater problems associated with ratings in general, have been reduced by utilizing same behaviorally the

Research design
The experimentaldesign consisted of a single experimentalstimulus, namely a case problem in systems analysis given to a paired-sampleof eighteen subjects. The single case stimuluswas used to control the characteristics of the problem encountered by each subject, regardless of background, rating or experience. Because these are not strict experimentalcontrols, the design is best labeled a quasi-experimental design. The experimentaltask involvedthe determination of information requirementsand functional specifications of an accounts receivable system for a low-margin, high-volume retailer. The

MIS Quarterly/September 1985

225

Expertisein Systems Analysis

anchored rating scale across supervisors. due to the evolving nature of the Furthermore, system analysis fieldfew measures, if any, exist that exhibit high criterion-related validity.Both ratingmeasures agreed in all cases. (Fora more detailed descriptionof the ratingprocedureand and the methodology,see Vitalari Dickson[48].)

Protocol analysis
Data was collected utilizingprotocol analysis. Protocolanalysis is a data collection technique that focuses upon the cognitive process and task. The cognitive content of a problem-solving the is observed by requiring subject to process think aloud while performinga task. The subjects' verbalizations were tape recorded and later transcribed for analysis, providing the researcher with a detailed set of information aboutthe mannerin whicha subjectconsciously reasoned abouta task. Protocolanalysis may be used in a laboratoryexperiment or a quasiexperiment. Protocol analysis depends upon a welldeveloped coding scheme to extractinformation from the protocols. A protocol coding scheme consists of a series of categories about the behaviorto be studied. In this study the coding scheme consists of detailed categories of knowledgewithdefinitions.The coding scheme is used by one or morecoders to extractrelevant informationfrom the protocols. The results of the coding process are then statisticallycompared for inter-coder agreement. The coded becomes data for the analysis. information Several issues face researchers who use protocol anlysis as a methodology. First, protocol analysis requires a high fidelitytask that represents a realworldproblemwithsufficientreality. task, Second, duringthe actual problem-solving the researcher must avoid probingthe subject for information, since probingcan alterthe subject's thought process. The more an observer interfereswitha subject's decision process, the less reliable the informationin the protocol. Ericsson and Simon [21] recommend that directed probes be avoided in order to maintain the validityof the protocols. Third,thinkaloud protocols should not be confused with introspectiveor retrospectiveverbal

accounts of behavior.The latterare considered problematic due to their dependence upon on recall, reconstructionand interpretation the of of the subject. Verbalizations a subject in part a thinkaloud protocolare seen as that partof the problem-solvingprocess that can be encoded rapidlyintoverbalizedlanguage. These verbalizations are the resultof the subject's interaction with the experimentaltask and his or her own knowledge. Hence, the verbalizations are evoked and based upon actual cognitive behaviors in response to the task demands. Thus, the verbalizationsare likely to be only part of the complete thinking process used in the accomplishmentof a task. Fourth, as a data collection method protocol analysis is labor intensive. The time and effort requiredto collect and code the verbalprotocols for analysis is extensive. Most protocolstudies have small sample sizes. Thinkaloud protocols are appropriatein exploratorystudies where little is knownabout the cognitive behaviorin the domain and in cases where detailed understanding is required.Later,the researcher can furtherthe research by relyingon other forced choice methods that are amenable to large sample sizes or simulationtechniques where dynamic models of the cognitive behavior can be developed and validated.

Construction the codingscheme of


Inthis exploratory studythe coding scheme may be seen as a framework for research. The categories included in the coding scheme were based on previous research and prestudyinterviews with practitioners.The coding categories were developed prior to data collection and hence represent a set of expectations about systems analysis knowledge. In this context, classical hypothesis testing is premature and must await later studies. The followingprocedurewas used to construct the coding scheme. First, eight major topical areas of systems analysis knowledge were developed based upon literature in system analysis and MIS.Since this was an exploratory study it was felt that the scheme should be as exhaustive as possible. This classification includes traditional concepts of systems analysis, categories identifiedin the skill studies, as well

226 MISQuarterly/September 1985

Expertisein Systems Analysis

as concepts that have been identified as important from other lines of research in information systems. The eight major areas are: 1. Organizational Structure - Knowledge about the formal and informal structure of the organization, roles, policies, interrelationships and work procedures. 2. Organizational Behavior - Knowledge about types of human behavior occurring in organizations, absenteeism, turnover, conflict, training, attitudes, and productivity. 3. Organizational Politics - Knowledge about the role of politics in an organization and the systems development and use processes. Consideration of ownership, territoriality, power, competition, and coalition development and operation. 4. Functional Requirements - Knowledge about the various types of functional requirements of information systems, including data items, databases, reports, report layouts, time, information, and control response procedures. 5. Systems Development - Knowledge about the various phases and issues related to the development of information systems, including development methods, implementation issues, user training, design teams, and systems evaluation. 6. Information Technology - Knowledge of the computer and machine technology employed to implement information systems. 7. Individual Behavior - Knowledge of the behavior of individuals, including personality types, expectations, attitudes, beliefs, and cognitive style. 8. Corporate Characteristics - Knowledge about the corporate environment in which analysis takes place, including growth, revenues, industry, corporate culture, management style, and competitors. The next step was to identify detailed categories of knowledge for each topical area. The detailed list of knowledge categories was developed via a prestudy series of indepth interviews with six practicing systems analysts who were designated by their supervisor as experts. The knowl-

edge categories generated from these interviews were then combined with knowledge categories observed in previous research. The resultant list included 250 categories of knowledge spread across the eight topical areas. In addition, 57 categories relating to specific features of the experimental problem were catalogued. The case specific categories were used to determine the focus of the analyst on attributes specific to the experimental stimulus. This initial coding scheme was then subjected to a review with an experienced systems analyst. After this review two additional categories were added, employee turnover and group culture. The final coding scheme as used in this study includes 252 categories. The complete coding scheme is presented in Tables 2 and 3. Once the categories were identified, definitions for each of the 252 categories were developed so the coders would have a common definitional basis. The purpose of the relatively large number of detailed categories was to make the coding scheme as complete as possible for an exploratory study. The large number of categories also permitted the analysis of null categories (categories of knowledge not employed by the analyst), and an analysis of the relative incidence of a wide variety of categories in the analysts' verbal transcripts.

Coding procedure employed


The coding process consisted of reading the verbal protocol transcripts and looking for the presence of categories of knowledge defined in the coding scheme. For each line in the protocol the coder documented the presence of one of the 252 knowledge categories and one of the 57 case specific categories. A line in the protocol represents a complete assertion on the part of the systems analyst. The coder was required to make a judgement regarding which category was the best descriptor of the knowledge in each assertion (line). Two additional categories were found to be missing from the coding scheme during the coding process: knowledge related to control in the accounts receivable system and questions about the organization's budget. Although these categories were found in the protocols, it was decided to omit them from the present analysis and update the coding

MIS Quarterly/September 1985

227

Expertisein Systems Analysis

Table 2.

Coding Scheme for Knowledge Categories in Systems Analysis


OB: ORGANIZATON BEHAVIOR 00 01 02 03 04 05 06 07 08 09 10 11 12 13 14 15 16 17 18 19 20 21 22 23 GENERAL ABSENTEEISM RATE AGGRESSION AVOIDANCE GROUP COHESION GROUP CULTURE GROUP LEADERSHIP GROUP NORMS GROUP STRUCTURE JOBTENURE OPINON LEADERS PERSONNEL MORALE PERSONNEL NAMES POSITION TENURE PROJECTION RESISTANCE CHANGE TO STABILITY TURNOVER OF RATE SUBORDINATE-SUPERIOR RAPPORT RATE TARDINESS TURNOVER RATE TYPE DECISIONS OF USER MISSOPHISTICATION GROUP PERCEPTION DECISION CRITERIA OP: ORGANIZATION POLITICS 00 01 02 03 04 05 06 07 08 09 10 11 12 GENERAL CAREER COERCION CONTROL DATA OF PUNISHMENT POWER PRESTIGE PRIME MOVER RESISTANCE RESOURCE CONFLICT TERRITORIAL CONFLICT VESTED INTEREST COOPERATION

OS:ORGANIZATIONS STRUCTURE 00 GENERAL 01 AUTHORITY 02 CENTRALIZED VERSUS DECENTRALIZED 03 COMPENSATION STRUCTURE 04 FUNCTIONAL ORGANIZATION STRUCTURE 05 INTERDEPARTMENTAL RELATIONSHIP 06 INTERORGANIZATIONAL RELATIONSHIP 07 JOB ACTIVITY DESCRIPTION 08 JOB POSITION HIERARCHY 09 JOB TYPE NAME OR 10 LINE-STAFF AUTHORITY RELATIONSHIP 11 MATRIX ORGANIZATION 12 MIS MASTER PLAN 13 NUMBER PERSONNEL OF 14 ORGANIZATION CHART 15 PROCEDURES 16 PRODUCT ORGANIZATION STRUCTURE 17 REPORTING RELATIONSHIPS 18 RESPONSIBILITY 19 SKILL INVENTORY 20 POLICY
FR: FUNCTIONAL REQUIREMENTS 00 01 02 03 04 05 06 07 08 09 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 GENERAL ALTERATIONDATABASE OF ALTERATIONINPUT OF ALTERATIONOUTPUTS OF OF ALTERATIONPROGRAMS AUDIT TRAIL DATA ELEMENT DEFINITION DATA FLOW DATABASE DATABASE COMPLETENESS PRIVACY DATABASE DATABASE SECURITY ERROR CONTROL INFORMATION OBJECTIVES OUTPUT CYCLE OUTPUTS PURPOSE PURPOSE REPORTS OF REPORT CYCLE REPORT FORMAT TIME RESPONSE SYSTEM FUNCTIONS TRANSACTION VOLUME TYPES REPORTS OF UPDATE FREQUENCY OUTPUT DISTRIBUTION DATA STRUCTURE INPUTS INQUIRY FUNCTION SCOPE

SD: SYSTEMS DEVELOPMENT

00 GENERAL 01 BUDGET 02 DESIGN TEAM 03 PROGRAMMING TEAM 04 REVIEW POINTS 05 SCHEDULE 06 ANALYSIS 07 PROGRAMMING 08 DESIGN 09 IMPLEMENTATION 10 MAINTENANCE IT: INFORMATION TECHNOLOGY 11 SYSTEM EVALUATION 12 USERINVOLVEMENT 00 GENERAL 13 USER TEAMS 01 AGE INFORMATION OF SYSTEM 14 USER REPRESENTATIVE 02 COMMERCIAL SOFTWARE 15 TECHNIQUES 03 COMPUTER CAPACITY UTILIZATION 16 LIFE CYCLE PROCEDURE 04 ELECTRONIC COLLECTION 17 DOCUMENTATION DATA ANDTRANSMISSION 18 PRESENTATION 05 DECISION SUPPORT 19 INTERVIEWING 06 DISTRIBUTED NETWORK IB: INDIVIDUAL BEHAVIOR 07 GRAPHICS 08 HARDWARE CONFIGURATION 00 GENERAL 09 MANUAL DATA COLLECTION AND 01 ATTITUDES TRANSMISSION 02 BELIEFS 10 MAINFRAME COMPUTER 03 COMMITMENT MONEY 11 MINI MICRO OR COMPUTER 04 COMMITMENT TIME 12 OUTPUT MEDIUM 05 DECISION MAKING STYLE 13 PERIPHERALS 06 DEGREE INVOLVEMENT OF 14 PRIMARY MEMORY 07 EGO INVOLVEMENT 15 PROGRAMMING LANGUAGE 08 JOBSATISFACTION 16 SECONDARY MEMORY 09 MOTIVATION 17 ACCESS PATH MECHANISM OR 10 PERSONALITY 18 INPUT MEDIUM 11 VALUES 19 STORAGE MEDIUM CC:CORPORATE CHARACTERISTICS 20 DATABASE MANAGEMENT SYSTEM 21 HIGH LEVEL QUERY LANGUAGE 00 GENERAL 01 DP SOPHISTICATION 02 GROWTH 03 NATURE THE OF BUSINESS 04 SIZE PERSONNEL IN 05 SIZE REVENUE IN 06 TRENDS

228

MIS Quarterly/September 1985

Expertisein Systems Analysis

Table 3. Coding Scheme for Reference to Specific Categories 00 01 02 03 04 05 06 07 08 09 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 GENERAL CHIEF EXECUTIVE (CEO) OFFICER PRESIDENT VICE OFMARKETING PERSONNEL VICE PRESIDENT OF VICE PRESIDENT OFDEVELOPMENT VICE PRESIDENT OFFINANCE PRESIDENT ASSISTANTTHE TO VICE CONTROLLER ASSISTANT CONTROLLER COLLECTION CLERK SENIOR ACCOUNT CLERKS DPDIRECTOR MANAGER: OPERATIONS MANAGER: DEVELOPMENT MANAGER: MAINTENANCE AUDITORS STORE MANAGER CUSTOMERS STEERING COMMITTEE ACCOUNTING DEPARTMENT PAYABLE DEPARTMENT ACCOUNTS CASH DEPARTMENT RECEIPTS DPDEPARTMENT DEPARTMENT SALES RETAIL OUTLET COLLECTION PROCESS BUDGET SYSTEM ACCOUNTS RECEIVABLE ACCOUNTS RECEIVABLE BILLS CREDIT LIMIT CREDIT POLICY A/RMASTER FILE A/RINVOICE FILE CASH FILE RECEIPTS A/RREGISTER PROGRAM A/RSTATEMENT PROGRAM ERROR REPORT A/RSUMMARY LISTING FILE CUSTOMER PAYABLE ACCOUNTS RECEIPTS GENERAL LEDGER ORDER ENTRY PAYROLL REGIONS INVOICES PAST ACCOUNTS DUE INVENTORY COMPANY A USERS ENHANCEMENTS CUSTOMER DEMAND CREDIT SALES AGE RECEIVABLES OF SYSTEMS FLOWCHART MANAGEMENT

future studies. Each protocol took approximately 15 hours to code. When the coders finished coding the eighteen transcripts their results were tested for interrater reliability using the Wilcoxon Two Sample Test. The test indicated that both samples came from the same distribution with no significant differences (p-value = .21). Thus the results of the coder's independent work was seen as being in agreement, and the coded transcripts of the more experienced coder were used for analysis.

Data analysis
The data was analyzed based upon an importance measure constructed from the percent frequency of occurrence of each knowledge category represented in the coding scheme. The percent frequency measure of occurrence (rather than raw frequency) was chosen in order to minimize differences in the number of coded statements between each analyst. Differences in the numbers of coded statements in verbal protocols is common due to the normal variation in the duration of each subject's problemsolving episode. The use of percent frequencies scales these differences to a common denominator. The percent frequencies can be represented by the following equation: PFi = CFjITF x 100 where: PF = The percent frequency of knowledge category i. CF = The raw frequency of knowledge category i in the subject's protocol. TF = The total number of coded categories in the subject's protocol. The percent frequency in a particular knowledge category in a particular subject's protocol can be used as a measure of relative importance of that category in the subject's entire problem-solving process. The percent frequency can then be used to make relative comparisons among all subjects, between subjects, and within subjects. Analysis of the knowledge base of all eighteen analysts, as a group, was conducted using median percent frequencies as the measure of central tendency. The median is superior to the mean in cases where the underlying distribution is unknown, where the sample size is small, and

MIS Quarterly/September 1985

229

Expertisein Systems Analysis

where a large varianceis expected. The median percent frequencies in each of the knowledge categories were computed and ranked. The ranks provide a measure of the relative emphasis placed on each knowledge category in the problem-solving process. A second analysis used the Two Sample MedianTest to investigate for differences between the highand low-rated analyst groups [44]. This test measures the differences in two samples where the underlying population distribution is unknown, the sample size is small, and the measures are nominalor ordinalin scale.

categories of knowledge Major


Table 4 summarizes the results of the item analysis. Twentyknowledgecategories account for approximately60 percent of all knowledge categories appearingin the subjects' protocols. A second groupingof 30 knowledge categories account for another 15 percent of the observed behavior.The remaining25 percent is sparsely scattered among another 100 knowledge categories. Thus, 50 knowledge categories account for approximately75 percent of the observed references in the subjects' protocols. Due to the large number of categories, the followinganalysis willfocus uponthe top twenty categories. the The top twentycategories illustrate analysts' concern for the functionalrequirementsof the system, the organization's structure, system errorcontrol,systems documentation,and user involvement. At first glance these categories corroboratethe findings of earlier studies of systems analysis skills and appear consistent includedin systems withthe types of information development techniques and structured analysis methods. However, under closer inspection there are some important differences. In the skill studies reviewed earlier, behavioral skills are rankedas the most useful. However,a somewhat different view is indicated in this study. The highest numberof references in the subjects' protocolsrelatedto details concerning the functionalrequirementsof the system, the systems procedures (i.e., preplanned rules to guide manual or automated tasks), types of reports, requirements for information, and requirementsfor particularsystems functions. In fact, the top nine categories all relate to and aspects of the computersystem. Behavioral issues have a lower incidence. organizational issues are ranked10, 12, and 15 Organizational respectively.Concerns over the purpose of the system, degree of user involvement, and corporatecharacteristicsare ranked13, 20, and 22, respectively. The top twenty categories imply that the analyst is more concerned about systems reports, reportformats, reportcycles, data element definitions, the user's requireand sources of data for ments for information, the system than for organizationalor people issues.

limitations Methodological
The design employed in this study has several limitations. First, due to the labor intensive characteristicsof protocolanalysis, the sample size is small. A small sample size constrainsour ability to generalize the results to the larger population of systems analysts. The general applicability of the results is also reduced because the study utilizesonly one type of computer applicationin the experimentaltask. It is conceivable that the content of the verbal protocols wouldvaryaccordingto the type of application (i.e., decision support system, office domain of application automation,CAD/CAM), (i.e., financial, production,marketing,etc.), or the presence of fourth generation automated system analysis tools. Second, the methodology is best viewed as since a varietyof variables quasi-experimental such as the type of organizawere uncontrolled, tions includedin the study, the subjects' cognitive style, and previoustrainingand educational background.

Results
A majorobjectiveof the study was to determine the major categories of knowledge utilized by the practicingsystem ar,alyst.Specifically,what types of knowledge do experienced systems analysts use and what are their relative importance?

230

MIS Quarterly/September 1985

Expertisein Systems Analysis

Table 4.

Comparative Rankings of the Top Twenty Knowledge Categories Between High- and Low-Rated Systems Analysts Overall Group Median Percent (Rank) (n = 18) 10.7 (1) 6.5 (2) 6.0 (3) 4.6 (4) 3.4 (5) 3.3 (6) 2.5 (7) 2.4 (8) 2.2 (9) 1.9 (10) 1.8 (11) High-Rated Group Median Percent (Rank) (n = 9) 11.6 (1) 6.4 (2) 6.4 (3) 3.7 (5) 4.7 (4) 3.2 (6) 1.7 (11) 2.4 (9) 1.6 (13) 2.6 (7) 2.6 (7) Low-Rated Group Median Percent (Rank) (n = 9) 10.1 (1) 7.1 (2) 3.7 (3) 5.4 (3) 2.8 (6) 2.8 (6) 4.4 (4) 2.4 (8) 2.4 (8) 0.7 (19) 0.7 (19)

Knowledge Categories 1. Functional requirements 2. Procedures 3. Types of reports 4. Requirements for information 5. Requirements for systems function 6. Process of analysis 7. Development issues 8. Data element definitions 9. Design of system 10. Organizational responsibility 11. Database issues 12. Descriptions of organizational activities 13. Purpose of system 14. Information technology issues 15. Organizational structure issues 16. Errorcontrol within system 17. Requirements for report formats 18. Documentation issues 19. Source data 20. User involvement 21. Interviews with users 22. Corporate characteristics 23. Requirements for report cycle

1.7 (12) 1.7 (12) 1.6 (14) 1.5 (5) 1.5 (15) 1.4 (17) 1.4 1.3 1.3 0.7 (17) (19) (19) (28)

2.2 (10) 2.2 (10) 2.5 (8) 1.4 (15) 1.6 (13) 1.4 (15) 0.7 0.7 1.3 1.1 (22) (22) (18) (20)

1.6 (11) 1.9 (10) 1.6 (11) 1.6 (11) 1.0 (16) 1.4 (14) 1.4 1.2 0.5 0.6 (14) (15) (21) (20)

0.9 (23) .08 (26)

0.8 (21) 1.2 (19)

0.9 (17) 0.8 (18)

Notes: Relative Percent Frequency = (the category frequency/sum of the statements in all of a subgroup's protocols) x 100.

MIS Quarterly/September 1985

231

in Expertise Systems Analysis

Several explanations are possible. First, the findingsof the earlierskillstudies may be based upon what the managers and systems analysts want to believe are the most importantskills. The belief system of the respondents, in turn, may be influencedby the subjective importance to attached to people skills or, alternatively, the potentialcost associated with the mismanagement of the people side of systems analysis. It may also be easier for managers and systems analysts to recall concrete situations in which people problemsdid occur. Moreover,it is less likely that mistakes in specifications of functional requirementswill be perceived as impor"corrected"laterin tantsince they are implicitly the development process or much later during maintenance. On the other hand, since the skill studies were not specific to a particularphase of systems developmentand only examined general skills, it is also possible that people, organizational, laterinthe and societal skillsare more important development process such as in the implementationand conversionstages. Itis also likelythat the characteristicsof the task environmentplay a role in which skills are considered useful. In this study the task instructions directed the analyst to determine the functional requirements of the system. In the case of the skill studies, the demands of the task variedaccording to the memories and perceptions of the respondent withoutany anchors to a common behavioralevent. Thus, it seems reasonable to conclude that the utilityof skills and the importance of categories of knowledgeare dependent upon a variety of factors, especially the task environmentand the nature of the task goals. resultsof this studysuggest that The exploratory in a requirements determinationtask for accounts receivable, knowledge of systems in specific issues seems most important solving the experimentaltask. Furtherresearch must examine the knowledge utilizationof systems analysts in other stages of systems development. Table 4 also providesa comparativerankingof the top twenty knowledge categories for the high- and low-ratedgroups. Few differences in the rankingsappeared between the two groups. The findingsimplythat a highdegree of stability in the importanceof the top twenty categories

existed among the analysts in the study. A reasonable interpretation suggests that the top twenty categories represent a baseline or systems analysis core of knowledge. These categories are the minimumrequirementsfor practicing systems analysts are probably distinguish systems analysts from other domains of organizational analysis such as organizational development, EDP auditing, office proceduresanalysis, or general management consulting.

of Categories knowledge not observed


It is also useful to examine the categories of knowledgethat failedto appear in the protocols of the subjects. Out of the 252 categories in the coding schematic, 51 categories had zero (null) frequencies. Out of the 51 categories with zero frequencies, 19 categories were absent from both the high-ratedand low-ratedsubjects, 12 subcategories were absent fromthe high-rated jects, and 20 were absent from the low-rated subjects. Table 5 providesa list of the 19 categories that did not appear in any of the analysts' protocols. The list indicatesthat none of the analysts found a variety of behavioral,technical, politicaland systems development issues to be pertinentto the case problem.The presence of zero values may indicatea faultinthe coding scheme, a genuine disintereston the partof the analysts, or a listof the categories thatwere beyondthe scope of the experimentalproblem. Table 5 also lists the zero value categories for and low-rated the high-rated analysts. The highrated group tended to ignore technical issues such as DBMS,developmenttechniques, decision support, programminglanguages, input mediums, and the size of the computer. Also curiouslyabsent were references to several of the user oriented categories: user teams, user representatives,and user sophistication. The null categories in the low-ratedgroup are more numerous. Seven of the null categories focus on political issues. The other thirteen categories are spread evenly over behavioral and technical topics. The lack of focus on politics seems problematic due to the well-

1985 232 MISQuarterly/September

Table 5. Unobserved Knowledge Categories Included in Coding Scheme* (Zero Categories Absent from High-RatedSubjects User teams User representatives User MISsophistication Database managementsystems Development techniques Decisionsupport structure Inter-organizational Political vested interest Programming languages Inputmedium Subordinate-superior managementrapport Mainframe computer

Categories Absent fro

Generalpolitics Primemover Politicalcontrolof d Territorial conflict Prestige Politicalpunishmen Politicalresistance Purpose of reports Design team Secondary memory DP sophistication or Minicomputers m Line-staff relations Product-based orga Alteration inputs of Groupcohesion Groupleadership Personnel morale team Programming Individual personal

co 0
CD

Categories Absent from All Subjects Employeecompensation Matrix structure organization Absenteeismrate Aggressionbehaviorin organization Avoidancebehaviorin organization Groupculture Groupnorms Opinionleaders behaviorin organization Projection of Stability turnoverrate Tardinessrate Graphicstechnology Computer'sprimary memory

tr
CD

A-

ro
(0 01

CI) c4 CA)

*The above categories are included in the coding scheme but were not observed in any of the subjects'

in Expertise Systems Analysis

accepted importance of politics in systems development. Interestingly,low-ratedanalysts did not consider issues related to several funcand tional,organizational people issues such as purpose of reports,group cohesion and leadership, personnel morale and individual personalities. However,all of these categories were considered by one or more of the high-rated analysts. This differencecould suggest that the failure of low-ratedanalysts to consider these issues leads to lower performance.This could provide another explanationfor the earlier differences between this study and the skillstudies corraborating the subjective importance attached to people and organizationalskills. However,these findingsmust be viewed as preand liminary speculative, because the data does not permitcausal conclusions.

cent frequencies in the organizationalchart, reportingrelationships,personnel names, skill and resistance to change categories. inventory, The results suggest that the high-ratedsystem of analyst is more cognizant of the interplay the developmentprocess and characteristicsof the organization. Report Focus High-rated analysts spent moretime discussing the characteristicsof reportsthan the low-rated group as exemplifiedby the purpose and types of reportcategories in Table 6. The results suggest that the high-ratedgroup was more concerned about the outputsof the system as compared to inputs and processes. Several post-experimentaldebriefing sessions withhigh-rated subjects providedfurtherinsight into the rationalebehind the outputfocus. One analyst mentionedthat by focusing on the outputs of the system first, she was assured of the definingthe information user wanted before she was undulybiased by the state of the current system. Anotheranalyst stated that a focus on the reportsforced him to remainat an abstract level, and not delve too early into the technical features of the problem. Political Focus Perhaps one of the most interestingdistinctions to appear between the high- and low-rated analysts was the differencein the politicalorientations between the two groups. While neither group demonstrated a sophisticated understanding of organizationalpolitics, (e.g., zero median percent frequencies in 11 of 13 political knowledgecategories), the high-rated groupdid exhibit more interest whereas the low-rated no group exhibitedvirtually interest. This data providessome evidence that the highrated group understands the importance of organizationalpolitics. Differences are seen in the politics-general category and the prime mover category. The politics-generalcategory was coded any timethe analystdiscussed unreferenced, general politicalconcern. The prime mover category was coded when the analyst voiced concern about the "real" motivationor person behind the development of the system

in Differences the knowledge bases


The second analysis looked for differences in the importancegiven to the types of information and knowledge used by the high-ratedand lowrated groups of systems analysts. A numberof categories were foundto be associated withthe high-ratedanalyst group. Table 6 presents the results. Each of the categories were foundto be significantlydifferenton the basis of a singletailed Two Sample MedianTest. Organizational Focus The high-ratedanalysts focused upon the relationships among differentjobs, the characteristics of jobs, and on the type and name of the job, as evidenced by the job activitydescription, job position, job type or name, and the skill inventoryof the employees. The focus on the job-relatedcategories suggests a less technical orientation analyzingthe information to requirements. Forexample, a moretechnical approach would simply look at the user environmentin terms of tasks, processes, information flows and information storage, with littleconsiderationof less functionalcomponents of the user environment such as the natureof the job, interjob relaand the context in which the job is tionships, performed. Evidence for a greater interest in organizational and behavioralissues is seen in the medianper-

234

MIS Quarterly/September 1985

in Expertise Systems Analysis

Table 6. A Comparison of Differences in the Median Percent Frequencies Median Percent


Frequency'

Knowledge Category

NumberHigh-RatedNumberLow-Rated of of Above/Below Above/BelowSubjects Subjects Median3 Median2 Significance4 6/3 6/3 7/2 4/5 6/3 3/6 4/5 4/5 4/5 6/3 5/4 3/6 6/3 6/3 3/6 5/4 4/5 5/4 3/6 3/6 2/7 1/8 3/6 0/9 1/8 1/8 0/9 3/6 2/7 0/9 1/8 1/8 0/9 2/7 0/9 1/8 .086 .086 .014 .073 .086 .051 .073 .073 .020 .086 .083 .051 .086 .012 .051 .083 .020 .033

Job ActivityDescription Job Position Hierarchy Job Type Chart Organization ReportingRelationships SkillInventory Personnel Names Resistance to Change Purpose of Reports Types of Reports Politics-General Issues Concernabout Prime Mover User Involvement Degree of Involvement Secondary Memory Storage Medium Design Team System Evaluation

1.7 0.3 0.3 0.0* 0.6 0.0 0.0 0.0 0.0 6.0 0.0 0.0 1.3 0.0 0.0 0.0 0.0 0.0

1Median for subjects of the category frequency/total frequencies of all categories x 100. 2nl =9 high rated subjects. 3n2 = 9 low-rated subjects. 4Computed according to single tailed Two Sample Median Test. 'A zero value indicates that at least half of the subjects in the group exhibited little or no responses (i.e., zero frequency) in that category of knowledge.

and what position that individual had in the organization. User Involvement Focus The greater incidence of the user involvement and degree of involvementcategories suggests that the high-ratedgroup is interested in getting the user involved in the system development process. This result is consistent with previous research in MISthat indicatesa relationship with user involvementand MISsuccess.

Although the degree of involvement expected from the user is not widely discussed in the literature,this concern was evident in the highratedprotocols.The degree of user involvement category suggests that the high-ratedanalysts were also concerned about definingthe amount of involvement they could reasonably expect from the user during the development project. One possible explanationfor this findingis that by defining the degree of involvement at the outset, the analyst knows what commitmentsto

MISQuarterly/September 1985 235

in Expertise Systems Analysis

Table 7.

A Comparison of Differences in the Median Percent Frequencies in Case Specific Management Focus Number High-Rated Number Low-Rated of of Above/Below Subjects Above/Below Subjects Median Median Significance2 3/6 4/5 7/2 0/9 1/8 2/7 .051 .073 .014

Knowledge Category Chief Executive Officer Vice President of Marketing Senior Account Collection Clerk

Median Percent Frequency' 0 0 .8

1 Median for subjects of the category frequency/total frequencies of all categories x 100. Computed according to single tailed Two Sample Median Test.

expect from the user and the user knows what to expect from the analyst. Technical Focus as a Constraint Set Perhaps most interesting is the greater incidence in the secondary storage and storage medium categories representing technical features that are constraints upon any functional design. In the determination of information requirements the analyst must attend to certain technical constraints that impinge upon the functional requirements. For example, as in the experimental problem the absence or presence of disk storage capability determined the technical and economic feasibility of an online customer account system. The results suggest that the effective analyst assesses the impact of the constraints and modifies the boundaries of the solution space accordingly. Focus on Later Stages The design team and system evaluation categories suggest that the high-rated analyst focuses upon the later stages of systems development from the beginning. Both of the above categories are particularly important in the information requirements determination stage because the composition of the design team and the effectiveness of the ultimate system are largely defined by the requirements in this first stage. Management Focus Several differences were found in the casespecific categories between the high- and low-

rated groups (Table 7). The high-rated group has a greater frequency in categories related to management. For example, both the high and low-rated groups were concerned about the needs of the controller, the assistant controller, and the data processing director, who were explicitly (specified by name in the experimental task) involved in the development process. However, the high-rated group more often considered the information requirements of other managers such as the senior account collection supervisor, the vice president of marketing, and the chief executive officer.

Conclusion
The results of this study are exploratory and must be received as preliminary. However, despite these limitations, the study raised some important issues. The categories of knowledge observed in the protocols of the practicing systems analysts provide a detailed picture of the relative importance of the types of knowledge and information relevant to the analyst. The findings suggest a list of knowledge categories that practicing analysts actually use when solving a system analysis problem. One important contribution of the study was the definition of the dominant or core categories of knowledge observed in the protocols of practicing analysts. The top twenty categories of knowledge are very similar for both the highrated and low-rated analysts and can be viewed

236

MIS Quarterly/September 1985

Expertisein Systems Analysis

as the necessary component knowledge requiredto solve analysis problems.These findings by no means providean exhaustive or sufficent list of core knowledge, but do provide a pointfromwhich to elaborate in futurestudies. The article also looked for differences in the importanceattached to differenttypes of knowledge. Certain categories of knowledge were observed a greater numberof times in the highrated subjects' protocols. These initial results must be viewed as preliminary because this is a first study and a single case problem was employed. However, the findings suggest that high-ratedanalysts are more concerned about user involvement, managerial prerogatives, organizationalpolitics, and user reports. The results also suggest that high-rated analysts consider hardwarefeatures as constraints on the solution.These results findsome supportin previous research studies in the information systems literature. The results also indicated the types of knowledge that were not observed in the analysts' protocols. The absence of certain categories of knowledgesuggested one of three possibilities:(1) these types of knowledgewere

not relevantto the problemused in the study,(2) these categories have littlerelevanceto systems analysis or are only relevant in exceptional cases, (3) the time limitsof the experimentalsetting precludedconsiderationof such issues. The findings of the study also suggested that the utilityof system analysis knowledge and skills are dependent upon the task and the phase of systems development. In the study, analysts spent most of theirtime focusing on the requirements of the system. Less time was spent upon people and organizationalissues, althoughthe high-rated analysts spent moretime on the later issues than low-rated analysts. Additional studies withvariationsin the problemstimulus, analyst backgroundexperience, and time constraints are requiredto determinewhich of the above explanations is correct. Additionalstudies may also provide additionalcategories of knowledge not included in the present coding scheme. Taken together, the findings and discussion of the analysts knowledge base suggest an enhanced view of systems analyst's knowledge. Six majorconcepts about the knowledge base have emerged (Figure 1):

Figure 1. The Emerging Structure of Systems Analyst's Knowledge Base

MIS Quarterly/September 1985

237

in Expertise Systems Analysis

1. Core System AnalysisDomainKnowledgeNecessary components of systems analyst's knowledge to achieve satisfactory levels of performance. 2. High-rated Domain Knowledge - Knowledge that distinguishes high-ratedanalysts from low-ratedanalysts. 3. Application Domain Knowledge - Knowlinformation edge relatedto particular system applications, such as decision support systems, office automation,transactionprocessing systems, and end user computing. 4. Functional Domain Knowledge - Knowledge related to specific management disciplines, such as finance, accounting, production, marketing, public sector computing, informationsystem management, and strategic planning. 5. OrganizationSpecific Knowledge- Knowledge specific to the organizationin whichthe analysts works. 6. Knowledge of Methods and Techniques Knowledge related to specific analysis techniques, methodologies,and approaches. For in futurestudies the coding scheme used in this study must be updatedto encompass each of these areas. Futureresearch intothe knowledge base of the systems analyst can take several directions. First,it is necessary to providethe subjects with multipleexperimentaltasks to strengthen intersubject comparisons. Second, future studies may wish to explore the structure of the analyst's knowledge base by utilizinga variety of semantic association methodologiesto discover the relationship among the knowledge items. in Third,it is possible to take the information this study and develop instrumentsto test for selection criteriaand the assessment of job performance using traditional scale development procedures. this Finally, study illustratesthe value of indepth exploratoryinvestigations of the human problem-solving process. The results provide encouragement for the use of this research approach in other information system domains.

and John Lapland for able assistance in the coding of the data.

References
[1] Adams, R. and Athey, T.H. (eds.) DPMA ModelCurriculum Undergraduate for Computer Information Systems Education, DPMA,Park Ridge, Illinois,1981. [2] Alloway,R.M."GrowYourOwn,"Damatation, Volume 26, Number8, April1980. [3] Arvey,R.D.and Hoyle,J.C. "AGuttman Approachto the Developmentof BehaviorallyBased RatingScales for Systems Analysts and Programmer Analysts," Journal of AppliedPsychology,Volume59, Number1, February1974, pp. 61-68. [4] Ash, P. and Kroeker,L.P. "Personnel Section, Classification, and Placement," in Annual Review of Psychology, M.R. Rosenweig and L.W.Porter(eds.), Annual ReviewsInc.,PaloAlto,California, 1975,pp. 481-507. R.L.(ed.) "AReport the ACM of [5] Ashenhurst, Curriculum Committee Computer on Education in Management,"Communications of theACM,Volume15, Number5, May1972, pp. 363-398. [6] Atwood, M. Expert and Novice Systems Designers, SAI, Boulder,Colorado, 1980. [7] Benbasat, I., Dexter, A., Drury, D., and Goldstein, R. "A Critique of the Stage Hypothesis: Theory and Empirical Evidence," Communicationsof the ACM, Volume 27, Number 5, May 1984, pp. 476-485. R.W. [8] Benbasat,I.,Dexter,A.S., and Mantha, of on "Impact Organizational Maturity InformationSystem SkillNeeds," MISQuarterly, Volume4, Number1, March 1980,pp. 21-34. [9] Benbasat, 1.and Taylor,R.N. "The Impact of CognitiveStyles on Information Systems Volume2, Number Design," MISQuarterly, 2, June 1978, pp. 43-54. T. [10] Berrisford, and Wetherbe,J. "Heuristic Development: A Redesign of Systems Volume3, Number Design,"MISQuarterly, 2, June 1979, pp. 43-54. R. Solv[11] Bhaskar, and Simon,H.A."Problem in Semantically An RichDomains: Examing ple from Engineering Thermodynamics,"

Acknowledgements:
The authorwould liketo thankGaryDicksonfor comments on an earlier version of this paper

238 MISQuarterly/September 1985

Measures SystemEffectiveness of

CognitiveScience, Volume 1, Number 2, April1977, pp. 193-215. J.E. and Davies,G.P.A Handbook [12] Bingham, of Systems Analysis, John Wiley & Sons, New York,New York, 1978. in [13] Boland, R. "Protocolsof Interaction the Design of Information Systems: An Evaluation of the Role of the Systems Analyst in Determining InformationRequirements," Unpublished Doctoral Dissertation, Case WesternUniversity, Cleveland,Ohio, 1976. [14] Bostrom,R.P.and Heinen,S.J. "MISProblems and Failures: A Sociotechnical Perspective-Part I: The Causes," MIS Volume1, Number3, September Quarterly, 1977, pp. 17-32. [15] Bryant,J.H. and Ferguson, G. "Systems AnalystActivitiesand SkillRequirements," AnnualComProceedingsof the Fourteenth puter Personnel Research Conference, AssociationforComputingMachinery, July 29-30, 1976. G. [16] Burch,J.G., Strater,F.R.and Grudnitski, Information Systems: Theoryand Practice, (2nd Ed.) John Wiley & Sons, New York, New York, 1979. Recommen[17] Couger,J.D. (ed.) "Curriculum dations for UndergraduateProgramsin Information of Systems," Communications the ACM,Volume 16, Number 12, December 1973, pp. 727-749. [18] Dagwell, R. and Weber, R. "Systems Designers' User Models: A Comparative ComStudy and Methodological Critique," munications of the ACM, Volume 26, Number11, November 1983, pp. 987-997. T. and [19] DeMarco, StructuredAnalysis System Inc., Specification, Prentice-Hall, Englewood Cliffs, New Jersey, 1979. [20] Dickson, G.W., Senn, J.A. and Chervany, N.L. "Research in MIS: The Minnesota Experiments," Management Science, Volume 23, Number 9, May 1977, pp. 913-923. [21] Ericsson, K.A. and Simon, H.A. "Verbal Reports as Data," Psychological Review, Volume 87, Number3, 1980, pp. 215-251. [22] Flanagan, J.C. "The Critical Incident Volume Technique,"PsychologicalBulletin, 51, Number4, July 1954, pp. 327-358. [23] Gingras, L. and McLean,R.L. "Designers and Users of Information Systems: A Study in DifferingProfiles," Proceedings of the

Third International Conference Information on Systems, C. Ross (ed.), Ann Arbor, Michigan, December 13-15, 1982, pp. 169-181. [24] Gore, M. and Stubbe, J. Elements of Systems Analysis, (3rd Ed.) W.C. Brown CompanyPublishers,Dubuque,Iowa,1983. E. [25] Hedberg,B. and Mumford, "The Design of Computer Systems: Man'sVisionof Man as an IntegralPart of the Systems Design ChoiceandComputers, Process," inHuman E. Mumford and H. Sackman (eds.) North Holland,Amsterdam,1975, pp. 31-59. [26] Henry,R.M.,Dickson,G.W.and LaSalle,J. Human Resources for MIS: A Report of Research, MIS Research Center Working of Paper, WP-74-01,University Minnesota, Minneapolis,Minnesota,May 1974. [27] Keen, P.G.W. "Information Systems and Organizational Change," Communications of theACM,Volume24, Number1, January 1981, pp. 24-33. [28] Keen, P.G.W. and Bronsema, G.S. "CognitiveStyle Research: A Perspective forIntegration," Proceedingsof the Second InternationalConference on Information Systems, C. Ross (ed.), Cambridge, Massachusetts, December 7-9, 1981, pp. 21-52. K.L. and [29] King,J.L.and Kraemer, "Evolution Organizational InformationSystems: An Assessment of Nolan'sStage Model," Communications of the ACM, Volume 27, Number5, May 1984. [30] Kling,R. "Social Analyses of Computing: TheoreticalPerspectives in Recent EmpiricalResearch,"ACMComputing Surveys, Volume 28, Number 1, March 1980, pp. 61-110 R. [31] Kling, and Scacchi, W."TheWebof Computing: Computer Technology as Social in Organization," Advances in Computers, M.Yovits(ed),Volume21, AcademicPress, New York,New York. 1982. [32] Markus, M.L. "Power, Politics and MIS Communicationsof the Implementation," Volume26, Number June 1983, pp. 6, ACM, 430-444. D.C. "TestingforCompetence [33] McClelland, Rather than for Intelligence," American Psychologist, Volume 19, Number 1, January 1973, pp. 1-14.

MISQuarterly/September 1985 239

in Expertise Systems Analysis

[34] Monro,M.C. and Davis, G. "Determining Needs: A ComManagement Information Volume of Methods," Quarterly, MIS parison 1, Number2, June 1977, pp. 55-67. [35] Nagel, E. and Brandt, R.B. Meaningand Knowledge: Systematic Readings in Harcourt Brace&World, Inc., Epistemology, New York,New York,1965. [36] Naumann, J.D. and Jenkins, A.M. "Prototyping:The New Paradigmfor Systems Development,"MISQuarterly,Volume 6, Number3, September 1972, pp. 29-44. the [37] Nolan,R. "Managing Crisesin DataProBusinessReview,Volume cessing,"Harvard 57, Number 2, March-April1979, pp. 115-126. J.F. Jr., Ho, T., Konsynski,B. [38] Nunamaker, and Singer, C. "Computer-Aided Analysis and Design of Information Systems," Communications of the ACM, Volume 19, Number12, December 1976, pp. 674-687. J.F. [39] Rockart, "ChiefExecutivesDefineTheir Own Data Needs." Harvard Business Review,Volume57, Number3, March-April 1979, pp. 81-93. Schank, R. and Abelson, R. Scripts,Plans, [40] An into Goals and Understanding: Inquiry Human Knowledge Structures, Lawrence Earlbaum Associates, Hillsdale, New Jersey, 1977. [41] Scott-Morton, M. Management Decision for Based Support DeciSystems:Computer sion Making, Harvard Business School, Division of Research, Cambridge,Massachusetts, 1971. [42] Semprevivo,P.C.Systems Analysis:Definition,Process and Design,Science Research Associates, Chicago, Illinois,1976. [43] Senn, J.A.Analysisin Designof Information Book Systems, McGraw-Hill Company,New York,New York,1984 Statistics for the [44] Siegal, S. Nonparametric Book Behavioral Sciences, McGraw-Hill Company, New York,New York,1956. [45] Strout,E.A Studyof Job RelatedCompetencies Used by Information Systems Analysts, State Ph.D.Thesis, Oklahoma Unpublished University,Stillwater,Oklahoma,1970. [46] Taggart,W.M.and Tharp,M.O."A Survey of Information Requirements Analysis Techniques," ACM Computing Surveys, Volume9, Number4, December 1977, pp.

273-290. [47] Taggart,W.M. and Tharp, M.O. "Dimensions of Information Requirements Analysis,"Database, Volume7, Number1, Summer 1975, pp. 5-13. N.P. and Dickson,G.W."Problem [48] Vitalari, for EffectiveSystems Analysis:An Solving Experimental Exploration,"Communications of the ACM,Volume 26, Number11, November1983, pp. 948-956. Analysis,Prentice[49] Weinberg,V. Structured Hall, Inc., Englewood Cliffs, New Jersey, 1980. [50] Wetherbe, J.C. Systems Analysis and Design: Traditional, Structured and AdvancedConcepts and Techniques,(2nd Ed.), West PublishingCompany,St. Paul, Minnesota,1984. [51] Wolfe, J.M. "A ValidationReport on the Test"(ExWolfeSystems Analysts Appitude Personnel, perimentsFormB3), Computer Volume6, Number Spring1977,pp. 11-12. 4, L.L. [52] Yordon,E. and Constantine, Structured Design: Fundamentalsof a Discipline of ComputerProgramand Systems Design, Inc., EnglewoodCliffs, New Prentice-Hall, Jersey, 1979.

Aboutthe Author
is Nicholas P. Vitalari an Assistant Professorof of and Co-Director ProjectNOAH Management in for Outlook Automation the Home)at (National He Irvine. is currently of the University California, examiningthe role of cognitive models in systo tems analysis and theirrelationship expertise, and selection methods for training requirements systems analysts, and the sociology of computing in the home and the workplace.He has published articles in such journals as Communications of the ACM and Computers and Environments UrbanSystems, and the book Structured BASIC for Business. He holds a Ph.D. and an MBA from the University of Minnesota.

1985 240 MISQuarterly/September

Expertisein Systems Analysis

Appendix 1. Case Stimulus for Information Requirement Determination Problem Accounts Receivable Subsystem stores throughoutthe midwest. CompanyA, a large retailmerchandisingcompany, has 53 low-margin Annualrevenues at the close of the thirdquarterin 1979 were $205 million.Companyrevenues are growing at an annual rate of 25 percent. Company A's success stems from its abilityto closely monitor customer demand for consumer items. The items are sold at high volumes at discount prices. Moving the merchandise is key to their success. Creditsales constitute a large portionof sales, permitting the large volume merchandisingstrategy. The company employs over 2000 employees. Corporateoffices are located in Chicago, Illinois. CompanyA recentlydecided to develop a new accounts receivablesystem. The new system willreplace an existing accounts receivable system completed in 1963. The current system has been patched, repatched and expanded throughequipment generations. Besides these technical difficulties,a new system is requiredbecause both the internaland external auditorsand management have proposed enhancements. The enhancements are incompatible withthe currentsystem's processing logic and program structure. The new system willsupportthe account collectionclerks, the assistant controller, controller, vice the the president of finance, and the assistant to the vice president of finance. The account collection clerks monitorthe status of particular accounts. The assistant controllerand controllerare responsiblefor collection proceduresand monitoring age of the receivables. The vice presidentof finance is responsithe ble for establishing a sound creditpolicy.Tentativeapprovalfor the system has been secured fromthe systems planningcommittee, the directorof information systems, and the vice president of finance.

MIS Quarterly/September 1985

241

Das könnte Ihnen auch gefallen