Sie sind auf Seite 1von 16

Accounting, Organizations and Society 34 (2009) 638653

Contents lists available at ScienceDirect

Accounting, Organizations and Society


journal homepage: www.elsevier.com/locate/aos

The usefulness of inaccurate models: Towards an understanding of the emergence of nancial risk management
Yuval Millo a,*, Donald MacKenzie b
a b

Department of Accounting, London School of Economics and Political Science, Houghton Street, London WC2A 2AE, United Kingdom School of Social and Political Studies, University of Edinburgh, Adam Ferguson Building, George Square, Edinburgh EH8 9LL, United Kingdom

a r t i c l e

i n f o

a b s t r a c t
Is the growth of modern nancial risk management a result of the accuracy and reliability of risk models? This paper argues that the remarkable success of todays nancial risk management methods should be attributed primarily to their communicative and organizational usefulness and less to the accuracy of the results they produced. This paper traces the intertwined historical paths of nancial risk management and nancial derivatives markets. Spanning from the late 1960s to the early 1990s, the paper analyses the social, political and organizational factors that underpinned the exponential success of one of todays leading risk management methodologies, the applications based on the Black ScholesMerton options pricing model. Using primary documents and interviews, the paper shows how nancial risk management became part of central market practices and gained reputation among the different organisational market participants (trading rms, the options clearinghouse and the securities regulator). Ultimately, the events in the aftermath of the market crash of October 1987 showed that the practical usefulness of nancial risk management methods overshadowed the fact that when nancial risk management was critically needed the risk model was inaccurate. 2008 Elsevier Ltd. All rights reserved.

Introduction Financial risk management is one of the fastest growing service industries in the business world. According to the Global Association of Risk Professionals (GARP), one of the leading trade associations in the eld, there are currently more than 74,000 nancial risk managers in nancial institutions.1 Dozens of academic and professional institutions award degrees and diplomas in nancial risk management and these qualications are gaining recognition by regulators and international certication bodies. At the heart of this body of knowledge and practices (virtually non-existent less than thirty years ago) is a set of nancial economic theories, employing a variety of statistical models

* Corresponding author. E-mail address: y.millo@lse.ac.uk (Y. Millo). 1 http://www.garp.com/about/archive.asp (accessed on 30th March, 2008). 0361-3682/$ - see front matter 2008 Elsevier Ltd. All rights reserved. doi:10.1016/j.aos.2008.10.002

to assess and calculate the risks associated with a plethora of nancial assets and contracts. Many of these nancial risk management systems are proprietary and no reliable statistics are published about their use, but it is estimated that the daily transaction volume of nancial products traded in organized exchanges that are managed using such methods exceeds 50 million transactions,2 with many more transactions executed in over-the-counter markets. Is this remarkable success an evidence of the predictive powers of modern nancial economics? Judging from a brief review of several leading textbook in nancial economics (Hull, 2005; McDonald, 2006; Stulz, 2002), the answer to this question is a resounding yes: the accuracy and validity of risk models and the applications that use them are said to be tested and re-validated literally millions of times a day in the markets.

http://www.sungard.com/sungard/ (accessed on 30th March, 2008).

Y. Millo, D. MacKenzie / Accounting, Organizations and Society 34 (2009) 638653

639

In this paper we aim to show that such a representation is partial at best and that the explosive growth of nancial risk management cannot be explained by the accuracy of the models and methods used. This paper traces the growth of nancial risk management applications that made use of the options pricing model developed by Black and Scholes (1972, 1973) and Merton (1973), known as the BlackScholesMerton model. Arguably, this model is the crowning achievement of modern nancial economics and was included in many of the pioneering nancial risk management systems.3 The history of the BlackScholes Merton model is tightly connected to the rst organized exchange for the trading of stock options, the American Chicago Board Options Exchange (CBOE). This historical case was studied previously (MacKenzie & Millo, 2003; MacKenzie, 2006). However, while previous studies focused on the effect that the BlackScholesMerton model had on prices in options markets, this paper examines the development of nancial risk management. The initial link between practice and model-based prediction is historical-temporal: CBOE began trading options less than two weeks before the mathematical model aiming to predict their prices, the BlackScholesMerton model, was published. These two coincident events mark the beginning of an exponential growth curve that traces both the markets for nancial derivatives and nancial risk management. This growth curve, we argue, was not fuelled simply by actors persuaded by the accuracy of the model. Instead, the usefulness of the model-based risk management, which enabled an efcient tackling of a variety of operational, organisational and political challenges, was a crucial factor behind the success of modern nancial risk management. Model usefulness has three important organisational manifestations. First, model-based methods allowed clearer communication within trading organisations. It reduced the complexity of nancial data and enabled more efcient decision-making (Sections Risk assessment practices in early CBOE and The rise of comprehensive risk management systems). Second, applications based on the model solved the operational challenge of the clearinghouse when calculating the level of risk-based deposits required of traders (Sections Financial risk management off the trading oor: options clearing and Financial risk management as useful operational tools). Third, the growing consensus among market participants around the usefulness of the model became useful in its own right when used by the SEC to legitimise its regulatory decisions (Section Financial risk management and the 1987 market crash). The accumulating effect of these different manifestations of usefulness contributed to a situation whereby the accuracy of the predictions it produced, even during critical times, was much less salient than one might expect. This approach, it has to be noted, does not aim to provide a denitive analysis of the emergence of nancial risk management. Instead, it focuses on the interplay between technological, social and organisational factors and how it brought about the formation of rich interconnectedness among market participants.
3 Merton and Scholes received Nobel Prize in Economics in the 1997 for their work on the model Black died in 1995, but was mentioned as a contributor by the prize committee.

Theoretical approach Knowledge and practice are fused together within nancial risk management through the notion of management. The etymology of the word management is traced back to the Italian maneggiare, which means to to handle, and especially to control a horse (Barnhart, 1999). Controlling a horse demands both knowledge and the ability to perform that knowledge in real life. Hence, management is dependent on a successful transformation of knowledge from one realm to another: from knowledge that contains descriptions of actions to knowledge that dictates and controls these actions. Power (2007, pp. 2428), who traced the growth of risk management as an organizational phenomenon claims that the growth of risk management in the last two decades is related to a gradual convergence between risk calculation and risk management. In spite of the fact that Powers analysis is devoted mainly to corporate risk analysis and does not focus primarily on nancial risk management, his general insight is also relevant in this case. As Power demonstrates, the historical process of convergence led eventually to a subsuming of calculation into management. That is, nowadays risk is regarded as a manageable factor rather than merely a measurable, quantiable and calculable entity. In other words, risk has been internalized. Organisational market participants re-positioned themselves vis--vis risk: they moved from being spectators at an external phenomenon to managers of an increasingly internal institutional resource. If, as Power claims, a major transformation has turned descriptive knowledge (risk calculation) into (practice-oriented) risk management then an empirical examination would reveal organizational actors that gradually shift resources towards communicating and coordinating action using risk management and pay relatively less attention to merely calculating risk levels. This organisational-communicative aspect of risk management also has reexive and constitutive implications. Risk management allows market participants to produce a map of risks and opportunities from which a plan of action is then derived. Naturally, any map, be it a geographical map or a risk map, is charted while incorporating a particular perspective. An actors point of view is the initial coordination according to which risks are dened and risk assessments are made. Therefore, the way an organizational actor depicts its risks is contingent upon how that actor perceives itself, its goals and its relationships with other actors. Consequently, since risk management is not only a description of a given reality but includes a prediction and is operated upon as a blueprint for action, it includes a constitutive (or performative) element: the way organizations depict their risks has a signicant effect on the way they will, eventually, react to events and to other actors. Over time, an inuential risk management system will bring about institutionalised patterns of risk embodiment. To theorize how nancial risk management became a dominant factor in nancial markets one has to conceptualise markets not as meeting places where abstract forces of supply and demand are represented by traders, but instead as arenas where different strands of knowledge are continually converted into practices which are then tested,

640

Y. Millo, D. MacKenzie / Accounting, Organizations and Society 34 (2009) 638653

challenged, and frequently revised. Such a conceptual approach corresponds directly with developments that have taken place in economic sociology over the last two decades and most specically, with the emergence of the social studies of nance (SSF) research agenda.4 That said, this growing strand of research about the social and institutional dimensions of nancial markets and their structure and operation paid little attention to the vital role that nancial risk management played in shaping markets. Thus, while corresponding with many theoretical approaches within SSF, this paper also uses concepts both from conventional economic sociology (drawing mostly on the role of social networks in markets) and concepts from actor to network theory from the sociology of science and technology (Latour, 2005 ). As a starting point, we posit that the fusion of knowledge and practice through which nancial risk management developed took place through the social fabric that makes up markets. In this regard we follow the famed American sociologist Granovetter (1985, 1992), who, by referring to economic historian Polanyi and MacIver (1957) made the theoretical claim that markets should be regarded as social constructions that evolve on the basis of pre-existing social and cultural frameworks, in which markets are embedded but which are not necessarily economic. Hence, the development of economic institutions takes place through continuous interactions among actors who hold a variety of motivations and perspectives. Granovetters central concept the embedding of markets in pre-existing socio-cultural frameworks focused on the resulting economic structures. Other economic sociologists such as Abolaa (1996), Baker, 1984 Uzzi (1996), Uzzi and Gillespie (2002), and Uzzi and Lancaster (2003), who built upon Granovetters theoretical perspective, studied the interaction of a variety of individual actors in nancial markets. Abolaas research about the New York Stock Exchange showed that the common view of the nancial trading world as an atomistic, egoistic, prot-driven social environment was far from accurate. Market participants frequently created social networks and very seldom operated alone. These networks imposed unwritten (but carefully guarded) rules of reciprocity. For example, traders who failed to give back a trade (i.e. partake in a non-protable exchange with a fellow trader who previously had done it for them) risked being excluded from the trade circles. Baker, who studied an options exchange, found that traders tended to operate within more or less set network of trading associates. This stream of empirical works demonstrated persuasively that fundamental elements underpinning market behaviour are regulated through dense personal networks of crisscrossing favours and animosities, which then feed into equally elaborate sets of closely guarded norms.

4 Collections of articles devoted to SSF are Knorr Cetina and Preda (2005) and Kalthoff et al. (2000). Other noteworthy contributions include Abolaa (1996, 1998), Beunza and Stark (2003, 2004, 2005), Clark (2000), FentonOCreevy, Nicholson, Soane, and Willman (2005); Holzer and Millo (2005), Izquierdo (2001), Knorr Cetina and Bruegger (2000, 2002a, 2002b), Lpinay (2007), Millo (2007), Muniesa and Callon (2007), Preda (2001a, 2001b, 2001c, 2004, 2006), Thrift (1994), Zaloom (2003, 2004).

In particular, the work on the embedding of social networks and their role in markets indicates that it may be necessary to look for a theoretical perspective that combines action and structure when we conceptualise the diffusion and growth of nancial risk management, rather than a theoretical framework that traces primarily the formation of structures. At this point, it is useful to refer to the theoretical discussion that Fourcade (2007) develops.5 Fourcade attributes the primary differences between approaches in the sociology of markets to disagreement as to what exactly is constructed socially in markets. Among else, she compares the differences between the conceptualisations of concrete social ties among actors to the different positions actors take within a Bourdieusian eld. It is not the intention of this paper to reconcile eld theory and social network approaches. However, the historical dynamics of the options markets case do challenge the purely structural version of the embeddedness concept. When embeddedness is represented in terms of historical processes, two preliminary premises emerge. First, nancial risk management did not develop in a linear fashion, but rather in a network-like one. That is, the various market participants did not form an assembly line where one actor operated after the other, each contributing to the nal shape of the system. Instead, the growth process unfolded through continuous interactions without the presence of a meaningful command and control centre. Second, inherent in the interactive and networked nature of the historical process is the heterogeneity of the actors involved. As they interact, the various market participants promote different, even conicting, operative agendas, each rooted in a different worldview. Hence, the overall environment where nancial risk management emerged should be regarded as a heterogeneous network. Important as inter-personal networks were the context of the developing Chicago derivatives markets, this paper does not aim to provide an embeddedness-like analysis of the growth of markets. Instead, the analysis here brings to the fore two important issues that the Granovetterian tradition tends to neglect: the long processes through which markets become organisationally institutionalised and the pivotal role that non-human entities play in shaping market networks. First, the research that applied the embeddedness approach empirically focused on collecting contemporaneous data (participant observation and quantitative analysis). As useful as this methodological approach is in revealing how inter-personal connections frame and congure economic activities, the longer-term processes of institutionalization, such as the ones described in this paper, by which patterns of network-based coordinated action become part of the markets infrastructure, are not likely to be captured by such methodological tools. In contrast, the historical sociology perspective applied in this research, using both oral history methods (interviews) and primary documents, is geared towards the analysis of longer historical durations and assessing their impact on organizational structures. In fact, the empirical evidence and the historical analysis in the paper

We thank an AOS reviewer for pointing out this reference to us.

Y. Millo, D. MacKenzie / Accounting, Organizations and Society 34 (2009) 638653

641

imply that a snapshot view of any one point in time during the twenty-year period analysed in the paper could have led to mistaken conclusions about the way nancial risk management developed. Second, the embeddedness approach can be enriched by taking into account the role of non-human actors in the shaping of nancial risk management. Financial markets are commonly described as an environment saturated in sophisticated technological artefacts. Printouts of calculations, display screens and trading oor computer workstations, to name but a few, are indistinguishable parts of todays nancial markets. As ubiquitous as these technological artefacts are the realisation of the part that technology plays in shaping the structure of markets is far from common. In a beautifully written paper, Kalthoff (2005) shows how practices that emerged around the use of computer software (epistemic practices) crystallized institutional risk management routines. Kalthoffs ndings reveal that practices did not emerge primarily from simple inter-personal interaction, but that coordinated communication was mediated by technical representations of risks and through that mediated representation risk management grew and became established. A recent paper by Miller and OLeary (2007) draws similar conclusions regarding the role that technological materiality played in the growing efcacy of capital budgeting. Miller and OLeary argue persuasively that the efcacy of heterogeneous networks as agents of constitutive change is dependent on their intermediaries, the material content (e.g. written documents, technical artefacts, money) that circulates in the network and embodies, in effect, the connections among the actors. The hybrid humanmachine networks through which nancial risk management developed is tightly related to a core issue, that is, the ability of such systems to produce results that would be perceived as valid and accurate descriptions of a reality: the facticity of risk management (Latour, 1988). MacKenzie (2009) argues that the production of prices in nancial markets is inherently embedded in the production of validity for those prices. That is, markets have to maintain the legitimacy of their prices as factual descriptions of the tradable values of the assets exchanged in the market. That ability, as the analysis below aims to show, is dependent critically on the creation and maintenance of a technology-driven, practice-oriented set of descriptors. For example, a portfolio manager would nd it difcult to trust the gures her risk management software produce if she believed that those gures were valid only for the immediate set of circumstances in which they were produced. On the other hand, if the same portfolio manager had trusted that her risk management software speaks in a universal language and thus her level of calculated risk is comparable with those of others who perform the same type of practice and use the same risk methodology, then it would be easier for her to assign a high degree of validity to the results. It is true that assigning facticity to informational items can be created, hypothetically, without the presence of machines. Nonetheless, in the context of contemporary nancial markets, done manually, such a process would have practically halted activity in the markets. That is, technological actors

do not merely help human market participants to perform, but by providing a stream of methodologically valid information (although not always realistically valid, as the ndings show) they perform an irreplaceable and irreducible part in the constitution of markets. Indeed, inhuman speed and efciency were the factors that kept the facts machine of nancial risk management running smoothly. Considering this, we believe that it would be suitable to adopt the actornetwork terminology here and to refer to the network of connections that took part in constituting nancial risk management as a techno-social network rather then only a social one (Law, 1991; Latour, 2005). The empirical material in this paper is based on interviews and primary documents. More than thirty interviews were conducted with leading gures from the CBOE, the Options Clearing Corporation (OCC) and the Securities and Exchange Commission (SEC) who played central roles in the unfolding of the events analysed here. All interviews were recorded and transcribed in full. Since several interviewees requested anonymity, we decided not to single out by name the ones who agreed to be recognised and thus all interviewees are identied by a single letter. In addition, extensive archival research was done. The archives used were those of the SEC and the Federal Reserve Board in Washington, DC and private collections of documents in Baltimore, New York and Chicago. The rest of the paper continues as follows. The next two sections focus on the part that trading practices played in the development of nancial risk management during the early days of CBOE and the growth of options trading to other markets, respectively. Section Risk assessment practices in early CBOE discusses the rst model-based applications, sheets with calculated prices while Section The rise of comprehensive risk management systems analyses the introduction of spreading and scenario-simulations. Both application show the communicative usefulness of the model. Sections Financial risk management off the trading oor: options clearing and Financial risk management as useful operational tools analyse the role that the options clearinghouse and the securities markets regulator, respectively, played in shaping nancial risk management. For these market participants, risk management was an operational and a regulatory issue, respectively, and accordingly, the notion of usefulness here is manifested in increased operational efciency as a political-discursive device. Section Financial risk management and the 1987 market crash examines the effects that the October 1987 market crash had on the development of nancial risk management and focuses particularly on the dramatic clash between usefulness and accuracy that the crash brought to the fore. Section Conclusion discusses the ndings and concludes the paper. Table 1, below, places the development of nancial risk management in the different arenas in a chronological order (according the different time period analysed in the sections) and summarises the important practices, technologies and market structures. Risk assessment practices in early CBOE Brief explanations about nancial options and the mathematical model used in options market (the

642

Y. Millo, D. MacKenzie / Accounting, Organizations and Society 34 (2009) 638653

Table 1 Periods in the history of nancial risk management in options exchanges. Trading/clearing practices Risk management technologies Sheets with calculated prices Institutional structure of the market

19731975 (Section Risk assessment practices in early CBOE)

Single traders, scalping

19761984 (Section The rise of comprehensive risk management systems) 19851987 (Sections Financial risk management off the trading oor: options clearing and Financial risk management as useful operational tools) October 1987 (Section Financial risk management and the 1987 market crash) 19881994 (Section Financial risk management and the 1987 market crash)

Intermarket distributed portfolios; Margins calculated using strategybased method Index tracking; dynamic portfolio insurance

CBOE is the only options exchange Spreading, daily trading Options traded strategies using pricing in several models. exchanges OCC Margins calculated using pricing model

Under extreme volatility, BlackScholes-based applications are not accurate Testing of model-based application for net capital requirements (TIMS), approved by SEC in 1994

BlackScholesMerton model) are necessary before presenting the historical ndings. Options give their buyers the right, but not the obligation, to buy (call option) or to sell (put option) a certain asset at a set price on (or before) a given future date. Sellers of options take upon themselves the obligation to sell the assets at the set price or to buy them at a set price if the buyers decide to exercise the options. By buying a put option, a trader can protect herself from the possibility of prices dropping. Similarly, by selling options, different traders can be paid for taking upon themselves other traders risks. Without delving too deeply into the theoretical background of the BlackScholesMerton model, it is important to explain briey the way in which the model ties together pricing of options with risk as this played a crucial role in the development of nancial risk management. BlackScholesMerton model is a statistical model that can be used to predict options contracts prices. The model is based on the no arbitrage hypothesis that assumes that prices in markets react instantly to new information that reaches them and therefore riskfree prot-making opportunities are virtually non-existent (Black & Scholes, 1972; Black & Scholes, 1973).6 When the no arbitrage assumption is placed in a complete market setting, it dictates that a combination of options and stocks that bears no risk to its holder (risk-free) would have to generate the same cash ow as an interest-bearing account (which is another risk-free instrument). Hence, the market prices of the option and stock composing such a risk-free portfolio could be discovered by comparing them with the expected yield of cash invested in a risk-free interest-bearing account. Using this initial result, the model can then be used to predict the prices of options. Similarly, because the models calculation is based on the degree of risk related to the market positions of options, the same set of equations can be used to evaluate how much risk is embedded in holding particular market positions. The bi-directionality embedded in the model the
6 Mertons (1973) theoretical reasoning for the model is different from the one offered by Black and Scholes, but both approaches support the same mathematical model.

fact that it offered two equivalent procedures through which quantitative estimates of risk and prices could be calculated was pivotal to the emergence of nancial risk management. The use of Blacks sheets The Chicago Board Options Exchange (CBOE) started trading options based on 16 blue-chip stocks in April 1973, a few weeks before the BlackScholesMerton model was rst published. There is evidence of sporadic use of the model by traders in the early months of trading (MacKenzie & Millo, 2003). However, the rst model-based price prediction service is that offered by Fischer Black, one of the developers of the model, who started in 1975 a monthly subscription service selling sheets containing options calculated prices for the weeks in each month. This rst application of the options pricing model was using a case by case approach. By calculating a bare-bones BlackScholesMerton formula, one option contracts price or hedging ratio was predicted (see Table 2). CBOE was established by a group originating from Chicago Board of Trade (CBOT), the citys well-established agricultural commodities exchange. In its early days, most of CBOEs traders came from CBOT (R, 2000). Unsurprisingly, the practice that developed around Blacks sheets was based on one of the common trading technique in the Chicago commodities markets scalping. Scalping was a very basic buy low sell high tactic, executed many times during a typical trading day and utilising minute uctuations in prices. With scalping as the main trading technique, the mode of operation of this rst risk assessment and trading application was as follows: traders calculated the theoretical value of one option contract, compared it to the contracts market price, and then decided whether it was protable to buy or sell. Hence, trading with Blacks sheets can be described as modelaided scalping. The sheets supplied pinpointed information to the trader: the model-calculated price of a specic option contract on a specic date. The information provided by the sheets was particularly useful for single traders operating on their own for a

Y. Millo, D. MacKenzie / Accounting, Organizations and Society 34 (2009) 638653 Table 2 Sample of data from Blacks sheets.a Date of expiration Strike price: 50.40 Last Friday of July 1976 $9.26 Last Friday of October 1976 $9.53

643

Last Friday of January 1977 $10.00

a The sample shows the prices predicted for the week on the 4th of June 1976 for three series of call options written on stocks of US Steel Corporations. Prices were calculated on 25th of May 1976 for stock price of $40 per share. The strike price is the market price of the stock at (or above) which the option can be exercised protably.

number of reasons. First, most of them had relatively small portfolios,7 so they could easily estimate what would be the implication on the entire portfolio of selling or buying particular contracts. Second, the part of the portfolio that was traded on CBOE (i.e. the options8) was concentrated on the pits in which the particular trader specialised. This focused approach made the use of the sheets easy from yet another aspect: the typical trader had to purchase sheets only for a small number of options series, and had fewer pieces of paper to carry and manage in the crowded trading pits (J, 2000).9 Third, because most trading rms had only very small numbers of traders, of which typically only one was the senior partner, it was relatively easy to execute, singlehandedly, portfolio-wide changes. For example, a small rm could make changes to their positions during the trading day in order to utilize price discrepancies between market prices and prices calculated in the sheet (Securities and Exchange Commission 1978, pp. 130136). In many cases, it only took a decision and a quick word between the partner traders to change positions in order to take advantage of a price discrepancy. This potentially practical use of the model was not accompanied by corresponding accuracy. Galai (1977) who tested CBOE prices using the BlackScholesMerton model during this early period found considerable deviations between actual CBOE market prices and between the predictions produced by the model. That is not to say that the results produced by the model were so inaccurate so as to mislead traders. However, even at this early stage it was evident that the BlackScholesMerton model, in spite of being inaccurate was already packaged in (and indeed, disseminated) in the form of trading technologies and practices. The rise of comprehensive risk management systems Spreading Between 1973 and 1977, volumes in options exchanges grew by more than 500%, the sophistication of trading strategy increased, as Fig. 1 shows, and the number of trading rms doubled (Securities and Exchange Commission, 1978). As the markets for options ourished, so did the trading rms, which no longer employed one or two oor traders, but up to a dozen, along with a similar number
7 Portfolios containing typically each less then one hundred options and stock positions (J, 2000). 8 Stocks were traded in a stock exchange. During the period discussed in this paper, stocks and underlying options were typically traded in the New York Stock Exchange. 9 J was a trader in CBOE from the early 70s until the early 90s.

of clerks, runners and back-ofce employees (E, 2000). In the larger trading rms, portfolio-wide changes could no longer be performed by a single trader. Although scalping aided by sheets was still a possible trading strategy, coordination among traders trading on the same portfolio became increasingly important so that the different trading orders would not undermine each other. Gradually, bareboned BlackScholes-based applications, like the pre-calculated sheets, ceased to serve as devices in their own right and were incorporated into larger portfolio management systems. One of the rst steps in this direction was a BlackScholesMerton-based trading practice known as spreading (Securities and Exchange Commission, 1978). Spreading was a basket term for a variety of planning techniques that were all based on the same principle: nding probable discrepancies between options market prices and between their model-generated prices (this was done by computer-programmed calculations of many separate positions) and then using those results to devise a daily trading strategy. The growth in the average number of options per transaction indicates of the growing complexity in options trading strategies.10 The decrease in trading in the last three years (19881990) followed the market crash of October 1987, which is discussed in Section Financial risk management as useful operational tools. Number of options data is adapted from the Options Clearing Corporations historical data archive (http:// www.optionsclearing.com/market/vol_data/main/volume_ archive.jsp). Average number of options contracts per transaction is taken from CBOEs 2006 Market Statistics report (http:// www.cboe.com/data/Marketstats-2006.pdf). The main difference between the sheets and spreading laid in the organizational setting of the different methods. Usually, traders who used Blacks sheets were able to take advantage of the market-model price discrepancies only if they noticed them during the trading day (J, 2000). On the other hand, spreading was planned before the beginning of the trading day at the trading rms back ofce, not necessarily by traders themselves. Thus, at the beginning of the day, a trader would enter the trading oor, having seen the days risk map for the portfolio she was trading and knowing which options were overpriced and which were

10 The average number of options is also related to the average size of transaction, and hence, to the number of options traded. However, since trading grew by over 300 times while the average number of options per transaction grew by less than seven times, it is clear that the number of transactions grew immensely, while the size of transaction only grew as to accommodate the increasingly complex options trading strategies.

644

Y. Millo, D. MacKenzie / Accounting, Organizations and Society 34 (2009) 638653

350,000,000 300,000,000 250,000,000 200,000,000

16 14 12 10 8

150,000,000 6 100,000,000 50,000,000 0


97 3 19 74 19 75 19 76 19 77 19 78 19 79 19 80 19 81 19 82 19 83 19 84 19 85 19 86 19 87 19 88 19 89 19 90 1

4 2 0

Year
Number of options traded annually (all options exchanges) Average number of options per transaction (CBOE)
Fig. 1. Number of options contracts traded in all options exchanges and average number of contracts per transaction in CBOE, 19731990.

under priced, according to the model. The daily trading strategy was tailored with respect to these predictions. Another difference lay in the nature of information that the trades received from the methodologies. Unlike the immediate and highly specic information that was provided by the sheets, the typical results of a spreading procedure were broader guidelines that stated recommended ranges for buying and selling. Spreading, apart from automating the actual positionby-position calculations, also introduced a new practice to options trading: planning in advance the following days trading game plan on the basis of the model-generated estimates. Such discussions were an inherent part of the spreading procedure because the BlackScholes Merton calculations, on their own, did not produce denite sets of instructions for the following trading day. Instead, the results were discussed alongside other bits of information; risks and opportunities were evaluated and an overall picture of the trading day was generated, which lead to the design of a recommended daily trading strategy. The crucial communicative element of the model that the spreading practice brought into being was that it enabled the development of improved organisational cognition. Spreading methodologies allowed trading rms to express risks in accessible terms and to construct a clear picture of potential market situations and possible reactions. Therefore, spreading marked an important step in the unfolding of the techno-social process by which BlackScholesMerton-based applications gained appreciation for their communicative and managerial usefulness and by which risk assessment transformed into risk management. Model-based practices did not only introduce new practices, but also changed knowledge-related power relations within the trading rms and embodied the extent of the techno-social nature of options markets. In

the early days of option trading in CBOE, much as in the commodities trading world from which it sprang, trading was an expertise that was learned through apprenticeship. A trader would typically start her/his career as a oor runner,11 a job that they would do for 12 years before possibly becoming a junior partner and perform trades independently. Yet, even as a junior trader, the partner would generally seek advice from the more senior partners. However, in rms where daily trading plans were designed with the aid of a spreading application, a new factor was added to the decision-making process the input provided by the model-based application. For example, an enthusiastic young trader could recruit a model-generated result as a source of support for a proposed daring trading strategy, frequently against the advice of a senior partner (E, 2000). Similarly, referring to a prediction offered by the application could bridge over the differences between opinions about the route that should be taken in a certain situation. The model-based computer application and the information it provided have became a non-human market participant a source of information and, indeed, opinions, about the planned trading strategy. Thought not the focus of this paper, it should be noted that the changes described above, triggered by the introduction of spreading. The use of spreading was the rst time, as far as we know, that two methodologically and conceptually different risk management discourses (model-based and experience-based) were competing within trading organisations for prominence and legitimacy in decision-making process.12

11 A oor runners main task was to deliver order notes to the trader from the back ofce and execution notes from the trader to the bookkeeper. 12 This structural trend has grown to be an inherent point of contention in modern nancial risk management; a point that ares into an open conict in times of crises, as we now witness.

Y. Millo, D. MacKenzie / Accounting, Organizations and Society 34 (2009) 638653

645

Multi-exchange option trading As options became a more popular nancial contract, option trading spread from CBOE to other exchanges. By 1977, four other exchanges were also trading options: the American Stock Exchange in New York (AMEX), the Pacic Stock Exchange in San Francisco (PSE), the PhiladelphiaBaltimoreWashington Stock Exchange (PBW) and the Philadelphia Stock Exchange (PHLX) (Securities and Exchange Commission, 1978). Although options trading had gained signicant popularity, until the mid 80s the SEC did not allow for the same option contract (i.e. based on the same underlying asset) to be traded in more than one exchange and the rights to trade options were distributed among the exchanges in a ballot. Thus, in order to build and maintain a diversied portfolio, traders had to execute trades in many exchanges across the country. This fact gave a clear advantage to the nation-wide investment rms that could have traders in all exchanges, over the local rms that typically traded only in one of the exchanges. Consequently, the late 70s witnessed a change in the ecology of the options traders population. The Chicago-based rms that originated from the commodities trading markets were accompanied by big, nation-wide rms that entered options markets as an extension to their securities trading (Securities & Exchange Commission, 1980). The entrance of large investment rms did not only change the composition of traders on the trading oor; an equally signicant change took place in the portfolios management practices. The large trading rms held typically huge portfolios, containing thousands of positions, distributed among four or ve different exchanges, and their trading activity was conducted by a few dozen traders. When managing a portfolio of such a size, there was little sense in asking the question: what are the specic risks (and opportunities) involved in my current positions? There were simply too many possible answers to this question to serve as a basis for planning a strategy.13 Hence, the communicative and managerial challenge facing market participants in such an environment was twofold. First, to aid decision-making, it was vital that highly complex information contained in the large portfolios was simplied. Second, an agreed-upon communicative medium describing portfolio risks was called for so that the various people involved in executing trading orders and operating in different cities could coordinate their actions. Facing these organizational challenges, trading rms started to consider a new approach to portfolio management, an approach that, for the rst time, managed risk directly. This is where the bi-directionality of the Black ScholesMerton model had been realized as the basis for the models usefulness, in all its forms. Instead of calculating theoretical prices for each of the positions and then

summing up these results, the new approach took a hypothetical result as its starting point. In other words, the operational question of this new risk management method was: what if the market drops/rises by X percent tomorrow, how would that affect my portfolio? To answer such a question, the methodology assumed (in fact, simulated) a market movement of a certain size, say of 10%, then calculated the impact that the market movement would have on each of the positions, and nally summarized the results so as to come up with the overall implication on the portfolio. In essence, the systems simulated possible future market scenarios by using results coming from the Black ScholesMerton model. Although beyond the scope of this paper, it is worth noting that this general principle was later incorporated into Value at Risk (VaR), one of todays leading nancial risk management methodologies. Scenario-simulating systems added a new dimension to the communicative function of developing nancial risk management. The applications not only created a reference point for the market participants, but also represented the complex market picture in a clear and coherent way. In fact, the communicative usefulness of this new risk management methodology was such that even the information that was still originating directly from the markets was mediated by model-generated results. For example, in order to simplify the positions, these were presented as percentage of the previous days gain/loss predictions and not as absolute numbers (Securities and Exchange Commission, 1986). Results from the scenario-simulating systems became an indispensable mediating step between the market and its participants. When using scenario-simulating systems to design their trading strategy, market participants were no longer conned to concrete results from the market but were able to resort to predicted future situations. The introduction of scenario-simulating systems marked a signicant step away from risk assessment and towards risk management. Whereas the use of spreading merely enhanced the ability of traders to communicate their ideas about trading strategy, this new type of applications became the tools with which such ideas were generated in the rst place. Using spreading, a trader could only illustrate the benets of the trading strategy she/he had already planned. In contrast, with scenario simulating risk management systems, it became possible, even likely, to receive the initial idea about possible trading opportunity by examining the applications output. For example, after the proliferation of scenario-simulating applications traders started to talk about buying volatility or selling volatility, when increasing the relative share of options in their portfolios. That is, model-based applications indicated that risky assets of various degrees should be bought or sold in order to balance the portfolio. Scenario-simulating did not merely supply reference points for discussions; by presenting a new discourse to the management of portfolios it made the very existence of such discussions possible. Financial risk management off the trading oor: options clearing Prices and risks related to options positions were a matter of concern not only for trading rms, but also for the

13 Large investment companies managed multi-exchange portfolios before exchange-traded options appeared. Yet, the level of coordination necessary in options trading was much higher than in stock trading as the vast majority of options positions were composite: composed of stock positions and one or more option positions that were bought and sold simultaneously, frequently at different exchanges. The growing complexity of options positions over the years is expressed in Fig. 1.

646

Y. Millo, D. MacKenzie / Accounting, Organizations and Society 34 (2009) 638653

options clearinghouse (Options Clearing Corporation OCC14) and for the regulator of securities markets, the American Securities and Exchange Commission (SEC). In fact, this part of the historical analysis reveals the heterogeneous nature of the techno-social network from which nancial risk management sprang and thus expands the notion of market participants. The following two sections show that the interests of the clearinghouse and of the SEC, two vital market participants, were very different from the narrow utilitymaximization interests usually attributed to market participants. In addition, this part of the analysis shows that nancial risk management helped in solving operational and organisational problems not only among trading rms but also among other market participants. Fundamentally, an options clearinghouse ensures that future obligations of buyers and sellers of options, which derive from the options contracts they buy or sell, are met. To prevent the risk of one of the parties not performing its side of the contract and to ensure that the market remains liquid and trustworthy, the clearinghouse was assigned as the immediate buyer of options from the sellers and the immediate seller to buyers.15 As the other side of the contracts (until expiry or offsetting), the options clearinghouse was exposed to considerable risks. In order to protect itself against those risks, the clearinghouse collected a portion of the contracts value as collateral, known as margin. Participants were required to deposit margins when they rst took a position involving an options contract. Then, the margins may either decrease or increase according to daily price uctuations. Apart from its own margins, OCC was also responsible for the calculation and collection of another set of risk-related fees the SECs net capital requirements. According to the SECs net capital rule,16 traders who regularly executed transactions for others, collectively known as brokerdealers (or brokers),17 were required to make daily deposits of specied amounts of money, known as net capital. Unlike margins, the net capital rules purpose was not to protect the clearinghouse, but to protect broker-dealers customers in case their funds were inadvertently involved in risky positions held by their brokers. If such losses did occur then the pre-deposited capital would be put towards compensating the customers. In the rst three years to its operation, two different methods were used in the options clearinghouse for determining the amounts of margins and net capital requirements. For the clearinghouses own margins, a premium-based method was used. That is, a xed premium
14 Since OCC was the only options clearinghouse for organized exchanges at the discussed period, we refer to it as the clearinghouse. 15 The concept of the modern options clearinghouse was developed by the CBOT team who set up the rst options exchange. Indeed, the concept of a clearinghouse as an entity separate from the trading community, played an important role in the approval of the options exchange itself (R, 2000). 16 The rule to which the paper refers is the revised net capital rule from 1975. Prior to the 1975 amendments (the net capital rule was rst written in 1942), brokers had to deposit a set amount of capital at the beginning of a trading day, regardless of the risk level associated with their positions (Seligman, 1982). 17 The largest group of traders (although there were others) handling accounts of others were broker-dealers, who were bounded by the SECs net capital rule.

was paid regardless of the positions components (Seligman, 1982). The net capital requirements, on the other hand, were calculated using a strategy-based method. The strategy-based method of risk-evaluation was based on a set of categories that assigned various levels of risk to the different nancial assets and contracts. For example, options were considered more risky than bonds, so the required deposit for options was larger than the one for bonds. The fact that two separate methods were used for the evaluation of the same factor market risk caused uneasiness among the trading rms. H, who was a senior executive at the clearinghouse from the late 70s to the mid 90s described the early years of option clearing: At about 19778, OCC had premium-based margin requirements [calculation methodology] and we were barraged with requests to convert the margining system to something like the way net capital rule worked at the time, which was strategy based. The requests for the changes came from the trading community, principally, and they came in with graphs and numbers and said something like: My risk is limited to this; you should never charge me more than this in margins. (H 2000) Brokers and other traders who had to pay both the SECs capital requirements and the clearinghouses margins demanded for the clearinghouse to stop charging margins according to the premium-based method and to switch to the strategy-based method. From the traders point of view, the premium-based method was unjust because it did not reect the growing complexity embedded in options positions and trading methods. Because options were often used to minimize risk levels, charging a at rate for all options positions, regardless of the implied risk embedded in them, was defeating the purpose of using options altogether. Traders were not the only ones who demanded changes in the calculation methods. Organized option trading was an emerging and highly competitive nancial practice in the mid 70s, and each of the exchanges that traded options wanted to attract customers. Since OCC was the only options clearinghouse at the time, it faced demands from all exchanges to charge less for its services. Facing those pressures, in 1977 the clearinghouse replaced its method for margin calculation from a premium-based method to strategy-based one (Securities and Exchange Commission, 1986). The new calculation method was seen as a positive move by both the brokers and the exchanges. However, from the clearinghouses side, the move entailed some signicant problems: [The] strategy based approach, intuitively for OCC, would have complicated the nightly margin calculation process to such an extent that, because everybody was increasing volume on the CBOE, we were worried that we would not be able to get the exercising assignment notices and the reports out in time,18 if we had to calcu-

18 Exercising assignment notices informed trading rms about the amount of daily margin they were required to pay.

Y. Millo, D. MacKenzie / Accounting, Organizations and Society 34 (2009) 638653

647

late margins for the entire market place. What they wanted you to do was to take large accounts with all sorts of positions and break them down into components, strategies, and minimize their margin requirements. Mathematically, it was an optimization problem that would have required iterative calculations. (H 2000) Unlike the premium-based method, in which every transaction was charged a pre-determined rate and hence was relatively straightforward operation, the strategybased method required a more arduous procedure. Each portfolio (typically including between 100 and 200 different options and stocks) had to be broken down to basic positions dened in the rule; for each of those positions a risk level19 (in the case of net capital requirements) or margin payment were determined and then the calculated amounts were summed up, producing the daily margin payment or the net capital requirement. Furthermore, because there were several possibilities for breaking down complex positions into simple ones, there also existed several alternative levels of margin payments. As a result, the clearinghouse had to perform an optimization process for each of the portfolios to determine the specic splitting of positions that would result in the minimal payment satisfying the rule. This optimization process had to be done nightly so that payments, in or out of the traders account, could be made the following morning before the beginning of trading. Given the amount of computing power needed for completing the nightly operational task on time and considering that computers in the mid 70s operated at a fraction of the speed of todays computers the pressure that margin calculation placed on the clearinghouse can be understood. Apart from the growing trading volumes, the increasing sophistication of option trading also made the performance of the optimization process increasingly difcult. For example, some strategies consisted of simultaneously buying and selling options from the same series, but with different strike prices. By creating such positions, traders covered a range of possible future stock prices, at, or near, expiration date. In order to calculate margins and net capital requirements, portfolios had to be broken down to basic positions. The more separate strategies in use, the more conditions and rules had to be built into the computer programs that performed the actual splitting into basic positions. One of the consequences of the increased demands from the clearinghouse was that it occasionally attempted to lessen the enormous workload by simplifying the strategies. Instead of following all the possible routes in order to nd the minimal net capital requirements in a complex portfolio, simpler positions were chosen and portfolios were charged accordingly. The division of market regulation was in charge of overseeing trading and clearing practices. As such, the division was in charge of applying the changes made in the net capital rule and also for designing, along with the self-regulatory organisations (the ex-

changes), new risk-evaluation methods. M, who was a senior attorney at the SECs division of market regulation from the early 70s to the mid 90s: . . .and then you have First Options [a large trading rm] who would have 800 large portfolios to clear and they [OCC] have to do it account by account. So it involves a lot of computing power. They would just say: Were not going to do that one. Well just ignore that strategy because it involves six more permutations. . . .And the market maker [trader] will get angry or would question them and say: Look. If Im doing it then my real risk is that and youre charging me for this. (M 2001) Such disputes were often not resolved between the clearinghouse and the traders. CBOE (being a self-regulatory organization20) were often called in to intervene and mediate. As options strategies became more complex, such disputes broke more often and this, in turn, added yet another burden on the SECs division of market regulation. The clearinghouses problems were reected, in part, by the SEC. Each time a new strategy was presented by the trading rms, the SEC had to examine it according to the net capital requirements to determine whether the proposed strategy complied with the rules denitions. During the mid to late 70s, many of the major American broker rms expanded into option markets, causing portfolios designs to become steadily more complex, and thus making the rules maintenance work more cumbersome and time-consuming. The rapid growth in options trading volumes created a situation whereby some personnel at the division of market regulation of the SEC spent much of their time adjusting the net capital rule to the ux of new portfolio strategies. This situation was a cause for much concern in the division of market regulation since one of the main purposes of the net capital rules 1975 amendments was to make the determination of net capital requirements more straightforward and efcient than before. M describes it as: Our role had gotten so complicated when strategies have constantly been replaced with other strategies. It has become very hard to function in that environment. No matter what you did, there would be another one [trading strategy]. (M 2001) As a result of the trends described above, concern was growing about the discrepancy between the sophistication of portfolio-construction methods displayed by trading rms and between the relatively crude risk-evaluation practices that were imposed by the regulator: I would hear [complaints about clearing], but what were we going to do? I mean, that was the rule. They [trading rms] were the ones who wanted the complicated strategies. I wasnt the one saying: I want you to do these complicated strategies. They wanted to do

19 Risk levels were expressed in the form of haircuts discounts applied to the original value of the positions. The riskier the position was, the larger the haircut was.

20 According to American securities law, national securities exchanges and their clearinghouses are considered self-regulatory organisations. That is, some of the monitoring, supervising and rule-making functions otherwise performed by the SEC are delegated, in the case of self-regulatory organisations, to departments within the organisation.

648

Y. Millo, D. MacKenzie / Accounting, Organizations and Society 34 (2009) 638653

them. They would, obviously, then have to do the work. (M 2001) That discrepancy was rooted in the different viewpoints that the various market participants (i.e. trading rms, the clearinghouse and the exchanges) held regarding the purposes of nancial risk management and hence, the practice-related nature of accuracy. From the regulatory point of view, risk management was intended to protect customers by collecting back up funds for the case of a loss, and such money was indeed collected by the clearinghouse. Since the funds were not expected to cover fully the losses in any case, the exact amount was of little signicance as long as it was above the set minimum.21 Therefore, for the SEC, an accurate measurement of risk was less important than the fact that net capital was collected and that the rule was followed. In contrast, from the traders point of view, sophisticated portfolio strategies were critical in reducing costs and achieving an advantage over competitors. Thus, for traders, a relatively inaccurate net capital rule would have undermined that purpose: there would not be much use to employ sophisticated strategies if those were treated as simple ones and incurred high net capital requirements. The aforementioned combination of factors high volume of trades, sophisticated strategies and a lagging regulator lead the clearinghouse in the late 70s to look for alternatives for the existing margin calculation mechanism. Financial risk management as useful operational tools In the early 80s, two of CBOEs prominent trading rms (Chicago Research and Trading (CRT), and OConnor & Associates) were using scenario-simulating risk management systems. When H and his team in the OCC started to examine alternatives for the strategy-based margin calculation system, they quickly encountered the new technology: I was going to grad school and one of my grad school teachers was also a CBOE market maker [trader] and he taught me options price theory and I started to talk to him. The idea was worth a try and we convinced the board [of the clearinghouse] that they should fund some study. [An external company] began to calculate potential theoretical values for us on a daily basis for all the options series for a one year period and internally we built this program that would calculate a margin requirement equal to the worst possible loss on a line by line basis. We ran that for a year, then we wrote another report to our margin committee. (H 2000) The system developed by the OCC applied a similar scenario-simulating principle to the one traders used to design trading strategies for the calculation of required margins. However, there was an important difference. While trading rms wanted to estimate the maximal daily loss in order to minimize it; the clearinghouse used the calculated gure as the required daily margin deposit. These

21 The minimum value of net capital for registered broker-dealers (after their rst year as broker-dealers) was set at $250,000 or as 6 2/3% of the total debts (SEC, 1975).

two different sets of uses mark the practice-related notion of accuracy. The basic technology underlying both the OCCs and the trading rms systems was similar, but the organisational settings and ultimate purposes were different. For the trading rms nancial risk management served as a communicative and managerial tool. It provided traders with a language with which they could talk meaningfully about risk and to plan in the increasingly complex environment of the market. In contrast, of OCCs purposes was to reduce the number of complaints regarding margins. OCC used nancial risk management to establish and maintain industrious silence with regard to the calculations of margins: intended both internally (improving the efciency of the calculation process) and externally (satisfying the demands of the trading rms). Again, as in the case of the trading techniques, accuracy was practicerelated. As the quote above shows, OCC tested the model-based system in comparison with the existing calculation method. The results of these tests were, and are still, condential, but interviewees comments revealed that from an accuracy perspective the performance of the model-based system was similar to that of the rule-based one. Nevertheless, the model-based system, which, as explained earlier was much more efcient in calculating margins than the current one, was chosen as a replacement. Hence, it was operational usefulness rather than superior accuracy that continued to pave the way for model-based nancial risk management. The different organisational structure of the OCC and the trading rms had also affected the way in which the historical narrative unfolded. In the late 70s and early 80s, the transition towards model-based practices took place mainly within the broker rms, and the clearinghouse joined in only later. In contrast, OCC was slow to implement model-based systems for a number of reasons. First, as options markets became more competitive, innovative practices that had the potential to improve the effectiveness of trading were eagerly adopted. Second, when the SEC approved options 1973 it only as part of a pilot programme (that lasted until and 1979), under which both CBOE and the clearinghouse were subject to close SEC supervision. The pilot programme status meant that for every change to the trading or clearing practices, prior approval from the SEC was needed. Trading rms that were not subject to such close SEC scrutiny and could implement trading systems much quicker. Third, CBOEs organizational structure was also related to the elaborated implementation process of the proposed risk management systems. H mentioned that he needed to receive approval from CBOEs margin committee to run a feasibility study and to report to this committee. After proposals were submitted to the committees, making a decision involved political manoeuvring and lobbying to ensure majority in the committees votes. By 1986, when OCC fully implemented its internal nancial risk management system (model-based margin calculation) model-based applications had already developed into a de factostandard for the communication and management of risk among trading rms and between them and the clearinghouse (Securities & Exchange Commission, 1986a; Securities & Exchange Commission, 1986b). Nevertheless, while the growing popularity of the nancial

Y. Millo, D. MacKenzie / Accounting, Organizations and Society 34 (2009) 638653

649

risk management based on the BlackScholesMerton model brought about a standardisation of risk communication, no consensus emerged regarding the accurate measures of risk. In effect, the ubiquity of model-based application facilitated the debate. For example, when traders negotiated their margin levels with the clearinghouse, which they did continually, the two parties had different opinions about the levels of risks embedded in the various options position. That is, although both sides knew that the same mathematical model was underpinning both the trading rms and OCCs applications, the main issue they tended to debate over was not about the universally accurate measure of risk, but about the practice-related risk measure. As the interviewees from both the trading companies and the clearinghouse repeatedly stated: each side was concerned about their risk and how much they were charging or paying for it. Consequently, both the clearinghouse and traders had stakes in the promotion of the model-based nancial risk management system. Traders based their coordinated trading activity on it and the clearinghouse found in the model-based margining system an answer to volume and complexity challenges. In contrast, the SECs point of view on the applications was different. While it was true that the increasingly popular options markets had brought about a signicant growth in the complexity of trading strategies and the fact each of these strategies had to be approved by the SEC increased its work burden, the net capital rule system functioned properly money was collected from brokers and investors, as far as the SEC could judge, were protected. While the clearinghouse and the traders were relying on model-based applications and were eager to extend their use, the common opinion at the SEC about the application was still sceptical. This is how things stood in the summer of 1987 when staff at OCC prepared for another round of discussions with the SEC, hoping to receive a regulatory approval for their proposed model-based calculation of the net capital rule. However, before these planned discussions could take place an event occurred that dramatically questioned the accuracy and the validity of nancial risk management systems based on the BlackScholesMerton model the October 1987 market crash. Financial risk management and the 1987 market crash By October 1987, risk management systems based on BlackScholesMerton were present in virtually all of the major trading rms ofces as well as in the options clearinghouse.22 On Monday 19 October 1987, American nancial markets experienced the worst one-day price drop in asset prices since October 1929. Since stock prices dropped sharply, options (which were designed to lessen the effect of such situations) were in extremely high demand (Brady, 1988). Furthermore, because many investors were selling stocks to try to cut their losses, price volatility reached reAnother pricing model, which is a variant of the BlackScholesMerton model was developed by Cox, Ross, and Rubinstein (1979), Ross (1977), Rubinstein (1994) and also gained signicant popularity during the time period described in this paper.
22

cord levels. Several interviewees told us that between the 19th and the 22nd of October 1987 BlackScholes-based applications did not calculate prices and risks correctly. In fact, in a few cases it was reported that the computer systems displayed call option prices that were higher than the market price of the stock for which the option was written (M 2001), which, of course, makes no economic sense. This last point refers directly to the practice-oriented nature of the techno-social network that performed risk management: the theoretical BlackScholesMerton model could not have produced such an effect; it was the interaction between model-based computerized applications and live traders that brought it about. Although this paper does not discuss the possible theoretical reasons why the models were not reliable in October 1987, it is clear that some of the basic premises on which the model was established were questioned, if not shaken, because of the events. Among the questionable assumptions was the validity of the hypothesis that prices followed a lognormal distribution. Based on the lognormal distribution is the assumption that the extreme events are very rare: colloquially, the lognormal distribution is said to have a thin tale.23 According to this assumption (among other things), the BlackScholesMerton model is being used to estimate the prices of options. On the 19th of October 1987, it appeared that the assumption about the lognormal distribution of prices did not hold. For instance, events that had very low probabilities and, thus, were expected to occur very rarely (i.e. once in a few decades) happened a few times a day (Rubinstein, 1994). For many market participants it became apparent that under such extreme conditions (for example, the NYSE dropped 21% on 19 October, its biggest one-day drop since the 1920s) model-based nancial risk management was not predicting risk accurately and thus could not help to manage risk appropriately. However, despite of the fact that model-based nancial risk management was not proved to be accurate during the crash, the discussions between OCC and the SEC continued, leading, eventually (in 1994), to the SEC granting a risk management system based on the BlackScholesMerton model based for the calculation of SECs net capital requirements (Securities & Exchange Commission, 1994, 1997). The system was dubbed TIMS Theoretical Intermarket Margining System.24 Considering the fact that for the better part of the 80s the market regulation division of the SEC did not approve such systems, one might ask what motivated the SEC to approve TIMS when it did, at the wake of the October 1987 market crash. To understand this step, it is necessary to examine the October 1987 crash and its effect on nancial risk management while taking into account the techno-social network of market participants that reacted to the events of October 1987. M was an assistant director in the SECs division of market regulation in the late 80s and early 90s; while in this position M headed the team that examined the OCCs sys-

The lognormal distribution, being one-sided, has one tail. The SEC issued a no-action letter about the use of TIMS in 1994 (Securities and Exchange Commission). The meaning of the letter was that no-action would be taken against bodies that used TIMS. The nal, unrestricted approval of the system was granted in 1997.
24

23

650

Y. Millo, D. MacKenzie / Accounting, Organizations and Society 34 (2009) 638653

tem that was dubbed TIMS. The examination of the system took several months between 1990 and 1991, in which time the SEC and the OCC conducted comparative performance tests between TIMS and of the existing strategybased calculation method. At the completion of the tests, it was concluded that TIMS provided more reliable and accurate results than those produced by the strategy-based system. That is, TIMS predicted daily gain/loss amounts that were closer to the actual market results than the ones determined by the strategy-based system. That said, does this mean that this nancial risk management was accurate? The period of the test was a time of relative calm in the markets and so the accuracy of TIMS was not tested during periods of extreme volatility such as those that existed in October 1987. The results meant that under ordinary market conditions TIMS would provide appropriate amounts of net capital, but what would happen in times of extreme market conditions? The SECs answer to this question was simple: [TIMS] is good for business purposes. Obviously, a businessman should know what his risk is from day to day. He should also have an idea of what the worst thing that could happen to him, more or less. [I]n the ordinary circumstance, not much capital is needed from day to day. You only need it in stress times. Stress times dont occur that frequently. So the model is always wrong! Because it will not give that stress capital. (M 2001) M, like many other market participants was aware of the fact that under extreme conditions BlackScholes Merton-based applications did not provide accurate results. Nevertheless, as the quote shows, from a regulatory point of view, it was more important to approve a nancial risk management system that was acceptable by virtually all market participants, albeit being unreliable under infrequent extreme conditions, than to have a system (strategybased rules) whose most market participants had complaints about its usefulness.25 This argument is rooted deeply in the practical intention behind the net capital rule. The rule was designed to protect customers from the possible adverse consequences of positions they did not explicitly intend to hold. That is, if a broker constructed risky positions using customers money without the direct intention of the customers and the positions resulted in a loss, the customers were entitled to compensation. However, in times of extreme volatility, when prices in the markets as a whole uctuate wildly, even conservative positions could be risky. In other words, the net capital rule was not designed to protect market participants from events of the type that occurred in October 1987. Therefore, from this perspective it was of little signicance that the model used in the rule was inaccurate when such events happened. When the SEC tested TIMS in the early 90s, Black Scholes-based applications had already served as the agreed-upon communicative and organizational basis for options trading and for the calculation of margins by the clearinghouse. The regulatory approval of TIMS, indicated
25 This duality brings to mind Bloors analysis (Bloor, 1978) of the reactions to social anomaly (comparing Imr Lakatos and Mary Douglas approaches).

not only that the preferences of the SEC regarding options markets changed, but also that a more fundamental change took place. The dominance of model-related practices in the options market environment had a signicant impact on the SECs perspective of the markets. In particular, the concept of the common businessman was inuenced by the awareness that the model had become the common language in the market. When M, the SECs senior employee, mentioned that a businessman should know what his risk is from day to day, he did not merely make a normative conviction that was based on the rules and the regulations of the SEC, but one that drew its power from a more general set of values. That is, market participants should use nancial risk management on a daily basis not because the system produces accurate results (at critical times it does not!) but because the different systems based on the model proved to be very useful. Financial risk management systems facilitated the growth of the market through their use by the trading rms (where they facilitated efcient organisational communication) and by the application by OCC (where it solved the bottleneck problem of calculating margins). The techno-social network of nancial risk management replaced accuracy with usefulness. The claim that TIMS, like other BlackScholes-based applications, was more important for the organizational and social role it played in the markets than for its accuracy is supported by yet another nding. Dissatised by the unreliable results that the BlackScholesMerton model produced under the extreme volatility of 1987, in the early 90s OCC developed a version of TIMS that did not depend on the BlackScholesMerton models lognormal distribution (Hinkes, personal communication). In this newer system, OCC made use of another set of statistical distributions, the fat tailed Lvy distributions with innite variance, that assigned extreme events like the ones of October 1987 a higher probability than the system based on the lognormal distribution. The margining system based on the stable Lvy distribution has the virtue that sudden increases in price volatility, because they are expected by the distribution, do not lead to sudden, and large increases in margin demands. The Lvy distributions, however, as had been widely known when OCC decided to use them in the margin calculation system, are not empirically accurate (Ocer, 1972; Hsu, Miller, & Wichern, 1974). Yet, a system using that distribution was much more likely to produce useful margin calculations under extreme conditions than the alarmingly high ones produced by the BlackScholesMertons lognormal distribution. Again, we see how nancial risk management prefers usefulness over accuracy. Conclusion As implied in the title of the paper, in many respects the story of the establishment of the BlackScholesMerton model simply marks the emergence of contemporary nancial risk management. The current dominance of the risk management methods results in the fact that nancial risk management is not limited only to nancial markets, but to a variety of managerial decisions, from the amounts paid as equity-based bonuses to corporate executives to

Y. Millo, D. MacKenzie / Accounting, Organizations and Society 34 (2009) 638653

651

the degree of risk associated with exploring new oil elds, are taken in collaboration with systems that include variants of the BlackScholesMerton model. Can we draw a lesson from the history of nancial risk management in early derivatives markets for other areas? Can a more general insight be derived from the specic historical case analysed in this paper? One possible direction is to examine the mechanisms through which knowledge dispersed from one realm of practice to another. The case shows that the small trading rms of early CBOE used model-based application (Fischer Blacks sheets) as a trading aid for the single traders. Then, when larger rms entered the options exchanges, modelbased applications served as tools for organisational communication. The clearinghouse, in turn, developed a solution to a technicaloperational problem using the model. Finally, the SEC approved an application that performed a regulatory function. This process of gradual dissemination of knowledge includes two elements. First, the transfer of knowledge regarding the BlackScholes Merton model was not simple diffusion of information, but was an interpretive process. The actors analysed the practices in which the model took part, distilled from them the features that could be useful in their realm of practice and employed those feature in the new set of applications. These useful features of the model, the bidirectionality of prices and quantied risks, were placed in a new techno-social set of conditions. However, the dissemination of knowledge related to the model included a second crucial element: the gradual accumulation of model-based technologies and practices, an accumulation that was not undermined by the realisation that the model was empirically inaccurate. Realising what the active ingredients of the model were, was not, on its own, a sufcient condition for cross-realm adoptions of the modelbased applications. It is true that market participants, such as the clearinghouse, tested the applications rigorously before they were put into service. However, it is unlikely that would have started the organisationally and politically costly process without strong indications from the trading rms that the model was useful for them. Similarly, the SEC would not have approved the model-based calculation of net capital with out having the clear realisation that both trading rms and the clearinghouse use it extensively. The combination of these two elements the bare-bones technical usefulness of the model and the social, emerging usefulness that derived from the fact that others also use it is responsible for the widespread development and adoption of model-based applications. While a discussion about the efcacy of the rst element, the inner structure of the model is beyond the scope of this paper, the strength of the social usefulness is highly related to the fact that nancial risk management emerged through a network of connections. The accumulation of implicit trust in the usefulness of the model, in spite of the different practices involved reminds of the phenomenon the American sociologist Burt (2005) denes as closure. Closure among two actors exists to the degree that the two have strong connections with other mutual actors and it tends to be positively correlated with the density of the network. When closure exists, the interdependency

created through the structure of connections tends to bring about trust or at least trust-like effects. Accordingly, we can hypothesise that for a nancial risk methodology to become successful and to spread across practice domains, the connections between the different actors need to be strong and the overall structure of connections has to be dense. This hypothesis implies the existence of a social organisational structure of a very different nature from the one commonly assumed to exist in nancial markets. While nancial risk management is connected frequently with procedure-based, utilitarian, arms length type of connections, the development of nancial risk management systems analysed here, when seen through the prism of Burts closure reveals a different type of dynamics: not only did the different organisational actors know each other well, but that they also trusted each others judgement about the usefulness of the systems. The development of practice-oriented trust across the different realms evokes another interesting theoretical concept that although not referring directly to the notions of accuracy and practicality, may still enrich our understanding about the historical process analysed in the paper. Bowker and Leigh Star (1999) analyze classicatory systems and trace the ways through which these were embedded into organizational infrastructures and eventually became part of the taken-for-granted organizational reality. Using detailed case studies, Bowker and Leigh Star describe how the networks of connections both within organizations and among them created and legitimized rules and practices. They refer to the technical and procedural entities that bridged between the different organisational actors as boundary objects (although it may be better to refer to them as boundary-spanning objects). Boundary objects, according to Bowker and Leigh Star are objects that can facilitate communication among several communities of practice and satisfy the informational requirements of each of them. Boundary objects are both plastic enough to adapt to local needs and constraints, yet robust enough to maintain a common identity across sites (p. 297). Thus, while not referring to accuracy issue, the concept of boundary object may help us to conceptualise the ways in which reputation about the model-based applications grew. The different market participants held widely varying perceptions with regard to risk, from which their different needs and constraint were derived. As the paper shows, nancial risk management developed into such a plastic medium that was able to accommodate different practices while allowing awareness about the common elements of the practices to evolve and strengthen the connections among the actors. Another area where the conclusions from this paper can be used as a basis for further conceptualisation is accounting research. The tension between accuracy and practice, inherent to the organisational application of expert knowledge, is highly relevant for accounting research. Whilst one of the fundamental principles of nancial accounting, for example, is the production of accurate and objective reports, the very same reports are used to plan and bring about changes at the organisations described in the reports and thus affect the potential validity and accuracy of future reports. This constitutive power inherent to accounting is also related to the fact that accuracy is nested within the

652

Y. Millo, D. MacKenzie / Accounting, Organizations and Society 34 (2009) 638653 Abolaa, M. Y. (1998). Markets as cultures: An ethnographic approach. In M. Callon (Ed.), The laws of the markets (pp. 6985). Oxford: Blackwell. Baker, W. E. (1984a). The social structure of a national securities market. American Journal of Sociology, 89, 775811. Baker, W. E. (1984b). Floor trading and crowd dynamics. In P. A. Adler & P. Adler (Eds.), The social dynamics of nancial markets (pp. 107128). Greenwich, Conn.: JAI Press. Barnhart, R. K. (1999). Chambers dictionary of etymology. London: Chambers. Beunza, D., & Stark, D. (2003). The organization of responsiveness: Innovation and recovery in the trading rooms of lower Manhattan. Socio-Economic Review, 1, 135164. Beunza, D., & Stark, D. (2004). Tools of the trade: The socio-technology of arbitrage in a wall street trading room. Industrial and Corporate Change, 13, 369400. Beunza, D., & Stark, D. (2005). Resolving identities: Successive crises in a trading room after 9/11. In N. Foner (Ed.), Wounded city: the social impact of 9/11 (pp. 293320). New York: Russell Sage Foundation Press. Black, F., & Scholes, M. (1972). The valuation of option contracts and a test of market efciency. Journal of Finance, 27, 399417. Black, F., & Scholes, M. (1973). The pricing of options and corporate liabilities. Journal of political economy, 637654. Bloor, D. (1978). Polyhedra and the abominations of leviticus. The British Journal for the History of Science, 11, 245272. Bowker, G., & Susan, L. S. (1999). Sorting things out: Classication and its consequences. London: The MIT Press. Brady, N. (1988). Report of the presidential task force on market mechanisms. Washington, DC. Brown, J., & Duguid, P. (2001). Knowledge and organization: A socialpractice perspective. Organization Science, 12, 198213. Burchell, S. C. C., & Hopwood, A. G. (1985). Accounting in its social context: Towards a history of value added in the United Kingdom. Accounting, Organizations and Society, 10, 381413. Burt, R. (2005). Brokerage and closure: An introduction to social capital. Oxford: Oxford Univ. Press. Clark, G. (2000). Pension fund capitalism. Oxford: Oxford University Press. Cox, J., Ross, S., & Rubinstein, M. (1979). Option pricing: A simplied approach. Journal of Financial Economics, 7, 229236. Fenton-OCreevy, M., Nicholson, N., Soane, E., & Willman, P. (2005). Traders: Risks, decisions, and management in nancial markets. Oxford: Oxford University Press. Fourcade, M. (2007). Theories of markets and theories of society. American Behavioral Scientist, 50, 10151034. Galai, D. (1977). Tests of market efciency of the Chicago board options exchange. Journal of Business, 50, 167197. Granovetter, M. (1985). Economic action and social structure: The problem of embeddedness. American Journal of Sociology, 91, 481 510. Granovetter, M. (1992). Economic institution as social constructions: A framework for analysis. Acta Sociologica, 35, 311. Holzer, B., & Millo, Y. (2005). From risks to second-order dangers in nancial markets: Unintended consequences of risk management systems. New Political Economy, 10, 223245. Hsu, D.-A., Miller, R. B., & Wichern, D. W. (1974). On the stable paretian behavior of stock-market prices. Journal of the American Statistical Association, 69, 108113. Hull, J. C. (2005). Options futures and other derivatives. London: Prentice Hall. Izquierdo, J. (2001). Reliability at risk: The supervision of nancial models as a case study for reexive economic sociology. European Societies, 3, 6990. Kalthoff, H. (2005). Practices of calculation: Economic representations and risk management. Theory, Culture and Society, 22(2), 6997. Kalthoff, H., Rottenburg, R., & Wagener, H.-J. (2000). konomie und Gesellschaft, Jahrbuch 16. Facts and gures: Economic representations and practices. Marburg: Metropolis. Knorr Cetina, K., & Bruegger, U. (2000). The market as an object of attachment: Exploring postsocial relations in nancial markets. Canadian Journal of Sociology, 25, 141168. Knorr Cetina, K., & Bruegger, U. (2002a). Global microstructures: The virtual societies of nancial markets. American Journal of Sociology, 107, 905951. Knorr Cetina, K., & Bruegger, U. (2002b). Inhabiting technology: The global lifeform of nancial markets. Current Sociology, 50, 389 405. Knorr Cetina, K., & Preda, A. (Eds.). (2005). The sociology of nancial markets. Oxford: Oxford University Press. Latour, B. (1988). Mixing humans and nonhumans together: The sociology of a door-closer. Social Problems, 35(3), 298310.

larger framework of relevancy and usefulness. Financial reports are aimed at various communities of practice (shareholders, prospective investors, creditors, etc.) and to be counted as accurate the data contained in the reports has to be relevant and useful to the particular group. Outside the groups particular frame of reference, the question of accuracy has little meaning, as the sets of denitional algorithms with which the group measures the reports accuracy are embedded within the groups practice-based epistemology (Brown & Duguid, 2001). This interplay between practices and accuracy begs the following question: if accuracy includes a fundamental practice-dependent dimension, then can the usefulness of a practice become a substitute for its lack of accuracy? For example, can a useful forecasting technique overcome the fact that the forecasts it produces tend to be inaccurate?26 These questions touch directly on the general question of nature of expert knowledge in and around organisations. As such, addressing these questions and analysing the processes that underpin them are central to the continuation of the project whereby accounting is conceptualised and understood as a social and organisational practice and in which the arenas where various aspects of this practice are mapped (Burchell & Hopwood, 1985). The conclusions of this paper, in turn, which analysed the emergence, ascendance and establishment of nancial risk management techniques, build on this research agenda and offer to extend it, through further research, into areas that hitherto were virtually dominated by nancial economics. Acknowledgement The authors are grateful to the interviewees who committed their time, expertise and experience to us. Special thanks go to Mike Power and Bridget Hutter for their support and encouragement. The following people commented on earlier versions of the paper and offered insightful comments (all remaining mistakes are, of course, ours): Daniel Beunza, Asaf Dar, Stephen Dunne, Manu Haven, Peter Levin, Geoff Lightfoot, Simon Lilley, Roi Livne, Andrea Mennicken, Anette Mikes, Peter Miller, Fabrizio Panozzo, Martin Parker, Martha Poon, Alex Preda, Nick Prior, Uri Schweid, David Stark, Balzs Vedres, Dani Vos, Josh Whitford, Yuval Yonay and Ezra Zuckerman. In addition, previous versions of the paper were presented in seminars at the London School of Economics, University of Edinburgh, Universit Ca Foscari, Columbia University, University of Haifa and University of Leicester where more comments were received. Finally, we would also like to thank the anonymous AOS reviewers for their focused and constructive comments, which improved this paper greatly. All remaining errors are ours.

References
Abolaa, M. Y. (1996). Making markets: Opportunism and restraint in wall street. Cambridge, MA: Harvard Univ. Press.

26 A similar analytical conceptual of the connection between practice and validity is offered by Macintosh, Shearer, Thornton, and Welker (2000, pp. 4045), who use Baudrillards simulacra as their main theoretical concept (we thank the AOS reviewer for the reference to this paper).

Y. Millo, D. MacKenzie / Accounting, Organizations and Society 34 (2009) 638653 Latour, B. (2005). Reassembling the Social: An introduction to actor network-theory. Oxford: Oxford University Press. Law, J. (1991). Introduction: Monsters, machines and sociotechnical relations. In J. Law (Ed.), A sociology of monsters: Essays on power, technology, and domination (pp. 125). London: Routledge. Lpinay, V.-A. (2007). Decoding nance: Articulation and liquidity around a trading room. In D. MacKenzie, F. Muniesa, & L. Siu (Eds.), Do economists make markets? On the performativity of economics (pp. 87127). Princeton, NJ: Princeton University Press. Macintosh, N. B., Shearer, T., Thornton, D. B., & Welker, M. (2000). Accounting as simulacrum and hyperreality: Perspectives on income and capital. Accounting, Organizations and Society, 25, 1350. MacKenzie, D. (2006). An engine, not a camera: How nancial models shape markets. London: The MIT Press. MacKenzie, D. (2009). Material markets: Facts, technologies, politics. Oxford: Oxford University Press. MacKenzie, D., & Millo, Y. (2003). Negotiating a market, performing theory: The historical sociology of a nancial derivatives exchange. American Journal of Sociology, 109(1), 107145. McDonald, R. L. (2006). Derivatives markets. Addison-Wesley. Merton, R. C. (1973). Theory of rational option pricing. Bell Journal of Economics and Management Science, 4, 141183. Miller, P., & OLeary, T. (2007). Mediating instruments and making markets: Capital budgeting, science and the economy. Accounting, Organizations and Society, 32(78), 701734. Millo, Y. (2007). Making things deliverable: The origins of index-based derivatives. In M. Callon, Y. Millo, & F. Muniesa (Eds.), Market devices. Oxford: Blackwell. Muniesa, F., & Callon, M. (2007). Economic experiments and the construction of markets. In D. MacKenzie, F. Muniesa, & L. Siu (Eds.), Do economists make markets? On the performativity of economics. Princeton, NJ: Princeton University Press. Ocer, R. R. (1972). The distribution of stock returns. Journal of the American Statistical Association, 67, 807812. Polanyi, K., & MacIver, R. M. (1957). The great transformation. Boston: GowerBeacon Press. Power, M. (2007). Organized uncertainty: Designing a world of risk management. Oxford: Oxford University Press. Preda, A. (2001a). The rise of the popular investor: Financial knowledge and investing in england and france, 18401880. Sociological Quarterly, 42, 205232. Preda, A. (2001b). In the enchanted grove: Financial conversations and the marketplace in England and France in the 18th century. Journal of Historical Sociology, 14, 276307. Preda, A. (2001c). Sense and sensibility: Or, how should social studies of nance behave, a Manifesto. Economic Sociology: European Electronic Newsletter, 2/2, 1518. Preda, A. (2004). Informative prices, rational investors: The emergence of the random walk hypothesis and the nineteenth-century Science of Financial Investments. History of Political Economy, 36, 351386.

653

Preda, A. (2006). Socio-technical agency in nancial markets: The case of the stock ticker. Social Studies of Science, 36, 753782. Ross, S. (1977). The Capital asset pricing model CAPM, short-sale restrictions and related issues. Journal of Finance, 32, 177184. Rubinstein, M. (1994). Implied binomial trees. Journal of Finance, 69, 771818. Securities and Exchange Commission. (1975). Securities Exchange Act of 1934, Rule 15c3-1. Code of Federal Regulations. 17 part 240 . Securities and Exchange Commission. (1978a). The SEC Speaks. Securities and Exchange Commission. (1978b). Report of the Special Study of The Options Markets to the Securities and Exchange Commission. Securities and Exchange Commission, Washington, DC. Securities and Exchange Commission. (1980). The SEC Speaks. Securities and Exchange Commission. (1986). Self-Regulatory Organizations; Options Clearing Corp.; Order Approving Proposed Rule Change. Release No. 34-23167; File No. SR-OCC-85-21. 51 FR 16127. Securities and Exchange Commission. (1986a). Self-Regulatory Organizations; Options Clearing Corp.; Order Approving Proposed Rule Change. In Release No. 34-23167; File No. SR-OCC-85-21, Vol. 51 FR 16127. Securities and Exchange Commission. (1986). Self-Regulatory Organizations; Options Clearing Corp.; Proposed Rule Change. Release No. 34-22844; File No. SR-OCC-85-21. 51 FR 4257. Securities and Exchange Commission. (1986b). Self-Regulatory Organizations; Options Clearing Corp.; Proposed Rule Change. in Release No. 34-22844; File No. SR-OCC-85-21, Vol. 51 FR 4257. Securities and Exchange Commission. (1994). Release No. 33761 (Proposing Release). Washington, DC. Securities and Exchange Commission. (1997). Release No. 34-38248; File No. S7-7-94. Washington, DC. Seligman, J. (1982). The transformation of wall street A history of the securities and exchange commission and modern nance. Boston: Houghton Mifin. Stulz, R. (2002). Derivatives and risk management. Thomson Learning. Thrift, N. (1994). On the social and cultural determinants of nancial centres: The case of the city of London. In S. Corbridge & R. Martin (Eds.), Money, Power, and Space (pp. 327355). Oxford: Blackwell. Uzzi, B. (1996). Embededdness and economic performance: The network effect. American Sociological Review, 61, 674698. Uzzi, B., & Gillespie, J. (2002). Knowledge spillover in corporate nancing networks: Embeddedness, network transitivity and trade credit performance. Strategic Management Journal, 23, 595618. Uzzi, B., & Lancaster, R. (2003). Relational embeddedness and learning: The case of bank loan managers and their clients. Management Science, 49, 383399. Zaloom, C. (2003). Ambiguous numbers: Trading technologies and interpretation in nancial markets. American Ethnologist, 30, 258272. Zaloom, C. (2004). The productive life of risk. Current Anthropology, 19, 365391.

Das könnte Ihnen auch gefallen