Sie sind auf Seite 1von 8

Metadata, controlled vocabulary and directories: electronic document management and standards for records management

Alistair Tough and Michael Moss

Introduction
Conventional conceptions of records management have been based on the ``life cycle'' which in recent years has been taken as implying that records managers only become involved ``downstream'' and have no necessary involvement in system design and creation which has typically been left in the hands of others[1]. This has been challenged by the ``records continuum'' concept and the latter was explicitly the basis on which AS 4390, the Australian Standard for records management, was predicated[2]. AS 4390 (1996) states that the term records continuum:
Refers to a consistent and coherent regime of management processes from the time of the creation of the records (and before creation, in the design of recordkeeping systems), through to the preservation and use of records as archives.

The authors Alistair Tough is a Senior Research Fellow in the Humanities Advanced Technology and Information Institute, and Michael Moss is Professor of Archival Studies, both at the University of Glasgow, UK. Keywords Document management, Record keeping, Data storage Abstract The authors argue that the development and use of elaborate embedded directory structures or file plans, derived from functional analysis, should be a key component in the future development of the discipline of records management. Directory structures thus conceptualised are explicitly intellectual constructs and their construction will require considerable effort, particularly if they are to be portable. Their greatest advantage is that they provide a coherent schema from which to derive folder/file names that can be embedded in metadata. One of the major challenges is to design systems that derive metadata from the directory structure or file plan and attach them automatically to documents at the point of creation, thereby minimising the need for human intervention and opportunities for human error. Electronic access The Emerald Research Register for this journal is available at http://www.emeraldinsight.com/researchregister The current issue and full text archive of this journal is available at http://www.emeraldinsight.com/0956-5698.htm

In a digital era, any debate about which of these approaches best suits hard copy systems has been rendered irrelevant. If records created electronically are to be managed effectively and retained as is operationally necessary then records managers do need to adopt the ``records continuum'' approach as a theoretical underpinning for their work or IT professionals will do it for them, if they are not already doing so. Since an increasing range of records is either open to public inspection or can be ``discovered'' by the courts, this concept must inevitably impinge on management theory and practice. This is well-recognised among information professionals but, as yet, rarely acknowledged by top management. It is depressing to read Information as an Asset The Board Agenda (KPMG IMPACT, 1995), the report of the Hawley Committee to find no input from the information professions apart from those with a background in IT and no explicit recognition that authenticated information is frequently a by-product of business transactions.

EDM, FTR and the WWW


Many EDM (electronic document management) vendors with managers of IT infrastructure in their sights are currently promoting ``off the peg'' systems that rely primarily on the use of automated indexing for the purpose of retrieval. The basic 24

Records Management Journal Volume 13 . Number 1 . 2003 . pp. 24-31 # MCB UP Limited . ISSN 0956-5698 DOI 10.1108/09565690310465713

Metadata, controlled vocabulary and directories

Alistair Tough and Michael Moss

Records Management Journal Volume 13 . Number 1 . 2003 . 24-31

proposition is that structuring of records systems is no longer necessary. Instead of being placed in context by means of a filing plan, all documents can be treated as discrete units. Then appropriate index terms can be attached to and/or derived from the documents for purposes of retrieval and use which cannot of itself guarantee structured retrievals. This is, in essence, the application of the classic ``discrete unit'' methodology of librarianship to records where items relating to one subject are simply referenced and presented to the user as a random collection of related entities often only bound together by a single co-ordinate[3]. They may be arranged sequentially but rarely subdivided by the nature of the content[4]. In computational terms it is classic graph theory where a multiplicity of entities can be made to relate to each other in an unstructured fashion. This writ large is the philosophy which underpins the World Wide Web (WWW) and demands very powerful algorithms to facilitate searching. The Achilles' heel of this approach lies in the welter of ``hits'' that it is likely to produce. For instance, consider a paper-based filing system consisting of 500 files containing 200 documents on average and then imagine what that translates into as an EDM system of the type under consideration. We get the following equation (allowing for ten index terms per document): 10 terms 200 documents 500 files = 1,000,000 terms. These 1 million terms will not all be different, of course, and therein lies the problem. A search for a frequently occurring term is liable to produce thousands of ``hits'' and this is more likely to swamp the end user than be of any help. The addition of free text searching to the end user's armoury is unlikely to make this situation any better and such utilities notoriously fail to represent items nested within a hierarchy. Full text retrieval (FTR), while unquestionably a valuable tool, is limited by its reliance on natural language because different record creators inevitably use different terms to carry the same or closely related meanings. In addition, the growing use of Web-based search and retrieval techniques by knowledge management (KM) systems is also a less than wholly satisfactory approach when viewed from a records management perspective. To understand why this is the case, it may be helpful to think of 25

KM systems as searching internal and external knowledge bases (a terminology employed by their creators). The use of hypertext links and mark-up language by KM systems to enable external searches of the World Wide Web self-evidently can and does work well in the short term. Sufficient numbers of appropriate hits are made with sufficient speed to satisfy the information needs of end users. However in the longer term such short cuts will fail unless practices and procedures are in place (and they rarely if ever are) to ensure that addresses are constantly updated and more importantly removed when the content of the target site ceases to contain relevant references. Remarkably in the commercial world KM systems look suspiciously like surrogate libraries, which they often replaced, but are managed by those from an IT background with little understanding of the skills of the librarian. It is worth remembering that Internet users are not concerned to achieve comprehensive coverage. They merely want sufficient information on a given subject to meet their needs. They do not want or require all of the available information on the subject and in most instances could not cope with it if it were made available. This is in clear contrast with records management systems where accurate and complete discovery of all records relating to a particular transaction, decision or case is an essential feature. As we argued above, where records are open to public inspection or to discovery through the courts this has significant implications. Neither Web-based KM systems nor full text retrieval can meet these requirements, although each may be a useful element in a comprehensive information and records management system. Hendley (2001) has described this situation in the following terms:
On the minus side, the Web itself offers very few facilities for managing collections of documents . . . There is no inherent provision for creating hierarchical structures . . .

Or, as Carroll (2001) has expressed it:


The WWW protects the user from the need to know where information is physically stored and allows authors and users to set up, manage and navigate information bases that span different locations, resources and organisations.

Glasgow University's effective records management (ERM) prototype and Iron Mountain's Digital Archive package,

Metadata, controlled vocabulary and directories

Alistair Tough and Michael Moss

Records Management Journal Volume 13 . Number 1 . 2003 . 24-31

currently under development, adopt a more sophisticated approach. Both rely on metadata to tag individual documents in such a way as to maintain essential linkages between documents and the functional/ transactional context of their creation (see Bennison, 2001; Currall et al., 2001). In the Glasgow system, which is designed specifically to handle committee transactions, the linkages between committee minutes, agendas, and papers submitted for consideration by the committee are captured by attaching the appropriate metadata at the point of creation (Currall et al., 2001). The creators acknowledge, however, that the pressures on committee clerks to meet deadlines and distribute documents on schedule may militate against the comprehensive capture of metadata, including contextual information, on which their approach depends. Moreover it is inevitably expensive and the costs must be traded off against the benefit of improved access. Beyond systemic pressures that may vitiate absolute reliability in the use of metadata, there is the additional factor of human error. Even a 1 per cent error rate could have potentially important consequences. This is especially the case for any system that has no internal structure to limit the consequent difficulties of attempting retrieval. This problem is very evident from the enormous problems encountered in normalising roll call data even in small organisations. Without sufficient co-ordinates it is impossible to determine if two instances of the same name relate to the same individual in sets of transactional records. Human users can often tell from the context in which the record is held but a machine cannot unless the relevant metadata are present. It would be foolhardy for any organisation to hold such data (often the bulk of its records) unless hardwired together in a database with sufficient co-ordinates and unique identifiers to be certain that individuals could be unambiguously distinguished.

Directories as file plans


In our opinion, in this context, there is a strong argument for transporting physical directories of files and filing systems into the electronic medium. As the term directory structure is sometimes associated with the 26

information architecture of a local area network, we should emphasise than we envisage a broader more catholic concept in which functions, activities, processes, projects and other record series can be hierarchically represented. Since in the digital world the names of files and directories cannot be made secure (witness the way documents attached to e-mails are arbitrarily named), this directory structure must be embedded within the metadata. In computational terms this is a ``tree'' and these can of course be combined with graphs to take advantage of the best of both worlds (Kingston, 1998). Potentially the adoption of this approach alongside (not instead of) other approaches has considerable benefits. Firstly, where an EDM system based primarily on the use of automatically generated index terms runs the risk of creating a digital lake, embedded directory structures can create a series of navigable pools. The end user can choose to limit his search to particular parts of the system thereby reducing the number of hits to manageable proportions. Similarly, in a system based on the use of metadata, the difficulties of recovering from the loss of documents and of contextual data consequent on human error and systemic pressure can be substantially reduced. Embedded directory structures have the potential to serve as an effective damage control mechanism, as a tool for making searches more precise and more productive, and to provide a navigational facility that will support comprehensive retrieval. We would argue that there are strong historical precedents for advocating electronic records management systems that utilise directory structures embedded in metadata in conjunction with automatic indexing. The centralised registry systems that were used by the great departments of state in both the UK and Germany until the First World War provide a model to emulate. Every record was allocated a unique reference and the registers contained subject indexing terms and metadata relating to both provenance and related documentation. The series of which the individual registers formed part provided a precursor of directory structures. Contemporary users held this system in high regard and it was only abandoned because the abrupt increase in transactions and in the paperwork generated by transactions occasioned by the war overwhelmed it (see, for example, Orbell, 1991).

Metadata, controlled vocabulary and directories

Alistair Tough and Michael Moss

Records Management Journal Volume 13 . Number 1 . 2003 . 24-31

It survived, however, in accounting systems, which at one level are an elaborate form of registry permitting the certain location of the underlying documentation (vouchers, invoices and so on) which support individual transactions and positioning them within relevant hierarchies. When such systems were automated the ``registers'' (the ledgers, journals, cash books and so on) disappeared and were replaced by robust embedded indices, which were supported by codes, themselves hardwired to directories to each transaction in the database. These codes are intended to guarantee that ledger and journal entries can be recreated along with reconciliations, trial balances, profit and loss accounts and all the other necessary elements of accounting systems[5]. Archivists and records managers with their anathema for keeping so much detail have largely failed to learn from this experience. This is nowhere more evident than in the model requirements specification for electronic records (MoReq)[6] which calls for the use of files and file volumes without apparently appreciating that in the paper world as much as in the electronic they are intellectual constructs in the same way as ledgers and journals are. As a result the MoReq specification approaches the problem from the top down rather than the bottom up, overlooking the fact that many transactions will inevitably have a one-to-many relationship with files. Although this is admitted in paragraph 3.4.13, the inference of the use of pointers to avoid multiple entries suggests that the authors are still locked in a ``file mentalite'' rather than one driven by content analysis. For example letters more often than not make reference to more than one subject or impinge on a number of areas of activity and are therefore filed in multiple copies and each copy (original and surrogates) in a physical filing systems has equal value. This has never been permissible in accounting systems as one authentic copy of the transaction, which can be verified, has to be preserved for audit purposes. As a result elaborate cross-referencing had to be employed within the cash books, journals and ledgers. If MoReq had started from this assumption then it would have recommended functionality equivalent to accounting systems which would guarantee the creation of structured files from the records of individual transactions. Such an approach 27

would meet the MoReq requirements and not militate against selective destruction as that is something accountants have to live with as their professional bodies recommend differential periods to comply with different legislative regimes.

Standards
Both the International Standard (BS ISO 15489, 2001) and the Australian Standard (AS 4390,1996) give cause for concern when considered in these terms. The Australian Standard, written in the mid 1990s and presumably derived from experience with hard copy records systems, did not define the term ``file'' and the concept of filing systems appeared to be largely absent from it. Instead AS 4390 was concerned mainly with ``documents''. Documents were defined as:
. . . structured units of recorded information, published or unpublished, in hard copy or electronic form, and managed as discrete units in information systems (AS 4390.1,1996, section 4.12).

Documents, thus defined were treated as being virtually synonymous with ``records''. The apparent conceptualisation of documents as ``discrete units'' implicitly negates the long established principle that records derive much of their significance from the manner in which they are related to other records by provenance and series. This apparent weakness was compounded by the bald definition of record keeping systems as:
. . . information systems which capture, maintain and provide access to records (AS 4390.1,1996, section 4.20).

This seeming abandonment of archival methodology in favour of library methodology is particularly worrying in view of the developments that are taking place in EDM and KM systems. The real world does, however, intrude into the Australian Standard. Although the concept of the file plan was not explained, it is conceded that:
Where the primary record-keeping system of an organisation . . . is based on paper files, all records of substantive business should be attached to relevant files in that system . . . (AS 4390.3,1996, section 8.3.1).

How this statement relates to the description of record keeping systems that precedes it is unclear[7]. Moreover it does not go far

Metadata, controlled vocabulary and directories

Alistair Tough and Michael Moss

Records Management Journal Volume 13 . Number 1 . 2003 . 24-31

enough in explaining that archivists and records managers play a crucial fiduciary role in selecting and retaining such records. They are responsible (as they always have been) for guaranteeing and verifying authenticity and may be required to do so in a court of law, making them more akin to auditors than librarians. The deficiencies of the Australian Standard were, however, more apparent than real. Like the Liberalism of Mr Gladstone, the Standard becomes more readily comprehensible when its foundations are fully understood[8]. Behind AS 4390 lies the design and implementation of record keeping systems (DIRKS) methodology and alongside that lies the Keyword AAA approach to functional analysis and classification. AS 4390 incorporated an abbreviated version of DIRKS, setting out the various stages of analysis, design, implementation and evaluation. A fuller version, with additional explanatory material, is available from the National Archives of Australia Web site[9]. A full implementation of DIRKS calls for a comprehensive functional analysis of the organisation for which a record keeping system is being devised. One of the essential products of this functional analysis is a classification scheme that serves as the basis for titling files and directories. In other words, although the concept of the file plan appeared on the surface to be absent from AS 4390, in reality it implicitly informs the whole methodology. One reason why this is obscure, at least when seen from a distance, is that the lynch pin of the classification schemes being developed in Australia Keyword AAA is a proprietary product for which users must buy licences. Following the adoption of AS 4390 and of the DIRKS methodology, the State Records Authority of New South Wales acted as lead body for the production of a classification scheme for those functions that are common to most public sector organisations. This is Keyword AAA[10]. To be fully effective in a particular agency it has to be supplemented by a classification scheme for those functions that are domain-specific to that agency. This is produced by following the DIRKS methodology. Keyword AAA consists of 17 functional keywords and a much larger number of activity and subject descriptors akin to the coding schemes so long familiar to accountants. These are supplied to licensed 28

users with explanatory material to assist users in implementation. Further information can be found on the Web site of the State Records Authority of New South Wales (State Records Authority of New South Wales, 1998). It is clear that in many respects Australia is leading the world in devising and implementing records management systems that can cope with the challenges of the twenty-first century. A DIRKS user group has been established[11]. Although it is based in Australia, the participants come from around the world. In part, at least, the formation of the users' group reflects the difficulties that are being encountered in implementing AS 4390/ISO 15489 compliant systems. On the evidence of messages posted to the list-serve of the Records Management Association of Australia, resistance on the part of end users to a hierarchical system is one significant problem[12]. There is also some evidence that a shortage of fully trained records management staff is hampering progress. The International Standard ISO 15489 began life as a derivative of the Australian Standard, so it is hardly surprising that it too is, or appears to be, deficient in respect of the role of file plans and directory structures. Given that ISO 15489 is addressed to the wider world, unlike AS 4390, the failure to identify explicitly its intellectual underpinnings and to consider their portability and applicability beyond Australia is unfortunate. Like its precursor, ISO 15489 contains no definition of the term file. The potential value of embedded directory structures in handling electronic records is scarcely mentioned nor is the challenge of replicating the hierarchical structure (with nested elements) of such systems. As the Glasgow ERM report makes clear this is technically challenging and cannot be solved by simply creating virtual files ``on the fly'' using so-called powerful search engines. The response of the IT community to such complaints is ``give us more money'' but with the collapse of the telecommunications media and technology (TMT) sector this is hardly likely unless there can be some certainty of a solution. There are strong theoretical grounds for questioning any such claims as nested items are often only differentiated by implicit rather than explicit reference, which can only be distinguished by the use of powerful thesauri. Although some prototypes exist, such as Cooperative Online Resource Catalog

Metadata, controlled vocabulary and directories

Alistair Tough and Michael Moss

Records Management Journal Volume 13 . Number 1 . 2003 . 24-31

(CORC)[13], there is no agreement among information professionals about the standard to be adopted let alone how they could be employed to reduce this difficulty.

The way forward


We envisage the use of elaborate embedded directory structures or file plans, derived from functional analysis of the kind now being carried forward in Australia, as being a key component in the future development of the discipline of records management. Directory structures thus conceptualised are explicitly intellectual constructs. Their construction requires considerable effort particularly if they are to be in any sense portable. Their greatest advantage is that they provide a coherent schema from which to derive folder/file names which can be embedded in the metadata. One of the major challenges is to design systems that derive metadata from the directory structure or file plan and attach them automatically to documents at the point of creation, thereby minimising the need for human intervention and opportunities for human error. EDM systems vendors have already made some progress in aligning information systems to functions, particularly in respect of routine processes and project management systems (Hendley, 2001). There is, however, considerable further scope for developing this approach, for example by using templates that themselves incorporate appropriate metadata forms. By working ``with the grain'' of established practice, such templates can directly serve end users and ensure the implementation of hierarchical directory structures. In other words, by opting to download such templates provided to assist them in their daily work end users will unconsciously be opting to assign appropriate metadata to the documents they produce. The Glasgow CDocs protocols have already made some progress in this direction and there are other examples in the commercial world. Success in promoting record keeping systems that will genuinely deliver information as a corporate resource and underpin knowledge management is not just a matter of information theory. Building and maintaining coalitions of support from end users and top managers will be crucial. Over recent decades many people in both the 29

public and private sectors have become accustomed to operating decentralised records systems, sometimes even personal systems. Any shift to integrated institutionwide systems will meet with resistance unless supported by senior management. In part this is because some people have learnt to hoard information and to derive power from restricting access to it. Equally importantly, all significant change is resisted simply because most people derive comfort from working with systems they already know. We do not wish to consider change management or leadership theories in general here, there are plenty of good books on the subject (see, for example, Kotter, 1999; Reed, 2001) but we do need to acknowledge that there are issues that will have to be faced. One particular issue is the initial investment in analysis and systems design that a good and comprehensive records management system necessarily requires. In an era of severe budgetary constraints and short-term decision making, advocacy of long-term solutions may be difficult. At the very least we need to be prepared to demonstrate some quick win outcomes from our preferred solutions. In this respect it is interesting to observe that sales staff for the TRIM software package that has been built on the DIRKS methodology do not dwell on the initial investment of staff time and effort that a thorough implementation of their product necessitates. We also need to be aware that many end users have an intuitive preference for subject-based approaches over functional and activity-based approaches. There is much evidence that individualised record keeping systems can no longer be justified for the simple reason that there has been an increase in regulation. Supervisory authorities, such as the Financial Services Authority in the UK or the much older Securities and Exchange Commission in the USA, have wide investigatory powers. Experience suggests that audits conducted by such bodies focus on compliant record keeping which in itself demands uniformity in practice and procedures within an organisation. Auditors typically now consider as part of their brief the oversight of the mechanisms for retaining and destroying records, which might be open to outside inspection or which might give rise to costly litigation (Strathern, 2000). The unfolding Enron/Anderson scandal suggests that auditors who fail to insist on proper record

Metadata, controlled vocabulary and directories

Alistair Tough and Michael Moss

Records Management Journal Volume 13 . Number 1 . 2003 . 24-31

keeping are imperilling their own business prospects. With the growth of the so-called knowledge economy (this is not the place to discuss that topic in depth) many organisations actively encourage the sharing or ``unlocking'' of information to secure competitive advantage (Douglas, 1995). Sharing of course implies effective and easy-to-use navigational and retrieval instruments which in our view demand structured directory systems which must (given the rapidity with which people change jobs) be portable. Moreover investors are increasingly keen to see such ``shared'' information valued in the balance sheet which confirms the need for effective management (KPMG IMPACT, 1995). On the other hand there are countervailing pressures to make the core strategic records of an organisation opaquely inaccessible. Among large companies it is common practice to use code in the discussion of acquisitions and disposals for fear of leaks or investigation by external regulators. Likewise strategic decisions are now often taken on the basis of runs of key performance indicators (KPIs). These are usually left to speak for themselves with no accompanying discourse. Consequently the volume of records created at the core of an organisation tends to be diminishing while transactional records are ballooning. Moreover custody of these ``critical'' records is often not entrusted to the records manager or archivist but to the chief administrative officer who has in any event always had ultimate fiduciary responsibility both in the public and private sector. What is missing from much of the records management rhetoric is how information professionals are to regain access (if they ever had in the first place) to the decision-making process in the creation of information systems which transcend the merely transactional. This issue is neatly expressed in section 3.7.3 of the Hawley Report[14] (KPMG IMPACT, 1995):
It is now common, if not universal, practice for Boards to receive annual or six-monthly reviews of the information systems strategy and particularly to review the benefits that are accruing from the investments made. There is also focus upon the comparative performance of competitors or other similar organisations with information systems and on the capability of the organisation to identify the opportunities to exploit information technology and deliver the strategy proposed.

It is hard to see how the records management community can break into such a cycle except by raising the standard of both argument and practice. In so doing records managers and archivists in the anglophone world are ill served by their utilitarian approach[15]. We need to develop robust theoretical models which emphasise that aggregations of records are intellectual constructs and not entities in themselves and to consider how these constructs can have a functional place in digital systems. Tyacke (2002), the Keeper of the Public Record Office in the UK, makes a powerful attack on the utilitarian in her essay ``Archives in the wider world the culture and politics of archives'' and with the imperative of the digital firmly in view calls for archivists to:
. . . re-define our definitions of the record and its ``record-ness'' what custody, authenticity and provenance mean and build in both the legal and procedural frameworks necessary at the point when the digital systems and their consequent records are created.

Rising to such a challenge will take the records management and archive profession into unfamiliar and difficult terrain and inevitably a war with, at least, some of the IT professionals. This struggle is already being successfully waged by the cognitive psychologists, who have readily drawn on their theoretical armoury to challenge many of the assumptions of the new technologies (see, for example, Gackenbach, 1998). Let us join the fray.

Notes
1 The ``life cycle'' concept did not, of course, emerge in a vacuum. Mander has elucidated the ways in which the structural alignment of archival responsibilities with cultural and educational functions rather than the central administration has tended to militate against pro-active records management (Mander, 1989). Slater (1990) has demonstrated that, even with the records management function located in the central administration, the existence of a large backlog of unprocessed records can dictate a reactive approach. However, Penn (1993) reminds us that the ``life cycle'' concept originally did embrace system design, not least via forms control. The records continuum model too emerged from concrete experience not least dissatisfaction with the evident drawbacks of the ``life cycle'' model. One of the pioneers in conceptualising a new approach was Derek Charman (see Charman, 1980). 2 See Flynn, (2001); AS 4390 (1996). However, the persistence of life cycle thinking is demonstrated by

30

Metadata, controlled vocabulary and directories

Alistair Tough and Michael Moss

Records Management Journal Volume 13 . Number 1 . 2003 . 24-31

5 6

9 10 11 12 13 14 15

Parker (2000) a fuller version of which can be found at http://www.kcl.ac.uk/projects/srch/reports/ reports.htm The drawbacks of this approach can be seen in the results of the various attempts to unify library catalogues using Z39.50 protocols which often result in multiple copies of the same imprint appearing as separate discrete items because of the idiosyncrasies of individual catalogues albeit operating within the same rules. There is, of course, an important distinction between adopting the principles and adapting the tools and products of librarianship. See, for example, Reed (1985) and Tough (1985). Any accounting textbook will explain how such systems work, but see for example Lee and Parker (1979). This can be found at <www.cornwell.co.uk/ moreq>MoReq specification (accessed December 2001) or < www.cornwell.co.uk/ moreq%20Specification%20v5-2.1.doc> (accessed June 2002). ``Record-keeping systems include (a) both records practitioners and records users; (b) a set of authorized policies, assigned responsibilities . . . procedures and practices; (c) policy statements, procedures manuals, user guidelines . . . ; (d) the records themselves; (e) specialized information and records systems used to control the records; and (f) software, hardware and other equipment, and stationery'', AS 4390.3 (1996) section 6.2.1. William Ewart Gladstone (1809-1898), British Prime Minister, was a devout Christian and his interpretation of Liberalism depended on the underpinnings of Christian ethics. See Jenkins (1995) and Morley (1903). The DIRKS methodology, available at: www.naa.gov.au/recordkeeping/dirks/ summary.html (accessed June 2002). Keyword AAA, available at: www.records.nsw.gov.au/publicsector/rk/aaa/ keywordaaa.htm (accessed June 2002). see dirks@naa.gov.au rmaa-list@echidna.stu.cowan.edu.au http://corc.oclc.org/ and in Europe http:// corc.uk.oclc.org (accessed June 2002). The Hawley Committee had a membership of 30 major organisations, mainly businesses. This is all too evident from the tediously familiar agenda of the InterPares project: ``The research project is divided into four domains. The first domain aims to identify the requirements for preserving authentic electronic records. The second domain aims to establish whether, in order to satisfy the requirements for authenticity identified in domain one . . .'' and so on (Duranti, 2000). Worthy as such an approach may be, it fails to acknowledge that much of the territory has already been conquered by the IT community.

References
AS 4390 (1996), Records Management Sections 1-6, Standards Australia.

Bennison, B. (2001), ``Digital strategies for the records management industry'', paper presented at the Annual Conference of Records Management Society of Great Britain, Nottingham. Carroll, T. (2001), ``Document management and the Internet the e-business opportunity'', in Hendley, T. (Ed.) (2001), Document Management Guide and Directory, CIMTECH. Charman, D. (1980), ``Expanding role of the archivist'', ARMA Quarterly, Vol. 14 No. 1. Currall, J., Johnson, C., Johnston, P., Moss, M.S. and Richmond, L.M. (2001), No Going Back. Final Report of Effective Records Management Project, University of Glasgow, Glasgow. Douglas, M. (1995), ``Forgotten knowledge'', in Strathern, M. (Ed.), Shifting Contexts, Routledge, London. Duranti, L. (2000), ``The InterPARES Project'', in Sarno, L. (Ed.), Authentic Records in the Electronic Age, Instituto Italiano di Cultura. Flynn, S.J.A. (2001), ``Records continuum model in context and its implications for archival practice'', Journal of the Society of Archivists, Vol. 22 No. 1, pp. 79-93. Gackenbach, J. (Ed.) (1998), Psychology and the Internet: Intrapersonal, Interpersonal and Transpersonal Implications, Academic Press, New York, NY. Hendley, T. (Ed.) (2001), Document Management Guide and Directory, CIMTECH. Jenkins, R. (1995), Gladstone, Macmillan, London. Kingston, J.H. (1998), Algorithms and Data Structures, Addison Wesley Longman, Boston, MA. Kotter, J. (1999), Leading Change, Harvard University Business School Press, Boston, MA. KPMG IMPACT (1995), Information as an Asset The Board Agenda The Hawley Committee. Lee, T.A. and Parker, R.H. (1979), The Evolution of Corporate Financial Reporting, Nelson. Mander, D. (1989), ``Records management in London local authorities'', Journal of the Society of Archivists, Vol. 10 No. 1. Morley, J. (1903), Life of Gladstone, Macmillan, London. Orbell, J. (1991), ``The development of office technology'', in Turton, A. (Ed.), Managing Business Archives, Butterworth-Heinemann/Business Archives Council, pp. 60-83. Parker, E. (2000), ``Study of the records life cycle'', New Review of Academic Librarianship, Vol. 6. Penn, I. (1993), ``Records management: still lazy after all these years'', Records Management Quarterly, Vol. 27 No. 1. Reed, D. (1985), ``RLIN AMC format: experiment in librarycompatible archival data automation'', Journal of the Society of Archivists, Vol. 7 No. 7. Reed, P. (2001), Extraordinary Leadership, Kogan Page, London. Slater, G. (1990), ``Public Record Office of Northern Ireland and records management in Northern Ireland Civil Service'', Journal of the Society of Archivists, Vol. 11 Nos 1 and 2. State Records Authority of New South Wales (1998), Keyword AAA. Thesaurus of General Terms. Strathern, M. (2000), Audit Cultures, Routledge, London. Tough, A.G. (1985), ``International exchange and some comments on automation in archives'', Janus, pp. 11-25. Tyacke (2002), ``Archives in the wider world the culture and politics of archives'', Archivia.

31

Das könnte Ihnen auch gefallen