Beruflich Dokumente
Kultur Dokumente
Since TPS systems can be such a powerful business tool, access must be restricted to only those employees who require their use. Restricted access to the system ensures that employees who lack the skills and ability to control it cannot influence the transaction process. Transactions Processing Qualifiers In order to qualify as a TPS, transactions made by the system must pass the ACID test. The ACID tests refers to the following four prerequisites: Atomicity Atomicity means that a transaction is either completed in full or not at all. For example, if funds are transferred from one account to another, this only counts as a bone fide transaction if both the withdrawal and deposit take place. If one account is debited and the other is not credited, it does not qualify as a transaction. TPS systems ensure that transactions take place in their entirety. Consistency TPS systems exist within a set of operating rules (or integrity constraints). If an integrity constraint states that all transactions in a database must have a positive value, any transaction with a negative value would be refused. Isolation Transactions must appear to take place in isolation. For example, when a fund transfer is made between two accounts the debiting of one and the crediting of another must appear to take place simultaneously. The funds cannot be credited to an account before they are debited from another. Durability Once transactions are completed they cannot be undone. To ensure that this is the case even if the TPS suffers failure, a log will be created to document all completed transactions. These four conditions ensure that TPS systems carry out their transactions in a methodical, standardized and reliable manner. Types of Transactions While the transaction process must be standardized to maximize efficiency, every enterprise requires a tailored transaction process that aligns with its business strategies and processes. For this reason, there are two broad types of transaction: Batch Processing
Page 1 of 18
Page 1 of 18
Page 1 of 18
Easy for upper-level executives to use, extensive computer experience is not required in operations Provides timely delivery of company summary information Information that is provided is better understood Filters data for management Improves to tracking information Offers efficiency to decision makers Page 1 of 18
Functions are limited, cannot perform complex calculations Hard to quantify benefits and to justify implementation of an EIS Executives may encounter information overload System may become slow, large, and hard to manage Difficult to keep current data May lead to less reliable and insecure data Small companies may encounter excessive costs for implementation Too detailed Oriented
Future Trends The future of executive info systems will not be bound by mainframe computer systems. This trend allows executives escaping from learning different computer operating systems and substantially decreases the implementation costs for companies. Because utilizing existing software applications lies in this trend, executives will also eliminate the need to learn a new or special language for the EIS package. Future executive information systems will not only provide a system that supports senior executives, but also contain the information needs for middle managers. The future executive information systems will become diverse because of integrating potential new applications and technology into the systems, such as incorporating artificial intelligence (AI) and integrating multimedia characteristics and ISDN technology into an EIS.
Page 1 of 18
Page 1 of 18
Page 1 of 18
Page 1 of 18
A. A large category of information systems comprises those designed to support the management of an organization. Those systems rely on data obtained by transaction processing systems, as well as data acquired outside the organization (such as business intelligence gleaned on the Internet) and data provided by business partners, suppliers, and customers. Information systems support all levels of management, from those in charge of short-term schedules and budgets for small work groups to those concerned with long-term plans and budgets for the entire organization. Management reporting systems provide routine, detailed, and voluminous information reports specific to each managers areas of responsibility. Generally, these reports focus on past and present performance, rather than projecting future performance. To prevent information overload, reports are automatically sent only under exceptional circumstances or at the specific request of a manager.
Page 1 of 18
ORGANIZATION: The planned and intentional structure of roles or positions in an identified unit that seeks to achieve established purposes and objectives. The purpose of organising is to make human effort productive and effective. Formal and Informal Span of control: The number of subordinates that a manager supervises. Narrow and Wide FACTORS Task complexity Experience and maturity of the personnel Task relationships STEPS IN ORGANIZING Identifying specific tasks or activities Grouping the tasks or activities Assigning resources and responsibilities Co-ordinating activities and relationships
STAFFING Not hire a good person; hire a person who can meet the demands of the position. JOB DESCRIPTION: a) Goals b) Job content (tasks/activities) c) Job formulation or job analysis d) job qualifications and e) job specifications JOB ANNOUNCEMENT Position title Job specifications Page 1 of 18
CONTROLLING Establishing measurable standards Measuring performance and accomplishment against the standards Revising or correcting variations from the standards when they occur PRINCIPLES OF EFFECTIVE CONTROL Tailor controls to specific circumstances Use both subjective and objective means of evaluation Be flexible Be economical Aim to improve performance
Page 1 of 18
A. A Transaction Processing System or Transaction Processing Monitor is a set of information which process the data transaction in database system that monitors transaction programs (a special kind of program). The essence of a transaction program is that it manages data that must be left in a consistent state. E.g. if an electronic payment is made, the amount must be either both withdrawn from one account and added to the other, or none at all. In case of a failure preventing transaction completion, the partially executed transaction must be 'rolled back' by the TPS. While this type of integrity must be provided also for batch transaction processing, it is particularly important for online processing: if e.g. an airline seat reservation system is accessed by multiple operators, after an empty seat inquiry, the seat reservation data must be locked until the reservation is made, otherwise another user may get the impression a seat is still free while it is actually being booked at the time. Without proper transaction monitoring, double bookings may occur. Other transaction monitor functions include deadlock detection and resolution (deadlocks may be inevitable in certain cases of cross-dependence on data), and transaction logging (in 'journals') for 'forward recovery' in case of massive failures. Transaction Processing is not limited to application programs. The 'journaled file system' provided with IBMs AIX Unix operating system employs similar techniques to maintain file system integrity, including a journal.
Page 1 of 18
A. Artificial intelligence (AI) is the intelligence of machines and the branch of computer science which aims to create it. Major AI textbooks define the field as "the study and design of intelligent agents," where an intelligent agent is a system that perceives its environment and takes actions which maximize its chances of success. John McCarthy, who coined the term in 1956, defines it as "the science and engineering of making intelligent machines." The field was founded on the claim that a central property of human beings, intelligencethe sapience of Homo sapienscan be so precisely described that it can be simulated by a machine. This raises philosophical issues about the nature of the mind and limits of scientific hubris, issues which have been addressed by myth, fiction and philosophy since antiquity. Artificial intelligence has been the subject of breathtaking optimism, has suffered stunning setbacks and, today, has become an essential part of the technology industry, providing the heavy lifting for many of the most difficult problems in computer science. AI research is highly technical and specialized, so much so that some critics decry the "fragmentation" of the field. Subfields of AI are organized around particular problems, the application of particular tools and around longstanding theoretical differences of opinion. The central problems of AI include such traits as reasoning, knowledge, planning, learning, communication, perception and the ability to move and manipulate objects. General intelligence (or "strong AI") is still a long term goal of (some) research.
Page 1 of 18
Also:Network topology is the study of the arrangement or mapping of the elements (links, nodes, etc.) of a network, especially the physical (real) and logical (virtual) interconnections between nodes. A local area network (LAN) is one example of a network that exhibits both a physical topology and a logical topology. Any given node in the LAN will have one or more links to one or more other nodes in the network and the mapping of these links and nodes onto a graph results in a geometrical shape that determines the physical topology of the network. Likewise, the mapping of the flow of data between the nodes in the network determines the logical topology of the network. The physical and logical topologies might be identical in any particular network but they also may be different. Any particular network topology is determined only by the graphical mapping of the configuration of physical and/or logical connections between nodes. LAN Network Topology is, therefore, technically a part of graph theory. Distances between nodes, physical interconnections, transmission rates, and/or signal types may differ in two networks and yet their topologies may be identical. Basic types of topologies There are six basic types of topology in networks: 1. Bus topology Page 1 of 18
Page 1 of 18
A. Systems Development Life Cycle (SDLC), or Software Development Life Cycle, in systems engineering and software engineering refers to the process of creating or altering systems, and the models and methodologies that people use to develop these systems. The concept generally refers to computer or information systems. In software engineering the SDLC concept underpins many kinds of software rdevelopment methodologies. These methodologies form the framework for planning and controlling the creation of an information system: the software development process. Overview Systems Development Life Cycle (SDLC) is any logical process used by a systems analyst to develop an information system, including requirements, validation, training, and user ownership. An SDLC should result in a high quality system that meets or exceeds customer expectations, reaches completion within time and cost estimates, works effectively and efficiently in the current and planned Information Technology infrastructure, and is inexpensive to maintain and cost-effective to enhance. Computer systems have become more complex and often (especially with the advent of Service-Oriented Architecture) link multiple traditional systems potentially supplied by different software vendors. To manage this level of complexity, a number of system development life cycle (SDLC) models have been created: "waterfall," "fountain," "spiral," "build and fix," "rapid prototyping," "incremental," and "synchronize and stabilize." Although the term SDLC can refer to various models, it typically denotes a waterfall methodology. In project management a project has both a life cycle and a "systems development life cycle," during which a number of typical activities occur. The project life cycle (PLC) encompasses all the activities of the prroject, while the systems development life cycle focuses on realizing the product requirements.
Page 1 of 18
Q. Communication Protocols: A. In the field of telecommunications, a communications protocol is the set of standard rules for data representation, signaling, authentication and error detection required to send information over a communications channel. An example of a simple communications protocol adapted to voice communication is the case of a radio dispatcher talking to mobile stations. Communication protocols for digital computer network communication have features intended to ensure reliable interchange of data over an imperfect communication channel. Communication protocol is basically following certain rules so that the system works properly. Most recent protocols are assigned by the IETF for Internet communications, and the IEEE, or the ISO for other types. The ITU-T handles telecommunications protocols and formats for the public switched telephone network (PSTN). The ITU-R handles protocols and formats for radio communications. As the PSTN. radio systems, and Internet converge, the different sets of standards are also being driven towards technological convergence. For marine electronics the NMEA standards are used.
Page 1 of 18