Sie sind auf Seite 1von 18

Types of computer systems

1. Mainframe Computer

20 A mainframe computer is a computer system with:

• very powerful processors


• lots of backing storage
• large internal memory.

Mainframes are designed to process large volumes of data at


high speed. They are used by large businesses such as

 banks and mail-order


 companies as well as by large
 organisations such as universities.
Movie 1. Mainframe Computers.
Mainframe computers can also multi-task by running more
than one program at the same time time. This is known as multi-programming and with more memory
has become possible on desktop and laptop computers.

2. Desktop Computer

A desktop computer is the most common kind of PC. It is a


collection of a number of different hardware devices. This type of
computer is sited permanently on a desk because its design means
it cannot be easily moved. The common components of a desktop
PC are:

 the system unit containing the processor and main memory


 monitor
 keyboard
 mouse
 hard disk drive Movie: 2. Desktop Computers.
 floppy disk drive
 CD/DVD drive
 speakers.

3. Laptop or Notebook Computer


A laptop computer is a small, light computer that you can
easily carry about with you. It can be powered by battery or
mains power. A laptop computer has a keyboard, and
comes with specialised input devices, for example
trackballs, touch pads or track points. They are needed
because laptop computers are often operated in places
where it is impracticable to use a mouse.

For output the laptop has an LCD or TFT screen and a set
of small speakers.

‘Laptops’ are often as powerful as desktop computers and


run the same range and type of software.
Movie: 3. Laptop Computers.
People use laptops for working when they are on the move,
going to meetings or attending courses.

Many businesses are replacing desktop PCs with special plug-in workstations designed round laptop
computers because of the flexibility they offer.

4. Palmtop Computer or Personal Digital Assistant (PDA)

This type of computer is increasing in popularity, and is often called


a Personal Digital Assistant (PDA).

A palmtop computer is small enough to fit in your pocket.

It combines a lot of capabilities, including organiser features (such as storing


contact numbers, names and addresses, etc.), e-mail and wireless internet
access.

Palmtop's have small keyboards and most let you open menus and select
icons by using a special pen or stylus. Most let you enter data by writing with
the stylus. They are powered by batteries and store their data on removable
memory units called flash cards.

You can run a wide range of software on palmtop's, for example simple word Movie: 4. Palmtop
processing, database and spreadsheet software as well as useful Computers.
applications such as electronic diaries. Many modern palmtop's:

 are converging with mobile phones to let you access the internet
 have wireless communications to let you access your local area network.
Types of processing systems

In this article, I’m going to explain five different types of data processing. The first two, scientific
and commercial data processing, are application specific types of data processing, the second three
are method specific types of data processing.

First a quick summary of data processing: Data processing is defined as the process of converting
raw data into meaningful information.

Data processing can be defined by the following steps

 Data capture, or data collection.


 Data storage.
 Data conversion (changing to a usable or uniform format).
 Data cleaning and error removal.
 Data validation (checking the conversion and cleaning).
 Data separation and sorting (drawing patterns, relationships, and creating subsets).
 Data summarization and aggregation (combining subsets in different groupings for more
information).
 Data presentation and reporting.

There are different types of data processing techniques, depending on what the data is needed for.
Types of data processing at a bench level may include statistical, algebraical, mapping and plotting,
forest and tree method, machine learning, linear models, non-linear models, relational and non-
relational processing. These are methodology and techniques which can be applied within the key
types of data processing.

What we’re going to discuss in this article is the five main hierarchical types of data processing. That
is the overarching types of systems in data analytics.

Data Processing by Application Type


The first two key types of data processing I’m going to talk about are scientific data processing and
commercial data processing.

1. Scientific Data Processing

When used in scientific study or research and development work, data sets can require quite different
methods than commercial data processing.

Scientific data is a special type of data processing that is used in academic and research fields.
It’s vitally important for scientific data that there are no significant errors that contribute to wrongful
conclusions. Because of this, the cleaning and validating steps can take a considerably larger amount
of time than for commercial data processing.

Scientific data processing needs to draw conclusions, so the steps of sorting and summarization often
need to be performed very carefully, using a wide variety of processing tools to ensure no selection
biases or wrong relationships are produced.

Scientific data processing often needs a topic expert additional to a data expert to work with
quantities.

2. Commercial Data Processing


Commercial data processing has multiple uses, and may not necessarily require complex sorting. It
was first used widely in the field of marketing, for customer relationship management applications,
and in banking, billing, and payroll functions. Most of the data caught in these applications are
standardized, and somewhat error proofed. That is capture fields eliminate errors, so in some cases
raw data can be processed directly, or with minimum and largely automated error checking.

Commercial data processing usually applies standard relational databases, and uses batch processing,
however some, in particular technology applications may use non-relational databases.

There are still many applications within commercial data processing that lean towards a scientific
approach, such as predictive market research. These may be considered a hybrid of the two methods.

Data Processing Types by Processing Method

Within the main areas of scientific and commercial processing, different methods are used for
applying the processing steps to data. The three main types of data processing we’re going to discuss
are automatic/manual, batch, and real-time data processing.

3. Automatic versus Manual Data Processing

It may not seem possible, but even today people still use manual data processing. Bookkeeping data
processing functions can be performed from a ledger, customer surveys may be manually collected
and processed, and even spreadsheet-based data processing is now considered somewhat manual. In
some of the more difficult parts of data processing, a manual component may be needed for intuitive
reasoning.

The first technology that led to the development of automated systems in data processing was punch
cards used in census counting. Punch cards were also used in early days of payroll data processing.

Computers started being used by corporations in the 1970’s when electronic data processing began to
develop. Some of the first applications for automated data processing in the way of specialized
databases were developed for customer relationship management (CRM), to drive better sales.
Electronic data management became widespread with the introduction of the personal computer in
the 1980s. Spreadsheets provided simple electronic assistance for even everyday data management
functions such as personal budgeting and expense allocations.

Database management provided more automation of data processing functions, which is why I refer
to spreadsheets as a now rather manual tool in data management. The user is required to manipulate
all the data in a spreadsheet, almost like a manual system, only calculations are aided. Whereas in a
database, users can extract data relationships and reports relatively easily, providing the setup and
entries are correctly managed.

Autonomous databases now look to be a data processing method of the future, especially in
commercial data processing. Oracle and Peloton are poised to offer users more automation with what
is termed a “self-driving” database. This development in the field of automatic data processing,
combined with machine learning tools for optimizing and improving service, aims to make accessing
and managing data easier for end users, without the need for highly specialized data professionals in-
house.

4. Batch Processing

To save computational time, before the widespread use of distributed systems architecture, or even
after it, stand-alone computer systems apply batch processing techniques. This is particularly useful
in financial applications or where data was secure such as medical records.

Batch processing completes a range of data processes as a batch, by simplifying single commands to
provide actions to multiple data sets. This is a little like the comparison of a computer spreadsheet to
a calculator in some ways. A calculation can be applied with one function, that is one step, to a whole
column or series of columns, giving multiple results from one action. The same concept is achieved
in batch processing for data. A series of actions or results can be achieved by applying a function to a
whole series of data. In this way, the computer processing time is far less.

Batch processing can complete a queue of tasks without human intervention, and data systems may
program priorities to certain functions or set times when batch processing can be completed.
Banks typically use this process to execute transactions after the close of business, where computers
are no longer involved in data capture and can be dedicated to processing functions.

5. Real Time Data Processing

For commercial uses, many large data processing applications require real-time processing. That is
they need to get results from data exactly as it happens. One application of this that most of us can
identify with is tracking stock market and currency trends. The data needs to be updated immediately
since investors buy in real time and prices update by the minute. Data on airline schedules and
ticketing, and GPS tracking applications in transport services have similar needs for real-time
updates.
The most common technology used in real time processing is stream processing. The data analytics
are drawn directly from the stream, that is, at the source. Where data is used to draw conclusions
without uploading and transforming, the process is much quicker.

Data virtualization techniques are another important development in real-time data processing, where
the data remains in its source form, the only information is pulled for the data processing. The beauty
of data virtualization is that where transformation is not necessary, so the error is reduced.

Data virtualization and stream processing mean that data analytics can be drawn in real time much
quicker, benefiting many technical and financial applications, reducing processing times and errors.

Other than these popular Data processing Techniques there are three more processing techniques
which are mentioned below-

6. ONLINE PROCESSING

This data processing technique is derived from Automatic data processing. This technique is now
known as immediate or irregular access handling. Under this technique, the activity by the
framework is prepared at the time of operation/processing. This can be viewed easily with continuous
preparing of data sets. This processing method highlights the fast contribution of exchange of data
and connects directly with the databases.

7. MULTI PROCESSING

This is the most commonly used data processing technique. However, it is used all over the globe
where we have the computer-based setups for Data capture and processing. As the name suggests –
Multiprocessing is not bound to one single CPU, With this, it has a collection of several CPU’s. As
the various set of processing devices are included in this method, therefore the outcome efficiency is
very useful. The jobs are broken into frames and then sent to the multiprocessors for processing. The
result obtained is expected to be in less time and the output is increased. The additional benefit is
every processing unit is independent thus failure of any will not impact the working of other
processing units.
8. TIME SHARING

This kind of Data processing is entirely based on Time. In this, one unit of processing data is used by
several users. Each user is allocated with the set timings on which they need to work on the same
CPU/processing Unit. Intervals are divided into segments and thus to users so there is no collapse of
timings which makes it as a multi-access system. This processing technique is also widely used and
mostly entertained in startups.

QUICK TIPS TO ANALYSE BEST PROCESSING TECHNIQUES

1. Understanding your requirement is a major point before choosing the best processing
techniques for your Project.
2. Filter out your data in a much more precise manner so you can apply processing
techniques.
3. Recheck your filter data again in a way that it still represents the first requirement and you
don’t miss out any important fields in it.
4. Think about the OUTPUT which you would like to have so you can follow your idea.
5. Now you have the filter data and the output you wish to have, Check the best and most
reliable processing technique.
6. Once you choose your technique as per your requirement it will be easy to follow up for
the end result.
7. The chosen technique must be checked simultaneously so there are no loopholes in order
to avoid mistakes.
8. Always apply ETL functions to recheck your datasets.
9. With this don’t forget to apply a timeline to your requirement as without a specific
timeline it is useless to apply energy.
10. Test your OUTPUT again with the initial requirement for a better delivery.

Summary

This has been a little bit of an introduction to some of the different types of data processing. If you
like what you’ve read here and want to learn more, take a look around on our blog for more about
data processing systems.

Effects of IT on Internal Controls


1. Principles of a Reliable System and Examples of Overall Risk

 a reliable system is one that is capable of operating without material error, fault, or failure durin
specified environment

 AICPA Trust Services: provides assurance on information systems, uses a framework with five
system; when a principle is not met a risk exists:

Principle Examples of Risk


Security: protected against unauthorized Physical access:
access (physical and logical)  weather
 acts of war
 disgruntled employees

Logical access:
 Malicious(or accidental) alteration or damage to
files and/or system
 Computer based fraud
 Unauthorized access to confidential data
Availability: available for operation and use  Interruption of business operations
as agreed and in conformity with policies  Loss of data
Processing Integrity: complete, accurate,  Invalid, incomplete or inaccurate input data, data
timely, and authorized processing, updating of master files, and creation of
output
Online Privacy: personal information obtained  Disclosure of customer information, such as SSN,
as a result of e-commerce is collected, used, credit card numbers, credit rating
disclosed, and retained as agreed
Confidentiality: protected as agreed  disclosing confidential data, such as transaction
details, business plans, and legal documents

2. Control Environment

a. Segregation of Controls

 at minimum, segregate programming, operations, and the library function within the inform

 a more complete segregation of key functions within the IS department would be to separate

o System analysis- analyzes the present user environment and requirements and may (1)
changes, (2) recommend the purchase of a new system, or (3) design a new information s

o Systems programming- responsible for implementing, modifying, and debugging the so


making the hardware work

o Applications programming- responsible for writing, testing, and debugging the applica
specifications provided by the systems analyst

o Database administration- responsible for maintaining the database and restricting acces
authorized personnel

o Data preparation- may be prepared by user departments and input by key to magnetic t

o Operations- responsible for daily computer operations of both hardware and software; m
the tape drives, supervises operations on the operator’s console, accepts any required inp
generate output; also responsible for help desks

o Data library- responsible for custody of the removable media and for the maintenance o
documentation
o Data control- acts as liaison between users and the processing center; records input dat
the progress of processing, distributes output, and ensures compliance with control totals

 can also include a number of Web-related positions, such as:

o Web administrator- responsible for overseeing the development, planning, and the imp
site; usually managerial

o Web master- responsible for providing expertise and leadership in the development of
design, analysis, security, maintenance, content development, and updates

o Web designer- responsible for creating the visual content of the Web site

o Web coordinator- responsible for daily operations of the Web site

o Internet developer- responsible for writing programs from commercial use

o Intranet/Extranet developer- responsible for writing programs based on the needs of the

3. Risk Assessment

 changes in computerized information systems and in operations may increase the risk of improp

4. Information and Communication

 affected by whether the company uses small computers and/or a complex mainframe system

o Small computer system, purchased commercial (off-the-shelf” software may be used

o Complex mainframe system, a significant portion of the software is ordinarily developed w


company (greater control testing)

5. Monitoring

 requires adequate computer skills to evaluate the propriety of processing of computerized applic

 involves reviewing system-access log for inappropriate access

 IT can evaluate data and transactions based on established criteria and highlight items that appe
unusual

6. Control Activities- Overall


 can be divided into general control activities, computer application control activities (programm
up), and user controlled activities

7. Computer General Control Activities

 Developing new programs and systems

o Segregation Controls

 User departments participate in systems design

 Both users and information systems personnel test new systems.

 Management, users, and information systems personnel approve new systems befor
operation.

 All master and transaction file conversions should be controlled to prevent unautho
verify the accuracy of the results.

 Programs and systems should be properly documented

o Computer Hardware

 Parity check- small bit added to each character that can detect if the hardware loses
movement of a character

 Echo check- during the sending and receiving of characters, the receiving hardware
sending hardware what it received and the sending hardware automatically resends an
received incorrectly

 Diagnostic routines- hardware or software supplied by the manufacturer to check th


device within the computer system

 Boundary protection- ensure that simultaneous jobs by the CPU cannot destroy or c
memory of another job

 Periodic maintenance

 Changing existing programs and systems

o proper change control procedures should be in place:

 IS manager should review all changes.


 The modified program should be tested.

 Details of all changes should be documented.

 A code comparison program may be used to compare source and/or object codes of
program with the program currently used

 Controlling access to programs and data

o Segregation Controls

 Access to program documentation should be limited to those persons who require i


their duties

 Access to data files and programs should be limited to those individuals authorized

 Access to computer hardware should be limited to authorized individuals such as co


their supervisors

o Physical Access to Computer Facility

 Limited physical access- guards, automated key card, manual locks

 Visitor entry logs

o Hardware and Software Access Controls

 Access control software (user identification)- most common is a combination of a u


and a confidential password (letters, numbers, and symbols; changed periodically; di
leaves company)

 Call back- specialized form of user identification in which the user dials the system
is disconnected from the system; then either the individual manually finds the authori
the system automatically finds the authorized telephone number and calls back

 Encryption- protects data, since to use the data unauthorized users must not only ob
translate the coded data; encryption performed by physically secure hardware is ordin
more costly than that performed by software

 Controlling computer operations

o Segregation Controls
 should have operation manual that contains instructions for processing programs an
operational programs, but not with detailed program documentation

 control group should monitor the operator’s activities and jobs should be scheduled

o Other Controls

 Backup and recovery

 Contingency processing

 File protection ring- processing control to ensure than an operator does not use a m
write on when it actually has critical information on it; if ring is on the tape, it cannot

 Internal and external labels

8. Computer Application Control Activities- Programmed Control Activities

 Input Controls

o Overall: inputs should be authorized and approved; system should verify all significant dat
data into machine-readable form should be controlled and verified for accuracy

o Preprinted form

o Check Digit- extra digit added to ID number to detect data transmission errors

o Control, batch, or proof total- total of one numerical field for all records of a batch that wo

o Hash total- control total where total is meaningless for financial purposes

o Record count- control total of the total records processed

o Limit (reasonableness) test- test of reasonable of field of data, given a pre-determined uppe

o Menu driven input- menu prompts the proper response

o Field check- control that limits the types of characters accepted into a specific data field

o Validity check- control that allows only “valid” transactions or data to entered into the syst

o Missing data check- control that searches for blank inappropriately existing in input data
o Field size check- control of an exact number of characters to be input

o Logic check- ensures that illogical combinations of input are not accepted

o Redundant data check- uses 2 identifiers in each transaction record to confirm that the corr
being updated

o Closed-loop verification- control that allows data entry personnel to check the accuracy of
instead of redundant data check

 Processing controls: should include limit tests, record counts, and control totals

9. Application Control- Manual Follow-Up of Computer Exception Reports

 involve employee follow-up of items listed on computer exception reports

 effectiveness depends on the effectiveness of both the programmed control activities that produ
manual follow-up activities

10. User Control Activities to Test Completeness and Accuracy of Computer-Processed Controls

 manual controls, include:

o Checks of computer out against source documents, control totals, or other input to provide
programmed aspects of the financial reporting system and control activities have operated ef

o Reviewing computer processing logs to determine that all of the correct computer jobs exe

o Maintaining proper procedures and communications specifying authorized recipients of ou

 often performed by both control group and users

11. Disaster Recovery and Business Continuity

 should allow firm to:

o Minimize the extent of disruption, damage, and loss

o Establish an alternate (temporary) method for processing information

o Resume normal operations as quickly as possible

o Train and familiarize personnel to perform emergency operations


 should include the following:

o Priorities

o Insurance to defer costs

o Backup approaches

 Batch systems: Grandfather-Father-Son method: store grandfather and son in separ

 Online databases and master file systems

 Checkpoint: copy database and store separately

 Rollback: to undo changes made to a database to a point at which it was fu

 Backup facilities: Reciprocal agreement (with another company to aid in d


cold sites, and internal sites

o Specific assignments for individuals, including:

 Arranging for new facilities

 Computer operations

 Installing software

 Establishing data communications facilities

 Recovering vital records

o Periodic testing and updating of plan

o Documentation of plan

Das könnte Ihnen auch gefallen