Sie sind auf Seite 1von 15

Computer Generations

The following technology definitions will help you to better understand the five generations of computing:
First Generation (1940-1956) Vacuum Tubes
The first computers used vacuum tubes for circuitry and magnetic drums for memory, and were often
enormous, taking up entire rooms. They were very expensive to operate and in addition to using a great deal
of electricity, the first computers generated a lot of heat, which was often the cause of malfunctions.
First generation computers relied on machine language, the lowest-level programming language understood
by computers, to perform operations, and they could only solve one problem at a time, and it could take days
or weeks to set-up a new problem. Input was based on punched cards and paper tape, and output was
displayed on printouts.
The UNIVAC and ENIAC computers are examples of first-generation computing devices. The UNIVAC
was the first commercial computer delivered to a business client, the U.S. Census Bureau in 1951.

A UNIVAC computer at the Census Bureau.

Image Source: United States Census Bureau
Recommended Reading: Webopedia's ENIAC definition. The ENIAC, was developed by Army Ordnance to
compute World War II ballistic firing tables. It weighed 30 tons and used 200 kilowatts of electric power.
Second Generation (1956-1963) Transistors
Transistors replace vacuum tubes and ushered in the second generation of computers. The transistor was
invented in 1947 but did not see widespread use in computers until the late 1950s. The transistor was far
superior to the vacuum tube, allowing computers to become smaller, faster, cheaper, more energy-efficient
and more reliable than their first-generation predecessors.
Though the transistor still generated a great deal of heat that subjected the computer to damage, it was a vast
improvement over the vacuum tube. Second-generation computers still relied on punched cards for input and
printouts for output.
Second-generation computers moved from cryptic binary machine language to symbolic, or assembly,
languages, which allowed programmers to specify instructions in words. High-level programming
languages were also being developed at this time, such as early versions of COBOL and FORTRAN. These
were also the first computers that stored their instructions in their memory, which moved from a magnetic
drum to magnetic core technology.
The first computers of this generation were developed for the atomic energy industry.
Third Generation (1964-1971) Integrated Circuits
The development of the integrated circuit was the hallmark of the third generation of computers. Transistors
were miniaturized and placed on silicon chips, called semiconductors, which drastically increased the speed
and efficiency of computers.
Instead of punched cards and printouts, users interacted with third generation computers
through keyboards and monitors and interfaced with an operating system, which allowed the device to run
many different applications at one time with a central program that monitored the memory. Computers for
the first time became accessible to a mass audience because they were smaller and cheaper than their
Fourth Generation (1971-Present) Microprocessors
The microprocessor brought the fourth generation of computers, as thousands of integrated circuits were
built onto a single silicon chip. What in the first generation filled an entire room could now fit in the palm of
the hand. The Intel 4004 chip, developed in 1971, located all the components of the computerfrom
the central processing unit and memory to input/output controlson a single chip.
In 1981 IBM introduced its first computer for the home user, and in 1984 Apple introduced the Macintosh.
Microprocessors also moved out of the realm of desktop computers and into many areas of life as more and
more everyday products began to use microprocessors.
As these small computers became more powerful, they could be linked together to form networks, which
eventually led to the development of the Internet. Fourth generation computers also saw the development
of GUIs, the mouse and handheld devices.
Fifth Generation (Present and Beyond) Artificial Intelligence
Fifth generation computing devices, based on artificial intelligence, are still in development, though there
are some applications, such as voice recognition, that are being used today. The use of parallel
processing and superconductors is helping to make artificial intelligence a reality. Quantum computation and
molecular and nanotechnology will radically change the face of computers in years to come. The goal of
fifth-generation computing is to develop devices that respond to natural language input and are capable of
learning and self-organization.
Did You Know... ? An integrated circuit (IC) is a small electronic device made out of a semiconductor
material. The first integrated circuit was developed in the 1950s by Jack Kilby of Texas Instruments and
Robert Noyce of Fairchild Semiconductor.

Certain characteristics of computer interaction can make computers well suited for distance learning. The features
listed below the prospect of the computer use look more promising:
Access to expert and respected peers.
One to One and much communication.
Active learner participation.
Linking of new learning to concrete on the job problems.
Follow up, feedback and implementation support from pears or experts.
Self direction control over stop or start, time, pace and place of learning or communication activity.

A computer is used in all human life. It has revolutionized all phases of human activities. The most important have been
given as follows:
Routine job handling
The routine classical and stenotype jobs calculating and formality bits, salaries, updating stocks, tax return, reservation
records and information.
Traffic control
Controlling traffic, traffic lights. Television cameras are used to maintain traffic light routine.
Electronic money
Automatic tellers machine (ATM) is very common in banks. You can deposit and withdraw money with the ATM.
Electronic office
All type information are stored, manipulated and utilized in the electronic form. A document is sent to different place with
FAX, internet and e-mail.
Industrial Application
It plays an important role in production control. It is bringing efficiency it trade and industry.
With help computerized telephone through satellites STD and IST services have been introduced. It maintains the record of
calls and does the billing for you.
Every type of trade computer is used successfully. It is used in Banks, stock exchanges to control stocks and accounts.
Scientific research
In every science, the research work becomes economical from time, energy, money point of new. A large data is analyzed
very quickly.
There is wide use in medical science e. g. ECG, CAT scan, Ultra sound. The proper and accounts diagnosis is done with the
help of computer. The medical apparatus are controlling computerized.
Space Science
The satellite controlling I the space with the help of computer. The informations are collected by using the computer from
the space satellite.
The composing work is done speedily and economical with the help of computer. The designing work is also done by
computer. The quality is maintained is publication by computer.
The computer is used for sending message example printer, FAX, e-mail, Internet. The import and export work is done on
Film industry
It had influenced film industry such as animation; titling etc.The multimedia approach is used in film production with the
help of computer. The cartoon films are developed by computers.
The computer is widely used in the field of education and independent study field of computer science has developed which
is popular these days. At every stage computer is compulsory. The distance education is using computer for instructional
purpose as multimedia approach. The computer makes teacher learning process effecting by involving audio and visual
sense of learners.

A language is defined as the medium of expression of thoughts . All the human beings in this world communicate with each
other by a language. Similarly, computer also needs some expression medium to communicate with others
A computer follows the instructions given by the programmer to perform a specific job. To perform a particular task,
programmer prepares a sequence of instructions, know as programmed. A program written for a computer is known as
Software. The programmed is stored in RAM. The CPU takes one instruction of the programmed at a time from RAM and
executes it. The instructions are executed one by one in sequence and finally produce the desired result.
The Journey of computer software machine language to high level languages to modern 4GL / 5GL languages is an
interesting one. Let us talk about this in detail.
When the human being stared programming the computer the instruction were given to it in a language that it could easily
understand. And that language was machine language. The binary language a language, a language of Is and Os is known as
Machine language. Any instruction in this language is given in the form of string of 1s and 0s. Where the symbol I stand for
the presence of electrical pulse and 0 stands for the absence of electric pulse. A set of 1s and 0s as 11101101 has a specific
meaning to a computer even through it appears as binary number to us.
The writing of programmer in machine language is very cumbersome and complicated and this was accomplished by
experts only. All the instructions and input data are fed to the computer in numeric form, specifically a binary form.
Lots of efforts are made during last 50 years to obviate the difficulties faced for using the machine language. The first
language similar to English was developed in 1950 which was known as Assembly Language or Symbolic Programming
Languages. After 1960, the High Level Languages were developed which bought the common man very to the computer.
And this was the main reason for tremendous growth in computer industry. The high level languages are also known as
Procedure Oriented Languages.
The assembly language was easier to use compared with machine la language as it relieved the programmer from a burden
of remembering the operation codes and addresses of memory location. Even though the assembly languages proved to be
great help to the programmer, a search was continued for still better languages nearer to the conventional English language.
The languages developed which were nearer to the English language, for the use of writing the programmer in 1960 were
known as High Level languages.
The different high level languages which can be used by the common user are FORTRAN, COBOL, BASIC, PASCAL, PL-
1 and many others. Each high level language was developed to fulfill some basic requirements for particular type of
problems. But further developments are made in each language to widen its utility for different purposes.
The 3GLs are procedural in nature i.e., HOW of the problem get coded i.e., the procedures require the knowledge of how
the problem will be solved . Contrary to them, 4GLs are non procedural. That is only WHAT of the problem is coded i.e.,
only What is required is to be specified and rest gets done on its own.
Thus a big program of a 3GLs may get replaced by a single statement of a 4GLs. The main aim of 4GLs is to be cut down
on developed and maintenance time and making it easier for users.
With the invention and popularity of GUI based interfaces. GUI based languages include:
1. TCL/Tk
2. Visual basic
3. Visual C++
4. C# (Pronounced as C sharp)
5. Visual basic.NET
6. Visual basic 2005

When was the first computer invented?

There is no easy answer to this question due to the many different
classifications of computers. The first mechanical computer, created by Charles
Babbage in 1822, doesn't really resemble what most would consider a
computer today. Therefore, this document has been created with a listing of
each of the computer firsts, starting with the Difference Engine and leading up
to the computers we use today.
Note: Early inventions which helped lead up to the computer, such as
the abacus, calculator, and tablet machines, are not accounted for in this document.
The word "computer" was first used
The word "computer" was first recorded as being used in 1613 and originally was
used to describe a human who performed calculations or computations. The definition
of a computer remained the same until the end of the 19th century, when the
industrial revolution gave rise to machines whose primary purpose was calculating.
First mechanical computer or automatic computing engine concept
In 1822, Charles Babbage conceptualized and began developing the Difference
Engine, considered to be the first automatic computing machine. The Difference
Engine was capable of computing several sets of numbers and making hard copies of
the results. Babbage received some help with development of the Difference Engine
from Ada Lovelace, considered by many to be the first computer programmer for her
work and notes on the Difference Engine. Unfortunately, because of funding, Babbage
was never able to complete a full-scale functional version of this machine. In June
of 1991, the London Science Museum completed the Difference Engine No 2 for the
bicentennial year of Babbage's birth and later completed the printing mechanism in

In 1837, Charles Babbage proposed the first general

mechanical computer, the Analytical Engine. The Analytical Engine contained
an Arithmetic Logic Unit (ALU), basic flow control, punch cards (inspired by
the Jacquard Loom), and integrated memory. It is the first general-purpose computer
concept. Unfortunately, because of funding issues, this computer was also never built
while Charles Babbage was alive. In 1910, Henry Babbage, Charles Babbage's
youngest son, was able to complete a portion of this machine and was able to
perform basic calculations.
First programmable computer
The Z1 was created by German Konrad Zuse in his parents' living room
between 1936and 1938. It is considered to be the first electro-
mechanical binary programmable computer, and the first really functional modern

First concepts of what we consider a modern computer

The Turing machine was first proposed by Alan Turing in 1936 and became the
foundation for theories about computing and computers. The machine was a device
that printed symbols on paper tape in a manner that emulated a person following a
series of logical instructions. Without these fundamentals, we wouldn't have the
computers we use today.
The first electric programmable computer

The Colossus was the first electric

programmable computer, developed by Tommy Flowers, and first demonstrated in
December 1943. The Colossus was created to help the British code breakers read
encrypted German messages.
The first digital computer
Short for Atanasoff-Berry Computer, the ABC began development by
Professor John Vincent Atanasoff and graduate student Cliff Berry in 1937. Its
development continued until 1942 at the Iowa State College (now Iowa State
The ABC was an electrical computer that used more than 300 vacuum tubes for digital
computation, including binary math and Boolean logic and had no CPU (was not
programmable). On October 19, 1973, the US Federal Judge Earl R. Larson signed his
decision that the ENIAC patent by J. Presper Eckert and John Mauchly was invalid and
named Atanasoff the inventor of the electronic digital computer.
The ENIAC was invented by J. Presper Eckert and John Mauchly at the University of
Pennsylvania and began construction in 1943 and was not completed until 1946. It
occupied about 1,800 square feet and used about 18,000 vacuum tubes, weighing
almost 50 tons. Although the Judge ruled that the ABC computer was the first digital
computer, many still consider the ENIAC to be the first digital computer because it
was fully functional.
The first stored program computer
The early British computer known as the EDSAC is considered to be the first stored
program electronic computer. The computer performed its first calculation on May
6, 1949 and was the computer that ran the first graphical computer game, nicknamed

Around the same time, the Manchester Mark 1 was another computer that could
run stored programs. Built at the Victoria University of Manchester, the first version of
the Mark 1 computer became operational in April 1949. Mark 1 was used to run a
program to search for Mersenne primes for nine hours without error on June 16 and
17 that same year.
The first computer company
The first computer company was the Electronic Controls Company and was
founded in 1949 by J. Presper Eckert and John Mauchly, the same individuals who
helped create the ENIAC computer. The company was later renamed to EMCC or
Eckert-Mauchly Computer Corporation and released a series of mainframe computers
under the UNIVAC name.
First stored program computer

First delivered to the United States government in 1950,

the UNIVAC 1101 or ERA 1101 is considered to be the first computer that was
capable of storing and running a program from memory.
First commercial computer
In 1942, Konrad Zuse begin working on the Z4 that later became the first commercial
computer. The computer was sold to Eduard Stiefel, a mathematician of the Swiss
Federal Institute of Technology Zurich on July 12, 1950.
IBM's first computer
On April 7, 1953 IBM publicly introduced the 701; its first commercial scientific
The first computer with RAM
MIT introduces the Whirlwind machine on March 8, 1955, a revolutionary computer
that was the first digital computer with magnetic core RAM and real-time graphics.

The first transistor computer

The TX-O (Transistorized Experimental

computer) is the first transistorized computer to be demonstrated at the
Massachusetts Institute of Technology in 1956.
The first minicomputer
In 1960, Digital Equipment Corporation released its first of many PDP computers,
the PDP-1.
The first desktop and mass-market computer
In 1964, the first desktop computer, the Programma 101, was unveiled to the
public at the New York World's Fair. It was invented by Pier Giorgio Perotto and
manufactured by Olivetti. About 44,000 Programma 101 computers were sold, each
with a price tag of $3,200.
In 1968, Hewlett Packard began marketing the HP 9100A, considered to be the first
mass-marketed desktop computer.
The first workstation
Although it was never sold, the first workstation is considered to be the Xerox Alto,
introduced in 1974. The computer was revolutionary for its time and included a fully
functional computer, display, and mouse. The computer operated like many
computers today utilizing windows, menus and icons as an interface to its operating
system. Many of the computer's capabilities were first demonstrated in The Mother of
All Demos by Douglas Engelbart on December 9, 1968.
The first microprocessor
Intel introduces the first microprocessor, the Intel 4004 on November 15, 1971.
The first micro-computer
The Vietnamese-French engineer, Andr Truong Trong Thi, along with Francois
Gernelle, developed the Micral computer in 1973. Considered as the first "micro-
computer", it used the Intel 8008 processor and was the first commercial non-
assembly computer. It originally sold for $1,750.
The first personal computer
In 1975, Ed Roberts coined the term "personal computer" when he introduced
the Altair8800. Although the first personal computer is considered by many to be
the KENBAK-1, which was first introduced for $750 in 1971. The computer relied on
a series of switches for inputting data and output data by turning on and off a series
of lights.

The first laptop or portable computer

The IBM 5100 is the first portable computer, which was released
on September 1975. The computer weighed 55 pounds and had a five
inch CRT display, tape drive, 1.9 MHz PALM processor, and 64 KB of RAM. In the
picture is an ad of the IBM 5100 taken from a November 1975 issue of Scientific
The first truly portable computer or laptop is considered to be the Osborne I, which
was released on April 1981 and developed by Adam Osborne. The Osborne I weighed
24.5 pounds, had a 5-inch display, 64 KB of memory, two 5 1/4" floppy drives, ran
the CP/M 2.2 operating system, included a modem, and cost US$1,795.
The IBM PC Division (PCD) later released the IBM portable in 1984, its first portable
computer that weighed in at 30 pounds. Later in 1986, IBM PCD announced it's
first laptop computer, the PC Convertible, weighing 12 pounds. Finally, in 1994, IBM
introduced the IBM ThinkPad 775CD, the first notebook with an integrated CD-ROM.
The first Apple computer
The Apple I (Apple 1) was the first Apple computer that originally sold for $666.66.
The computer kit was developed by Steve Wozniak in 1976 and contained a 6502 8-
bit processor and 4 kb of memory, which was expandable to 8 or 48 kb using
expansion cards. Although the Apple I had a fully assembled circuit board the kit still
required a power supply, display, keyboard, and case to be operational. Below is a
picture of an Apple I from an advertisement by Apple.

The first IBM personal computer

IBM introduced its first personal computer called the IBM
PC in 1981. The computer was code named and still sometimes referred to as
the Acorn and had a 8088processor, 16 KB of memory, which was expandable to 256
and utilized MS-DOS.
The first PC clone
The Compaq Portable is considered to be the first PC clone and was release in
March 1983 by Compaq. The Compaq Portable was 100% compatible with IBM
computers and was capable of running any software developed for IBM computers.
See the below other computer companies first for other IBM compatible computers
The first multimedia computer
In 1992, Tandy Radio Shack became one of the first companies to release a computer
based on the MPC standard with its introduction of the M2500 XL/2 and M4020 SX
Other computer company firsts
Below is a listing of some of the major computers companies first computers.
Commodore - In 1977, Commodore introduced its first computer, the "Commodore
Compaq - In March 1983, Compaq released its first computer and the first 100% IBM
compatible computer, the "Compaq Portable."
Dell - In 1985, Dell introduced its first computer, the "Turbo PC."
Hewlett Packard - In 1966, Hewlett Packard released its first general computer, the
NEC - In 1958, NEC builds its first computer, the "NEAC 1101."
Toshiba - In 1954, Toshiba introduces its first computer, the "TAC" digital computer.
A language is a system of communication. Humans communicate with one another in some
language, like English, German or in many other languages. We, humans, can also communicate
through gestures, facial expressions, even through our emotions we can express ourselves and our
feelings. In order to make computers work for us, some sort of instructions must be stored in some
kind of language. And that language is called a Programming Language. A programming language
consists of all the symbols, characters, and usage rules that permit people to communicate with
computers. There are at least several hundred, and possibly several thousand different programming
languages. Some of these are created to serve a special purpose (controlling a robot), while others
are more flexible general-purpose tools that are suitable for many types of applications.
What is a Programming Language?
A programming language is a set of written symbols that instructs the computer hardware to
perform specific tasks. Typically, a programming language consists of a vocabulary and a set
of rules (called syntax) that the programmer must learn".
1st generation of programming languages
Machine language is the only programming language that the computer can understand directly
without translation. It is a language made up of entirely 1s and 0s. There is not, however, one
universal machine language because the language must be written in accordance with the special
characteristics of a given processor. Each type or family of processor requires its own machine
language. For this reason, machine language is said to be machine-dependent (also called hardware-
In the computers first generation, programmers had to use machine language because no other
option was available. Machine language programs have the advantage of very fast execution speeds
and efficient use of primary memory. Use of machine language is very tedious, difficult and time
consuming method of programming. Machine language is low-level language. Since the
programmer must specify every detail of an operation, a low-level language requires that the
programmer have detailed knowledge of how the computer works. Programmers had to know a
great deal aobut the computers design and how it functioned. As a result, programmers were few in
numbers and lacked complexity. To make programming simpler, other easier-to-use programming
languages have been developed. These languages, however must ultimately be translated into
machine language before the computer can understand and use them.
2nd Generation of programming languages
The first step in making software development easier and more efficient was the creation
of Assembly languages. They are also classified as low-level languages because detailed
knowledge of hardware is still required. They were developed in 1950s. Assembly languages use
mnemonic operation codes and symbolic addresses in place of 1s and 0s to represent the operation
codes. A mnemonic is an alphabetical abbreviation used as memory aid. This means a programmer
can use abbreviation instead of having to remember lengthy binary instruction codes. For example,
it is much easier to remember L for Load, A for Add, B for Branch, and C for Compare than the
binary equivalents i-e different combinations of 0s and 1s.
Assembly language uses symbolic addressing capabilities that simplify the programming process
because the programmer does not need to know or remember the exact storage locations of
instructions or data. Symbolic addressing is the ability to express an address in terms of symbols
chosen by the programmer rather than in terms of the absolute numerical location. Therefore, it is
not necessary to assign and remember a number that identifies the address of a piece of data.
Although assembly languages represented an improvement, they had obvious limitations. Only
computer specialists familiar with the architecture of the computer being used can use them. And
because they are also machine dependent, assembly languages are not easily converted to run on
other types of computers.
Before they can be used by the computer, assembly languages must be translated into machine
language. A language translator program called an assembler does this
conversion. Assembly languages provide an easier and more efficient way to program than machine
languages while still maintaining control over the internal functions of a computer at the most basic
level. The advantages of programming with assemblylanguages are that they produce programs that
are efficient, use less storage, and execute much faster than programs designed using high-level
3rd Generation of programming languages
Third generation languages, also known as high-level languages, are very much like everyday text
and mathematical formulas in appearance. They are designed to run on a number of different
computers with few or no changes.
Objectives of high-level languages
To relieve the programmer of the detailed and tedious task of writing programs in machine
language and assembly languages.
To provide programs that can be used on more than one type of machine with very few
To allow the programmer more time to focus on understanding the users needs and designing
the software required meeting those needs.
Most high level languages are considered to be procedure-oriented, or Procedural languages,
because the program instructions comprise lists of steps, procedures, that tell the computer not only
what to do but how to do it. High-level language statements generate, when translated, a
comparatively greater number of assembly language instructions and even more machine language
instructions. The programmer spends less time developing software with a high level language than
with assembly or machine language because fewer instructions have to be created.
A language translator is required to convert a high-level language program into machine language.
Two types of language translators are used with high level languages: compilers and interpreters.
4th Generation of programming languages
Fourth generation languages are also known as very high level languages. They are non-procedural
languages, so named because they allow programmers and users to specify what the computer is
supposed to do without having to specify how the computer is supposed to do it. Consequently,
fourth generation languages need approximately one tenth the number of statements that a high
level languages needs to achieve the same results. Because they are so much easier to use than third
generation languages, fourth generation languages allow users, or non-computer professionals, to
develop software.
Objectives of fourth generation languages
Increasing the speed of developing programs.
Minimizing user effort to obtain information from computer.
Decreasing the skill level required of users so that they can concentrate on
theapplication rather than the intricacies of coding, and thus solve their own problems without
the aid of a professional programmer.
Minimizing maintenance by reducing errors and making programs that are easy to change.
Depending on the language, the sophistication of fourth generation languages varies widely. These
languages are usually used in conjunction with a database and its datadictionary.
Five basic types of language tools fall into the fourth generation language category.
1. Query languages
2. Report generators.
3. Applications generators.
4. Decision support systems and financial planning languages.
5. Some microcomputer application software.
Query languages
Query languages allow the user to ask questions about, or retrieve information from database files
by forming requests in normal human language statements (such as English). The difference
between the definitions for query language and for database management systems software is so
slight that most people consider the definitions to be the same. Query languages do have a specific
grammar vocabulary, and syntax that must be mastered, but this is usually a simple task for both
users and programmers.
Report generators
Report generators are similar to query languages in that they allow users to ask questions from a
database and retrieve information from it for a report (the output); however, in case of a report
generator, the user is unable to alter the contents of the database file. And with a report generator,
the user has much greater control over what the output will look like. The user of a report generator
can specify that the software automatically determine how the output should look or can create his
or her own customized output reports using special report generator command instructions.
Application generators
Application generators allow the user to reduce the time it takes to design an entire software
application that accepts input, ensures data has been input accurately, performs complex
calculations and processing logic, and outputs information in the form of reports. The user must key
into computer-useable form the specification for what the program is supposed to do. The resulting
file is input to the applications generator, which determine how to perform the tasks and which then
produces the necessary instructions for the software program.
Decision support systems and financial planning languages combine special interactive computer
programs and some special hardware to allow high level managers to bring data and information
together from different sources and manipulate it in new ways.
Some microcomputer applications software can also be used to create specialized applications in
other words, to create new software. Microcomputer software packages that fall into this category
include many spreadsheet programs (such as Lotus 1-2-3), database managers (Such as dBase IV),
and integrated packages (such as Symphony).
5th Generation of programming languages
Natural Languages represent the next step in the development of programming languages, i-e fifth
generation languages. The text of a natural language statement very closely resembles human
speech. In fact, one could word a statement in several ways perhaps even misspelling some words or
changing the order of the words and get the same result. These languages are also designed to make
the computer smarter. Natural languages already available for microcomputers include Clout,
Q&A, and Savvy Retriever (for use with databases) and HAL (Human Access Language).
The use of natural language touches on expert systems, computerized collection of the knowledge of
many human experts in a given field, and artificial intelligence, independently smart computer