Sie sind auf Seite 1von 18

Generations of Computer

By Najmi
The history of computer development is often referredto in reference to the different generations
ofcomputing devices A generation refers to the state ofimprovement in the development of a
product. Thisterm is also used in the different advancements ofcomputer technology. With each new
generation, thecircuitry has gotten smaller and more advanced thanthe previous generation before
it. As a result of theminiaturization, speed, power, and memory of computershas proportionally
increased. New discoveries areconstantly being developed that affect the way welive, work and play.
Each generation of computer is characterized by amajor technological development that
fundamentallychanged the way computers operate, resulting inincreasingly smaller, cheaper, more
powerful and moreefficient and reliable devices. Read about eachgeneration and the developments
that led to thecurrent devices that we use today.

First Generation - 1940-1956: Vacuum Tubes


The first computers used vacuum tubes for circuitryand magnetic drums for memory, and were
oftenenormous, taking up entire rooms. A magnetic drum,also referred to as drum, is a metal
cylinder coatedwith magnetic iron-oxide material on which data andprograms can be stored.
Magnetic drums were once usedas a primary storage device but have since beenimplemented as
auxiliary storage devices.
The tracks on a magnetic drum are assigned to channelslocated around the circumference of the
drum, formingadjacent circular bands that wind around the drum. Asingle drum can have up to 200
tracks. As the drumrotates at a speed of up to 3,000 rpm, the device'sread/write heads deposit
magnetized spots on the drumduring the write operation and sense these spotsduring a read
operation. This action is similar tothat of a magnetic tape or disk drive.
They were very expensive to operate and in additionto using a great deal of electricity, generated a
lotof heat, which was often the cause of malfunctions.First generation computers relied on machine
languageto perform operations, and they could only solve oneproblem at a time. Machine languages
are the onlylanguages understood by computers. While easilyunderstood by computers, machine
languages are almostimpossible for humans to use because they consistentirely of numbers.
Programmers, therefore, useeither a high-level programming language or anassembly language. An
assembly language contains thesame instructions as a machine language, but theinstructions and
variables have names instead of beingjust numbers.
Programs written in high-level languages aretranslated into assembly language or machine
languageby a compiler. Assembly language programs aretranslated into machine language by a
program calledan assembler.
Every CPU has its own unique machine language.Programs must be rewritten or recompiled,
therefore,to run on different types of computers. Input wasbased on punched cards and paper tape,
and output wasdisplayed on printouts.
The UNIVAC and ENIAC computers are examples offirst-generation computing devices. The UNIVAC
was thefirst commercial computer delivered to a businessclient, the U.S. Census Bureau in 1951.
Acronym for Electronic Numerical Integrator AndComputer, the world's first operational
electronicdigital computer, developed by Army Ordnance tocompute World War II ballistic firing
tables. TheENIAC, weighing 30 tons, using 200 kilowatts ofelectric power and consisting of 18,000
vacuum tubes,1,500 relays, and hundreds of thousands of resistors,capacitors, and inductors, was
completed in 1945. Inaddition to ballistics, the ENIAC's field ofapplication included weather
prediction, atomic-energycalculations, cosmic-ray studies, thermal ignition,random-number studies,
wind-tunnel design, and otherscientific uses. The ENIAC soon became obsolete as theneed arose for
faster computing speeds.

Second Generation - 1956-1963: Transistors


Transistors replaced vacuum tubes and ushered in thesecond generation of computers. Transistor is
a devicecomposed of semiconductor material that amplifies asignal or opens or closes a circuit.
Invented in 1947at Bell Labs, transistors have become the keyingredient of all digital circuits,
includingcomputers. Today's microprocessors contains tens ofmillions of microscopic transistors.
Prior to the invention of transistors, digitalcircuits were composed of vacuum tubes, which had
manydisadvantages. They were much larger, required moreenergy, dissipated more heat, and were
more prone tofailures. It's safe to say that without the inventionof transistors, computing as we
know it today wouldnot be possible.
The transistor was invented in 1947 but did not seewidespread use in computers until the late 50s.
Thetransistor was far superior to the vacuum tube,allowing computers to become smaller, faster,
cheaper,more energy-efficient and more reliable than theirfirst-generation predecessors. Though the
transistorstill generated a great deal of heat that subjectedthe computer to damage, it was a vast
improvement overthe vacuum tube. Second-generation computers stillrelied on punched cards for
input and printouts foroutput.
Second-generation computers moved from cryptic binarymachine language to symbolic, or
assembly, languages,which allowed programmers to specify instructions inwords. High-level
programming languages were alsobeing developed at this time, such as early versionsof COBOL and
FORTRAN. These were also the firstcomputers that stored their instructions in theirmemory, which
moved from a magnetic drum to magneticcore technology.
The first computers of this generation were developedfor the atomic energy industry.

Third Generation - 1964-1971: Integrated Circuits


The development of the integrated circuit was thehallmark of the third generation of
computers.Transistors were miniaturized and placed on siliconchips, called semiconductors, which
drasticallyincreased the speed and efficiency of computers.
A nonmetallic chemical element in the carbon family ofelements. Silicon - atomic symbol "Si" - is
the secondmost abundant element in the earth's crust, surpassedonly by oxygen. Silicon does not
occur uncombined innature. Sand and almost all rocks contain siliconcombined with oxygen, forming
silica. When siliconcombines with other elements, such as iron, aluminumor potassium, a silicate is
formed. Compounds ofsilicon also occur in the atmosphere, natural waters,many plants and in the
bodies of some animals.
Silicon is the basic material used to make computerchips, transistors, silicon diodes and
otherelectronic circuits and switching devices because itsatomic structure makes the element an
idealsemiconductor. Silicon is commonly doped, or mixed,with other elements, such as boron,
phosphorous andarsenic, to alter its conductive properties.
A chip is a small piece of semiconducting material(usually silicon) on which an integrated circuit
isembedded. A typical chip is less than -square inchesand can contain millions of electronic
components(transistors). Computers consist of many chips placedon electronic boards called printed
circuit boards. There are different types of chips. For example, CPUchips (also called
microprocessors) contain an entireprocessing unit, whereas memory chips contain blankmemory.
Semiconductor is a material that is neither a goodconductor of electricity (like copper) nor a
goodinsulator (like rubber). The most common semiconductormaterials are silicon and germanium.
These materialsare then doped to create an excess or lack ofelectrons.
Computer chips, both for CPU and memory, are composedof semiconductor materials.
Semiconductors make itpossible to miniaturize electronic components, such astransistors. Not only

does miniaturization mean thatthe components take up less space, it also means thatthey are faster
and require less energy.
Instead of punched cards and printouts, usersinteracted with third generation computers
throughkeyboards and monitors and interfaced with anoperating system, which allowed the device
to run manydifferent applications at one time with a centralprogram that monitored the memory.
Computers for thefirst time became accessible to a mass audiencebecause they were smaller and
cheaper than theirpredecessors.

Fourth Generation - 1971-Present: Microprocessors


The microprocessor brought the fourth generation ofcomputers, as thousands of integrated circuits
werebuilt onto a single silicon chip. A silicon chip thatcontains a CPU. In the world of personal
computers,the terms microprocessor and CPU are usedinterchangeably. At the heart of all
personalcomputers and most workstations sits a microprocessor.Microprocessors also control the
logic of almost alldigital devices, from clock radios to fuel-injectionsystems for automobiles.
Three basic characteristics differentiatemicroprocessors:

Instruction Set: The set of instructions that themicroprocessor can execute.

Bandwidth: The number of bits processed in a singleinstruction.


Clock Speed: Given in megahertz (MHz), the clockspeed determines how many
instructions per second theprocessor can execute.

In both cases, the higher the value, the more powerfulthe CPU. For example, a 32-bit
microprocessor thatruns at 50MHz is more powerful than a 16-bitmicroprocessor that runs at
25MHz.
What in the first generation filled an entire roomcould now fit in the palm of the hand. The Intel
4004chip, developed in 1971, located all the components ofthe computer - from the central
processing unit andmemory to input/output controls - on a single chip.
Abbreviation of central processing unit, andpronounced as separate letters. The CPU is the brainsof
the computer. Sometimes referred to simply as theprocessor or central processor, the CPU is where
mostcalculations take place. In terms of computing power,the CPU is the most important element of
a computersystem.
On large machines, CPUs require one or more printedcircuit boards. On personal computers and
smallworkstations, the CPU is housed in a single chipcalled a microprocessor.
Two typical components of a CPU are:

The arithmetic logic unit (ALU), which performsarithmetic and logical operations.

The control unit, which extracts instructions frommemory and decodes and executes them,
calling on theALU when necessary.

In 1981 IBM introduced its first computer for the homeuser, and in 1984 Apple introduced the
Macintosh.Microprocessors also moved out of the realm of desktopcomputers and into many areas
of life as more and moreeveryday products began to use microprocessors.
As these small computers became more powerful, theycould be linked together to form networks,
whicheventually led to the development of the Internet.Fourth generation computers also saw the
developmentof GUIs, the mouse and handheld devices

Fifth Generation - Present and Beyond: ArtificialIntelligence


Fifth generation computing devices, based onartificial intelligence, are still in development,though
there are some applications, such as voicerecognition, that are being used today.
Artificial Intelligence is the branch of computerscience concerned with making computers behave
likehumans. The term was coined in 1956 by John McCarthyat the Massachusetts Institute of
Technology.Artificial intelligence includes:

Games Playing: programming computers to play gamessuch as chess and checkers

Expert Systems: programming computers to makedecisions in real-life situations (for


example, someexpert systems help doctors diagnose diseases based onsymptoms)
Natural Language: programming computers to understandnatural human languages

Neural Networks: Systems that simulate intelligenceby attempting to reproduce the types
of physicalconnections that occur in animal brains
Robotics: programming computers to see and hear andreact to other sensory stimuli

Currently, no computers exhibit full artificialintelligence (that is, are able to simulate
humanbehavior). The greatest advances have occurred in thefield of games playing. The best
computer chessprograms are now capable of beating humans. In May,1997, an IBM super-computer
called Deep Blue defeatedworld chess champion Gary Kasparov in a chess match.
In the area of robotics, computers are now widely usedin assembly plants, but they are capable only
of verylimited tasks. Robots have great difficultyidentifying objects based on appearance or feel,
andthey still move and handle objects clumsily.
Natural-language processing offers the greatestpotential rewards because it would allow people
tointeract with computers without needing anyspecialized knowledge. You could simply walk up to
acomputer and talk to it. Unfortunately, programmingcomputers to understand natural languages
has provedto be more difficult than originally thought. Somerudimentary translation systems that
translate fromone human language to another are in existence, butthey are not nearly as good as
human translators.
There are also voice recognition systems that canconvert spoken sounds into written words, but
they donot understand what they are writing; they simply takedictation. Even these systems are
quite limited -- youmust speak slowly and distinctly.
In the early 1980s, expert systems were believed torepresent the future of artificial intelligence and
ofcomputers in general. To date, however, they have notlived up to expectations. Many expert
systems helphuman experts in such fields as medicine andengineering, but they are very expensive
to produceand are helpful only in special situations.
Today, the hottest area of artificial intelligence isneural networks, which are proving successful in
anumber of disciplines such as voice recognition andnatural-language processing.
There are several programming languages that are knownas AI languages because they are used
almostexclusively for AI applications. The two most commonare LISP and Prolog.

Voice Recognition
The field of computer science that deals withdesigning computer systems that can recognize
spokenwords. Note that voice recognition implies only thatthe computer can take dictation, not that
itunderstands what is being said. Comprehending humanlanguages falls under a different field of
computerscience called natural language processing. A number of voice recognition systems are
available onthe market. The most powerful can recognize thousandsof words. However, they

generally require an extendedtraining session during which the computer systembecomes


accustomed to a particular voice and accent.Such systems are said to be speaker dependent.
Many systems also require that the speaker speakslowly and distinctly and separate each word with
ashort pause. These systems are called discrete speechsystems. Recently, great strides have been
made incontinuous speech systems -- voice recognition systemsthat allow you to speak naturally.
There are nowseveral continuous-speech systems available forpersonal computers.
Because of their limitations and high cost, voicerecognition systems have traditionally been used
onlyin a few specialized situations. For example, suchsystems are useful in instances when the user
isunable to use a keyboard to enter data because his orher hands are occupied or disabled. Instead
of typingcommands, the user can simply speak into a headset.Increasingly, however, as the cost
decreases andperformance improves, speech recognition systems areentering the mainstream and
are being used as analternative to keyboards.
The use of parallel processing and superconductors ishelping to make artificial intelligence a reality.
Parallel processing is the simultaneous use of morethan one CPU to execute a program. Ideally,
parallelprocessing makes a program run faster because thereare more engines (CPUs) running it. In
practice, it isoften difficult to divide a program in such a way thatseparate CPUs can execute
different portions withoutinterfering with each other.
Most computers have just one CPU, but some models haveseveral. There are even computers with
thousands ofCPUs. With single-CPU computers, it is possible toperform parallel processing by
connecting thecomputers in a network. However, this type of parallelprocessing requires very
sophisticated software calleddistributed processing software.
Note that parallel processing
severalprograms at once.

differs

frommultitasking,

in

which

single

CPU

executes

Parallel processing is also called parallel computing.


Quantum computation and molecular and nanotechnologywill radically change the face of computers
in yearsto come. First proposed in the 1970s, quantumcomputing relies on quantum physics by
takingadvantage of certain quantum physics properties ofatoms or nuclei that allow them to work
together asquantum bits, or qubits, to be the computer'sprocessor and memory. By interacting with
each otherwhile being isolated from the external environment,qubits can perform certain
calculations exponentiallyfaster than conventional computers.
Qubits do not rely on the traditional binary nature ofcomputing. While traditional computers
encodeinformation into bits using binary numbers, either a 0or 1, and can only do calculations on
one set ofnumbers at once, quantum computers encode informationas a series of quantummechanical states such as spindirections of electrons or polarization orientationsof a photon that
might represent a 1 or a 0, mightrepresent a combination of the two or might representa number
expressing that the state of the qubit issomewhere between 1 and 0, or a superposition of
manydifferent numbers at once. A quantum computer can doan arbitrary reversible classical
computation on allthe numbers simultaneously, which a binary systemcannot do, and also has some
ability to produceinterference between various different numbers. Bydoing a computation on many
different numbers at once,then interfering the results to get a single answer, aquantum computer
has the potential to be much morepowerful than a classical computer of the same size.In using only
a single processing unit, a quantumcomputer can naturally perform myriad operations inparallel.
Quantum computing is not well suited for tasks such asword processing and email, but it is ideal for
taskssuch as cryptography and modeling and indexing verylarge databases.
Nanotechnology is a field of science whose goal is tocontrol individual atoms and molecules to
createcomputer chips and other devices that are thousands oftimes smaller than current
technologies permit.Current manufacturing processes use lithography toimprint circuits on
semiconductor materials. Whilelithography has improved dramatically over the lasttwo decades -- to
the point where some manufacturingplants can produce circuits smaller than one micron(1,000

nanometers) -- it still deals with aggregatesof millions of atoms. It is widely believed thatlithography
is quickly approaching its physicallimits. To continue reducing the size ofsemiconductors, new
technologies that juggleindividual atoms will be necessary. This is the realmof nanotechnology.
Although research in this field dates back to RichardP. Feynman's classic talk in 1959, the
termnanotechnology was first coined by K. Eric Drexler in1986 in the book Engines of Creation.
In the popular press, the term nanotechnology issometimes used to refer to any sub-micron
process,including lithography. Because of this, manyscientists are beginning to use the term
molecularnanotechnology when talking about true nanotechnologyat the molecular level.
The goal of fifth-generation computing is to developdevices that respond to natural language input
and arecapable of learning and self-organization.
Here natural language means a human language. Forexample, English, French, and Chinese are
naturallanguages. Computer languages, such as FORTRAN and C,are not.
Probably the single most challenging problem incomputer science is to develop computers that
canunderstand natural languages. So far, the completesolution to this problem has proved elusive,
althougha great deal of progress has been made.Fourth-generation languages are the
programminglanguages closest to natural languages.

The history of computer development is often referred to in reference to the different generations
of computing devices. Each generation of computer is characterized by a major technological
development that fundamentally changed the way computers operate, resulting in increasingly
smaller, cheaper, more powerful and more efficient and reliable devices.
Read about each generation and the developments that led to the current devices that we use
today.

First Generation - 1940-1956: Vacuum Tubes


The first computers used vacuum tubes for circuitry and magnetic drums for memory, and were
often enormous, taking up entire rooms. They were very expensive to operate and in addition to
using a great deal of electricity, generated a lot of heat, which was often the cause of
malfunctions.
First generation computers relied on machine language, the lowest-level programming language
understood by computers, to perform operations, and they could only solve one problem at a
time. Input was based on punched cards and paper tape, and output was displayed on printouts.
The UNIVAC and ENIAC computers are examples of first-generation computing devices. The
UNIVAC was the first commercial computer delivered to a business client, the U.S. Census
Bureau in 1951. Second Generation - 1956-1963: Transistors
Transistors replaced vacuum tubes and ushered in the second generation of computers. The
transistor was invented in 1947 but did not see widespread use in computers until the late 50s.
The transistor was far superior to the vacuum tube, allowing computers to become smaller, faster,
cheaper, more energy-efficient and more reliable than their first-generation predecessors. Though
the transistor still generated a great deal of heat that subjected the computer to damage, it was a
vast improvement over the vacuum tube. Second-generation computers still relied on punched
cards for input and printouts for output.
Second-generation computers moved from cryptic binary machine language to symbolic, or
assembly, languages, which allowed programmers to specify instructions in words. High-level
programming languages were also being developed at this time, such as early versions of

COBOL and FORTRAN. These were also the first computers that stored their instructions in their
memory, which moved from a magnetic drum to magnetic core technology.
The first computers of this generation were developed for the atomic energy industry.

Third Generation - 1964-1971: Integrated Circuits


The development of the integrated circuit was the hallmark of the third generation of computers.
Transistors were miniaturized and placed on silicon chips, called semiconductors, which
drastically increased the speed and efficiency of computers.
Instead of punched cards and printouts, users interacted with third generation computers through
keyboards and monitors and interfaced with an operating system, which allowed the device to run
many different applications at one time with a central program that monitored the memory.
Computers for the first time became accessible to a mass audience because they were smaller
and cheaper than their predecessors.

Fourth Generation - 1971-Present: Microprocessors


The microprocessor brought the fourth generation of computers, as thousands of integrated
circuits were built onto a single silicon chip. What in the first generation filled an entire room could
now fit in the palm of the hand. The Intel 4004 chip, developed in 1971, located all the
components of the computer - from the central processing unit and memory to input/output
controls - on a single chip.
In 1981 IBM introduced its first computer for the home user, and in 1984 Apple introduced the
Macintosh. Microprocessors also moved out of the realm of desktop computers and into many
areas of life as more and more everyday products began to use microprocessors.
As these small computers became more powerful, they could be linked together to form networks,
which eventually led to the development of the Internet. Fourth generation computers also saw
the development of GUIs, the mouse and handheld devices.

Fifth Generation - Present and Beyond: Artificial Intelligence


Fifth generation computing devices, based on artificial intelligence, are still in development,
though there are some applications, such as voice recognition, that are being used today. The
use of parallel processing and superconductors is helping to make artificial intelligence a reality.
Quantum computation and molecular and nanotechnology will radically change the face of
computers in years to come. The goal of fifth-generation computing is to develop devices that
respond to natural language input and are capable of learning and self-organization.
DID YOU KNOW...
An integrated circuit (IC) is a small
electronic device made out of a
semiconductor material. The first
integrated circuit was developed in the
1950s by Jack Kilby of Texas
Instruments and Robert Noyce of
Fairchild Semiconductor.

First Generation: 1944-1959

Characteristics:
(not all first generation computers had all these characteristics)
-vacuum tube based
-punched tape input or output
-about 1,000 circuits per cubic foot
Examples:
-Harvard Mark I (electromechanical)
-Whirlwind
-ENIAC
-EDSAC
-UNIVAC I, UNIVAC II, UNIVAC 1101
-RCA BIZMAC
-NCR CRC 102A, NCR CRC 102D
-Honeywell Datamatic 1000
-Burroughs E101, Burroughs 220
-IBM models 604, 650 (drum memory), 701, 702, 704, 705, 709

Second Generation: 1960-1964


Characteristics:
-used transistors
-about 100,000 circuits per foot
Examples:

-UNIVAC 1107, UNIVAC III


-RCA 501
-Philco Transact S-2000
-NCR 300 series
-IBM 7030 Stretch
-IBM 7070, 7080, 7090, 1400 series, 1600 series
-Honeywell 800, 400 series
-General Electric GE 635, 645, GE 200
-Control Data Corp. CDC 1604, 3600, 160A
-LARC
-Burroughs B5000, 200 series

Third Generation: 1964-1975


Characteristics:
-large scale integrated circuits
-10 million circuits per square foot
Examples:
-Burroughs 6700
-Control Data 3300, 6600, 7600
-Honeywell 200
-IBM System/360, System 3, System 7
-NCR Century Series

-RCA Spectra 70 series


-UNIVAC 9000 series
-General Electric GE 600 series, GE 235

Fourth Generation: 1975-Current


Characteristics:
-very large scale integration
-continued miniaturization
-billions of circuits per cubic foot
Examples:
-IBM System 3090, IBM RISC 6000, IBM RT
-ILLIAC IV
-Cray 2 XMP
-HP 9000

Fifth Generation: Current and Future


Characteristics:
Combinations of some or all of the following technologies:
-extremely large scale integration
-parallel processing

-high speed logic and memory chips


-high performance, micro-miniaturization
-voice/data integration; knowledge-based platforms
-artificial intelligence, expert systems
-virtual reality generation
-satellite links

THE COMPUTER GENERATIONS

Generation is referred as a step and advancement in the technology. It


provides the growth of the computer industry. There are total five computer
generations known till today.
1. The First generation (1942 1955) Vacuum Tubes
The early computer ENIAC, EDVAC, EDSAC, UNIVAC and IBM-701 used
vacuum tubes. These computers used thousand of vacuum tubes. A
vacuum tube was a fragile glass device, which used filaments as a source
of electronics and could control electronic signals. It was the only highspeed electronics signals. It was the only high-speed electronic switching
device available in those days. These vacuum tubes could perform
computations in mill-seconds and were referred as First Generation
Computer.
2. The Second Generation (1955 1964) Transistors
Vacuum tubes consumed large amount of electricity and produced a lot of
heat. Vacuum tubes were replaced by transistors. The transistor was a
solid-state device made from silicon. The use of transistors marks the
Second Generation of Computer Equipment.
The Second Generation Computers were powerful, reliable, less expensive
and smaller than first generation computer. The memory of the second

generation was composed of magnetic cores. Magnetic disk and magnetic


tapes were the main secondary storage media used in second generation
computer. Programs were now being written in high-level language instead
of machine language. High-level languages like FORTAN, COBOL, ALGOL
and SNOBOL were developed during the second generation period. Second
Generation Computers were easier to Program and use than the First
Generation Computer. The use of transistor in Second Generation
Computer produced less heat.
3. The Third Generation (1964 1975) Integrated Circuits or ICs
Second Generation computers were specialized i.e. (that is) the y could be
used for scientific or non-scientific applications. Third generation
computers were general purpose computers.
The IC technology was also known as micro electronics technology
because it made it possible to integrate large number of circuit
components into very small i.e. less than 5mm square surface of silicon
known as chip.
ICs were smaller, less expensive to produce, more reliable, faster in
operation, consumed less power and produced less heat than the second
generation computer.
4. The Fourth Generation ( 1975 1989)
In third generation, integrated circuits were used in building computers.
The number of circuits per chips was increase which was used in the
fourth generation. An increased number of circuits allow more data to be
stored on a memory chip. Large scale integration (LSI) and very large scale
integration (VLSI) allowed memory chips to have hundred and thousands
of locations. This progress led to the creation of micro processor.
Def: A microprocessor is a processor which has all its components on a
single integrated circuit chip.
The introduction of micro processor started a new social revolution the
personal computer or PC revolution. The micro processor contained all the

circuits required to perform arithmetic logic and control functions.


Computer now became inexpensive and it was possible for everyone to
own or buy a computer. At this period, along with hardware there were
several software developments.
In the fourth generation magnetic core memories were replaced by
semiconductor memories, resulting in large random access memories also
known as RAM, with very fast access time. Hard disk also became cheaper,
smaller and larger (big) in capacity.

Magnetic tapes, floppy disk became very popular portable medium for
taking data from one computer system to another.
5. The Fifth Generation (1989 - Present)
There are three factors that are said to characteristics the fifth generation
of computers: mega chip memories, advanced processing and artificial
intelligence.
Due to the advancement in technology, more compact or small in size and
more powerful computers are being introduced every year. There are
portable computers like notebook computers that can be used by the users
while traveling. There is advancement in the storage technology also.
During the fifth generation optical disk was also introduced which is a
popular storage media. They are also known as CD-ROMs i.e. Compact
Disk Read Only Memory.
THE COMPUTER GENERATIONS

Generation is referred as a step and advancement in the technology. It


provides the growth of the computer industry. There are total five computer
generations known till today.
1. The First generation (1942 1955) Vacuum Tubes

The early computer ENIAC, EDVAC, EDSAC, UNIVAC and IBM-701 used
vacuum tubes. These computers used thousand of vacuum tubes. A
vacuum tube was a fragile glass device, which used filaments as a source
of electronics and could control electronic signals. It was the only highspeed electronics signals. It was the only high-speed electronic switching
device available in those days. These vacuum tubes could perform
computations in mill-seconds and were referred as First Generation
Computer.
2. The Second Generation (1955 1964) Transistors
Vacuum tubes consumed large amount of electricity and produced a lot of
heat. Vacuum tubes were replaced by transistors. The transistor was a
solid-state device made from silicon. The use of transistors marks the
Second Generation of Computer Equipment.
The Second Generation Computers were powerful, reliable, less expensive
and smaller than first generation computer. The memory of the second
generation was composed of magnetic cores. Magnetic disk and magnetic
tapes were the main secondary storage media used in second generation
computer. Programs were now being written in high-level language instead
of machine language. High-level languages like FORTAN, COBOL, ALGOL
and SNOBOL were developed during the second generation period. Second
Generation Computers were easier to Program and use than the First
Generation Computer. The use of transistor in Second Generation
Computer produced less heat.
3. The Third Generation (1964 1975) Integrated Circuits or ICs
Second Generation computers were specialized i.e. (that is) the y could be
used for scientific or non-scientific applications. Third generation
computers were general purpose computers.
The IC technology was also known as micro electronics technology
because it made it possible to integrate large number of circuit
components into very small i.e. less than 5mm square surface of silicon
known as chip.
ICs were smaller, less expensive to produce, more reliable, faster in

operation, consumed less power and produced less heat than the second
generation computer.
4. The Fourth Generation ( 1975 1989)
In third generation, integrated circuits were used in building computers.
The number of circuits per chips was increase which was used in the
fourth generation. An increased number of circuits allow more data to be
stored on a memory chip. Large scale integration (LSI) and very large scale
integration (VLSI) allowed memory chips to have hundred and thousands
of locations. This progress led to the creation of micro processor.
Def: A microprocessor is a processor which has all its components on a
single integrated circuit chip.
The introduction of micro processor started a new social revolution the
personal computer or PC revolution. The micro processor contained all the
circuits required to perform arithmetic logic and control functions.
Computer now became inexpensive and it was possible for everyone to
own or buy a computer. At this period, along with hardware there were
several software developments.
In the fourth generation magnetic core memories were replaced by
semiconductor memories, resulting in large random access memories also
known as RAM, with very fast access time. Hard disk also became cheaper,
smaller and larger (big) in capacity.

Magnetic tapes, floppy disk became very popular portable medium for
taking data from one computer system to another.
5. The Fifth Generation (1989 - Present)
There are three factors that are said to characteristics the fifth generation
of computers: mega chip memories, advanced processing and artificial
intelligence.
Due to the advancement in technology, more compact or small in size and

more powerful computers are being introduced every year. There are
portable computers like notebook computers that can be used by the users
while traveling. There is advancement in the storage technology also.
During the fifth generation optical disk was also introduced which is a
popular storage media. They are also known as CD-ROMs i.e. Compact
Disk Read Only Memory.
THE COMPUTER GENERATIONS

Generation is referred as a step and advancement in the technology. It


provides the growth of the computer industry. There are total five computer
generations known till today.
1. The First generation (1942 1955) Vacuum Tubes
The early computer ENIAC, EDVAC, EDSAC, UNIVAC and IBM-701 used
vacuum tubes. These computers used thousand of vacuum tubes. A
vacuum tube was a fragile glass device, which used filaments as a source
of electronics and could control electronic signals. It was the only highspeed electronics signals. It was the only high-speed electronic switching
device available in those days. These vacuum tubes could perform
computations in mill-seconds and were referred as First Generation
Computer.
2. The Second Generation (1955 1964) Transistors
Vacuum tubes consumed large amount of electricity and produced a lot of
heat. Vacuum tubes were replaced by transistors. The transistor was a
solid-state device made from silicon. The use of transistors marks the
Second Generation of Computer Equipment.
The Second Generation Computers were powerful, reliable, less expensive
and smaller than first generation computer. The memory of the second
generation was composed of magnetic cores. Magnetic disk and magnetic
tapes were the main secondary storage media used in second generation
computer. Programs were now being written in high-level language instead
of machine language. High-level languages like FORTAN, COBOL, ALGOL

and SNOBOL were developed during the second generation period. Second
Generation Computers were easier to Program and use than the First
Generation Computer. The use of transistor in Second Generation
Computer produced less heat.
3. The Third Generation (1964 1975) Integrated Circuits or ICs
Second Generation computers were specialized i.e. (that is) the y could be
used for scientific or non-scientific applications. Third generation
computers were general purpose computers.
The IC technology was also known as micro electronics technology
because it made it possible to integrate large number of circuit
components into very small i.e. less than 5mm square surface of silicon
known as chip.
ICs were smaller, less expensive to produce, more reliable, faster in
operation, consumed less power and produced less heat than the second
generation computer.
4. The Fourth Generation ( 1975 1989)
In third generation, integrated circuits were used in building computers.
The number of circuits per chips was increase which was used in the
fourth generation. An increased number of circuits allow more data to be
stored on a memory chip. Large scale integration (LSI) and very large scale
integration (VLSI) allowed memory chips to have hundred and thousands
of locations. This progress led to the creation of micro processor.
Def: A microprocessor is a processor which has all its components on a
single integrated circuit chip.
The introduction of micro processor started a new social revolution the
personal computer or PC revolution. The micro processor contained all the
circuits required to perform arithmetic logic and control functions.
Computer now became inexpensive and it was possible for everyone to
own or buy a computer. At this period, along with hardware there were
several software developments.

In the fourth generation magnetic core memories were replaced by


semiconductor memories, resulting in large random access memories also
known as RAM, with very fast access time. Hard disk also became cheaper,
smaller and larger (big) in capacity.

Magnetic tapes, floppy disk became very popular portable medium for
taking data from one computer system to another.
5. The Fifth Generation (1989 - Present)
There are three factors that are said to characteristics the fifth generation
of computers: mega chip memories, advanced processing and artificial
intelligence.
Due to the advancement in technology, more compact or small in size and
more powerful computers are being introduced every year. There are
portable computers like notebook computers that can be used by the users
while traveling. There is advancement in the storage technology also.
During the fifth generation optical disk was also introduced which is a
popular storage media. They are also known as CD-ROMs i.e. Compact
Disk Read Only Memory.

Das könnte Ihnen auch gefallen