Sie sind auf Seite 1von 1

Neagu Iulia Veronica Unit 1 – A Brief History of Computers

Unit 1
A Brief History of Computers

The development of the modern day computer was the result of advances in
technologies and man’s need to quantify. Papyrus helped early man record language and
numbers. The abacus, a wooden rack holding two horizontal wires with beads strung on
them, was one of the first counting machines.
5 Webster’s Dictionary defines “computer” as any programmable electronic device
that can store, retrieve and process data. The basic idea of computing develops in the
1200 when a Moslem cleric proposes solving problems with a series of written
procedures. As early as 1640 mechanical calculators are manufactured for sale. Records
exist of earlier machines but Blaise Pascal invents the first commercial calculator, a hand
10 powered adding machine. It added numbers entered with dials and was made to help his
father, a tax collector.
In 1801 a Frenchman, Joseph – Marie Jacquard, builds a loom that weaves by
reading punched holes stored on small sheets of hardwood. These plates are then inserted
into the loom which reads the pattern and creates the weave. Powered by water, this
15 “machine” came 140 years before the development of the modern computer.
Shortly after the first mass-produced calculator (1820), Charles Babbage begins
his lifelong quest for a programmable machine based on the principles of punched cards,
storing data in a memory and a sequence of instructions clearly set out in a programme.
This machine will be released in 1830. By 1842, Ada Lovelace, his assistant, uses
20 Babbage’s analytical engine to mechanically translate a short written work. She is
generally regarded as the first programmer.
In 1925, unaware of the work of Charles Babbage, Vannevar Bush of MIT builds
a machine he calls the differential analyzer, which is in fact the first analog computer.
Using a set of gears and shafts, much like Babbage, the machine can handle simple
25 calculus problems but accuracy is a problem.
In 1945, Von Neumann performed an abstract study of computation that showed
that a computer should have a very simple, fixed physical structure, and yet be able to
execute any kind of computation by means of a proper programmed control without the
need for any change in the unit itself.
30 J. Eckert and J. Mauchly heralded the computer era in 1946 by building the first
digital computer using parts called vacuum tubes. They named their invention ENIAC but
these types of machines were too bulky and unreliable to be used in any but largest firms.
In 1947 and 1956 two further technological breakthroughs materialized the
tendency towards miniaturization: the transistor, invented in Bell laboratories, and the
35 integrated circuit, invented by Jack Kilby. The secret of the new technology was to etch
transistors and other components onto a thin silicon wafer, called a chip, in order to create
an integrated circuit.
In 1956 FORTRAN is introduced. Two additional languages, LISP and COBOL
are added in 1957 and 1958. Other early languages include ALGOL and BASIC.
40 Although never widely used, ALGOL is the basis for many of today’s languages.
Over the last decades computers have undergone more transformations: MOS
technology has been supplanted by C-MOS and RISC and laptops and powerful

Das könnte Ihnen auch gefallen