Sie sind auf Seite 1von 8

History of computer science

From Wikipedia, the free encyclopedia


The history of computer science began long before the modern discipline of computer science that emerged in the 20th
century, and hinted at in the centuries prior. The progression, from mechanical inventions and mathematical theories towards the
modern concepts and machines, formed a major academic field and the basis of a massive worldwide industry.
[1]
Contents
1 Early history
1.1 Binary logic
1.2 Birth of computer
2 Emergence of a discipline
2.1 Charles Babbage and Ada Lovelace
2.2 Alan Turing and the Turing Machine
2.3 Shannon and information theory
2.4 Wiener and cybernetics
2.5 John von Neumann and the von Neumann architecture
3 See also
4 Notes
5 Sources
6 Further reading
7 External links
Early history
The earliest known as tool for use in computation was the abacus, developed in period 27002300 BCE in Sumer . The
Sumerians' abacus consisted of a table of successive columns which delimited the successive orders of magnitude of their
sexagesimal number system.
[2]
Its original style of usage was by lines drawn in sand with pebbles . Abaci of a more modern
design are still used as calculation tools today.
[3]
The Antikythera mechanism is believed to be the earliest known mechanical analog computer.
[4]
It was designed to calculate
astronomical positions. It was discovered in 1901 in the Antikythera wreck off the Greek island of Antikythera, between
Kythera and Crete, and has been dated to c. 100 BCE. Technological artifacts of similar complexity did not reappear until the
14th century, when mechanical astronomical clocks appeared in Europe.
[5]
Mechanical analog computing devices appeared a thousand years later in the medieval Islamic world. Examples of devices from
this period include the equatorium by Arzachel,
[6]
the mechanical geared astrolabe by Ab Rayhn al-Brn,
[7]
and the
torquetum by Jabir ibn Aflah.
[8]
Muslim engineers built a number of automata, including some musical automata that could be
'programmed' to play different musical patterns. These devices were developed by the Ban Ms brothers
[9]
and Al-Jazari
[10]
Muslim mathematicians also made important advances in cryptography, such as the development of cryptanalysis and frequency
analysis by Alkindus.
[11]
When John Napier discovered logarithms for computational purposes in the early 17th century, there followed a period of
considerable progress by inventors and scientists in making calculating tools. In 1623 Wilhelm Schickard designed a calculating
machine, but abandoned the project, when the prototype he had started building was destroyed by a fire in 1624 . Around
1640, Blaise Pascal, a leading French mathematician, constructed a mechanical adding device based on a design described by
Greek mathematician Hero of Alexandria.
[12]
Then in 1672 Gottfried Wilhelm Leibnitz invented the Stepped Reckoner which
he completed in 1694.
[13]
In 1837 Charles Babbage first described his Analytical Engine which is accepted as the first design for a modern computer. The
analytical engine had expandable memory, an arithmetic unit, and logic processing capabilities able to interpret a programming
language with loops and conditional branching. Although never built, the design has been studied extensively and is understood
to be Turing equivalent. The analytical engine would have had a memory capacity of less than 1 kilobyte of memory and a clock
speed of less than 10 Hertz .
Considerable advancement in mathematics and electronics theory was required before the first modern computers could be
designed .
Binary logic
In 1703, Gottfried Wilhelm Leibnitz developed logic in a formal, mathematical sense with his writings on the binary numeral
system. In his system, the ones and zeros also represent true and false values or on and off states. But it took more than a
century before George Boole published his Boolean algebra in 1854 with a complete system that allowed computational
processes to be mathematically modeled .
By this time, the first mechanical devices driven by a binary pattern had been invented. The industrial revolution had driven
forward the mechanization of many tasks, and this included weaving. Punched cards controlled Joseph Marie Jacquard's loom
in 1801, where a hole punched in the card indicated a binary one and an unpunched spot indicated a binary zero. Jacquard's
loom was far from being a computer, but it did illustrate that machines could be driven by binary systems .
Birth of computer
Before the 1920s, computers (sometimes computors) were human clerks that performed computations. They were usually
under the lead of a physicist. Many thousands of computers were employed in commerce, government, and research
establishments. Most of these computers were women, and they were known to have a degree in calculus. Some performed
astronomical calculations for calendars, others ballistic tables for the military.
After the 1920s, the expression computing machine referred to any machine that performed the work of a human computer,
especially those in accordance with effective methods of the Church-Turing thesis. The thesis states that a mathematical method
is effective if it could be set out as a list of instructions able to be followed by a human clerk with paper and pencil, for as long
as necessary, and without ingenuity or insight .
Machines that computed with continuous values became known as the analog kind. They used machinery that represented
continuous numeric quantities, like the angle of a shaft rotation or difference in electrical potential .
Digital machinery, in contrast to analog, were able to render a state of a numeric value and store each individual digit. Digital
machinery used difference engines or relays before the invention of faster memory devices .
The phrase computing machine gradually gave away, after the late 1940s, to just computer as the onset of electronic digital
machinery became common. These computers were able to perform the calculations that were performed by the previous
human clerks .
Since the values stored by digital machines were not bound to physical properties like analog devices, a logical computer,
based on digital equipment, was able to do anything that could be described "purely mechanical." The theoretical Turing
Machine, created by Alan Turing, is a hypothetical device theorized in order to study the properties of such hardware .
Emergence of a discipline
Charles Babbage and Ada Lovelace
Charles Babbage is often regarded as one of the first pioneers of computing. Beginning in the 1810s, Babbage had a vision of
mechanically computing numbers and tables. Putting this into reality, Babbage designed a calculator to compute numbers up to
8 decimal points long. Continuing with the success of this idea, Babbage worked to develop a machine that could compute
numbers with up to 20 decimal places. By the 1830s, Babbage had devised a plan to develop a machine that could use
punched cards to perform arithmetical operations. The machine would store numbers in memory units, and there would be a
form of sequential control. This means that one operation would be carried out before another in such a way that the machine
would produce an answer and not fail. This machine was to be known as the Analytical Engine, which was the first true
representation of what is the modern computer.
[14]
Ada Lovelace (Augusta Ada Byron) is credited as the pioneer of computer programming and is regarded as a mathematical
genius, a result of the mathematically heavy tutoring regimen her mother assigned to her as a young girl. Lovelace began
working with Charles Babbage as an assistant while Babbage was working on his Analytical Engine, the first mechanical
computer. During her work with Babbage, Ada Lovelace became the designer of the first computer algorithm, which had the
ability to compute Bernoulli numbers. Moreover, Lovelaces work with Babbage resulted in her prediction of future computers
to not only perform mathematical calculations, but also manipulate symbols, mathematical or not. While she was never able to
see the results of her work, as the Analytical Engine was not created in her lifetime, her efforts in later years, beginning in the
1940s, did not go unnoticed.
[15]
Alan Turing and the Turing Machine
The mathematical foundations of modern computer science began to be laid by Kurt Gdel with his incompleteness theorem
(1931). In this theorem, he showed that there were limits to what could be proved and disproved within a formal system. This
led to work by Gdel and others to define and describe these formal systems, including concepts such as mu-recursive
functions and lambda-definable functions .
1936 was a key year for computer science. Alan Turing and Alonzo Church independently, and also together, introduced the
formalization of an algorithm, with limits on what can be computed, and a "purely mechanical" model for computing .
These topics are covered by what is now called the ChurchTuring thesis, a hypothesis about the nature of mechanical
calculation devices, such as electronic computers. The thesis claims that any calculation that is possible can be performed by an
algorithm running on a computer, provided that sufficient time and storage space are available .
In 1937, Alan Turing introduced his idea of what are now referred to as Turing Machines, and, anticipating the modern stored
program computer, described what became known as the Universal Turing machine. These Turing machines were designed to
formally determine, mathematically, what can be computed, taking into account limitations on computing ability. If a Turing
machine can complete the task, it is considered Turing computable.
[16]
Turing machines are not physical objects, but mathematical ones. They show if and how any given algorithm can be computed.
Turing machines are state machines, where a state represents a position in a graph. State machines use various states, or graph
positions, to determine the outcome of the algorithm. To accomplish this, a theoretical one-dimensional tape is said to be
divided into an infinite number of cells. Each cell contains a binary digit, 1 or 0. As the read/write head of the machine scans in
the subsequent value in the cell, it uses this value to determine what state to transition to next. To accomplish this, the machine
follows an input of rules, usually in the form of tables, that contain logic similar to: if the machine is in state A and a 0 is read in,
the machine is going to go to the next state, say, state B. The rules that the machines must follow are considered the program.
These Turing machines helped define the logic behind modern computer science. Memory in modern computers is represented
by the infinite tape, and the bus of the machine is represented by the read/write head.
[16]
Turing focused heavily on designing a machine that could determine what can be computed. Turing concluded that as long as a
Turing machine exists that could compute a precise approximation of the number, that value was computable. This does include
constants such as pi. Furthermore, functions can be computable when determining TRUE or FALSE for any given parameters.
One example of this would be a function IsEven. If this function were passed a number, the computation would produce
TRUE if the number were even and FALSE if the number were odd. Using these specifications, Turing machines can determine
if a function is computable and terminate if said function is computable. Furthermore, Turing machines can interpret logic
operators, such as "AND, OR, XOR, NOT, and IF-THEN-ELSE"
[16]
to determine if a function is computable.
Turing is so important to computer science that his name is also featured on the Turing Award and the Turing test. He
contributed greatly to British code-breaking successes in the Second World War, and continued to design computers and
software through the 1940s until his untimely death in 1954 .
At a symposium on large-scale digital machinery in Cambridge, Turing said, "We are trying to build a machine to do all kinds of
different things simply by programming rather than by the addition of extra apparatus" .
In 1941, Konrad Zuse developed the world's first functional program-controlled computer, the Z3; in 1998, it was shown to be
Turing-complete in principle.
[17][18]
Zuse was also noted for the S2 computing machine, considered the first process-controlled
computer. He founded one of the earliest computer businesses in 1941, producing the Z4, which became the world's first
commercial computer. In 1946, he designed the first high-level programming language, Plankalkl.
[19]
In 1969, Zuse suggested
the concept of a computation-based universe in his book Rechnender Raum (Calculating Space) .
In 1948, the first practical computer that could run stored programs, based on the Turing machine model, had been built - the
Manchester Baby .
In 1950, Britain's National Physical Laboratory completed Pilot ACE, a small scale programmable computer, based on
Turing's philosophy .
[20]
Shannon and information theory
Up to and during the 1930s, electrical engineers were able to build electronic circuits to solve mathematical and logic problems,
but most did so in an ad hoc manner, lacking any theoretical rigor. This changed with Claude Elwood Shannon's publication of
his 1937 master's thesis, A Symbolic Analysis of Relay and Switching Circuits. While taking an undergraduate philosophy class,
Shannon had been exposed to Boole's work, and recognized that it could be used to arrange electromechanical relays (then
used in telephone routing switches) to solve logic problems. This concept, of utilizing the properties of electrical switches to do
logic, is the basic concept that underlies all electronic digital computers, and his thesis became the foundation of practical digital
circuit design when it became widely known among the electrical engineering community during and after World War II .
Shannon went on to found the field of information theory with his 1948 paper titled A Mathematical Theory of Communication,
which applied probability theory to the problem of how to best encode the information a sender wants to transmit. This work is
one of the theoretical foundations for many areas of study, including data compression and cryptography .
Wiener and cybernetics
From experiments with anti-aircraft systems that interpreted radar images to detect enemy planes, Norbert Wiener coined the
term cybernetics from the Greek word for "steersman." He published "Cybernetics" in 1948, which influenced artificial
intelligence. Wiener also compared computation, computing machinery, memory devices, and other cognitive similarities with his
analysis of brain waves .
The first actual computer bug was a moth. It was stuck in between the relays on the Harvard Mark II.[1]
(http://www.history.navy.mil/photos/images/h96000/h96566kc.htm) While the invention of the term 'bug' is often but
erroneously attributed to Grace Hopper, a future rear admiral in the U.S. Navy, who supposedly logged the "bug" on
September 9, 1945, most other accounts conflict at least with these details. According to these accounts, the actual date was
September 9, 1947 when operators filed this 'incident' along with the insect and the notation "First actual case of bug being
found" (see software bug for details) .
John von Neumann and the von Neumann architecture
In 1946, a model for computer architecture was introduced and became known as Von Neumann architecture. Since 1950,
the von Neumann model provided uniformity in subsequent computer designs. The von Neumann architecture was considered
innovative as it introduced an idea of allowing machine instructions and data to share memory space. The von Neumann model
is composed of three major parts, the arithmetic logic unit (ALU), the memory, and the instruction processing unit (IPU). In von
Neumann machine design, the IPU passes addresses to memory, and memory, in turn, is routed either back to the IPU if an
instruction is being fetched or to the ALU if data is being fetched.
[21]
Von Neumanns machine design uses a RISC (Reduced instruction set computing) architecture, which means the instruction set
uses a total of 21 instructions to perform all tasks. (This is in contrast to CISC, complex instruction set computing, instruction
sets which have more instructions from which to choose.) With von Neumann architecture, main memory along with the
accumulator (the register that holds the result of logical operations)
[22]
are the two memories that are addressed. Operations
can be carried out as simple arithmetic (these are performed by the ALU and include addition, subtraction, multiplication and
division), conditional branches (these are more commonly seen now as if statements or while loops. The branches serve as
go to statements), and logical moves between the different components of the machine, i.e., a move from the accumulator to
memory or vice versa. Von Neumann architecture accepts fractions and instructions as data types. Finally, as the von Neumann
architecture is a simple one, its register management is also simple. The architecture uses a set of seven registers to manipulate
and interpret fetched data and instructions. These registers include the "IR" (instruction register), "IBR" (instruction buffer
register), "MQ" (multiplier quotient register), "MAR" (memory address register), and "MDR" (memory data register)."
[21]
The
architecture also uses a program counter ("PC") to keep track of where in the program the machine is.
[21]
See also
Computer Museum
History of computing
History of computing hardware
History of software
List of computer term etymologies, the origins of computer science words
List of prominent pioneers in computer science
Timeline of algorithms
History of personal computers
Bugbook Historical Computer Museum
Notes
1. ^ History of Computer Science (http://www.cs.uwaterloo.ca/~shallit/Courses/134/history.html)
2. ^ Ifrah 2001:11
3. ^ Bellos, Alex (2012-10-25). "Abacus adds up to number joy in Japan" (http://www.guardian.co.uk/science/alexs-adventures-
in-numberland/2012/oct/25/abacus-number-joy-japan). The Guardian (London). Retrieved 2013-06-25.
4. ^ The Antikythera Mechanism Research Project (http://www.antikythera-mechanism.gr/project/general/the-project.html), The
Antikythera Mechanism Research Project. Retrieved 2007-07-01
5. ^ In search of lost time, Jo Marchant, Nature 444, #7119 (November 30, 2006), pp. 534538, doi:10.1038/444534a
(http://dx.doi.org/10.1038%2F444534a) PMID 17136067.
6. ^ Hassan, Ahmad Y.. "Transfer Of Islamic Technology To The West, Part II: Transmission Of Islamic Engineering"
(http://www.history-science-technology.com/Articles/articles%2071.htm). Retrieved 2008-01-22.
7. ^ "Islam, Knowledge, and Science" (http://www.usc.edu/dept/MSA/introduction/woi_knowledge.html). University of Southern
California. Retrieved 2008-01-22.
8. ^ Lorch, R. P. (1976). "The Astronomical Instruments of Jabir ibn Aflah and the Torquetum". Centaurus 20 (1): 1134.
Bibcode:1976Cent...20...11L (http://adsabs.harvard.edu/abs/1976Cent...20...11L). doi:10.1111/j.1600-0498.1976.tb00214.x
(http://dx.doi.org/10.1111%2Fj.1600-0498.1976.tb00214.x).
9. ^ Koetsier, Teun (2001). "On the prehistory of programmable machines: musical automata, looms, calculators". Mechanism and
Machine Theory (Elsevier) 36 (5): 589603. doi:10.1016/S0094-114X(01)00005-2 (http://dx.doi.org/10.1016%2FS0094-
114X%2801%2900005-2).
10. ^ A 13th Century Programmable Robot (http://www.shef.ac.uk/marcoms/eview/articles58/robot.html), University of Sheffield
11. ^ Simon Singh, The Code Book, pp. 14-20
12. ^ History of Computing Science: The First Mechanical Calculator (http://lecture.eingang.org/pascaline.html)
13. ^ Kidwell, Peggy Aldritch; Williams, Michael R. (1992). The Calculating Machines: Their history and development
(http://www.rechenmaschinen-illustrated.com/Martins_book/Ernst%20Martin%20-
%20Rechen%20Machinen%20OCR%204.pdf). USA: Massachusetts Institute of Technology and Tomash Publishers., p.38-42,
translated and edited from Martin, Ernst (1925). Die Rechenmaschinen und ihre Entwicklungsgeschichte. Germany:
Pappenheim.
14. ^ "Charles Babbage" (http://www.britannica.com/EBchecked/topic/47371/Charles-Babbage/299985/Additional-Reading).
Encyclopedia Britannica Online Academic Edition. Encyclopedia Britannica In. Retrieved 2013-02-20.
15. ^ Isaacson, Betsy (2012-12-10). "Ada Lovelace, World's First Computer Programmer, Celebrated With Google Doodle"
(http://www.huffingtonpost.com/2012/12/10/google-doodle-ada-lovelace_n_2270668.html). The Huffington Post
(http://www.huffingtonpost.com/2012/12/10/google-doodle-ada-lovelace_n_2270668.html). Retrieved 2013-02-20.
16. ^
a

b

c
Barker-Plummer, David. [<http://plato.stanford.edu/archives/win2012/entries/turing-machine/>. "Turing Machines"]. The
Stanford Encyclopedia of Philosophy. Retrieved 2013-02-20.
17. ^ Rojas, R. (1998). "How to make Zuse's Z3 a universal computer". IEEE Annals of the History of Computing 20 (3): 5154.
doi:10.1109/85.707574 (http://dx.doi.org/10.1109%2F85.707574).
18. ^ Rojas, Ral. "How to Make Zuse's Z3 a Universal Computer"
(http://www.zib.de/zuse/Inhalt/Kommentare/Html/0684/universal2.html).
19. ^ Talk given by Horst Zuse to the Computer Conservation Society at the Science Museum (London) on 18 November 2010
20. ^ "BBC News - How Alan Turing's Pilot ACE changed computing" (http://news.bbc.co.uk/2/hi/technology/8683369.stm). BBC
News. May 15, 2010.
21. ^
a

b

c
Cragon, Harvey G. (2000). Computer Architecture and Implementation. Cambridge: Cambridge University Press. pp. 1
13. ISBN 0521651689.
22. ^ "Accumlator" Def. 3 (http://oxforddictionaries.com/definition/english/accumulator?q=accumulator). Oxford Dictionaries.
Sources
Ifrah, Georges (2001), The Universal History of Computing: From the Abacus to the Quantum Computer, New
York: John Wiley & Sons, ISBN 0-471-39671-0
Further reading
Alan Turing
A Very Brief History of Computer Science (http://www.cs.uwaterloo.ca/~shallit/Courses/134/history.html)
Computer History Museum (http://www.computerhistory.org/)
Computers: From the Past to the Present (http://www.eingang.org/Lecture/)
The First "Computer Bug" (http://www.history.navy.mil/photos/images/h96000/h96566kc.htm) at the Online Library of
the Naval Historical Center, retrieved February 28, 2006
Bitsavers (http://www.bitsavers.org/), an effort to capture, salvage, and archive historical computer software and manuals
from minicomputers and mainframes of the 1950s, 1960s, 1970s, and 1980s
Matti Tedre (2006). The Development of Computer Science: A Sociocultural Perspective
(ftp://cs.joensuu.fi/pub/Dissertations/tedre.pdf). Doctoral thesis for University of Joensuu.
External links
Oral history interview with Albert H. Bowker (http://purl.umn.edu/107140) at Charles Babbage Institute, University of
Minnesota. Bowker discusses his role in the formation of the Stanford University computer science department, and his
vision, as early as 1956, of computer science as an academic discipline.
Oral history interview with Joseph F. Traub (http://purl.umn.edu/107684) at Charles Babbage Institute, University of
Minnesota. Traub discusses why computer science has developed as a discipline at institutions including Stanford,
Berkeley, University of Pennsylvania, MIT, and Carnegie-Mellon.
Oral history interview with Gene H. Golub (http://purl.umn.edu/107334) at Charles Babbage Institute, University of
Minnesota. Golub discusses his career in computer science at Stanford University.
Oral history interview with John Herriot (http://purl.umn.edu/107356) at Charles Babbage Institute, University of
Minnesota. Herriot describes the early years of computing at Stanford University, including formation of the computer
science department, centering on the role of George Forsythe.
Oral history interview with William F. Miller (http://purl.umn.edu/107502) at Charles Babbage Institute, University of
Minnesota. Miller contrasts the emergence of computer science at Stanford with developments at Harvard and the
University of Pennsylvania.
Oral history interview with Alexandra Forsythe (http://purl.umn.edu/107291) at Charles Babbage Institute, University of
Minnesota. Forsythe discusses the career of her husband, George Forsythe, who established Stanford University's
program in computer science.
Oral history interview with Allen Newell (http://purl.umn.edu/107544) at Charles Babbage Institute, University of
Minnesota. Newell discusses his entry into computer science, funding for computer science departments and research,
the development of the Computer Science Department at Carnegie Mellon University, including the work of Alan J.
Perlis and Raj Reddy, and the growth of the computer science and artificial intelligence research communities. Compares
computer science programs at Stanford, MIT, and Carnegie Mellon.
Oral history interview with Louis Fein (http://purl.umn.edu/107284) at Charles Babbage Institute, University of
Minnesota. Fein discusses establishing computer science as an academic discipline at Stanford Research Institute (SRI)
as well as contacts with the University of CaliforniaBerkeley, the University of North Carolina, Purdue, International
Federation for Information Processing and other institutions.
Oral history interview with W. Richards Adrion (http://purl.umn.edu/104300) at Charles Babbage Institute, University of
Minnesota. Adrion gives a brief history of theoretical computer science in the United States and NSF's role in funding
that area during the 1970s and 1980s.
Oral history interview with Bernard A. Galler (http://purl.umn.edu/107301) at Charles Babbage Institute, University of
Minnesota. Galler describes the development of computer science at the University of Michigan from the 1950s through
the 1980s and discusses his own work in computer science.
Michael S. Mahoney Papers (http://purl.umn.edu/92154) at Charles Babbage Institute, University of Minnesota
Mahoney was the preeminent historian of computer science as a distinct academic discipline. Papers contain 38 boxes of
books, serials, notes, and manuscripts related to the history of computing, mathematics, and related fields.
The Modern History of Computing (http://plato.stanford.edu/entries/computing-history) entry by B. Jack Copeland in the
Stanford Encyclopedia of Philosophy
Retrieved from "http://en.wikipedia.org/w/index.php?title=History_of_computer_science&oldid=621302333"
Categories: History of computer science
This page was last modified on 15 August 2014 at 03:55.
Text is available under the Creative Commons Attribution-ShareAlike License; additional terms may apply. By using this
site, you agree to the Terms of Use and Privacy Policy. Wikipedia is a registered trademark of the Wikimedia
Foundation, Inc., a non-profit organization.

Das könnte Ihnen auch gefallen