Sie sind auf Seite 1von 59

Internet

When on 4th of October, 1957, the USSR launched Sputnik I (a 70 kgs bleeping sphere the size of a medicine ball)
into Earth orbit, the effect in the United States was electrifying, since it seemed overnight to wipe out the feeling
on invulnerability, the country had enjoyed since the explosion of the first nuclear bomb thirteen years before.
One of the immediate reactions was the creation of the Advanced Research Projects Agency (ARPA) within the
Ministry of Defense. Its mission was to apply state-of-the-art technology to US defense and to avoid being
surprised (again!) by technological advances of the enemy. It was also given interim control of the US satellite
program until the creation of NASA in October 1958.
ARPA became the technological think-tank of the American defense effort, employing directly a couple of
hundred top scientists and with a budget sufficient for sub-contracting research to other top American
institutions.
In 1962 ARPA opened a computer research program and appointed to its head an MIT scientist Joseph
Licklider to lead it. Namely Licklider and his group and followers, took the first steps toward the global network-
Internet. It was during this inventive era that a form of high speed Internet was initially created for different
purposes. AT&T first introduced the T1 line to handle large digital telephone networks, as it carries 24 digital
voice channels.

Some 40 years after Licklider's futuristic ideas the global computer networkInternet has become a mainstream
information, marketing, communication and whatnow tool. In 2009 the number of the Internet users reached 1.8
billion people, the number of web-sites reached 234 million, the number of emails sent reached 90 trillion (sadly,
81% percent of emails were spam). It is expected in the near future billion sensors in the home appliances,
buildings, bridges, etc. to be connected to the Internet, for controlling, optimizing and securing purposes.

In this section will be examined some of the key events and people in the history of Internet from the very
beginningfrom the people, who dreamed for some features of today's Internet, up to the beginning of 21st
century, when Internet changed into something like universal panacea and hope for future. It has 4 sub-sections,
containing 4 periods of the Internet's development The Dreamers(period up to 1960), Birth of the
ARPANET (1960-1969), Maturing of the Net(1970-1989) and Internet conquers the world (1989 till now).

Maturing of the Net

In this section will be presented some people and events, who established the foundations of the modern Internet
tools and technologies.

First chat program of Murray Turoff

In 1971 the Ph.D. in Physics Murray Turoff (born in San Francisco, February 13, 1936), while working
in the U.S. Office of Emergency Preparedness (a federally coordinated system that augments the
Nation's medical response capability), designed the Emergency Management Information Systems
And Reference Index (EMISARI), which appears to be the first multi-machine chat system. The
original purpose of the system was to help exchange information on opinion surveys between
networking experts and academics in geographically distributed locations, that could help the
government respond to emergencies. This system and others Turoff designed (RIMS, IRIS, PREMIS)
were used for a decade and a half by the U. S. Government for crisis management monitoring of most
economic disruptions (e.g., coal strikes, transportation strikes) and commodity shortages (e.g., oil,
gas, fertilizer). That system also provided the first example of instant messaging in an operational
environment.

In 1971, EMISARI was put to one of it's first practical uses to coordinate policy information for U.S.
President Nixon's wage and price control program to fight high inflation. Users of EMISARI logged in
to the system through teletypewriter terminals linked to a central computer, using long-distance
phone lines.

1
The EMISARI chat functionality was called the Party Line, and was originally developed to replace
telephone conferences which might have 30 or so participants, but where no-one could effectively
respond and take part in a meaningful discussion. Party Line had a range of useful features familiar to
users of modern chat systems, such as the ability to list the current participants, and the invocation of
an alert when someone joined or left the group.

EMISARI was written under EXEC VIII on a UNIVAC computer which had newly developed
multiprocessing capabilities, making possible the new, interactive functionality that Turoff designed.
As with current online chat rooms, the system showed a list of participants and gave alerts when
someone joined or left the chat. As many as 300 experts were set up to coordinate how the country
would respond. EMISARI continued to be used by the US Government for management of emergency
situations until 1986.

EMISARI had more features than many conferencing systems developed thirty years later, including
real-time voting, data collection assignment and reporting, and discussion threads for individual
database elements.

The First E-mail Message of Ray Tomlinson


Raymond (Ray) Samuel Tomlinson (born 1941 in Amsterdam, New York) is a graduate of Rensselaer
Polytechnic Institute (the oldest technological university in USA) and a long-time employee of the
company Bolt, Beranek and Newman, which had won the contract to create ARPANET, the
predecessor of Internet.

Tomlinson started his work on the ARPANET, developing the Network Control Protocol (NCP), the
predecessor of TCP/IP, for a time-sharing system called TENEX, as well as network programs, such as
an experimental file transfer program (called CPYNET).

During the summer and autumn of 1971, he was making improvements to the local inter-user mail
program (called SNDMSG). Single-computer electronic mail had existed since at least the early 1960's
and SNDMSG was an example of that. SNDMSG allowed a user to compose, address, and send a
message to other users' mailboxes on the same computer. Tomlinson hit on the idea to merge an intra-
machine message program with another program developed for transferring files among the remote
ARPANET computers.

There were other people, who developed similar programs. Richard W. Watson, for example, thought
of a way to deliver messages and files to numeric printers at remote sites. He filed his "Mail Box
Protocol" as a draft standard under RFC 196 in July 1971, but the protocol was never implemented.

In contrast, SNDMSG sent messages to named individuals (computer users), but only working on the
same computer. Tomlinson decided to improve it, in order to send messages to users at remote
computers also.

Tomlinson examined the mailbox as a file with a particular name. The users could write more data
onto the end of the mailbox, but they couldn't read or overwrite what was already there. The idea of
Tomlinson was to use CPYNET to append material to a mailbox file just as readily as SNDMSG could.
SNDMSG could easily incorporate the code from CPYNET and direct messages through a network
connection to remote mailboxes in addition to appending messages to local mailbox files.

The missing piece was that the experimental CPYNET protocol had no provision for appending to a
file. It could just send and receive files. Tomlinson had to make a minor addition to the protocol and
to incorporate the CPYNET code into SNDMSG.

Next problem was to provide a way to distinguish local mail from network mail. Tomlinson chose to
append an at sign (@) and the host name to the user's (login) name. He chooses namely the at sign,
because the purpose of this sign (in English) was to indicate a unit price (for example, 10 items @

2
$1.95, i.e. 10 items with price $1.95). Besides that, at signs didn't appear in names so there would be
no ambiguity about where the separation between login name and host name occurred. The at sign
also had no significance in any editors, that ran on TENEX. Thus he used the at sign to indicate that
the user was "at" some other host rather than being local.

The first message using the new command was sent in the end of 1971 and was sent between two
machines, that were literally side by side (see the lower photo) and ran the TENEX time-sharing
monitor. The only physical connection they had was through the ARPANET. In the foreground of the
photo is the computer BBN-TENEXA (BBNA). In the background is the computer BBN-TENEXB
(BBNB) from which the first email was sent. On the left, foreground, is the Teletype KSR-33 terminal,
on which the first email was printed. Immediately behind and largely obscured is another KSR-33
terminal, on which the first email was typed. BBNA was a Digital Equipment Corporation KA10 (PDP-
10) computer with 64K (36-bit) words of (real magnetic) core memory. BBNB was a smaller machine
with only 48K words.

Tomlinson sent a number of test messages to himself from one machine to the other. When the
inventor was satisfied that the program seemed to work, he sent a message to the rest of the group,
explaining how to send messages over the network. Thus the first use of network email announced its
own existence.

The next release of TENEX went out in early 1972 and included the version of SNDMSG with network
mail capabilities. The CPYNET protocol was soon replaced with a real file transfer protocol having
specific mail handling features. Later, a number of more general mail protocols were developed also.

In 1996, for the first time in USA more electronic mail was being sent than postal mail. In 2003, for
the first time spam accounts for over one-half of all e-mail sent. In 2010, the number of emails sent
reached 107 trillion (sadly, 89.1% percent of emails were spam). In 2012 total number of email
accounts was 3.3 billion and is expected to increase to over 4.3 billion accounts by 2016.

Michael Hart and Project Gutenbergfirst digital library


In the beginning of 1970s came a time, which was dreamed by the pioneers like Herbert Wells in
his World Brain and Paul Outlet in his Universal Network for Information and Documentation.
Time to establish something like a giant world library, which contains all human knowledge and is
easily accessible all over the world. If we have the storage (computers), and we have the
communication media (Internet), so how can we start with this giant task. The easiest way is to collect
in one place all book, published by the human beings.

And who was the first?

Michael Stern Hart (born in 8 March, 1943, in Tacoma, Washington) is the founder of the first project
for digital libraryProject Gutenberg, which makes electronic books freely available via the Internet.

Studying at the University of Illinois (USA), in July 1971, Hart managed to get access to a Xerox
Sigma V mainframe computer in the university's Materials Research Lab. (Hart's brother's best
friend was the mainframe operator, so he helped). This particular computer was one of the 15 nodes
on the computer network that would become the Internet. Although the focus of computer use there
tended to be data processing, Hart was aware that it was connected to a network and chose also to use
his computer time for information distribution. He received an account with a virtually unlimited
amount of computer time (not bad to have a friend computer operator:-), its value at that time has
since been variously estimated at $100,000 or $100,000,000. Hart has said he wanted to give
back this gift by doing something that could be considered to be of great value. His initial goal was to
make the 10,000 most consulted books available to the public at little or no charge, and to do so by the
end of the 20th century.

Hart related that after his account was created on July 4, 1971, he had been trying to think of what to
do with it and had seized upon a copy of the United States Declaration of Independence, which he had
been given at a grocery store on his way home from watching fireworks that evening. He typed the text

3
into a teletype machine but was unable to transmit it via e-mail. Thus, to avoid crashing the system, it
had to be downloaded individually.

This was the beginning of Project Gutenberg.

The mission statements for the project, formulated later, was:

Encourage the Creation and Distribution of eBooks

Help Break Down the Bars of Ignorance and Illiteracy

Give As Many eBooks to As Many People As Possible

Most of the early postings were typed in personally by Hart, as was the case with Declaration. He
began posting text copies of such classics as the Bible, the works of Homer, Shakespeare, and Mark
Twain. As of 1987 he had typed in a total of 313 books in this fashion. Then, through being involved in
the University of Illinois PC User Group and with assistance from Mark Zinzow, a programmer at the
school, Hart was able to recruit volunteers and set up an infrastructure of mirror sites and mailing
lists for the project. With this the project was able to grow much more rapidly.

Today project Gutenberg is available at www.gutenberg.org. Its e-texts are produced (usually scanned)
by Project's many volunteers. The collection includes public domain works and copyrighted works
included with express permission. As of December 2009, Project Gutenberg claimed over 30000
(most of books are english language, but here are also books in French, German, Finnish, Dutch,
Chinese, Portuguese, etc.) items in its collection, primarily works of literature from the Western
cultural tradition. In addition to literature such as novels, poetry, short stories and drama, Project
Gutenberg also has cookbooks, reference works and issues of periodicals. The Project Gutenberg
collection also has a few non-text items such as audio files and music notation files. It is affiliated with
many projects that are independent organizations which share the same ideals, and have been given
permission to use the Project Gutenberg trademark.

First computer virus of Bob Thomas


The most important feature of a computer virus is his ability to self-replicate (in a sense every self-
replicating program can be called a virus). The idea of self-replicating programs can be traced back as
early as 1949, when the mathematician John von Neumann envisioned specialized computers or self-
replicating automata, that could build copies of themselves and pass on their programming to their
progeny.

If a computer virus has the ability to self-replicate over a computer network, e.g. Internet, it is
called worm.

It is not known who created the first self-replicating program in the world, but it is clear that the first
worm in the world (so called the Creeper worm) was created by the BBN engineer Robert (Bob) H.
Thomas probably around 1970.

The company BBN Technologies (originally Bolt, Beranek and Newman) is a high-technology
company, based in Cambridge, Massachusetts, which played an extremely important role in the
development of packet switching networks (including the ARPANET and the Internet).

A number of well-known computer luminaries have worked at BBN, including Robert Kahn, J. C. R.
Licklider, Marvin Minsky, Ray Tomlinson, etc. Between them was the researcher Robert H. (Bob)
Thomas, working in a small group of programmers who were developing a time-sharing system
called TENEX, that ran on Digital PDP-10 (see the lower image).

4
Let's clarify, the Creeper wasn't a real virus, not only because the notion computer virus didn't exist in
1970s, but also because it was actually an experimental self-replicating program, not destined to
damage, but to demonstrate a mobile application.

Creeper ran on the old Tenex operating system (Tenex is the OS which saw the first email programs,
SNDMSG and READMAIL, in addition to the use of the "@" symbol on email addresses) used the
ARPANET (predecessor of the current Internet) to infect DEC PDP-10 computers running the
TENEX. Creeper caused infected systems to display the message "I'M THE CREEPER : CATCH ME IF
YOU CAN.

The Creeper would start to print a file, but then stop, find another Tenex system, open a connection,
pick itself up and transfer to the other machine (along with its external state, files, etc.), and then start
running on the new machine, displaying the message. The program rarely if ever
actually replicated itself, rather it jumped from one system to another, attempting to remove itself
from previous systems as it propagated forward, thus Creeper didn't install multiple instances of itself
on several targets, actually it just moseyed around a network.

It is uncertain how much damage (if any) the Creeper actually caused. Most sources say the worm was
little more than an annoyance. Some sources claim that Creeper replicated so many times, that it
crowded out other programs, but the extent of the damage is unspecified. Anyway, it was immediately
revealed the key problem with such worm programs: the problem with controlling the worm.

The Creeper program led to further work, including a version by a colleague of ThomasRay
Tomlinson, that not only moved through the net, but also replicated itself at times. To complement
this enhanced Creeper, the Reaper program was created, which moved through the net, replicating
itself, and tried to find copies of Creeper and log them out. Thus, if Creeper was the first virus,
then Reaper was the first anti-virus software.

***
Note from the author (Georgi Dalakov):
After composition of this article, I referred to Mr. Ray Tomlinson with an appeal for comment. He was
so kind to provide me one, as follows:
Your description agrees with my recollection, though I think it was somewhat later than 1970 and I
don't recall some of the details you give, such as printing a file as evidence of its presence on a
particular machine (though it must have done something to indicate its progress). I do recall
making the modifications you indicate and thinking of it as the escalation of an arms race.
There was a server (or daemon or background process) (RSEXEC, I think it was called) running on
the individual machines that supported this activity. That is, the creeper application was not
exploiting a deficiency of the operating system. The research effort was intended to develop
mechanisms for bringing applications to other machines with intention of moving the application to
the most efficient computer for its task. For example, it might be preferable to move the application
to the machine having the data (as opposed to bringing the data to the applications). Another use
would be to bring the application to a machine that might have spare cycles because it is located in a
different timezone where local users are not yet awake. The CREEPER application was a
demonstration of such a mobile application.

TCP/IP
The most popular network protocol in the world, TCP/IP protocol suite, was designed in 1970s by 2
DARPA scientistsVint Cerf and Bob Kahn, persons most often called the fathers of the Internet.

Vinton Gray "Vint" Cerf (born June 23, 1943 in New Haven, Connecticut) obtained his B.S. in Math
and Computer Science at Stanford University in 1965 and went to IBM, where he worked for some two
years as a systems engineer, supporting QUIKTRANa system to make time-shared computing more
economical and widely available for scientists, engineers and businessmen.

In 1967 he left IBM to attend graduate school at University of California, Los Angeles (UCLA), where
he earned his master's (in 1970) and PhD degree (1972) in Computer Science. During his graduate

5
student years, he studied under Professor Gerald Estrin, worked in Leonard Kleinrock's data packet
networking group that connected the first two nodes of the ARPANet, the predecessor to the Internet.
He worked as a Principal Programmer, participating in a number of projects, including the ARPANet
Network Measurement Center, a video graphics project including a computer-controlled 16 mm
camera, development of ARPANet host protocol specifications.

While at UCLA, he also met Bob Kahn, who was working on the ARPANet hardware architecture
in Bolt Beranek and Newman.

Robert Elliot Kahn (born December 23, 1938) received a B.E.E. degree from the City College of New
York in 1960, and M.A. and Ph.D. degrees from Princeton University in 1962 and 1964 respectively.

After graduation, he received a position in the Technical Staff at Bell Labs and then became an
Assistant Professor of Electrical Engineering at MIT. He took a leave of absence from MIT to join Bolt
Beranek and Newman, where he was responsible for the system design of the Arpanet, the first
packet-switched network, and was involved into the building of the Interface Message Processor.

In 1972, Kahn was hired by Larry Roberts at the IPTO to work on networking technologies, and in
October he gave a demonstration of an ARPANet network connecting 40 different computers at the
International Computer Communication Conference, making the network widely known for the first
time to people from around the world and communication engineers realizing that packet switching
was a real technology.

At the IPTO, Kahn worked on an existing project to establish a satellite packet network, and initiated a
project to establish a ground-based radio packet network. These experiences convinced him of the
need for development of an open-architecture network model, where any network could communicate
with any other independent of individual hardware and software configuration. Kahn therefore set
four goals for the design of what would become the Transmission Control Protocol (TCP):

Network connectivity. Any network could connect to another network through a gateway.
Distribution. There would be no central network administration or control.
Error recovery. Lost packets would be retransmitted.
Black box design. No internal changes would have to be made to a network to connect it to
other networks.
In the spring of 1973, Vinton Cerf joined Kahn on the project. They started by conducting research on
reliable data communications across packet radio networks, factored in lessons learned from the
Networking Control Protocol, and then created the next generation Transmission Control Protocol
(TCP), the standard protocol used on the Internet today.

In the early versions of this technology, there was only one core protocol, which was named TCP. And
in fact, these letters didn't even stand for what they do today Transmission Control Protocol, but they
were for the Transmission Control Program. The first version of this predecessor of modern TCP was
written in 1973, then revised and formally documented in RFC 675, Specification of Internet
Transmission Control Program from December 1974.

What is the current status of Internet Protocol Suite (commonly known as TCP/IP)?

It is the set of communications protocols used for the Internet and other similar networks. It is named
from two of the most important protocols in it: the Transmission Control Protocol (TCP) and the
Internet Protocol (IP), which were the first two networking protocols defined in this standard. Today's
IP networking represents a synthesis of several developments that began to evolve in the 1960s and
1970s, namely the Internet and LANs (Local Area Networks), which emerged in the mid- to late-
1980s, together with the advent of the World Wide Web in the early 1990s.

The design of the network included the recognition that it should provide only the functions of
efficiently transmitting and routing traffic between end nodes and that all other intelligence should be
located at the edge of the network, in the end nodes. Using a simple design, it became possible to

6
connect almost any network to the ARPANet, irrespective of their local characteristics. One popular
saying has it that TCP/IP, the eventual product of Cerf and Kahn's work, will run over two tin cans
and a string.

A computer or device called a router (a name changed from gateway to avoid confusion with other
types of gateways) is provided with an interface to each network, and forwards packets back and forth
between them. Requirements for routers are defined in RFC 1812.

DARPA then contracted with BBN Technologies, Stanford University, and the University College
London to develop operational versions of the protocol on different hardware platforms. Four versions
were developed: TCP v1, TCP v2, a split into TCP v3 and IP v3 in the spring of 1978, and then stability
with TCP/IP v4the standard protocol still in use on the Internet today.

In 1975, a two-network TCP/IP communications test was performed between Stanford and University
College London (UCL). In November, 1977, a three-network TCP/IP test was conducted between sites
in the US, UK, and Norway. Several other TCP/IP prototypes were developed at multiple research
centers between 1978 and 1983. The migration of the ARPANet to TCP/IP was officially completed on
January 1, 1983, when the new protocols were permanently activated.

In March 1982, the US Department of Defense declared TCP/IP as the standard for all military
computer networking. In 1985, the Internet Architecture Board held a three day workshop on TCP/IP
for the computer industry, attended by 250 vendor representatives, promoting the protocol and
leading to its increasing commercial use.

The Internet Protocol Suite, like many protocol suites, may be viewed as a set of layers. Each layer
solves a set of problems involving the transmission of data, and provides a well-defined service to the
upper layer protocols based on using services from some lower layers. Upper layers are logically closer
to the user and deal with more abstract data, relying on lower layer protocols to translate data into
forms that can eventually be physically transmitted.

The TCP/IP model consists of four layers, as it is described in RFC 1122. From lowest to highest, these
arethe Link Layer, the Internet Layer, the Transport Layer, and the Application Layer. It should be
noted that this model was not intended to be a rigid reference model into which new protocols have to
fit in order to be accepted as a standard.

Almost all operating systems in use today, including all consumer-targeted systems, include a TCP/IP
implementation.

The World Wide Web of Tim Berners-Lee


Tim Berners-Lee used to say: "I just had to take the hypertext idea and connect it to the TCP and DNS
ideas andta-da!the World Wide Web."

So how this "simple" invention happened?

In March, 1989, a physicist and a computer nerd in CERN (European Particle Physics Laboratory in
Geneva, Switzerland), Timothy John "Tim" Berners-Lee (born 8 June 1955 in London), submitted to
his boss a proposal for an information management system, the prototype of the now
ubiquitous World Wide Web. The boss was not very impressed. Vague, but exciting, were the words
that he wrote on the proposal, thus unofficially allowing Berners-Lee to continue his work on WWW
(actually the term World Wide Web will be decided next year, in 1989 Berners-Lee called his
system Mesh).

Berners-Lee had already an experience with hypertext systems, including his own. During his first stay
in CERN, in 1980, inspired by the MEMEX of Vannevar Bush, Project Xanadu of Ted
Nelson and NLS of Douglas Engelbart, he proposed a project for a new document system, based on the
concept of hypertext, designed to facilitate sharing and updating information among researchers. The
system was called ENQUIRE and was written in the Pascal programming language, using NORD-

7
10 (16-bit minicomputer of Norsk Data, running under operating system SINTRAN III), later the
program was ported to PC, then VMS (see the original proposal for ENQUIRE).

The inspiration of Berners-Lee came from the frustrating fact, that there was a lot of different data on
different computers, but it was not connected at all. Because people at CERN came from universities
all over the world, they brought with them all types of computers. Not just Unix, Mac and PC: there
were all kinds of big mainframe computers and medium sized computers running all sorts of software.
One had to log on to different computers to get at it, and even sometimes one had to learn a different
program on each computer. So finding out how things worked was a really difficult task.

Berners-Lee wanted to wrote some programs to take information from one system and convert it, so it
could be inserted into another system. More than once. The big question was: "Can't we convert every
information system so that it looks like part of some imaginary information system which everyone
can read?" And that became the WWW.

In 1990, with the help of his colleague from CERNRobert Cailliau, Berners-Lee produced a revision
of the system, which was accepted by his manager. Berners-Lee coded the first Web browser, which
also functioned as an editor (the name of the program was WorldWideWeb, running on
the NeXTSTEP operating system), and the first Web server, CERN HTTPd (short for HyperText
Transfer Protocol daemon), both running on a NeXTcube workstation (see the images bellow).

The first Web site in the world (with the DNS name info.cern.ch) built was put online on 6 August
1991.

In 1990s WWW gradually became the prevalent technology in Internet an a global information
medium. Today, the Web and the Internet allow connectivity from literally everywhere on eartheven
ships at sea and in outer space. In 2011 the number of web-sites exceeded 300 million. As of April,
2012, the Indexed Web contains at least 8.02 billion pages.

In the beginning of the century, new ideas for sharing and exchanging content ad hoc, such as
Weblogs and RSS, rapidly gained acceptance on the Web. This new model for information exchange
(called Web 2.0), primarily featuring DIY (Do It Yourselfis a term used to describe building,
modifying, or repairing of something without the aid of experts or professionals) user-edited and
generated websites and various other content like video and audio media (YouTube), microblogging
(Twitter), etc.

What is the future of WWW? How will the next versionWeb 3.0 look like?

The vision of Tim Berners-Lee's vision of the future Web as a universal medium for data, information,
and knowledge exchange is connected with the term Semantic Web. In 1999 he wrote: "I have a dream
for the Web in which computers become capable of analyzing all the data on the Webthe content,
links, and transactions between people and computers. A Semantic Web, which should make this
possible, has yet to emerge, but when it does, the day-to-day mechanisms of trade, bureaucracy and
our daily lives will be handled by machines talking to machines. The intelligent agents people have
touted for ages will finally materialize. And later in 2006 he add: "People keep asking what Web 3.0 is.
I think maybe when you've got an overlay of scalable vector graphicseverything rippling and folding
and looking mistyon Web 2.0 and access to a semantic Web integrated acros

Birth of the ARPANET


In this section I'll present some evetns and people, who in the period 1960-1969 established not only the
theoretical foundations of Internet's network topology and invented some key technologies of WWW, but also
implemented the father of InternetARPANET:

8
Ted Nelson
The USA philosopher, sociologist and pioneer of information technology Theodor Holm Nelson (born
1937 in New York) is rather controversial figure not only in the computing world. He used to repeat
his four maxims by which he leads his life: "most people are fools, most authority is malignant, God
does not exist, and everything is wrong."

Ted Nelson stated: "In 1960 I had a vision of a world-wide system of electronic publishing, anarchic
and populist, where anyone could publish anything and anyone could read it. (So far, sounds like the
web.) But my approach is about literary depthincluding side-by-side intercomparison, annotation,
and a unique copyright proposal. I now call this "deep electronic literature" instead of "hypertext,"
since people now think hypertext means the web.

After graduating the Swarthmore college with a BA in philosophy, in 1960, Ted enrolled in graduate
school at Harvard. During his first year he took a course in computer programming using an IBM
7090 computer and began to think about writing a document management system to index and
organize his collection of notes. He started a term project for creating a writing system similar to a
word processor (essentially it was a word processor capable of storing multiple versions, and
displaying the differences between these versions), but that would allow different versions and
documents to be linked together by association and nonlinearly. He did not complete his first project,
but continued to work on a system, which was very similar to that, envisioned by Vannevar Bush (see
the memex), but based not on microfilms, as "memex", but on computer. This idea became the
overriding concern of his entire life.

Let's see what exactly inspiration led Nelson to develop hypertext (excerpt from an interview
for Wired magazine):
Well I was always, as a kid, into writing and reading and literature and movies basically, like a lot
of people, and I had done a great deal of writing as a youth, and re-writing, and the intricacy of
taking ideas and sentences and trying to arrange them into coherent, sensible, structures of thought
struck me as a particularly intricate and complex task, and I particularly minded having to take
thoughts which were not intrinsically sequential and somehow put them in a row because print as it
appears on the paper, or in handwriting, is sequential. There was always something wrong with
that because you were trying to take these thoughts which had a structure, shall we say, a spatial
structure all their own, and put them into linear form. Then the reader had to take this linear
structure and recompose his or her picture of the overall content, once again placed in this
nonsequential structure. You had two it seemedand now I'm reconstructing because I don't know
how explicitly I thought this out as a youthyou had to take these two additional steps of
deconstructing some thoughts into linear sequence, and then reconstructing them. Why couldn't that
all be bypassed by having a nonsequential structure of thought which you presented directly? That
was the hypothesiswell the hyperthesis reallyof hypertext, that you could save both the writer's
time and the reader's time and effort in putting together and understanding what was being
presented.

On top of his basic idea, Nelson wanted to facilitate nonsequential writing, in which the reader could
choose his or her own path through an electronic document. In the beginning of 1965, Nelson used for
the first time the term "hyper-text". Later in 1965, he presented a paper on "zippered lists" (key
algorithm in his Xanadu system) at a national conference of the Association for Computing
Machinery, in which he published the term. These "zippered lists" would allow compound documents
to be formed from pieces of other documents, a concept named transclusion.

Nelson continued to expound his ideas, but he did not possess the technical knowledge to tell others
how his ideas could be implemented, and so many people simply ignored him, but he still persisted. In
1967, he named his system XANADU, and with the help of interested, mainly younger, computer
hacks continued to develop it. In his 1974 book Computer Lib/Dream Machines and in the
1981 Literary Machines Nelson described his ideas.
Xanadu is a high-performance hypertext system that assures the identity of references to objects, and
solves the problems of configuration management and copyright control. Anyone is allowed to
reference anything, provided that references are delivered from the original, and possibly involving

9
micro payments to the copyright holders. Let see which are at the moment the basic objectives of
Xanadu, as they were defined on xanadu.com:

High-power hypertext (much richer structures than "pages")

Any number of overlapping, publishable two-way links made by any number of people

EVERYTHING ANNOTATABLE (special case of the above)

VERSION MANAGEMENT, with surviving portions viewable side by side

RIGHTS MANAGEMENT by:

Generalized permission for on-line re-use (now called transcopyright)

Payment by each user for any downloaded quotation

All quotations back-followable to their original contexts

EVERYTHING FREELY REPUBLISHABLE

In 1972, Cal Daniels completed the first demonstration version of the Xanadu software on a computer
Nelson had rented for the purpose, though Nelson soon ran out of money. In 1974, with the advent of
computer networking, Nelson refined his thoughts about Xanadu into a centralized source of
information, calling it a "docuverse".

In 1979, Nelson formed a strong group of his followers, to hashed out their ideas for Xanadu, but with
no success. The group continued their work, almost to the point of bankruptcy. In 1983 however,
Nelson obtain the support of John Walker, founder of Autodesk, and the group started working on
Xanadu with Autodesk's financial backing.

Xanadu has never been totally completed and is far from being implemented even now. From 1999 it
is an open-source project. In many ways Tim Berners-Lee's World Wide Web is a similar, though
much simplified, system. Let see what is the opinion of Nelson for WWW (excerpt from the same
interview for Wired magazine):
...I think the WWW was a brilliant simplification. As I understand it, and maybe I have this wrong,
but Tim Berners-Lee came and we had lunch, in, oh I guess it was 1989, 90, something like that, in
Sausalito, and I really liked the guy, and he'd done this very simple thing, and it sounded too trivial
to me {laughs} but he certainly was a nice fellow and I expected to keep in touch with him, although
I am a very bad correspondent, and the next thing I knew suddenly the thing had caught on. And
what it turns out to be is simply an extension of file transfer protocol, in other words it's saying you
can anonymously go in and dip in and take out this file and here is a proposed way to look at it. This
is called HTML. You have to understand the HTML/SGML kind of format where you've got all these
warty little knobs and boogers in it that are formatting codesthis is absolutely contrary to the
Xanadu idea that you have clean data undefiled. However, it works, it's very simple, and you can
always take those things out, so that's OK. But all it is is FTP with lipstick so that you can look at
these things and the jump addresses are hidden and the formats and you have paragraph levels and
stuff and it's basically what people needed and frankly I think it's much better than word
processing. I'm really happy now that I'm planning to switch from Microsoft Word to HTML just
because there's no need not to. It's a perfectly good format, and it makes everything simpler to
browse in.

The NLS system of Douglas Engelbart

10
The name of the prominent American inventor and computer pioneer Douglas Carl Engelbart
(biography of Douglas Engelbart ) has already been mentioned several times in this site (see the first
computer mouse).

Here we will mention only his system NLS as a first working hypertext system.

In the late summer of 1945, at the very end of World War II, a young 20 years old US Navy radar
technician in the PhilippinesDouglas Engelbart, went to the local Red Cross library and picked up a
copy of the Atlantic Monthly journal from July, 1945. There he stumbled upon an interesting article,
namely Vannevar Bush's work As We May Think, presenting his "memex" automated library system.
The young Douglas was profoundly influenced by the Bush's vision of the future of information
technology.

In 1950s Engelbart started developing Bush's ideas, extending them in the broader field of something,
which he will later on call Augmenting Human Intellect. In 1957 he joined the Stanford Research
Institute and in 1962 started work on Augmenting Human Intellect: A Conceptual Framework. It was
a project to develop computer tools to augment human capabilities, for boosting mankinds capability
for coping with complex, urgent problems. In October, 1962, Engelbart published his own version of
Bush's vision, describing an advanced electronic information system in the paper "Augmenting
Human Intellect: A Conceptual Framework" (see the paper "Augmenting Human Intellect" ), prepared
for the Air Force Office Of Scientific Research and Development.

In this article he mentioned: Most of the structuring forms I'll show you stem from the simple
capability of being able to establish arbitrary linkages between different substructures, and of
directing the computer subsequently to display a set of linked substructures with any relative
positioning we might designate among the different substructures. You can designate as many
different kinds of links as you wish, so that you can specify different display or manipulative
treatment for the different types.

Engelbart demonstrated creating links between three sample sentences, mentioning: Here is one
standard portrayal, for which I have established a computer process to do the structuring
automatically on the basis of the interword links....
He aimed at a term with the light pen and hit a few strokes on the keyset, and the old text jumped
farther out of the way and the definition appeared above the diagram, with the defined term
brighter than the rest of the diagram. And he showed you also how you could link secondary
phrases (or sentences) to parts of the statement for more detailed description. These secondary
substructures wouldnt appear when you normally viewed the statement, but could be brought in by
simple request if you wanted closer study...
It proves to be terrifically useful to be able to work easily with statements that represent more
sophisticated and complex concepts. Sort of like being able to use structural members that are
lighter and strongerit gives you new freedom in building structures...

In the same 1962 Engelbart hired a small team of researchers to develop a demonstration hyper
collaborative knowledge environment system called NLS (for oNLine System), first published (see the
article for NLS) and publicly demonstrated in 1968. NLS system is difficult to describe, since it is a
very richly comprehensive environment of tools and practices for facilitating any scale of heavy
knowledge work. Engelbart used NLS for all its own knowledge work, from drafting, publishing, email,
shared screen collaborative viewing and editing, document cataloging, project management, shared
address book, and all source code development and maintenanceall in an integrated hyper
groupware environment, filled with many special features for high performance work.

11
For example, the user can create a link to any paragraph or line of code or email paragraph, and he
can see when paragraphs and lines of code were last edited and by whom, and even view a file filtered
by author since a certain date and time (as in why doesn't the code work this morning, let's see who
was in there changing what when!), he can browse with outline views, drill down into the structure of
a document or source code and fly around with a number of precision browsing features and custom
viewing features, and edit the structure as well as the text, within and across files and application
domains.

Engelbart continued to evolve NLS under real world usage with a team of up to 47 researchers in his
lab at SRI, cultivating a networked community of early customer IT pioneers via the newly
formed ARPANET.

Leonard Kleinrock
Three people can be credited as inventors of packet-switched networks, thus laying foundations for
Internet: Leonard Kleinrock, Donald Davies and Paul Baran.

Leonard Kleinrock (born on 13th of June 13, 1934, in New York) is a famous American engineer and
computer scientist, who made several important contributions to the field of computer networking, in
particular to the theoretical side of computer networking. He also played an important role in the
development of the ARPANET.

Kleinrock is from a poor family of Ukrainian jewish immigrants. After graduating from the Bronx
High School of Science in 1951, Kleinrock's father, asked him to stay in New York to help support the
family. So Leonard wrote letters to "every chamber of commerce in the country," and asked for
scholarship opportunities in their towns. Kleinrock worked as a technician at a cousin's shop and
spent 5 and a half (instead of 4) years, taking night classes at New York City College, where in 1957 he
received his BEE (Bachelor of Electrical Engineering) degree.

In 1958 he went to MIT, to work on his PhD for the best guy at MITClaude Shannon, the creator
of Information Theory and the mathematical concept of entropy, not to mention the resurrector of
Boolean algebra as a useful way for describing data. "Brilliant man," Kleinrock says. "My role model
then and now."
In MIT Kleinrock received a master's degree (1959) and a Ph.D. (1963) in Electrical Engineering and
Computer Science. In 1961 he published his first paper on digital network
communications, Information Flow in Large Communication Nets. He developed his ideas further in
his 1963 Ph.D. thesis, establishing a mathematical theory of packet networks (packet switching), and
then published a comprehensive analytical treatment of digital networks in his book Communication
Nets in 1964.

After completing his thesis in 1962, Kleinrock moved to UCLA (University of California at Los
Angeles), and later established the Network Measurement Center (NMC), led by himself and
consisting of a group of graduate students, working in the area of digital networks.

Many of Kleinrock's initial ideas came from brainstorming about the best way for students and
researchers at MIT to most efficiently share computer time. "Computers burst data, they transmit
then they stop a while, while they're thinking or processing or whatever. And in those days data
communication lines were really expensive," said Kleinrock. "The idea was, don't dedicate a resource
to somebody -- when I was sitting there, scratching my head, that machine was idle, I'm not using it.
You want to do it in dynamic fashion: whoever needs it gets it now. If you're not using it, let somebody
else in."

When in 1966, Lawrence Roberts (a colleague of Kleinrock from MIT) joined the project of developing
the ARPANET, he used Kleinrock's Communication Nets to help convince his colleagues that a wide
area digital communication network was possible. In October, 1968, Roberts gave a contract to
Kleinrock's center as the ideal group to perform ARPANET performance measurement and find areas
for improvement.

12
On a historical day in early September, 1969, a team at Kleinrock's NMC connected one of their SDS
(Scientific Data Systems) Sigma 7 computers to an Interface Message Processor (the first switch),
thereby becoming the first node on the ARPANET, and the first computer ever on the Internet (see the
lower photo).

As the ARPANET grew in the early 1970's, Kleinrock's group stressed the system to work out the
detailed design and performance issues involved with the world's first packet switched network,
including routing, loading, deadlocks, and latency.

Kleinrock later published several of the standard works on the subject, continued to be active in the
research community, and published more than 200 papers and authored six books. His theoretical
work on hierarchical routing, done in the late 1970s with his student Farouk Kamoun, is now critical
to the operation of today's worldwide Internet.

Leonard Kleinrock has received numerous professional awards. He is a member of the National
Academy of Engineering, an IEEE fellow, and an ACM fellow. He is the recipient of the Marconi
Award, the L. M. Ericsson Prize, the UCLA Outstanding Teacher Award, the Lanchester Prize, the
ACM SIGCOMM Award, the Sigma Xi Monie Ferst Award, the INFORMS Presidents Award, and the
IEEE Harry Goode Award. He shared the Charles Stark Draper Prize for 2001 with Vinton Cerf,
Robert Kahn, and Lawrence Roberts for their work on the ARPANET and Internet. He was selected to
receive the prestigious National Medal of Science, from President George W. Bush in the White House
on September 29, 2008.

Paul Baran
Three people can be credited as inventors of packet-switched networks, thus laying foundations for
Internet: Paul Baran, Leonard Kleinrock and Donald Davies.

Paul A. Baran was born in the town of Grodno (then in Poland, now in Belarus) as
(Yiddish given name "Pesach", one of the most commonly observed Jewish holidays, usually in April)
on April 29, 1926, as the youngest of three children in a Jewish family. In May, 1928, his family moved
to the United States, first in Boston, then in Philadelphia, where his fatherMorris "Moshe" Baran
(18841979) owned a small grocery store. As a child, Baran delivered groceries for his father in his
little red wagon.

In 1949 Baran received his bachelor degree in electrical engineering from Drexel University in
Philadelphia and immediately was hired by the Eckert-Mauchly Computer Company as a technician
on the world's first commercial computer, the Univac. In 1950 he went to the Raymond Rosen
Engineering Products Company, where he designed the first telemetering equipment for Cape
Canaveral. In 1955 he married to Evelyn Murphy, moved to Los Angeles and joined the Hughes
Aircraft Company, where he worked for 4 years, at the same time preparing his Master degree.

In 1959 Baran obtained his Master degree in Engineering from UCLA and began working for the
Research And Development (RAND) research organization in the same year. RAND was founded in
Santa Monica, California, soon after the WWII to help maintain the unique system analysis and
operations research skills developed by the US military to manage the unprecedented scale of
planning and logistics during that global conflict.

The US Air Force had recently established one of the first wide area computer networks for the SAGE
radar defence system, and had an increasing interest in reliable and survivable, wide area
communications networks, so they could reorganize and respond after a nuclear attack, diminishing
the attractiveness of a first strike option by the main enemySoviet Union.

Relying on his experience in the radio-networks, Baran began an investigation into development of
survivable communications networks, the results of which were first presented to the Air Force in the
summer of 1961, and then he wrote a series of eleven comprehensive papers, which appeared in a
series of RAND studies, published between 1960 and 1962, and then in a book, titled On Distributed
Communications in 1964.

13
Baran's study describes a remarkably detailed architecture for a distributed, survivable, packet
switched communications network. The network is designed to withstand almost any degree of
destruction to individual components without loss of end-to-end communications. Since each
computer could be connected to one or more other computers, Baran assumed that any link of the
network could fail at any time, and the network therefore had no central control or administration
(see the lower scheme).

Using a mini-computer, Baran and his team developed simulation programs to test basic connectivity
of an array of nodes with varying degrees of linking. That is, a network of n-ary degree of connectivity
would have n links per node. The simulation randomly destroyed nodes and subsequently tested the
percentage of nodes who remained connected. The result of the simulation revealed that networks
where n>=3 had a significant increase in resilience against even as much as 50% node loss. Baran's
conviction gained from the simulation was that redundancy was the key.

Baran's work was accepted by the US Air Force for implementation and testing, but was neglected. His
series of papers and book however then influenced Larry Roberts and Kleinrock to adopt the
technology for development of the ARPANET network a few years later. Actually the ARPANET was
never intended to be a survivable communications network, but some people still maintain the myth
that it was. Instead, the resilience feature of a packet switched network, that uses link-state routing
protocols is something we enjoy today in some part from the research done to develop a network that
could survive a nuclear attack.

While working at the RAND, in addition to his innovation in networking products, Baran also created
the first metal detector, used today in all airports and many other security points.

Paul Baran left RAND to become an entrepreneur and private investor in the early 1970's, and
founded a couple of telecommunication companiesMetricom, Com21.com, and cofounded
the Institute for the Future.

Baran holds several patents and has received numerous professional honors, including the
IEEE Alexander Graham Bell Medal, the Marconi International Fellowship Award and National
Medal of Technology and Innovation.

Paul Baran was a finest gentlemen and an extremely creative man. However, he believed that
innovation was a "team process" and he didn't seek credit for himself. He once said: "Many of the
things I thought possible would tend to sound like utter nonsense, or impractical depending on the
generosity of spirit in those brought up in an earlier world."

Paul Baran died of lung cancer on March 28, 2011, in Palo Alto, California. His wife, Evelyn, died in
2007. In addition to his son, David, of Atherton, California, he was survived by three grandchildren.

Donald Davies

Three people can be credited as inventors of packet-switched networks, thus laying foundations for
Internet: Paul Baran, Leonard Kleinrock and Donald Davies. Little known to these men, at the time of
inventing these networks, was exactly how much of an impact it would play on mankind. Arguably
their impact can be compared to the entire history of telecommunications. Could you imagine a world
without search engines, blog hosting and social media or more importantly, could these men imagine
a world with these aspects in it?

The Welsh Donald Watts Davies was born in Treorchy in the Rhondda Valley, Wells, on 7th of June,
1924, in the family of a clerk at a coal mine. When his father died a few months later, his mother
moved Donald and his twin sister to her home town of Portsmouth, where he went to school. He went
on to get a BSc degree in physics at Imperial College, London, which he received in 1943, then took
another first class degree in mathematics, graduating in 1947, both with first class honors.

14
During his last year at university he attended a lecture by John Womersley, superintendent of the
mathematics division of the National Physical Laboratory (NPL), about the ACE digital computer,
which was being developed there. Excited by the potential of the new technology, Davies immediately
applied to join the group, and in September 1947 he joined the laboratory as a member of the small
team, which was led by Alan Turing.

The group's work, based on Turing's design, eventually led to the development of the Pilot
ACE computer, which ran its first program on May, 1950, and was one of the first electronic stored-
program digital computers in the world. Along with Ted Newman, Jim Wilkinson and others, Davies
had played an important part in the detailed design and development of the machine, and its
successor, the full-scale ACE computer.

As computer development moved from the laboratory to industry, Davies's interests widened to
include the purposes for which computers could be used. Davies worked for a while on applications
such as road traffic simulation and machine translation (in 1958 he initiated a project to use a
computer to translate technical Russian into English).

In 1963 he was appointed technical manager of the advanced computer techniques project,
responsible for government support for the British computer industry. The key new project was the
development of an idea he had originated in 1965: that to achieve communication between computers
a fast message-switching communication service was needed, in which long messages were split into
chunks sent separately, so as to minimize the risk of congestion.
Davies said he had realized that it was inefficient for a computer to send an entire file to another
computer in an uninterrupted stream of data, chiefly because computer traffic is 'bursty' with long
periods of silence. The stream must be broken up into chunks (which he called packets), and the
technique became known as packet-switching. As he later wrote: "So, in November 1965, I conceived
the use of a purpose-designed network employing packet switching in which the stream of bits is
broken up into short messages, or packets, that find their way individually to the destination, where
they are reassembled into the original stream".

So Davies initiated the terms packet and packet switching into the network terminology (which is
much catchier than Baran's distributed adaptive message block switching). Davies had considered
many possibilitiesblock, unit, segment, etc., before deciding on packet as a sort of small package.
And as he later told Baran: "Well, you may have got there first, but I got the name."

Davies' team presented its paper at a 1967 conference in Tennessee, USA, where Lawrence Roberts of
the Advanced Research Projects Agency (ARPA) of the U.S. Department of Defense presented a design
for creating a computer network. In the United States, this led to the development of ARPANET, the
prototype of Internet.

Davies' later work concentrated on the security of data. He undertook security studies for
teleprocessing systems, financial institutions, government agencies and suppliers.

Davies received the British Computer Society Award in 1974. He was appointed a Distinguished
Fellow of the British Computer Society in 1975, a CBE in 1983 and a Fellow of the Royal Society in
1987. His books included "Communication Networks for Computers" in 1973, "Computer Networks
and their Protocols" in 1979, and "Security for Computer Networks" in 1984.

Donald Davies died on May 28th, 2000.

Joseph Licklider

15
Joseph Carl Robnett Licklider (see biography of Licklider), called also J.C.R. or "Lick", was an
American scientist, an imaginative experimenter and theoretician, who left major marks not only in
computer science, but also in psychoacoustics.

In 1950s, working as a associate professor at the MIT (Massachusetts Institute of Technology) and
director of the Acoustics Laboratory, Lick got caught up in Project SAGE. It was a crash program to
create a computer-based air-defense system against Soviet long-range bombers. The computer in
SAGE was Whirlwind (see the lower photo), which had been under development at MIT since 1944
and recently had been completed under the direction of Jay Forrester. Other early computers, such as
ENIAC, had started out as giant calculators, with an operating style to match: You entered the
numbers and eventually got back a printout with the answer. This came to be known as batch-
processing. Whirlwind, by contrast, had started out as a flight simulator and had evolved into the
world's first real-time computer: It would try to respond instantly to whatever the user did at the
console. The challenge was to prove that a computer could take the data coming in from a new
generation of air-defense radars and display the results rapidly in a meaningful form.

Lick headed SAGE's human-factors team, and he saw the project as an example of how machines and
humans could work in partnership. Without computers, humans couldn't begin to integrate all that
radar information. Without humans, computers couldn't recognize the significance of that
information, or make decisions.

Developing in 1950s his theories of perception Lick used the analog equipment for generating stimuli,
collecting responses, analyzing them, and so on. He found out, however, that analog computers were
not flexible enough to help him in his theory-building effort. This disappointment with analog
computers marks the beginning of his love affair with digital computers as modeling tools.

In 1957 year he left MIT for the small consulting firm Bolt Beranek and Newman (BBN), as vice
president and director of research of the departments of psychoacoustics, engineering psychology, and
information systems. His train of thought went into strange new paths. That spring and summer, he
kept track of what he actually did during the day, analyzing later the data with shocking results.
"About 85 percent of my 'thinking' time was spent getting into a position to think, to make a decision,
to learn something I needed to know," he later wrote. He concluded that his decisions on what work to
attempt "were determined to an embarrassingly great extent by considerations of clerical feasibility,
not intellectual capability."

This 85% of his research time, devoted to performing routine clerical or mechanical operations
ranging from calculating and data plotting to collecting information, are tasks that in principle could
be performed better and faster by a computer. Computers, he believed, would rescue the human mind
from its enslavement by mundane detail. Human and machine were destined to unite in an almost
mystical partnership, with computers handling rote algorithms while people provided the creative
impulses. The hope, he said, was that "the resulting partnership will think as no human brain has
every thought and process data in a way not approached by the information-handling machines we
know today." Lick found this vision of human-computer symbiosis so compelling that standard
psychology could no longer compete.

Thus he switched fields, becoming an active member of the computer community. In March 1960 Lick
published a seminal paperMan-Computer Symbiosis, where he formulated a new vision of
computing. He described a machine that humans could relate to in the manner of "a colleague whose
competence supplements your own"a friend who could help when the problems got too hard to
think through in advance. Such problems "would be easier to solve" he wrote, "and they could be
solved faster, through an intuitively guided trial-and-error procedure in which the computer
cooperated, turning up flaws in the reasoning or revealing unexpected turns in the solution."

Real-time computers were still a rarity in 1960s, and far too expensive for personal use. Therefore,
Lick concluded, the most efficient way to use this technology was to have the computer "divide its time
among many users." This certainly was not an original idea; such "time-sharing systems" were already
under development at MIT and elsewhere. Lick however followed that notion to its logical conclusion,
describing an online "thinking center" that would "incorporate the functions of present-day libraries."

16
He foresaw "a network of such centers, connected to one another by wide-band communications lines
and to individual users by leased-wire services." Sounds like ARPANET, yeah?

Lick also saw a desperate need for better ways for interaction human-computer. Punch cards and
printouts were, he wrote, hopelessly impoverished relative to human communication via sight, sound,
touch and even body language. Lick proposed solution: a desk-sized console that would function much
like today's personal computer, equipped with voice and handwriting recognition. He described a
display surface "approaching the flexibility and convenience of the pencil and doodle pad or the chalk
and blackboard."

Lick pointed out the need for reference works distributed via cheap, mass-produced "published
memory"; data storage that could access items by content, and not just by names or keywords; and
languages that would allow you to instruct the computer by giving it goals, instead of step-by-step
procedures. He also revealed his mixed feelings about artificial intelligence, then in its infancy. He saw
it as being potentially very useful, but knew far too much about the brain and its complexities to
believe that computers would soon be surpassing humans.

In October 1962 Lick became the first director of the Information Processing Techniques Office
(IPTO) of the Advanced Research Projects Agency (ARPA) of the Department of Defense, and
launched a research program after his vision of man-computer symbiosis, including hardware and
software requirements for such a symbiosis. The Pentagon had formed ARPA five years earlier in the
aftermath of USSR's Sputnik as a fast-response research agency, charged with making sure the United
States was never again caught flat-footed. Now, ARPA wanted to set up a small research program in
"command and control". This was a critical matter in the nuclear age, and was obviously going to
involve computers. And once ARPA director Jack Ruina heard Lick expound upon his vision of
interactive, symbiotic computing, he knew he had found the right person to lead the effort. Lick was
being offered an opportunity to spend big money (over 10 million USD) in pursuit of his vision of
human-computer symbiosis. The IPTO program, initiated by Lick turned out to be very successful, so
much so that it has become by now a model for other government-sponsored research programs.

In IPTO Lick formed scattered groups of researchers around the country who already shared his
dream, and nurture their work with ARPA funding. First major project, supported by Lick, was Project
MAC at MIT, founded with Lick's encouragement as a large-scale experiment in timesharing and as a
prototype for the computer utility of the future. MAC (the name stood for both "Multi-Access
Computer" and "Machine-Aided Cognition") would also incorporate Marvin Minsky's Artificial
Intelligence (AI) Laboratory. Other major sites included Stanford, where Lick was funding a new AI
group under time-sharing inventor John McCarthy; Berkeley, where he had commissioned another
demonstration of time-sharing; Rand Corp., where he was supporting development of a "tablet" for
freehand communication with a computer; and Carnegie Tech, where he was funding Allen Newell,
Herbert Simon and Alan Perlis to create a "center of excellence" for computer science. Lick had also
taken a chance on a softspoken visionary he barely knew, namely Douglas Engelbart, whose ideas on
augmenting the human intellect with computers closely resembled his own and who had been
thoroughly ignored by his colleagues. With funding from Lick, and eventually from NASA as well,
Engelbart would go on to develop the mouse, hypertext, onscreen windows and many other features of
modern software.

Lick wanted to create a community in which widely dispersed researchers could build on one
another's work instead of generating incompatible machines, languages and software. He broached
this issue in an April 1963 memo to "Members and Affiliates of the Intergalactic Computer Network".
The solution was to make it extremely easy for people to work together by linking all of ARPA's time-
sharing computers into a national system. He wrote: "If such a network as I envisage nebulously could
be brought into operation, we would have at least four large computers, perhaps six or eight small
computers, and a great assortment of disc files and magnetic tape unitsnot to mention the remote
consoles and teletype stationsall churning away."

Again to close to what we now call the Internet. This time however Lick didn't stop there. Clearly
enamored by the idea, he spent most of the rest of the memo sketching out how people might use such
a system. He described a network in which software could float free of individual machines. Programs
and data would live not on an individual computer but on the Net.

17
A major project on which Lick worked at BBN was a study of future libraries, granted to him by
the Council on Library Resources. It was intended to be a five-year effort beginning in November
1961, but it lasted only two years because Lick left BBN in 1962 to become director of IPTO, and even
his long-distance supervision of the project had to be terminated a year later. The final report on the
project, completed in January 1963, was deemed by the council to be an important contribution
deserving publication in book form. The book Libraries of the Future, published in 1965 by MIT Press,
presents a vision of future libraries based on Licks vision of man-computer symbiosis.
The first half of the book is similar in structure to his earlier paper. After limiting the scope of
Libraries to the body of documents that could be stored in digital form without loss of value and
estimating its current size, Lick presents a detailed analysis of the intellectual processes involved in
the acquisition, organization, and use of knowledge. This is followed by a description of the structure
and usage of the procognitive systems he envisioned, capable of searching the body of documents
under user control. Finally, Lick outlines, as in his earlier paper, a research program intended to bring
about his vision. The second half of the book reports on the initial steps of the research program
carried out as part of the library project.

In 1964 Lick completed his two-year tour of duty at ARPA and became a consultant to the director of
research of IBM. At ARPA, program managers traditionally moved on after a year or two to give
someone else a chance, and Lick was no exception. But in September 1964, when he left ARPA for the
IBM research laboratory, he took care to find a successor who shared his vision. His choice was Ivan
Sutherland, a 26-year-old computer graphics genius from MIT's Lincoln Lab whose doctoral
project, Sketchpad, was the ancestor of today's computer-aided design software.
Lick's influence would continue to be felt at ARPA for more than a decade. Sutherland's successor in
1966 would be Robert Taylor, who shared with Lick a background in psychology and who was
probably Lick's most enthusiastic convert to the symbiosis vision. It was Taylor who would inaugurate
the actual development of Lick's proposed computer network, which began operation in 1969 as the
ARPAnet and ultimately evolved into the Internet. And it was Taylor who went on to head the
computer group at Xerox's Palo Alto Research Center (PARC), where, during the 1970s, researchers
turned Lick's notion of symbiosis into a working system. When Taylor left ARPA in 1969, he handed
the reins to ARPAnet architect Larry Roberts, another computer graphics maven, who had become
intrigued with networking after a late-night bull session with Lick.
So, in 1966 Lick returned to MIT to direct Project MAC and as professor of electrical engineering and
as director of Project MAC, the research laboratory he had been instrumental in establishing in 1963
and which became eventually the present Laboratory for Computer Science. In 1974 he took a year's
leave of absence to return to Washington with the Department of Defense, again as director of the
IPTO. Licklider returned to MIT from government service in 1975 and remained there until his
retirement in 1985, when he became emeritus professor.

Larry Roberts
Larry Roberts is sometimes called the "father of the ARPANET." He earned this nickname by directing
the team of engineers that created the ARPANET. Roberts was also the principal architect of the
ARPANET.

Lawrence (Larry) G. Roberts was born in 1937 in Westport, Connecticut, as the son of Elliott and
Elizabeth Roberts, who both had earned their doctorates in chemistry. During his youth, he built a
Tesla coil, assembled a television, and designed a telephone network built from transistors for his
parent's Girl Scout camp.

Roberts attended the Massachusetts Institute of Technology (MIT), where he received his bachelor's
degree (1959), master's degree (1960), and Ph.D. (1963), all in electrical engineering.

After receiving his PhD in 1963, Roberts continued to work at the MIT Lincoln Laboratory, mainly in
the field of computer graphics (see the lower image). Having read the seminal 1961 paper
"Intergalactic Computer Network" of Licklider, Roberts started to work also in the field of computer-
to-computer networks, that could communicate via data packets.

In 1966, Robert Taylor assumed the directorship of ARPA's Information Processing Techniques Office
(IPTO), Licklider's old post. He noticed that IPTO research contractors were constantly requesting

18
more computing resources. Most of them wanted their own computersan expensive luxury. Taylor
also noticed that there was a lot of duplication of research. This waste of resources also cost money.
Building on the theoretical legacy of Licklider, Taylor decided that ARPA should link the existing
computers at ARPA-funded research institutions together. This would allow everybody on the network
to share computing resources and results.

With the go-ahead to build a network Taylor began looking for someone to manage the project. His
first choice was namely the young Larry Roberts. Roberts was a shy man who was well-respected in
his field. He was known for his good management skills and dedication to his work.

Roberts also had experience with network computing (which was a rare commodity in those days). In
1965, a psychologist named Tom Marill, who had studied under Licklider and been influenced by his
interest in computers, approached ARPA and proposed a project to conduct an experiment linking
Lincoln Lab's TX-2 computer to the SDC Q-32 computer in Santa Monica. ARPA officials thought it
was a good idea, but suggested that Marill carry out his experiment under the sponsorship of the
Lincoln Lab, which he did. Officials at the Lincoln Lab put Roberts in charge of the project. The
experiment although much smaller in scope than the ARPANET would be was a success. Response
times were slow and connection reliability was often poor, but Marill's project provided a solid first
step. In 1966 Roberts and Marill publish a paper about their earlier success at connecting over dial-up.

In 1966, Taylor managed to persuade Roberts to accept the position of manager and principal
architect of the ARPANET, the precursor to the Internet. Roberts designed and managed the building
of the APRANET over the next 6 years. In 1967, he attended a meeting for ARPA's Principal
Investigators or PIs (scientists heading ARPA-funded research projects). The main topic was the new
networking project. Roberts laid out his plans. He wanted to connect all ARPA-sponsored computers
directly over dial-up telephone lines. Networking functions would be handled by "host" computers at
each site. This idea was not well-received. Researchers did not want to relinquish valuable computing
resources to administer this new network and did not see how they would benefit from sharing
resources with other researchers.

In 1989, Roberts recalled: "Although they knew in the back of their mind that it was a good idea and
were supportive on a philosophical front, from a practical point of view, theyMinsk and McCarthy
[two prominent computer scientists], and everybody with their own machine-wanted [to continue
having] their own machine. It was only a couple of years after they had gotten on [the ARPANET] that
they started raving about how they could now share research, and jointly publish papers, and do other
things that they could never do before."

Many also foresaw problems trying to facilitate communication between machine with many different
incompatible operating systems and languages. All in all the reception to Roberts' plans was a cold
one.

Toward the end of the meeting, a man named Wesley Clark (the creator of the LINC computer)
handed Roberts a note that read: You've got the network inside out. After the meeting, Roberts talked
with Clark, who suggested that Roberts employ small computers at each site to handle networking
functions and leave the host computers alone. All of the small computers could speak the same
language which would facilitate communication between them. Each host computer would only have
to adapt its language once in communicating with it's small computer counterpart. Each host
computer would be connected to the network via its small computer which would act as a sort of
gateway. The small computers could also remain under more direct ARPA control than were the large
host computers.

Roberts adopted Clark's idea. He called the small computers Interface Message Processors (IMPs).
Roberts decided that the network should start out with four sites: UCLA, the Stanford Research
Institute (SRI), the University of Utah, and UC Santa Barbara. This would be the core and the network
could grow from there. (SRI had been chosen as one of the first sites partly because Doug
Engelbart was there.) By the middle of 1968, Roberts sent out a request for bids to build the IMPs to
140 companies. In late December, the bidding was over. The best offer came from Bolt, Beranek and
Newman (BBN), where used to work Licklider.

19
Packet switching proved very controversial to communications people. Conventional opinion then
held that packet switching could never work. Many of the University computer research centers also
felt the network would steal their computer power. However, Roberts' team, in conjunction with
contractor BBN, which assembled and installed the hardware, proved them both wrong and the
network worked with much higher efficiency and utility than either group imagined.

In August 1969, BBN delivered the first IMP to UCLA. A month later, the second was delivered to SRI.
The two were connected and the ARAPNET was born. By 1973, 23 computers were connected
worldwide. At that point Roberts turned the development over to Bob Kahn and Vint Cerf and left
ARPA to form the first commercial packet network.

After ARPA, Dr. Roberts founded the world's first packet data communications carrier, Telenet - the
company that developed and drove adoption of the popular X.25 data protocol. Roberts was CEO from
1973 to 1980. Telenet was sold to GTE in 1979 and subsequently became the data division of Sprint.
From 1983 to 1993, Roberts was Chairman and CEO of NetExpress, an electronics company
specializing in packetized fax and ATM equipment. Roberts was president of ATM Systems from 1993
to 1998.

In 1999 Dr. Roberts undertook to redesign the IP router (not the protocol) to route flows, not just
random packets, to support high Quality of Service. (QoS) flows across the IP network. To do this he
founded Caspian Networks which built highly capable flow routers that accomplished the goal of ATM
quality QoS compatibly over IP networks. These routers were aimed at the network core and started
deployment in 2003 for QoS sensitive applications like video conferencing and P2P traffic control.
However, this first generation of flow routers was large, expensive, and did not take advantage of
many simplifications that were possible. Thus, Dr. Roberts left Caspian in 2004 in order to create a
more efficient and economic flow management system.

In 2004 Dr. Roberts founded Anagran Inc. Realizing that the output queue design of packet switches
and routers was causing major delay, packet loss, and unfairness, and he designed a new concept of
flow management where each flow is precisely rate controlled at the input rather than randomly at the
output. The flow manager concept only required 20% of the power and size of a L3 router, virtually
eliminated queuing delay and packet loss for both file transfers and streaming media, optimized
network utilization, and greatly improved fairness. Thus, instead of the prevailing concept that Quality
of Service (QoS) would increase the complexity and cost of a network, the QoS could be greatly
improved with less complexity while at the same time reducing network cost by eliminating the need
for overcapacity.

Today, Roberts and Kleinrock, along with Vinton Cerf and Robert Kahn, are widely recognized as the
four founding fathers of the Internet.

Dr. Roberts has received a number of awards for his work in computer networking. Today he lives in
Silicon Valley, California.

Charles Goldfarbthe Godfather of Markup Languages

The idea of markup languages was apparently first publicly presented by the engineer William W.
Tunnicliffe (1922-1996) from Washington, D.C. In September of 1967, during a meeting at the
Canadian Government Printing Office, Tunnicliffe gave a presentation on the separation of
information content of documents from their format. In the 1970s, Tunnicliffe led the development of
a standard called GenCode for the publishing industry and later was the first chair of
the International Organization for Standardization. At almost the same time, the book designer
Stanley Rice published vague speculation along similar lines in the late 1960s. Rice, as an editor at
a Major Publishing House was writing about Standardized Editorial Structures. This was the
beginning of a movement to separate the formatting of a document from its content.

In 1969 Charles F. Goldfarb, a graduate of Harvard Law School and Columbia College, hit upon the
basic idea of markup languages while working on a primitive document management system intended

20
for law firms, and at the end if the same 1969, leading a small team at IBM, developed the first
markup language, called Generalized Markup Language, or GML. Later on however Goldfarb
explains that he actually coined the term GML to be an anagram for the three researchers,
Charles Goldfarb, Ed Mosher and Ray Lorie, who worked also on the project. Goldfarb was also the
man, who coined the term "markup language."

Goldfarb felt that GML should both describe the documents structure and be structured in a way, such
that it could be both human-readable and machine-readable. In the beginning of 1970s he continued
his work at IBM as GML began to grow in popularity. Several years later, in 1974, and with the
influence of hundreds of people, the next version of the language, called (surprise:-) Standard
Generalized Markup Language (SGML) was born. SGML added additional concepts that were not
part of the GML project such as link processing, concurrent document types and most importantly the
concept of a validating parser (called ARCSGML), that could read and check the accuracy of the
Markup code.

As Markup languages go, SGML was powerful, flexible and complex, and was used extensively in the
document processing of the huge IBM documentation. It's widely known that when at the end of
1980s Tim Berners-Lee and Robert Caillau created HTML, they based their hypertext publishing
language namely on SGML. HTML as a subset of SGML is on the other hand easy to learn, but not
nearly as powerful. Their system used a NeXT computer and incorporated the concept of hyperlinks.
Tim realized the need for a Markup language that was easy to use and implement into their system. In
1991 the Web debuted on the Internet and it was the simplicity of HTML that made the Web grow at a
feverish pace.

As remembered later Goldfarb in an interview:


We were trying to do an automated law-office application. I had been a lawyer (in fact, I still am).
Lawyers must do research on existing case law, decisions of court, and so on, to find out which ones
are applicable to a given situation, find out what the previous legal rulings have been, and then
merge that with text that the lawyer has written himself. Eventually, if it's, say, a brief for the court,
he must then compose it and print it. At the time, which was 1969 or 1970, there weren't any systems
available that did these three things. So in order to get the systems to share the data we had to come
up with a way to represent it that was independent of any of those applications.
It was a very small research project. There was initially myself and another researcher, Ed Mosher,
working on it full time. Then, we had part-time consulting from a very brilliant fellow named Ray
Lorie who is also one of the pioneers of relational databases. Ray had the most brilliant insight into
the whole thing, which is that all the elements that are tagged the same way should be processed the
same way. Our manager, Andy Symonds, contributed technically as well...

In 1975, Goldfarb moved from Cambridge, Massachusetts to Silicon Valley and became a product
planner at the IBM Almaden Research Center. There, he convinced IBM's executives to deploy GML
commercially in 1978 as part of IBM's Document Composition Facility product. Development
informally began that year on what ultimately became the SGML standard, and Goldfarb eventually
became chair of the SGML committee. SGML was standardized and released by ISO in 1986.

Maturing of the Net

In this section will be presented some people and events, who established the foundations of the modern Internet
tools and technologies.

First chat program of Murray Turoff


In 1971 the Ph.D. in Physics Murray Turoff (born in San Francisco, February 13, 1936), while working
in the U.S. Office of Emergency Preparedness (a federally coordinated system that augments the

21
Nation's medical response capability), designed the Emergency Management Information Systems
And Reference Index (EMISARI), which appears to be the first multi-machine chat system. The
original purpose of the system was to help exchange information on opinion surveys between
networking experts and academics in geographically distributed locations, that could help the
government respond to emergencies. This system and others Turoff designed (RIMS, IRIS, PREMIS)
were used for a decade and a half by the U. S. Government for crisis management monitoring of most
economic disruptions (e.g., coal strikes, transportation strikes) and commodity shortages (e.g., oil,
gas, fertilizer). That system also provided the first example of instant messaging in an operational
environment.

In 1971, EMISARI was put to one of it's first practical uses to coordinate policy information for U.S.
President Nixon's wage and price control program to fight high inflation. Users of EMISARI logged in
to the system through teletypewriter terminals linked to a central computer, using long-distance
phone lines.

The EMISARI chat functionality was called the Party Line, and was originally developed to replace
telephone conferences which might have 30 or so participants, but where no-one could effectively
respond and take part in a meaningful discussion. Party Line had a range of useful features familiar to
users of modern chat systems, such as the ability to list the current participants, and the invocation of
an alert when someone joined or left the group.

EMISARI was written under EXEC VIII on a UNIVAC computer which had newly developed
multiprocessing capabilities, making possible the new, interactive functionality that Turoff designed.
As with current online chat rooms, the system showed a list of participants and gave alerts when
someone joined or left the chat. As many as 300 experts were set up to coordinate how the country
would respond. EMISARI continued to be used by the US Government for management of emergency
situations until 1986.

EMISARI had more features than many conferencing systems developed thirty years later, including
real-time voting, data collection assignment and reporting, and discussion threads for individual
database elements.

The First E-mail Message of Ray Tomlinson


Raymond (Ray) Samuel Tomlinson (born 1941 in Amsterdam, New York) is a graduate of Rensselaer
Polytechnic Institute (the oldest technological university in USA) and a long-time employee of the
company Bolt, Beranek and Newman, which had won the contract to create ARPANET, the
predecessor of Internet.

Tomlinson started his work on the ARPANET, developing the Network Control Protocol (NCP), the
predecessor of TCP/IP, for a time-sharing system called TENEX, as well as network programs, such as
an experimental file transfer program (called CPYNET).

During the summer and autumn of 1971, he was making improvements to the local inter-user mail
program (called SNDMSG). Single-computer electronic mail had existed since at least the early 1960's
and SNDMSG was an example of that. SNDMSG allowed a user to compose, address, and send a
message to other users' mailboxes on the same computer. Tomlinson hit on the idea to merge an intra-
machine message program with another program developed for transferring files among the remote
ARPANET computers.

There were other people, who developed similar programs. Richard W. Watson, for example, thought
of a way to deliver messages and files to numeric printers at remote sites. He filed his "Mail Box
Protocol" as a draft standard under RFC 196 in July 1971, but the protocol was never implemented.

In contrast, SNDMSG sent messages to named individuals (computer users), but only working on the
same computer. Tomlinson decided to improve it, in order to send messages to users at remote
computers also.

22
Tomlinson examined the mailbox as a file with a particular name. The users could write more data
onto the end of the mailbox, but they couldn't read or overwrite what was already there. The idea of
Tomlinson was to use CPYNET to append material to a mailbox file just as readily as SNDMSG could.
SNDMSG could easily incorporate the code from CPYNET and direct messages through a network
connection to remote mailboxes in addition to appending messages to local mailbox files.

The missing piece was that the experimental CPYNET protocol had no provision for appending to a
file. It could just send and receive files. Tomlinson had to make a minor addition to the protocol and
to incorporate the CPYNET code into SNDMSG.

Next problem was to provide a way to distinguish local mail from network mail. Tomlinson chose to
append an at sign (@) and the host name to the user's (login) name. He chooses namely the at sign,
because the purpose of this sign (in English) was to indicate a unit price (for example, 10 items @
$1.95, i.e. 10 items with price $1.95). Besides that, at signs didn't appear in names so there would be
no ambiguity about where the separation between login name and host name occurred. The at sign
also had no significance in any editors, that ran on TENEX. Thus he used the at sign to indicate that
the user was "at" some other host rather than being local.

The first message using the new command was sent in the end of 1971 and was sent between two
machines, that were literally side by side (see the lower photo) and ran the TENEX time-sharing
monitor. The only physical connection they had was through the ARPANET. In the foreground of the
photo is the computer BBN-TENEXA (BBNA). In the background is the computer BBN-TENEXB
(BBNB) from which the first email was sent. On the left, foreground, is the Teletype KSR-33 terminal,
on which the first email was printed. Immediately behind and largely obscured is another KSR-33
terminal, on which the first email was typed. BBNA was a Digital Equipment Corporation KA10 (PDP-
10) computer with 64K (36-bit) words of (real magnetic) core memory. BBNB was a smaller machine
with only 48K words.

Tomlinson sent a number of test messages to himself from one machine to the other. When the
inventor was satisfied that the program seemed to work, he sent a message to the rest of the group,
explaining how to send messages over the network. Thus the first use of network email announced its
own existence.

The next release of TENEX went out in early 1972 and included the version of SNDMSG with network
mail capabilities. The CPYNET protocol was soon replaced with a real file transfer protocol having
specific mail handling features. Later, a number of more general mail protocols were developed also.

In 1996, for the first time in USA more electronic mail was being sent than postal mail. In 2003, for
the first time spam accounts for over one-half of all e-mail sent. In 2010, the number of emails sent
reached 107 trillion (sadly, 89.1% percent of emails were spam). In 2012 total number of email
accounts was 3.3 billion and is expected to increase to over 4.3 billion accounts by 2016.

Michael Hart and Project Gutenbergfirst digital library

In the beginning of 1970s came a time, which was dreamed by the pioneers like Herbert Wells in
his World Brain and Paul Outlet in his Universal Network for Information and Documentation.
Time to establish something like a giant world library, which contains all human knowledge and is
easily accessible all over the world. If we have the storage (computers), and we have the
communication media (Internet), so how can we start with this giant task. The easiest way is to collect
in one place all book, published by the human beings.

And who was the first?

Michael Stern Hart (born in 8 March, 1943, in Tacoma, Washington) is the founder of the first project
for digital libraryProject Gutenberg, which makes electronic books freely available via the Internet.

23
Studying at the University of Illinois (USA), in July 1971, Hart managed to get access to a Xerox
Sigma V mainframe computer in the university's Materials Research Lab. (Hart's brother's best
friend was the mainframe operator, so he helped). This particular computer was one of the 15 nodes
on the computer network that would become the Internet. Although the focus of computer use there
tended to be data processing, Hart was aware that it was connected to a network and chose also to use
his computer time for information distribution. He received an account with a virtually unlimited
amount of computer time (not bad to have a friend computer operator:-), its value at that time has
since been variously estimated at $100,000 or $100,000,000. Hart has said he wanted to give
back this gift by doing something that could be considered to be of great value. His initial goal was to
make the 10,000 most consulted books available to the public at little or no charge, and to do so by the
end of the 20th century.

Hart related that after his account was created on July 4, 1971, he had been trying to think of what to
do with it and had seized upon a copy of the United States Declaration of Independence, which he had
been given at a grocery store on his way home from watching fireworks that evening. He typed the text
into a teletype machine but was unable to transmit it via e-mail. Thus, to avoid crashing the system, it
had to be downloaded individually.

This was the beginning of Project Gutenberg.

The mission statements for the project, formulated later, was:

Encourage the Creation and Distribution of eBooks

Help Break Down the Bars of Ignorance and Illiteracy

Give As Many eBooks to As Many People As Possible

Most of the early postings were typed in personally by Hart, as was the case with Declaration. He
began posting text copies of such classics as the Bible, the works of Homer, Shakespeare, and Mark
Twain. As of 1987 he had typed in a total of 313 books in this fashion. Then, through being involved in
the University of Illinois PC User Group and with assistance from Mark Zinzow, a programmer at the
school, Hart was able to recruit volunteers and set up an infrastructure of mirror sites and mailing
lists for the project. With this the project was able to grow much more rapidly.

Today project Gutenberg is available at www.gutenberg.org. Its e-texts are produced (usually
scanned) by Project's many volunteers. The collection includes public domain works and copyrighted
works included with express permission. As of December 2009, Project Gutenberg claimed over
30000 (most of books are english language, but here are also books in French, German, Finnish,
Dutch, Chinese, Portuguese, etc.) items in its collection, primarily works of literature from the
Western cultural tradition. In addition to literature such as novels, poetry, short stories and drama,
Project Gutenberg also has cookbooks, reference works and issues of periodicals. The Project
Gutenberg collection also has a few non-text items such as audio files and music notation files. It is
affiliated with many projects that are independent organizations which share the same ideals, and
have been given permission to use the Project Gutenberg trademark.

First computer virus of Bob Thomas

The most important feature of a computer virus is his ability to self-replicate (in a sense every self-
replicating program can be called a virus). The idea of self-replicating programs can be traced back as
early as 1949, when the mathematician John von Neumann envisioned specialized computers or self-

24
replicating automata, that could build copies of themselves and pass on their programming to their
progeny.

If a computer virus has the ability to self-replicate over a computer network, e.g. Internet, it is
called worm.

It is not known who created the first self-replicating program in the world, but it is clear that the first
worm in the world (so called the Creeper worm) was created by the BBN engineer Robert (Bob) H.
Thomas probably around 1970.

The company BBN Technologies (originally Bolt, Beranek and Newman) is a high-technology
company, based in Cambridge, Massachusetts, which played an extremely important role in the
development of packet switching networks (including the ARPANET and the Internet).

A number of well-known computer luminaries have worked at BBN, including Robert Kahn, J. C. R.
Licklider, Marvin Minsky, Ray Tomlinson, etc. Between them was the researcher Robert H. (Bob)
Thomas, working in a small group of programmers who were developing a time-sharing system
called TENEX, that ran on Digital PDP-10 (see the lower image).

Let's clarify, the Creeper wasn't a real virus, not only because the notion computer virus didn't exist in
1970s, but also because it was actually an experimental self-replicating program, not destined to
damage, but to demonstrate a mobile application.

Creeper ran on the old Tenex operating system (Tenex is the OS which saw the first email programs,
SNDMSG and READMAIL, in addition to the use of the "@" symbol on email addresses) used the
ARPANET (predecessor of the current Internet) to infect DEC PDP-10 computers running the
TENEX. Creeper caused infected systems to display the message "I'M THE CREEPER : CATCH ME IF
YOU CAN.

The Creeper would start to print a file, but then stop, find another Tenex system, open a connection,
pick itself up and transfer to the other machine (along with its external state, files, etc.), and then start
running on the new machine, displaying the message. The program rarely if ever
actually replicated itself, rather it jumped from one system to another, attempting to remove itself
from previous systems as it propagated forward, thus Creeper didn't install multiple instances of itself
on several targets, actually it just moseyed around a network.

It is uncertain how much damage (if any) the Creeper actually caused. Most sources say the worm was
little more than an annoyance. Some sources claim that Creeper replicated so many times, that it
crowded out other programs, but the extent of the damage is unspecified. Anyway, it was immediately
revealed the key problem with such worm programs: the problem with controlling the worm.

The Creeper program led to further work, including a version by a colleague of Thomas Ray
Tomlinson, that not only moved through the net, but also replicated itself at times. To complement
this enhanced Creeper, the Reaper program was created, which moved through the net, replicating
itself, and tried to find copies of Creeper and log them out. Thus, if Creeper was the first virus,
then Reaper was the first anti-virus software.

Note from the author (Georgi Dalakov):


After composition of this article, I referred to Mr. Ray Tomlinson with an appeal for comment. He was
so kind to provide me one, as follows:

25
Your description agrees with my recollection, though I think it was somewhat later than 1970 and I
don't recall some of the details you give, such as printing a file as evidence of its presence on a
particular machine (though it must have done something to indicate its progress). I do recall
making the modifications you indicate and thinking of it as the escalation of an arms race.
There was a server (or daemon or background process) (RSEXEC, I think it was called) running on
the individual machines that supported this activity. That is, the creeper application was not
exploiting a deficiency of the operating system. The research effort was intended to develop
mechanisms for bringing applications to other machines with intention of moving the application to
the most efficient computer for its task. For example, it might be preferable to move the application
to the machine having the data (as opposed to bringing the data to the applications). Another use
would be to bring the application to a machine that might have spare cycles because it is located in a
different timezone where local users are not yet awake. The CREEPER application was a
demonstration of such a mobile application.

TCP/IP

The most popular network protocol in the world, TCP/IP protocol suite, was designed in 1970s by 2
DARPA scientistsVint Cerf and Bob Kahn, persons most often called the fathers of the Internet.

Vinton Gray "Vint" Cerf (born June 23, 1943 in New Haven, Connecticut) obtained his B.S. in Math
and Computer Science at Stanford University in 1965 and went to IBM, where he worked for some two
years as a systems engineer, supporting QUIKTRANa system to make time-shared computing more
economical and widely available for scientists, engineers and businessmen.

In 1967 he left IBM to attend graduate school at University of California, Los Angeles (UCLA), where
he earned his master's (in 1970) and PhD degree (1972) in Computer Science. During his graduate
student years, he studied under Professor Gerald Estrin, worked in Leonard Kleinrock's data packet
networking group that connected the first two nodes of the ARPANet, the predecessor to the Internet.
He worked as a Principal Programmer, participating in a number of projects, including the ARPANet
Network Measurement Center, a video graphics project including a computer-controlled 16 mm
camera, development of ARPANet host protocol specifications.

While at UCLA, he also met Bob Kahn, who was working on the ARPANet hardware architecture
in Bolt Beranek and Newman.

Robert Elliot Kahn (born December 23, 1938) received a B.E.E. degree from the City College of New
York in 1960, and M.A. and Ph.D. degrees from Princeton University in 1962 and 1964 respectively.

After graduation, he received a position in the Technical Staff at Bell Labs and then became an
Assistant Professor of Electrical Engineering at MIT. He took a leave of absence from MIT to join Bolt
Beranek and Newman, where he was responsible for the system design of the Arpanet, the first
packet-switched network, and was involved into the building of the Interface Message Processor.

In 1972, Kahn was hired by Larry Roberts at the IPTO to work on networking technologies, and in
October he gave a demonstration of an ARPANet network connecting 40 different computers at the
International Computer Communication Conference, making the network widely known for the first
time to people from around the world and communication engineers realizing that packet switching
was a real technology.

At the IPTO, Kahn worked on an existing project to establish a satellite packet network, and initiated a
project to establish a ground-based radio packet network. These experiences convinced him of the
need for development of an open-architecture network model, where any network could communicate

26
with any other independent of individual hardware and software configuration. Kahn therefore set
four goals for the design of what would become the Transmission Control Protocol (TCP):

Network connectivity. Any network could connect to another network through


a gateway.

Distribution. There would be no central network administration or control.

Error recovery. Lost packets would be retransmitted.

Black box design. No internal changes would have to be made to a network to


connect it to other networks.

In the spring of 1973, Vinton Cerf joined Kahn on the project. They started by conducting research on
reliable data communications across packet radio networks, factored in lessons learned from the
Networking Control Protocol, and then created the next generation Transmission Control Protocol
(TCP), the standard protocol used on the Internet today.

In the early versions of this technology, there was only one core protocol, which was named TCP. And
in fact, these letters didn't even stand for what they do today Transmission Control Protocol, but they
were for the Transmission Control Program. The first version of this predecessor of modern TCP was
written in 1973, then revised and formally documented in RFC 675, Specification of Internet
Transmission Control Program from December 1974.

What is the current status of Internet Protocol Suite (commonly known as TCP/IP)?

It is the set of communications protocols used for the Internet and other similar networks. It is named
from two of the most important protocols in it: the Transmission Control Protocol (TCP) and the
Internet Protocol (IP), which were the first two networking protocols defined in this standard. Today's
IP networking represents a synthesis of several developments that began to evolve in the 1960s and
1970s, namely the Internet and LANs (Local Area Networks), which emerged in the mid- to late-
1980s, together with the advent of the World Wide Web in the early 1990s.

The design of the network included the recognition that it should provide only the functions of
efficiently transmitting and routing traffic between end nodes and that all other intelligence should be
located at the edge of the network, in the end nodes. Using a simple design, it became possible to
connect almost any network to the ARPANet, irrespective of their local characteristics. One popular
saying has it that TCP/IP, the eventual product of Cerf and Kahn's work, will run over two tin cans
and a string.

A computer or device called a router (a name changed from gateway to avoid confusion with other
types of gateways) is provided with an interface to each network, and forwards packets back and forth
between them. Requirements for routers are defined in RFC 1812.

DARPA then contracted with BBN Technologies, Stanford University, and the University College
London to develop operational versions of the protocol on different hardware platforms. Four versions
were developed: TCP v1, TCP v2, a split into TCP v3 and IP v3 in the spring of 1978, and then stability
with TCP/IP v4the standard protocol still in use on the Internet today.

In 1975, a two-network TCP/IP communications test was performed between Stanford and University
College London (UCL). In November, 1977, a three-network TCP/IP test was conducted between sites
in the US, UK, and Norway. Several other TCP/IP prototypes were developed at multiple research

27
centers between 1978 and 1983. The migration of the ARPANet to TCP/IP was officially completed on
January 1, 1983, when the new protocols were permanently activated.

In March 1982, the US Department of Defense declared TCP/IP as the standard for all military
computer networking. In 1985, the Internet Architecture Board held a three day workshop on TCP/IP
for the computer industry, attended by 250 vendor representatives, promoting the protocol and
leading to its increasing commercial use.

The Internet Protocol Suite, like many protocol suites, may be viewed as a set of layers. Each layer
solves a set of problems involving the transmission of data, and provides a well-defined service to the
upper layer protocols based on using services from some lower layers. Upper layers are logically closer
to the user and deal with more abstract data, relying on lower layer protocols to translate data into
forms that can eventually be physically transmitted.

The TCP/IP model consists of four layers, as it is described in RFC 1122. From lowest to highest, these
arethe Link Layer, the Internet Layer, the Transport Layer, and the Application Layer. It should be
noted that this model was not intended to be a rigid reference model into which new protocols have to
fit in order to be accepted as a standard.

Almost all operating systems in use today, including all consumer-targeted systems, include a TCP/IP
implementation.

The World Wide Web of Tim Berners-Lee

Tim Berners-Lee used to say: "I just had to take the hypertext idea and connect it to the TCP and DNS
ideas andta-da!the World Wide Web."

So how this "simple" invention happened?

In March, 1989, a physicist and a computer nerd in CERN (European Particle Physics Laboratory in
Geneva, Switzerland), Timothy John "Tim" Berners-Lee (born 8 June 1955 in London), submitted to
his boss a proposal for an information management system, the prototype of the now
ubiquitous World Wide Web. The boss was not very impressed. Vague, but exciting, were the words
that he wrote on the proposal, thus unofficially allowing Berners-Lee to continue his work on WWW
(actually the term World Wide Web will be decided next year, in 1989 Berners-Lee called his
system Mesh).

Berners-Lee had already an experience with hypertext systems, including his own. During his first stay
in CERN, in 1980, inspired by the MEMEX of Vannevar Bush, Project Xanadu of Ted
Nelson and NLS of Douglas Engelbart, he proposed a project for a new document system, based on
the concept of hypertext, designed to facilitate sharing and updating information among researchers.
The system was called ENQUIRE and was written in the Pascal programming language, using NORD-
10 (16-bit minicomputer of Norsk Data, running under operating system SINTRAN III), later the
program was ported to PC, then VMS (see the original proposal for ENQUIRE).

The inspiration of Berners-Lee came from the frustrating fact, that there was a lot of different data on
different computers, but it was not connected at all. Because people at CERN came from universities
all over the world, they brought with them all types of computers. Not just Unix, Mac and PC: there
were all kinds of big mainframe computers and medium sized computers running all sorts of software.
One had to log on to different computers to get at it, and even sometimes one had to learn a different
program on each computer. So finding out how things worked was a really difficult task.

28
Berners-Lee wanted to wrote some programs to take information from one system and convert it, so it
could be inserted into another system. More than once. The big question was: "Can't we convert every
information system so that it looks like part of some imaginary information system which everyone
can read?" And that became the WWW.

In 1990, with the help of his colleague from CERNRobert Cailliau, Berners-Lee produced a revision
of the system, which was accepted by his manager. Berners-Lee coded the first Web browser, which
also functioned as an editor (the name of the program was WorldWideWeb, running on
the NeXTSTEP operating system), and the first Web server, CERN HTTPd (short for HyperText
Transfer Protocol daemon), both running on a NeXTcube workstation (see the images bellow).

The first Web site in the world (with the DNS name info.cern.ch) built was put online on 6 August
1991.

In 1990s WWW gradually became the prevalent technology in Internet an a global information
medium. Today, the Web and the Internet allow connectivity from literally everywhere on eartheven
ships at sea and in outer space. In 2011 the number of web-sites exceeded 300 million. As of April,
2012, the Indexed Web contains at least 8.02 billion pages.

In the beginning of the century, new ideas for sharing and exchanging content ad hoc, such as
Weblogs and RSS, rapidly gained acceptance on the Web. This new model for information exchange
(called Web 2.0), primarily featuring DIY (Do It Yourselfis a term used to describe building,
modifying, or repairing of something without the aid of experts or professionals) user-edited and
generated websites and various other content like video and audio media (YouTube), microblogging
(Twitter), etc.

What is the future of WWW? How will the next versionWeb 3.0 look like?

The vision of Tim Berners-Lee's vision of the future Web as a universal medium for data, information,
and knowledge exchange is connected with the term Semantic Web. In 1999 he wrote: "I have a dream
for the Web in which computers become capable of analyzing all the data on the Webthe content,
links, and transactions between people and computers. A Semantic Web, which should make this
possible, has yet to emerge, but when it does, the day-to-day mechanisms of trade, bureaucracy and
our daily lives will be handled by machines talking to machines. The intelligent agents people have
touted for ages will finally materialize. And later in 2006 he add: "People keep asking what Web 3.0 is.
I think maybe when you've got an overlay of scalable vector graphicseverything rippling and folding
and looking mistyon Web 2.0 and access to a semantic Web integrated across a huge space of data,
you'll have access to an unbelievable data resource."

Internet conquers the world


Why Internet became so ubiquitous technology and conquered the world? Which are the "killer" applications and
ideas, which allowed this to happen? Who are the smartest guys, who got the inspiration and energy to create
such applications?

IRC (Internet Relay Chat) of Jarkko Oikarinen

It was already mentioned in this site, that the first chat program in the world (EMISARI) was
designed in 1971 by Murray Turoff. EMISARI however was used mainly for government and
educational purposes and never became popular.

29
The program, which gave birth to the modern extremely popular chat movement was the Internet
Relay Chat (IRC) of Jarkko Oikarinen.

During the summer of 1988, Jarkko Oikarinen (born 16 August 1967, in Kuusamo, Finland), a 2nd
year student in the Department of Electrical Engineering at the University of Oulu, Finland, was
working at the university Department of Information Processing Science, where he administered the
department's Sun Unix server "tolsun.oulu.fi", running on a public access BBS (bulletin board system)
called OuluBox.

The work with server administration didn't take all his time, so Jarkko started doing a communication
program, which was meant to make OuluBox a little more usable. Partly inspired by Jyrki Kuoppala's
"rmsg" program for sending messages to people on other machines, and partly by Bitnet Relay Chat,
Oikarinen decided to improve the existing multi-user chat program on OuluBox called MultiUser
Talk (MUT) (which had a bad habit of not working properly), itself based on the basic talk program
then available on Unix computers. He called the resulting program IRC (for Internet Relay Chat), and
first deployed it at the end of August, 1988.

When IRC started occasionally having more than 10 users (the first IRC server was the above
mentioned tolsun.oulu.fi.), Jarkko asked some friends at Tampere University of Technology and
Helsinki University of Technology to start running IRC servers to distribute the load. Some other
universities soon followed. Markku Jrvinen made the IRC client program more usable by including
support for Emacs editor commands, and before long IRC was in use across Finland on the Finnish
network FUNET, and then on the Scandinavian network NORDUNET.

In 1989 Oikarinen managed to get an account on the legendary machine "ai.ai.mit.edu" at the MIT
university, from which he recruited the first IRC users outside Scandinavia and arranged starting of
the first outside-Scandinavian IRC server. Soon followed 2 other IRC servers, at the University of
Denver and at Oregon State University, "orion.cair.du.edu" and "jacobcs.cs.orst.edu" respectively. The
administrators emailed Jarkko and obtained connections to the Finnish IRC network to create
transatlantic connection, and the number of IRC servers began to grow quickly across both North
America and Europe.

IRC became well known to the general public around the world in 1991, when its use skyrocketed as a
lot of users logged on to get up-to-date information on Iraq's invasion of Kuwait, through a functional
IRC link into the country that stayed operational for a week after radio and television broadcasts were
cut off.

The Internet Relay Chat Protocol was defined in May, 1993, in RFC 1459 of Jarkko Oikarinen and
Darren Reed. It was mainly described as a protocol for group communication in discussion forums,
called channels, but also allows one-to-one communication via private message as well as chat and
data transfers via Direct Client-to-Client.

As of the end of 2009, the top 100 IRC networks served more than half a million users at a time, with
hundreds of thousands of channels, operating on a total of some 1500 servers worldwide.

Archie of Alan Emtage


The Internet's first search enginethe Archie system, was created in 1989 by the student at the McGill
University in Montreal, Canada, Alan Emtage.

Emtage (born November 27, 1964, in Barbados) conceived the first version of Archie, which was
actually a pre-Web internet search engine for locating material in public FTP archives.

30
A native of Barbados, Alan attended high school at Harrison College from 1975 to 1983 (and in 1981
becoming the owner of a Sinclair ZX81 with 1K of memory), where he graduated at the top of his class,
winning the Barbados Scholarship. Alan was always crazy about computers and while a student at
Harrison College, he tossed around a number of other career choices including meteorology and
organic chemistry, but chose computer science.

In 1983 Alan entered McGill University in Montreal, Canada, to study for a Bachelor's degree in
computer science. In 1987 he continued his study for a Master's degree, which he obtained in 1991. He
was part of the team that brought the first Internet link to eastern Canada (and only the second link in
the country) in 1986.

In 1989 while a student and working as a systems administrator for the School of Computer Science,
Alan conceived and implemented the original version of the Archie search engine, the world's first
Internet search engine and the start of a line which leads directly to today's giants Yahoo and Google.
(The name Archie stands for "archives" without the "v", not the kid from the comics)

Working as a systems administrator, Alan was responsible for locating software for the students and
staff of the faculty. The necessity for searching information became the mother of invention.

He decided to develop a set of programs, that would go out and look through the repositories of
software (public anonymous FTP (File Transfer Protocol) sites) and build basically an index of the
available software, a searchable database of filename. One thing led to another and word got out that
he had an index available and people started writing in and asking if we could search the index on
their behalf.

As a result rather than having doing it himself, he allowed them to do it themselves so we wrote
software that would allow them to come in and search the index themselves. That was the beginning.

It seems that the administration of the university was the last to find out about what Alan had done.
As Alan remembered: "We had no permission from the school to provide this service; and as a matter
of fact, the head or our department found out about it for the first time by going to a conference.
Somebody went up to him and said they really wanted to congratulate him for providing this service
and he graciously smiled, said 'You're welcome' and went back to McGill and said 'what the hell is all
of this? I have no idea what they're talking about'."
"That was a once in a lifetime opportunity. It was largely being in the right place at the right time with
the right idea. There were other people who had similar ideas and were working on similar projects, I
just happened to get there first."

Archie is considered the original search engine and a lot of the techniques that Emtage and other
people that worked with him on Archie came up with are basically the same techniques that Google,
Yahoo! and all the other search engines use.

Later Alan and his colleagues developed various versions that allowed them to split up the service so
that it would be available at other universities rather than taxing the facility at McGill.

In 1992, Emtage along with Peter Deutsch formed Bunyip Information Systemsthe world's first
company expressly founded for and dedicated to providing Internet information services with a
licensed commercial version of the Archie search engine used by millions of people worldwide.

Emtage was a founding member of the Internet Society and went on to create and chair several
Working Groups at the Internet Engineering Task Force, the standard-setting body for the Internet.
Working with other pioneers such as Tim Berners-Lee, Marc Andreessen, Mark McCahill (creator of
Gopher) and Jon Postel, Emtage co-chaired the Uniform Resource Identifier (URI) Working Group
which created and codified the standard for Uniform Resource Locators (URLs).

Emtage is currently Chief Technical Officer at Mediapolis, Inc., a web engineering company in New
York City. Besides computers, traveling and photography are his passions. He has been sky-diving in

31
Mexico, hand-gliding in Brazil, diving in Fiji, hot air-ballooning in Egypt and white-water rafting in
the Arctic circle.

NCSA Mosaic of Marc Andreessen and Eric Bina

NCSA Mosaic was neither the first web browser (first was the WorldWideWeb of Berners-Lee) nor the
first graphical web browser (it was preceded by the lesser-known Erwise and ViolaWWW), but it was
the web browser credited with popularizing the World Wide Web. Its clean, easily understood user
interface, reliability, Windows port and simple installation all contributed to making it the application
that opened up the Web to the general public.

In 1992 Marc Andreesen was a student in Computer Science and part-time assistant at the NCSA
(National Center for Supercomputing Applications) at the University of Illinois. His position at NCSA
allowed him to become quite familiar with the Internet and World Wide Web, that began to take off.

There were several web browsers available then, but they were for Unix machines which were rather
expensive. This meant that the Web was mostly used by academics and engineers who had access to
such machines. The user-interfaces of all available browsers also tended to be not very user-friendly,
which also hindered the spread of the WWW. That's why Marc decided to develop a browser that was
easier to use and more graphically rich.

In the same 1992, Andreesen recruited his colleague from NCSA and University of Illinois, Eric Bina
(Master in Computer Science from the University of Illinois from 1988), to help with his project. The
two worked tirelessly. Bina remembers that they would work three to four days straight, then crash
for about a day. They called their new browser Mosaic. It was much more sophisticated graphically
than other browsers of the time. Like other browsers it was designed to display HTML documents, but
new formatting tags like center were included.

The most important feature was the inclusion of the image tag which allowed to include images on
web pages. Earlier browsers allowed the viewing of pictures, but only as separate files. NCSA Mosaic
made it possible for images and text to appear on the same page. It also featured a graphical interface
with clickable buttons that let users navigate easily and controls that let users scroll through text with
ease. Another innovative feature was the new form of hyperlink. In earlier browsers hypertext links
had reference numbers that the user typed in to navigate to the linked document. The new hyperlinks
allowed the user to simply click on a link to retrieve a document.

NCSA Mosaic was also a client for earlier protocols such as FTP, NNTP, and gopher.

In January 1993, Mosaic was posted for free download on NCSA's servers and became immediately
popular, more than 5000 copies were being downloaded each month. Within weeks tens of thousands
of people had downloaded the program. The original version was for Unix, but Andreesen and Bina
quickly put together a team to develop PC and Mac versions, which were released in the late spring of
the same year. With Mosaic now available for more popular platforms, its popularity soon
skyrocketed. More users meant a bigger Web audience. The bigger audiences spurred the creation of
new content, which in turn further increased the audience on the Web and so on. As the number of
users on the Web increased, the browser of choice was Mosaic so its distribution increased
accordingly.

By December 1993, Mosaic's growth was so great that it made the front page of the New York Times
business section. The article concluded that Mosaic was perhaps "an application program so different
and so obviously useful that it can create a new industry from scratch". NCSA administrators were
quoted in the article, but there was no mention of either Andreesen or Bina. Marc realized that when

32
he was through with his studies NCSA would take over Mosaic for themselves. So when he graduated
in December 1993, he left and moved to Silicon Valley in California.

Later Andreesen and Jim Clark, the founder of Silicon Graphics, incorporated Mosaic
Communications Corp., and developed the famous Netscape browser and server products.

NCSA Mosaic won multiple technology awards, including being named 1993 Product of the Year by
InfoWorld magazine and 1994 Technology of the Year by Industry Week magazine.

NCSA discontinued support for Mosaic in 1997, shifting its focus to other research and development
projects, but Mosaic browser is still available for download along with documentation at NCSA FTP
siteftp://ftp.ncsa.uiuc.edu/Mosaic/.

World Wide Web Wanderer of Matthew Gray

The brilliant idea of World Wide Web was devised in the spring of 1989 in the head of Tim Berners-
Lee, a physicist in CERN, but it didn't gain any widespread popular use until the remarkable NCSA
Mosaic web-browser was introduced in the beginning of 1993.

In the spring of 1993, just months after the release of Mosaic, Matthew Gray, who studied physics in
Massachusetts Institute of Technology (MIT) and was one of the three members of the Student
Information Processing Board (SIPB) who set up the site www.mit.edu, decided to write a program,
called World Wide Web Wanderer, to systematically traverse the Web and collect sites. Wanderer was
first functional in spring of 1993 and became the first automated Web agent ( spider or web crawler).
The Wanderer certainly did not reach every site in Web, but it was run with consistent methodology,
hopefully yielding consistent data for the growth of the Web.

Matthew was initially motivated primarily to discover new sites, as the Web was still a relatively small
place (in the early 1993 the total number of web-sites all over the world was about 100, and in June of
1995, even with the phenomenal growth of the Internet, the number of Web servers increased to a
point where one in every 270 machines on the Internet is a Web server). As the Web started to grew
rapidly after 1993, the focus quickly changed to charting the growth of the Web. The first report,
compiled using the data collected by Wanderer (see the table bellow) covers the period from June
1993 to June 1995.

Results Summary

Month/Year Nr. of Web sites % of .com sites Hosts per Web server

06/93 130 1.5 13000

12/93 623 4.6 3475

06/94 2738 13.5 1095

12/94 10022 18.3 451

06/95 23500 31.3 270

01/96 100000 50.0 94

33
Wanderer was written using the Perl language and while crawling the Web, it generated an index
called Wandexthe first web database. Initially, the Wanderer counted only Web servers, but shortly
after its introduction, it started to capture URLs as it went along.

Matthew Gray's Wanderer created quite a controversy at the time, partially because early versions of
the program ran rampant through the Web and caused a noticeable network performance
degradation. This degradation occurred because it would access the same page hundreds of time a
day. The Wanderer soon amended its ways, but the controversy over whether spiders were good or
bad for the Internet remained for some time.

Wanderer certainly was not the Internet's first search engine, it was the Archie of Alan Emtage,
but Wanderer was the first web robot, and, with its index Wandex, clearly had the potential to become
the first general-purpose Web search engine, years before Yahoo and Google. Mathew Gray however
does not make this claim and he always stated that this was not its purpose.
Anyway Wanderer inspired a number of programmers to follow up on the idea of web robots.

Yahoo of David Filo and Jerry Yang

In the beginning of 1994 two Ph.D. candidates in Electrical Engineering at Stanford UniversityJerry
Yang (born November 6, 1968, in Taipei, Taiwan) and David Filo (born April 20, 1966, in Wisconsin)
were looking for a single place to find useful Web sites and for a way to keep track of their personal
interests on the Internet. As they didn't manage to find such a tool, they decided to create their own.
Thus the now ubiquitous web portal and a global brand Yahoo! began as a student hobby and evolved
into a site, that has changed the way people communicate with each other, find and access
information.

Filo and Yang started realization of his project in a campus trailer in February, 1994, and before long
they were spending more time on their home-brewed lists of favorite links than on their doctoral
dissertations. Eventually, Jerry and David's lists became too long and unwieldy, and they broke them
out into categories. When the categories became too full, they developed subcategories, thus the core
concept behind Yahoo was born.

The Web site started out as Jerry and David's Guide to the World Wide Web, but eventually received
a new moniker with the help of a dictionary. Filo and Yang decided to select the name Yahoo, because
they liked the general definition of the word (which comes from Gulliver's Travels by Jonathan Swift,
where a Yahoo is a legendary being): rude, unsophisticated, uncouth. Later the name Yahoo was
popularized as an bacronym for Yet Another Hierarchical Officious Oracle.

The Yahoo! itself first resided on Yang's student workstation, Akebono, (URL was
akebono.stanford.edu/yahoo), while the software was lodged on Filo's computer, Konishiki, both
named after legendary sumo wrestlers.

For their surprise, Jerry and David soon found they were not alone in wanting a single place to find
useful Web sites. Before long, hundreds of people were accessing their guide from well beyond the
Stanford trailer. Word spread from friends to what quickly became a significant, loyal audience
throughout the closely-knit Internet community. Yahoo! celebrated its first million-hit day in the fall
of 1994, translating to almost 100 thousand unique visitors.

The Yahoo! domain was created on January 18, 1995. Due to the torrent of traffic and enthusiastic
reception Yahoo! was receiving, the founders knew they had a potential business on their hands. In
March 1995, the pair incorporated the business and met with dozens of Silicon Valley venture
capitalists, looking for financing. They eventually came across Michael Moritz of Sequoia Capital, the

34
well-regarded firm whose most successful investments included Apple Computer, Atari, Oracle and
Cisco Systems. Sequoia Capital agreed to fund Yahoo! in April 1995 with an initial investment of
nearly $2 million.

Like many other web search engines, Yahoo started as a web directory, but soon diversified into a web
portal and a search engine.

Realizing their new company had the potential to grow quickly, the founders began to shop for a
management team. They hired Tim Koogle, a veteran of Motorola, as chief executive officer and
Jeffrey Mallett, founder of Novell's WordPerfect consumer division, as chief operating officer. After
securing a second round of funding in Fall 1995 and an initial public offering, Yahoo raised $33.8
million in April 1996, with a total of 49 employees.

Here you can see the earliest known Yahoo! website from 1996.

Today, Yahoo! Inc. is a leading global Internet communications, commerce and media company that
offers a comprehensive branded network of services to more than 350 million individuals each month
worldwide. It provides internet communication services (such as Yahoo! Messenger and Yahoo! Mail),
social networking services and user-generated content (such as My Web, Yahoo! Personals, Yahoo!
360, Delicious, Flickr, and Yahoo! Buzz), media contents and news (such as Yahoo! Sports, Yahoo!
Finance, Yahoo! Music, Yahoo! Movies, Yahoo! News, Yahoo! Answers and Yahoo! Games), etc.
Headquartered in Sunnyvale, California, Yahoo! has offices in Europe, Asia, Latin America, Australia,
Canada and the United States.

Amazon of Jeff Bezos


Jeffrey Preston Bezos, the founder of the famous Amazon.com, was born on January 12, 1964, in
Albuquerque, New Mexico, when his mother, Jackie, was still in her teens. Her marriage to his father
lasted little more than a year. She remarried when Bezos was five and Jeffrey took the name of his
stepfather, Miguel Bezos.

In 1971 the family moved to Houston, Texas, where Jeffrey attended an elementary school, showing
intense and varied scientific interests. He rigged an electric alarm to keep his younger siblings out of
his room and maintain his privacy and converted his parents' garage into a laboratory for his science
projects.

Later the family moved to Miami, Florida, where Bezos attended a high school. While in high school,
he attended the a student science training program at the University of Florida, which helped him
receive a Silver Knight Award in 1982. He entered Princeton University, planning to study physics, but
soon returned to his love of computers and graduated summa cum laude, Phi Beta Kappa with a
degree in computer science and electrical engineering.

After graduating from Princeton, Bezos worked on Wall Street in the computer science field. Then he
worked on building a network for international trade for a company known as Fitel. Then Bezos
worked for Bankers Trust, becoming a vice-president. Later on he also worked in computer science for
D. E. Shaw & Co.

In 1994, Bezos decided to take part in the Internet gold rush, developing the idea of selling books to a
mass audience through the Internet. There is an apocryphal legend, that Bezos decided to found
Amazon after making a cross country drive with his wife from New York to Seattle, writing up the
Amazon business plan on the way and setting up the original company in his garage.

Bezos decided to name the company Amazon after the world's largest river, and reserved the domain
name Amazon.com. The company was incorporated in the state of Washington, beginning service in
July 1995. The initial Web site was text heavy and gray, it wasn't pretty and didn't have even listing

35
book publication dates and other key information. But that didn't concern Madrona Venture Group's
Tom Alberg, who invested $100000 in Amazon in 1995.

By the fourth month in business, the company was selling more than 100 books a day.

Bezos succeeded to created more than a bookstore, he created an online community. The site was
revolutionary early on for allowing average consumers to create online product reviews. It not only
drew people who wanted to buy books, but also those who wanted to research them before buying.

The company began as an online bookstore, but gradually incorporated a number of products and
services into its shopping model, either through development or acquisition.

In 1997, Amazon added music CDs and movie videos to the Web site, what many considered to be a
wise move designed to complement the company's expansive book collection. Soon Amazon added
five more product categoriestoys, electronics, software, video games and home improvement.

In 1999, Time magazine (on the lower image is shown the cover) named Bezos Person of the Year,
recognizing the company's success in popularizing online shopping.

Amazon's initial business plan was unusual: the company did not expect a profit for four to five years
and the strategy was effective. In 1996, its first full fiscal year in business, Amazon generated $15.7
million sales, a figure that would increase by 800 percent the following year. The company
successfully survived the dot-com bubble and remains profitable till now. Revenues increased thanks
to product diversification and an international presence: $3.9 billion in 2002, $5.3 billion in 2003,
$6.9 billion in 2004, $8.5 billion in 2005, and $10.7 billion in 2006. In May 1997 Amazon.com issued
its initial public offering of stock.

In 2007 Amazon launched the remarkable series of e-book readers Kindle. In 2011 Amazon entered
the tablet bisuness with Kindle Fire.

The site amazon.com attracted over 900 million visitors annually by 2011. In 2012 the company has
over 56000 employees.

In 2004, Bezos founded a human space flight startup company called Blue Origin. He is known for his
attention to business process details, trying to know about everything from contract minutiae to how
he is quoted in all Amazon press releases.

eBay of Pierre Omidyar


On September 4, 1995, the 28-year-old software developer and entrepreneur Pierre Omidyar launched
the famous eBay auction site as an experiment in how a level playing field would affect the efficiency
of a marketplace.

Pierre Morad Omidyar was born on June 21, 1967 in Paris, France, to Iranian immigrant parents, both
of whom had been sent by his grandparents to attend university there. The family moved to the US,
Washington, D.C. Pierre's graduated with a degree in computer science from Tufts University in 1988.
Shortly after, Omidyar went to work for Claris, an Apple Computer subsidiary, where he helped write
the vector based drawing application MacDraw.

In 1991 Pieere started his career as an entrepreneur, co-founding Ink Development, a pen-based
computing startup that was later rebranded as an e-commerce company and renamed eShop.

In a long holiday weekend some time in the middle of 1995, Pierre sat down in his living room in San
Jose, California, to write the original computer code for what eventually became an internet
superbrandthe auction site eBay. Initially, he wanted to call his site echobay, but the name had
already been registered. Thus the word eBay was made up on the fly by Omidyar.

36
The site www.ebay.com was launched on Labor Day, 1995, under the more prosaic title of Auction
Web, and was hosted on a site, Omidyar had created for information on the ebola virus. The site began
with the listing of a single broken laser pointer. Though Pierre had intended the listing to be a test
more than a serious offer to sell at auction, he was shocked when soon the item sold for $14.83.

Auction Web was later renamed eBay. The service, meant to be a marketplace for the sale of goods
and services for individuals, was free at first, but started charging in order to cover internet service
provider costs and soon started making profit.

What is the profitable Business Model of eBay?

It was built on the idea of an online person-to-person trading community on the Internet, using the
World Wide Web. Buyers and sellers are brought together in a manner, where sellers are permitted to
list items for sale, buyers to bid on items of interest and all eBay users to browse through listed items
in a fully automated way. The items are arranged by topics, where each type of auction has its own
category.

eBay has both streamlined and globalized traditional person-to-person trading, which has
traditionally been conducted through such forms as garage sales, collectibles shows, flea markets and
more, with their web interface. This facilitates easy exploration for buyers and enables the sellers to
immediately list an item for sale within minutes of registering.

Browsing and bidding on auctions is free of charge, but sellers are charged two kinds of charges:
When an item is listed on eBay, a nonrefundable Insertion Fee is charged, which ranges between 30
cents and $3.30, depending on the seller's opening bid on the item.
A fee is charged for additional listing options to promote the item, such as highlighted or bold listing.
A Final Value (final sale price) fee is charged at the end of the seller's auction. This fee generally
ranges from 1.25% to 5% of the final sale price.

eBay notifies the buyer and seller via e-mail at the end of the auction if a bid exceeds the seller's
minimum price, and the seller and buyer finish the transaction independently of eBay. The binding
contract of the auction is between the winning bidder and the seller only.

This appeared to be an excellent business model.

By 1996 the company was large enough to require the skills of a Stanford MBA in Jeffrey Skoll, who
came aboard an already profitable ship. Meg Whitman, a Harvard graduate, soon followed as
president and CEO, along with a strong business team under whose leadership eBay grew rapidly,
branching out from collectibles into nearly every type of market. eBay's vision for success transitioned
from one of commercebuying and selling thingsto one of connecting people around the world
together.

With exponential growth and strong branding, eBay thrived, eclipsing many of the other upstart
auction sites that dotted the dot-com bubble. By the time eBay had gone public in 1998, both Omidyar
and Skoll were billionaires.

In 2009 the net worth of the company reached S$5.5 Billion. Over one mill

Google of Larry Page and Sergey Brin

In January, 1996, exactly two years from the moment, when two Ph.D. candidates at Stanford
University (Jerry Yang and David Filo) started their work on the project, which will became the now
ubiquitous web portal Yahoo, two other graduates at StanfordLarry Page and Sergey Brin, Stanford
computer science graduate students, working on the Stanford Digital Library Project (to develop the
enabling technologies for a single, integrated and universal digital library), began collaborating on a
research project, which will evolve to the world-renowned web portal Google.

37
Sergey Brin, was born in 1973 as in Moscow, Soviet Union, to Russian
Jewish family, which moved to USA in 1979. In 1990 Brin enrolled in the University of Maryland, to
study computer science and mathematics, where he received his Bachelor of Science degree in 1993
and began his graduate study in Computer Science at Stanford University on a graduate fellowship
from the National Science Foundation.

Lawrence "Larry" Page, was born in 1973 in East Lansing, Michigan, in a Jewish family of computer
science professors at Michigan State University. He holds a Bachelor of Science degree in computer
engineering from the University of Michigan and a Masters degree in Computer Science from Stanford
University.

In 1995, searching for a dissertation theme, Page considered (among other things) exploring the
mathematical properties of the World Wide Web, understanding its link structure as a huge graph.
His supervisor Terry Winograd encouraged him to pick this idea (which Page later recalled as the best
advice I ever got) and Page focused on the problem of finding out which web pages link to a given
page, considering the number and nature of such backlinks to be valuable information about that page
(with the role of citations in academic publishing in mind).

In his research project, nicknamed BackRub, Page was soon joined by Sergey Brin, a fellow Stanford
Ph.D. student, who Page met in 1995. Page's web crawler began exploring the web in March 1996,
setting out from Page's own Stanford home page as its only starting point. To convert the backlink
data that it gathered into a measure of importance for a given web page, Brin and Page developed
the PageRank algorithm.

Analyzing BackRub's output, which for a given URL, consisted of a list of backlinks ranked by
importance, it occurred to them that a search engine based on PageRank would produce better results
than existing techniques (existing search engines at the time essentially ranked results according to
how many times the search term appeared on a page). This was not an original invention, as a small
search engine called Rankdex was already exploring a similar strategy.

BackRub is written in Java and Python and runs on several Sun Ultras and Intel Pentiums boxes,
running Linux. The primary database is kept on a Sun Ultra II with 28GB of disk storage.

As the search engine grew in popularity at Stanford, installed on the Stanford website (domain name
google.stanford.edu), both Page and Brin decided that BackRub needed a new name. Page turned to
fellow graduate student Sean Anderson for help, and they discussed several possible new names. After
several days of brainstorming in their graduate student office, Anderson verbally suggested the
word googleplex (a googolplex is the number one followed by a googol zeros. The term googol was
coined in 1938 by Milton Sirotta (19291980), nephew of American mathematician Edward Kasner).

Page and Brin liked googleplex, but Page suggested they shorten it to googol. When Anderson
searched to see if the domain name was available, he misspelled googol as google which was available.
Because googol was unavailable and Page thought google had an Internet ring to it like Yahoo! or
Amazon, Page registered their search engine as google.com on the Internet domain name registry.

In contrast to other busy-looking pages with flashy banners and blinking lights, Brin and Page decided
to keep google.com clean and simple to allow for faster searches. This appeared to be a wise decision.

Though the Stanford Digital Library funded them $10000 which Brin and Page used to build and
string together inexpensive PCs, they were still perpetually short on money. As google.coms
popularity continued to grow at Stanford, they were faced with a decision: finish their graduate work
or create a business around their growing search engine. Reluctant to leave their studies, Page and

38
Brin offered to sell their search engine for one million USD first to AltaVista search portal. To their
disappointment however, AltaVista passed, as did Yahoo, Excite, and other search engines.

They were rejected in part because many search engines wanted people to spend more time and
money on their Web site, while Google was designed to give people fast answers to their questions by
quickly sending them to relevant Web pages. Then Yahoo's cofounder David Filo, advised them to take
a leave of absence from Stanford to start their own business. Filo initially encouraged Brin and Page
not only because they were friends, but also because Yahoo was interested in cultivating a field of
healthy search engines they could use.

The domain google.com was registered on September 15, 1997. Page and Brin formally incorporated
their company, Google Inc., on September 4, 1998 at a friend's garage in Menlo Park, California.

Both Brin and Page had been against using advertising pop-ups in a search engine, or an "advertising
funded search engines" model, and they wrote a research paper in 1998 on the topic while still
students. However, they soon changed their minds on allowing simple text ads.

By the end of 1998, Google had an index of about 60 million pages. The home page was still marked as
"BETA", but some observers already argued that Google's search results were better than those of
competitors like Hotbot or Excite.com, and praised it for being more technologically innovative than
the overloaded portal sites (like Yahoo!, Excite.com, Lycos, Netscape's Netcenter, AOL.com and
MSN.com), which at that time, during the growing dot-com bubble, were seen as "the future of the
Web", especially by stock market investors.

The Google search engine attracted a loyal following among the growing number of Internet users,
who liked its simple design. In 2000, Google began selling advertisements associated with search
keywords. The ads were text-based to maintain an uncluttered page design and to maximize page
loading speed. Keywords were sold based on a combination of price bid and click-throughs, with
bidding starting at $.05 per click. This was not a pioneering approach, as this model of selling
keyword advertising was already used by Goto.com. While many of its dot-com rivals failed in the new
Internet marketplace, Google quietly rose in stature while generating revenue.

Now Google runs over one million servers in data centers around the world, and processes over one
billion search requests and twenty petabytes of user-generated data every single day. Google's rapid
growth since its incorporation has triggered a chain of products, acquisitions and partnerships beyond
the company's core productthe search engine. Google's offers online productivity software, such as
its Gmail e-mail software, and social networking tools, including Orkut and, more recently, Google
Buzz. Google's products extend to the desktop as well, with applications such as the web browser
Google Chrome, the Picasa photo organization and editing software, and the Google Talk instant
messaging application. Recently, Google created the Android mobile phone operating system (based
on Linux), used on a number of GSM smartphones.

Napster of Shawn Fanning


Shawn Fanning was born on November 22, 1980, in Brockton, Massachusetts. He worked summers at
his uncle's (John Fanning) Internet company, called Chess.net. During this work, wanting to create an
easier method of finding music than by searching IRC or Lycos, he spent months writing the code for
Napster, a program that could provide an easy way to download music, using an anonymous P2P
(peer-to-peer) file sharing service.

After graduating from Harwich High School in 1998, Shawn enrolled at Boston's Northeastern
University, but he rarely attended class and spent Christmas break working at the Hull, Massachusetts
chess.net office with his uncle John, pushing himself to get the Napster system completed. In January,

39
1999, Shawn drops out of Northeastern University after the first semester, to finish writing the
software.

The service, named Napster after Fanning's hairstyle-based nickname, was launched in June 1999 and
worked till July 2001, before to be shut down by court order.

Shawn's uncle ran all aspects of the company's operations for a period from their office. The final
agreement gave Shawn 30% control of the company, with the rest going to his uncle.

Napster was the first of the massively popular P2P file distribution systems, although it was not fully
peer-to-peer, since it used central servers, in order to maintain lists of connected systems and the files
they provided, while actual transactions were conducted directly between client machines. Actually
there were already networks that facilitated the distribution of files across the Internet, such as IRC,
Hotline, and USENET, but Napster specialized exclusively in music in the form of MP3 files and
presented a user-friendly interface.

The result was a system, whose popularity generated an enormous selection of music to download.
Napster made it relatively easy for music enthusiasts to download copies of songs that were otherwise
difficult to obtain, like older songs, unreleased recordings, and songs from concert bootleg recordings.
Many users felt justified in downloading digital copies of recordings they had already purchased in
other formats, like LP and cassette tape, before the compact disc emerged as the dominant format for
music recordings.

In 2000, Fanning and Napster were featured on the covers of two of the most popular magazines
Newsweek and Time (see the lower image).

Napster's facilitation of transfer of copyrighted material (songs) raised the ire of the Recording
Industry Association of America (RIAA), which almost immediately (in December, 1999), filed a
lawsuit against it. In April, 2000, the rock band Metallica also sues Napster for copyright
infringement. The service would only get bigger as the trials, meant to shut down Napster, also gave it
a great deal of publicity. Soon millions of users, many of them college students, flocked to it.

An injunction was issued in March, 2001 ordering Napster to prevent the trading of copyrighted
music on its network. In July 2001, Napster shut down its entire network in order to comply with the
injunction. In September, 2001, the case was partially settled, as Napster agreed to pay music creators
and copyright owners a $26 million settlement for past, unauthorized uses of music, as well as an
advance against future licensing royalties of $10 million. In order to pay those fees, Napster attempted
to convert their free service to a subscription system. Thus traffic to Napster was reduced.

Although the original service was shut down, it paved the way for decentralized peer-to-peer file-
distribution programs, which have been much harder to control.

BitTorrent of Bram Cohen


In 2001 an American computer programmer, named Bram Cohen, began his work on a protocol and a
program, which will change the entertainment industry and the interchange of information in Web.
The peer-to-peer (P2P) protocol was named BitTorrent, as well as the first file sharing program to use
the protocol, also known as BitTorrent.

Bram Cohen was born in 1975 in New York in a Jewish family, and grew up in Manhattan's Upper
West Side, where his father taught him the basics of computer coding at age 6. In first grade, he
flummoxed friends with comparisons of the Commodore 64 vs. the Timex Sinclair personal
computers, and was actively programming by age 10. Cohen graduated from Stuyvesant High
School and later attended for 2 years State University of New York at Buffalo. In 1995 he decided to
drop out of college, bored out of his mind, he said, to work for several dot-com companies throughout
the mid to late 1990s, the last being MojoNation, an ambitious but ill-fated project.

40
MojoNation allowed people to break up confidential files into encrypted chunks and distribute those
pieces on computers also running the software. If someone wanted to download a copy of this
encrypted file, he would have to download it simultaneously from many computers. Cohen found this
concept as perfect for a file sharing program and protocol, since programs like the popular
then KaZaA take a long time to download a large file because the file is (usually) coming from one
source.

That's why Cohen designed BitTorrent to be able to download files from many different sources, thus
speeding up the download time, especially for users with faster download, than upload speeds, which
is the case with the vast majority of users. Thus, the more popular a file is, the faster a user will be able
to download it, since many people will be downloading it at the same time, and these people will also
be uploading the data to other users.

In April 2001, Cohen quit MojoNation and began work on BitTorrent. He wrote the first BitTorrent
client implementation in Python language, and several other programs have since implemented the
protocol.

BitTorrent gained its fame for its ability to quickly share large music and movie files online. Cohen
himself has claimed he has never violated copyright law using his software and initially he designed it
for the community at etree.org, a site that shares only music by artists who expressly allow the
practice. It however didn't take long for the software to take hold with illegal music and movie
swapping.

So how exactly BitTorrent works?

BitTorrent is a protocol that offloads some of the file tracking work to a central server (called
a tracker). Another basic rule is that it uses a principal called tit-for-tat, which means that in order to
receive files, you have to give them. This solves the problem of leeching, which was one of developer
Bram Cohen's primary goals. With BitTorrent, the more files you share with others, the faster your
downloads are. And as it was already mentioned, to make better use of available network bandwidth
(the pipeline for data transmission), BitTorrent downloads different pieces of the file you want
simultaneously from multiple computers.

To download a file with BitTorrent, you have open a Web page and click on a link for the file you want.
BitTorrent client software communicates with a tracker to find other computers running BitTorrent,
that have the complete file (so called seed computers) and those with a portion of the file (peers that
are usually in the process of downloading the file).

The tracker identifies the swarm, which is the connected computers that have all of or a portion of the
file and are in the process of sending or receiving it.

The tracker helps the client software trade pieces of the file you want with other computers in
the swarm. Your BitTorrent client receives multiple pieces of the file simultaneously.

If you continue to run the BitTorrent client software after your download is complete, others can
receive .torrent files from your computer and your future download rates improve because you are
ranked higher in the tit-for-tat system.

By 2004, Cohen formed BitTorrent, Inc., with his brother Ross Cohen and business partner Ashwin
Navin.

Soon P2P network BitTorrent became extremely popular. According to some estimations, in 2004 its
traffic represented from 20 to 35% of all traffic on the Internet.

Due to his principle of contacting many (up 300-500 servers per second) BitTorrent lead to the
interesting network issue. Routers that use NAT (Network address translation), must maintain tables
of source and destination IP addresses and ports. Typical home routers are limited to about 2000

41
table entries while some more expensive routers have larger table capacities. As BitTorrent frequently
contacts many servers per second, it rapidly fills the NAT tables, causing home routers to lock up.

Another "copyright related" issue is that, as the BitTorrent protocol provides no way to index torrent
files, comparatively small number of websites have hosted a large majority of torrents, many linking to
copyrighted material, rendering those sites especially vulnerable to lawsuits.

There has been much controversy over the use of BitTorrent trackers. BitTorrent metafiles themselves
do not store copyrighted data. Whether the publishers of BitTorrent metafiles violate copyrights by
linking to copyrighted material is controversial. Various jurisdictions have pursued legal action
against websites that host BitTorrent trackers. High-profile examples include the closing of the
trackers Suprnova.org, Torrentspy, LokiTorrent, Demonoid, Mininova and OiNK.cd. The Pirate Bay
torrent website, was noted for the "legal" section of its website in which letters and replies on the
subject of alleged copyright infringements are publicly displayed. In 2006 the Pirate Bay's servers in
Sweden were raided by Swedish police on allegations of copyright infringement, the tracker however
was up and running again three days later.

Wikipedia of Jimmy Wales and Larry Sanger

The old dream for a world encyclopedia of Paul Otlet and Herbert Wells from 1930s became possible
in 1990s, after the rapid progress of Internet and World Wide Web.

First known proposal for online Internet encyclopedia was made by the Internet enthusiast Rick Gates
in October, 1993. Gates proposed in a message titled The Internet Encyclopedia, published in the
Usenet newsgroup alt.internet.services, to collaboratively create an encyclopedia on the Internet,
which would allow anyone to contribute by writing articles and submitting them to the central catalog
of all encyclopedia pages. Later the term Interpedia was coined for this encyclopaedia, but the project
never left the planning stages and finally died.

Several years later, in 1999, the famous open-source activist Richard Stallman popularized his concept
of an open source web-based online encyclopedia, but his idea also remained unrealized.

The first working online encyclopedia became Wikipedia, launched in January 2001 by Jimmy Wales
and Larry Sanger. The founders used the concept and technology of a wiki devised in 1994 by the
computer programmer Ward Cunningham. A wiki is a website that allows the easy creation and
editing of any number of interlinked web pages via a web browser, using a simplified markup language
or a WYSIWYG text editor. A special wiki software has been created, which is often used to create
collaborative websites.

Jimmy Donal Wales was on born August 7, 1966, in Huntsville, Alabama. He received his bachelor's
degree in finance from Auburn University and entered the Ph.D. finance program at the University of
Alabama before leaving with a master's degree to enter the Ph.D. finance program at Indiana
University. He taught at both universities during his postgraduate studies, but did not write the
doctoral dissertation required for a Ph.D., something which he has ascribed to boredom.

In 1994, rather than writing his doctoral dissertation, Wales took a job in a Chicago futures and
options trading firm. By speculating on interest rate and foreign-currency fluctuations, he had soon
earned enough to support himself and his wife for the rest of their lives.

Wales was addicted to the Internet from an early stage and used to write computer code as a pastime.
Inspired by the remarkable initial public offering of Netscape in 1995, he decided to become an
internet entrepreneur, and in 1996 founded the web portal Bomis with two partners. The website
featured user-generated webrings and for a time sold erotic photographs. It was something like a
"guy-oriented search engine" with a market similar to that of a male magazine, and was positioned as

42
the Playboy of the Internet. Bomis did not become successful, but in 2000 hosted and provided the
initial funding for the Nupedia project.

Wales began thinking about an open-content online encyclopedia built by volunteers in the fall of
1999, and in January 2000, he hired Sanger to oversee its development. The project, called Nupedia,
officially went online in March, 2000. Nupedia was designed as a free content encyclopedia, whose
articles were written by experts and intended to generate revenue from online ads. It was initially not
based on the wiki concept, it was instead characterized by an extensive peer-review process, designed
to make its articles of a quality comparable to that of professional encyclopedias. In 2001 Sanger
brought the wiki concept to Wales and suggested it be applied to Nupedia and then, after some initial
skepticism, Wales agreed to try it. Nupedia's however ceased operating in 2003, producing only 24
articles that completed its complex review process.

In January 2001, Wales decided to switch to the GNU Free Documentation License at the urgings of
Richard Stallman and the Free Software Foundation, and started the Wikipedia project,
reserving Wikipedia.com and Wikipedia.org domain names. Initially Wikipedia (the word was
devised by Sanger) was created as a side-project and feeder to Nupedia, in order to provide an
additional source of draft articles and ideas, but it quickly overtook Nupedia, growing to become a
large global project, and originating a wide range of additional reference projects. Jimmy Wales
announced that he would never run commercial advertisements on Wikipedia.

There was another similar project at this timeGNUPedia, but it had somewhat bureaucratic
structure, that's why it was never really developed and soon died, surpassed by Wikipedia.

Initially Wikipedia ran on UseModWiki, written in Perl. The server has run on Linux to this day,
although the original text was stored in files rather than in a database. In 2002 it was replaced by a
new software, written specifically for the project, which included a PHP wiki engine. Later a new
version, MediaWiki, was developed, which has been updated many times and is working till now.

As Wikipedia grew and attracted contributors, it quickly developed a life of its own and began to
function largely independently of Nupedia, although Sanger initially led activity on Wikipedia by
virtue of his position as Nupedia's editor-in-chief. Due to the collapse of the internet economy at that
time, Jimmy Wales decided to discontinue funding for a salaried editor-in-chief in December 2001
and next year Sanger resigned.

During the years, a lot of sister Wikipedia projects evolvedWiktionary, 2002 (dictionary and
thesaurus); Wikiquote, Wikibooks and Wikisource, 2003; etc.

Wikipedia has been blocked on some occasions by national authorities. To date these have related to
the People's Republic of China, Iran, Syria, Pakistan, Thailand, Tunisia, the United Kingdom and
Uzbekistan.

As of February 2010 there are some 14.4 million articles in Wikipedia, with approximately 3.2 million
articles in the English Wikipedia.

Skype of Niklas Zennstrm and Janus Friis


Skype is a extremely popular software application that allows users to make voice calls over the
Internet, as well as instant messaging, file transfer and video conferencing.

Its development in 2002 was financed by the Skype Group, founded by Swedish entrepreneur Niklas
Zennstrm and the Dane Janus Friis, who had already an experience in venture IT business, founding

43
in 2001 Kazaa Media Desktop (once capitalized as "KaZaA", but now usually written "Kazaa")the
famous peer-to-peer file sharing application. Skype calls to other users of the program and, in some
countries, to free-of-charge numbers, are free, while calls to other land lines and mobile phones can be
made for a fee ($2.95/month gets you unlimited calls in the USA).

The name Skype came from one of the initial names for the projectSKY PEer-to-peer, which was
then abbreviated to Skyper. It appeared however, that some of the domain names associated
with Skyper were already taken. Dropping the final r left the current title Skype, for which domain
names were available.

The domain names Skype.com and Skype.net were registered in April 2003. In August 2003 was
released the first public beta version of the program.

The code of Skype was written by Estonian developers Ahti Heinla, Priit Kasesalu and Jaan Tallinn,
who had also originally developed Kazaa. They used several IDEs and languagesDelphi, C and C++,
developing initially Skype clients for several OSWindows, Linux and MacOS.

Skype is a real nightmare for phone companies, because it provides almost free phone services to each
user with a Skype program and Internet connection. With popularizing of the wireless networks, the
business of the pure phone service providers is going to decline. Moreover, Skype is now running not
only on computers, but also on many models mobile phones and smartphones.

Skype uses a proprietary Internet telephony (VoIP) network, called the Skype protocol. The protocol
has not been made publicly available by Skype and official applications using the protocol are closed-
source. The main difference between Skype and standard VoIP clients is that Skype operates on a
peer-to-peer model (originally based on the Kazaa software), rather than the more usual client-server
model. The Skype user directory is entirely decentralized and distributed among the nodes of the
network, i.e. users' computers, thus allowing the network to scale very easily to large sizes without a
complex centralized infrastructure costly to the Skype Group.

Skype uses a secure communication protocol, as encryption cannot be disabled, and is invisible to the
user. Skype reportedly uses non-proprietary, widely trusted encryption techniques: RSA for key
negotiation and the Advanced Encryption Standard to encrypt conversations. Skype provides an
uncontrolled registration system for users with no proof of identity.

Now versions of Skype exist not only for Linux, Mac OS and Windows, but also for Maemo, iPhone
OS, Android and even Sony's PSP.

In April 2006, number of registered users reached 100 million. In 2009 this number reached 530
million. In January 2010 were reported over 22 million concurrent Skype users.

Facebook of Mark Zuckerberg


In the late evening of October 28, 2003, a sophomore in Harvard named Mark Zuckerberg,
disappointed from a girl who had dumped him, was sitting in his room in the dormitory, and was
trying to think of something to do to get her off his mind. A dormitory facebook was open on his
desktop and Mark found that some of these people have pretty horrendous facebook pictures He got
an idea to put some of these faces next to pictures of farm animals and have people vote on which is
more attractive:-)

Mark Elliot Zuckerberg was born on May 14, 1984, in White Plains, New York and raised in Dobbs
Ferry, New York. He started programming when he was in middle school. Early on, Zuckerberg
enjoyed developing computer programs, especially communication tools and games, and a music
player named Synapse that used artificial intelligence to learn the user's listening habits. Microsoft
and AOL tried to purchase Synapse and recruit Zuckerberg, but he decided to attend Harvard
University instead.

"Let the hacking begin", wrote Zuckerberg about midnight and started.

44
Thus Zuckerberg decided to create student directory with photos and basic personal information,
called Facemash, which used photos compiled from the online facebooks of nine dormitory Houses,
placing two next to each other at a time and asking users to choose the hotter person. To accomplish
this, Mark hacked into the protected areas of Harvard's computer network and copied the houses'
private dormitory ID images.

Harvard at that time did not have a student directory with photos and basic information and the
Facemash site generated 450 visitors and 22000 photo-views in its first several hours online. That the
initial site mirrored peoples physical communitywith their real identities, represented the key
aspects of what later became Facebook.

The site was quickly forwarded to several campus group list-servers but was shut down a few days
later by the Harvard administration. Zuckerberg got into trouble, being charged by the administration
with breach of security, violating copyrights and violating individual privacy and faced expulsion, but
ultimately the charges were dropped.

The following semester, in January 2004, Mark began writing code for a new website. In February,
2004, he launched Thefacebook site, initially located at URL thefacebook.com. When Zuckerberg
finished the site, he told a couple of friends, and one of them put it on an online mailing list.
Immediately several dozen people joined, and then they were telling people at the other houses. It was
like avalanche, within twenty-four hours, Thefacebook had somewhere between twelve hundred and
fifteen hundred registrants.

Initially membership was initially restricted to students of Harvard College, and within the first
month, more than half the undergraduate population at Harvard was registered on the site. Mark soon
attracted assistants, in order promote the websiteEduardo Saverin (business aspects), Dustin
Moskovitz (programmer), Andrew McCollum (graphic artist), and Chris Hughes. In March 2004,
Facebook expanded to 3 other UniversitiesStanford, Columbia, and Yale. This expansion continued
when it opened to all Ivy League and Boston area schools, and gradually most universities in Canada
and the United States.

The company Facebook incorporated in the summer of 2004 and the entrepreneur Sean Parker, who
had been informally advising Zuckerberg, became the company's president. At the same time the
company received its first investment of US$500000 from PayPal co-founder Peter Thiel and moved
its base of operations to Palo Alto, California. The company dropped The from its name after
purchasing the domain name facebook.com in 2005 for $200000.

Facebook launched a high school version in September 2005, at that time, high school networks
required an invitation to join. Facebook later expanded membership eligibility to employees of several
companies, including Microsoft and Apple. And finally on September 26, 2006, Facebook was opened
to everyone of ages 13 and older with a valid e-mail address.

Users of Facebook can create profiles with photos, lists of personal interests, contact information and
other personal information. Communicating with friends and other users can be done through private
or public messages or a chat feature. Users can also create and join interest and fan groups, some of
which are maintained by organizations as a means of advertising. To combat privacy concerns,
Facebook enables users to choose their own privacy settings and choose who can see what parts of
their profile.

Facebook is free to users and generates revenue from advertising, such as banner ads. By default, the
viewing of detailed profile data is restricted to users from the same network and reasonable
community limitations.

Microsoft is Facebook's exclusive partner for serving banner advertising, and as such Facebook only
serves advertisements that exist in Microsoft's advertisement inventory. According to comScore, an
internet marketing research company, Facebook collects as much data from its visitors as Google and
Microsoft, but considerably less than Yahoo!

45
Facebook has a number of features, with which users may interact. They include the Wall, a space on
every user's profile page that allows friends to post messages for the user to see and to post
attachments, depending on privacy settings, anyone who can see a user's profile can also view that
user's Wall; Pokes, which allows users to send a virtual poke to each other (a notification then tells a
user that they have been poked); Photos, where users can upload albums and photos; Status, which
allows users to inform their friends of their whereabouts and actions; etc.

The front-end servers of Facebook are running a PHP LAMP stack with the addition of Memcache,
and the back-end services are written in a variety of languages including C++, Java, Python and
Erlang. Other components of the Facebook infrastructure (which have been released as open source
projects) include Scribe, Thrift and Cassandra, as well as existing open-source components such as
ODS.

Today, Facebook is the leading social networking site based on monthly unique visitors, especially in
English-speaking countries, including Canada, the United Kingdom and the United States, having
overtaken main competitor MySpace in April 2008. ComScore reports that Facebook attracted 132.1
million unique visitors in June 2008, compared to MySpace, which attracted 117.6 million.

The website has won awards such as placement into the "Top 100 Classic Websites" by PC Magazine in
2007, and winning the "People's Voice Award" from the Webby Awards in 2008.

As of January 2010 Zuckerberg is the youngest self-made businessman worth more than a billion
dollars.

Digg of Kevin Rose


Kevin Rose is an American Internet Entrepreneur, known for his several Internet start-ups. He is the
co-founder of Revision3, Pownce, WeFollow and Diggnation. His masterpiece however is the social-
bookmarking website Digg.

Robert Kevin Rose was born on February 21, 1977, in Redding, California. Most of his childhood was
spent in Las Vegas, where at eight he began his first experience with computers, when his father
purchased a Gateway 80386 SX16. Rose soon got in to the world of BBS in the late 1980s. Eventually
he was running a two-node Wildcat! BBS (and PCBoard) with a CD-ROM full of shareware for people
to access.

In 1992 Rose transferred to Vo-Tech High School in Las Vegas, to study computers and animation.
Upon graduation from high school, he attended the University of Nevada Las Vegas, majoring in
computer science, but dropped out in 1998 to pursue the 1990s tech boom. After dropping out, he
worked for the Department of Energy, at the Nevada Test Site, as a technology advisor. He later
worked for several dot-com startups through the American technology and venture capital company
CMGI.

In October 2004, Kevin decided to invest $6000 (which was supposed to be for a deposit on a house
for him and his girlfriend) into a site, called Digg, together with his friends Owen Byrne, Ron
Gorodetzky and Jay Adelson.

Digg was designed as a social news website, made for people to discover and share content from
anywhere on the Internet, by submitting links and stories, and voting and commenting on submitted
links and stories. Voting stories up and down is the site's cornerstone function, respectively
called digging and burying. Many stories get submitted every day, but only the most Digg stories
appear on the front page.

The site digg.com was launched to the world on December 5, 2004. A Rose's friend, proposed to call
the site Diggnation, but Rose wanted a simpler name. He wanted the name Dig, because users are

46
able to dig stories, out of those submitted, up to the front page, but as the domain name dig.com had
been already registered, the site was called Digg.com.

The original design of Digg was free of advertisements, but as site became more popular, Google
AdSense was added to the website. In 2005, the site was updated to "Version 2.0", which featured a
friends list, the ability to "digg" a story, without being redirected to a "success" page, and a new
interface. In June 26, 2006, the Version 3 of Digg was released with specific categories for
Technology, Science, World & Business, Videos, Entertainment and Gaming as well as a View All
section where all categories are merged.

In August 27, 2007, Digg altered its main interface, mostly in the profile area. In April, 2007, Digg
opened their API (Application Programing Interface) to the public, thus allowing software developers
to write tools and applications based on queries of Digg's public data, dating back to 2004. The
domain digg.com attracted at least 236 million visitors annually by 2008.

YouTube of Chad Hurley and Steve Chen


In February, 2005, a dinner party of fellow PayPal (the famous e-commerce Internet site for money
transfers and payments) employees happened in San Francisco, California, in the apartment of some
Steve Chen. Chen was born in August 1978 in Taiwan, his family came to the United States when he
was eight years old, where he studied computer science at the University of Illinois at Urbana-
Champaign. At this party, Chen and his friend, Chad Hurley (born in 1977 in Philadelphia, holder of a
degree in fine arts from Indiana University of Pennsylvania, the first-ever graphic designer of PayPal
in the period 1999-2002), spent much of the party shooting videos and digital photos of each other.
They easily uploaded the photos to the Web. But the videos? Not a chance.

Realizing that digital photographs were easier to share thanks to new Web sites like Flickr, they
reasoned that a similar software package to share videos was possible, too, but... Stumbling across a
need to publish a video to Internet, the friends decided to to create a video sharing website on which
users can upload and share videos. And they had the means to address this need, because Chen was an
exceptional code writer, and Hurleys gift for design could give a new Web site a compelling look.

Hurley was living in Menlo Park, California, and had a garage on his property, where in February of
2005 he and Chen set to work on creating what became YouTube. Two months later Chen and Hurley
called for help his ex-colleague from PayPal Jawed Karim, also a very good programmer. The domain
name www.youtube.com was activated the same month. Both founders agreed on a few caveats: The
site had to be easy to use for a person with only a minimum of computer skills, and it would not
require users to download any special software in order to upload or view videos. They also made it
easy for site visitors to view clips without going through a registration process, and created a quick
search function to access the archives.

YouTube relocated his work from the garage, to a modest office, situated above a pizzeria and
Japanese restaurant in San Mateo, California and offered the public a beta test of the site in May
2005, six months before the official launch in November 2005. In the same month YouTube got a
venture-capital kick-offa US$11.5 million investment by Sequoia Capital and. The first YouTube
video was uploaded in April 23, 2005, (it was entitled Me at the zoo, showing founder Jawed Karim at
San Diego Zoo, still available on the site).

Before the launch of YouTube, there were few easy methods available for ordinary computer users
who wanted to post videos online. With its simple interface, YouTube made it possible for anyone with
an Internet connection to post a video that a worldwide audience could watch within a few minutes.
The wide range of topics covered by YouTube has turned video sharing into one of the most important
parts of Internet culture.

The site grew rapidly, and in July 2006 the company announced that more than 65000 new videos
were being uploaded every day, and that the site was receiving 100 million video views per day. Hurley
and Chen's idea proved to be one of the biggest Internet phenomena of 2006. YouTube soon became
the dominant provider of online video in the United States, with a market share of around 43 percent

47
and more than six billion videos viewed in January 2009. It was estimated that 20 hours of new
videos are uploaded to the site every minute, and that around three quarters of the material comes
from outside the United States. It is also estimated that in 2007 YouTube consumed as much
bandwidth as the entire Internet in 2000. In March 2008, YouTube's bandwidth costs were estimated
at approximately US$1 million a day.

In October 2006, Google Inc. announced that it had acquired YouTube for US$1.65 billion in Google
stock. Thus YouTube founders became overnight multi-millionaires.

Viewing YouTube videos on a personal computer requires the Adobe Flash Player plug-in to be
installed in the browser. The Adobe Flash Player plug-in is one of the most common pieces of software
installed on personal computers and accounts for almost 75% of online video material. YouTube
accepts videos uploaded in most container formats, including .AVI, .MKV, .MOV, .MP4, DivX, .FLV,
and .OGG. These include video codecs such as MPEG-4, MPEG, and .WMV.

Videos uploaded to YouTube by standard account holders are limited to ten minutes in length and a
file size of 2 GB. Initially it was possible to upload longer videos, but a ten minute limit was introduced
in March 2006 after YouTube found that the majority of videos exceeding this length were
unauthorized uploads of television shows and films. Partner accounts are permitted to upload videos
longer than ten minutes, subject to acceptance by YouTube.

Twitter of Jack Dorsey


Jack Dorsey was born on November 19, 1976, in St. Louis, Missouri. He started his occasions with
computers early, and at age 14, he created an open-source program for dispatch routing, still in use till
recently. Jack went to high school at Bishop DuBourg High School, and attended Missouri University
of Science and Technology. While working on dispatching as a programmer, he later moved to
Oakland, California.

In California in 2000, Dorsey started his company to dispatch couriers, taxis, and emergency services
from the Web. His other projects and ideas at this time included networks of medical devices and
a frictionless service market. In July 2000, building on dispatching and inspired partially by services
like LiveJournal and AOL Instant Messenger, he had the idea for the real time status communication.

When he first saw implementations of instant messaging, he had wondered if the software's user
status output could be shared among friends easily. He approached Biz Stone from Odeo (directory
and search destination website for RSS syndicated audio and video), who at the time happened to be
interested in text messaging. Dorsey and Stone decided that SMS text suited the status message idea,
and built a prototype of Twitter in about two weeks. The idea attracted many users at Odeo, as well as
investment from Evan Williams, the founder of Pyra Labs and Blogger.

The working concept of Twitter is partially inspired by the cell phone text messaging service TXTMob.

The name Twitter was picked up later, as Dorsey recalled:


We wanted to capture that in the name we wanted to capture that feeling: the physical sensation
that youre buzzing your friends pocket. Its like buzzing all over the world. So we did a bunch of
name-storming, and we came up with the word "twitch," because the phone kind of vibrates when it
moves. But "twitch" is not a good product name because it doesnt bring up the right imagery. So we
looked in the dictionary for words around it, and we came across the word "twitter," and it was just
perfect. The definition was "a short burst of inconsequential information," and "chirps from birds."
And thats exactly what the product was.

Work on the project started on March 21, 2006, when Dorsey published the first Twitter message at
9:50 PST: "just setting up my twttr".

In July 2006 Twitter moved from an internal service for Odeo employees, into a full-scale version in
July 2006. In October 2006, Biz Stone, Evan Williams, Dorsey and other members of Odeo

48
formed Obvious Corporation and acquired Odeo and all of its assetsincluding Odeo.com and
Twitter.comfrom the investors and other shareholders.

The tipping point for Twitter's popularity was the 2007 South by Southwest festival. During the event
usage went from 20000 tweets per day up to 60000. The Twitter people cleverly placed two 60-inch
plasma screens in the conference hallways, exclusively streaming Twitter messages. Hundreds of
conference-goers kept tabs on each other via constant twitters. Panelists and speakers mentioned the
service, and the bloggers in attendance touted it. Soon everyone was buzzing and posting about this
new thing that was sort of instant messaging and sort of blogging and maybe even a bit of sending a
stream of telegrams.

The 140-character limit on message length was initially set for compatibility with SMS messaging, and
has brought to the web the kind of shorthand notation and slang commonly used in SMS messages.
This limit has also spurred the usage of URL shortening services such as bit.ly, goo.gl, and tr.im, and
content hosting services, such as Twitpic and NotePub to accommodate multimedia content and text
longer than 140 characters.

The Twitter Web interface uses the Ruby on Rails framework, deployed on a Ruby Enterprise Edition.
The messages were handled initially by a Ruby persistent queue server called, but since 2009 this has
been gradually replaced with software written in Scala. The service's API allows other web services
and applications to integrate with Twitter. To group posts together by topic or type, users make use of
hash tags, words or phrases prefixed with a #. Similarly, the letter d followed by a username allows
users to send messages privately to their followers. Otherwise, the @ sign followed by a username
publicly states the attached tweets are a reply to (or just mention) any specific users (who can find
such recent tweets logged in their interface).

In late 2009, the new Twitter Lists feature was added, making it possible for users to follow (and
mention/reply to) lists of authors instead of following individual authors.

On March 2010, Twitter has recorded a 1500 per cent growth in the number of registered users, the
number of its employees has grown 500 percent, while over 70000 registered applications have been
created for the microblogging platform.

The Dreamers and the Godfathers

In this section I'll present some people, who had a vision of or define some aspects or features of Internet and
WWW, several decades before their actual implementing in the global network:

Jules Verne

The great French novelist and poet Jules Gabriel Verne (18281905) is best known for his adventure
novels and his profound influence on the literary genre of science fiction.

In 1889 Verne wrote an intriguing short story "In the Year 2889", following a day in the life of a news
mogul 1000 years in his future, in which he dreams for a global network. It is believed that the story
was chiefly if not entirely the work of his sonMichel Verne (1861-1925), but undoubtedly many of the
topics in the story echo Jules' ideas.

The above mentioned news mogul, named Fritz Napoleon Smith, used two communication devices,
a phonotelephote and a telephote, to spy his wife:
This morning Mr. Fritz Napoleon Smith awoke in very bad humor. His wife having left for France
eight days ago, he was feeling disconsolate. Incredible though it seems, in all the ten years since
their marriage, this is the first time that Mrs. Edith Smith, the professional beauty, has been so long
absent from home; two or three days usually suffice for her frequent trips to Europe. The first thing
that Mr. Smith does is to connect his phonotelephote, the wires of which communicate with his Paris
mansion. The telephote! Here is another of the great triumphs of science in our time. The

49
transmission of speech is an old story; the transmission of images by means of sensitive mirrors
connected by wires is a thing but of yesterday. A valuable invention indeed, and Mr. Smith this
morning was not niggard of blessings for the inventor, when by its aid he was able distinctly to see
his wife notwithstanding the distance that separated him from her. Mrs. Smith, weary after the ball
or the visit to the theater the preceding night, is still abed, though it is near noontide at Paris. She is
asleep...

Mr. Smith introduced an advanced system, which made him rich:


Every one is familiar with Fritz Napoleon Smith's systema system made possible by the enormous
development of telephony during the last hundred years. Instead of being printed, the Earth
Chronicle is every morning spoken to subscribers, who, in interesting conversations with reporters,
statesmen, and scientists, learn the news of the day. Furthermore, each subscriber owns a
phonograph, and to this instrument he leaves the task of gathering the news whenever he happens
not to be in a mood to listen directly himself. As for purchasers of single copies, they can at a very
trifling cost learn all that is in the paper of the day at any of the innumerable phonographs set up
nearly everywhere.

Mr. Smith's reporters also are using several devices for communication:
Mr. Smith continues his round and enters the reporters' hall. Here 1500 reporters, in their respective
places, facing an equal number of telephones, are communicating to the subscribers the news of the
world as gathered during the night. The organization of this matchless service has often been
described. Besides his telephone, each reporter, as the reader is aware, has in front of him a set of
commutators, which enable him to communicate with any desired telephotic line. Thus the
subscribers not only hear the news but see the occurrences. When an incident is described that is
already past, photographs of its main features are transmitted with the narrative. And there is no
confusion withal. The reporters' items, just like the different stories and all the other component
parts of the journal, are classified automatically according to an ingenious system, and reach the
hearer in due succession. Furthermore, the hearers are free to listen only to what specially concerns
them. They may at pleasure give attention to one editor and refuse it to another.

As Mr. Smith is a very rich man, he has to follow his account, thus:
Left alone, Mr. Smith busied himself with examining his accountsa task of vast magnitude, having
to do with transactions which involve a daily expenditure of upward of $800,000. Fortunately,
indeed, the stupendous progress of mechanic art in modern times makes it comparatively easy.
Thanks to the Piano Electro-Reckoner, the most complex calculations can be made in a few seconds.
In two hours Mr. Smith completed his task. Just in time. Scarcely had he turned over the last page
when Dr. Wilkins arrived. After him came the body of Dr. Faithburn, escorted by a numerous
company of men of science. They commenced work at once. The casket being laid down in the
middle of the room, the telephote was got in readiness. The outer world, already notified, was
anxiously expectant, for the whole world could be eye-witnesses of the performance, a reporter
meanwhile, like the chorus in the ancient drama, explaining it all viva voce through the telephone.

Do you find it funny and primitive? Let's remind you, it was 1889, and the most advanced computers
were mechanical marvels like Bollee's multiplier!

Paul Otlet
The Belgian Paul Marie Ghislain Otlet (18681944) from Brussels was an author, entrepreneur,
visionary, lawyer and peace activist, and which the most important for ushe is one of several people
who have been considered the father of information science. Otlet started his work on how to collect
and organize the world's knowledge in 1890s and towards the end of working life he summarized his
ideas in two large books of synthesis, the Trait de documentation in 1934 and Monde: Essai
d'universalisme in 1935.

50
Otlet's monumental book Trait de documentation (Brussels, 1934) was both central and symbolic in
the development of information science (which he called Documentation) in the first half of 20th
century. It reminds us also of something that has been too widely forgotten: That this field did have a
lively existence in the early decades of 20th century and a sophistication concerning theory and
information technology that now commonly surprises people.

Otlet was the most central figure in the development of Documentation. He struggled tirelessly for
decades with the most important technical, theoretical, and organizational aspects of a problem,
which is central to mankind: How to make recorded knowledge available to those who need it. He
thought deeply and wrote endlessly as he designed, developed, and initiated ambitious solutions at his
Institute in Brussels.

Otlet was also an idealist and peace activist, pushing internationalist political ideas that were
embodied in the League of Nations and its International Institute for Intellectual Cooperation
(forerunner of UNESCO), working alongside his colleague Henri LaFontaine (in 1895, Otlet and
LaFontaine, co-founded the International Institute of bibliography, to promote the efficient
organizaction and dissemination of knowledge), who won the Nobel Peace Prize in 1913, to achieve
their ideas of a new world polity that they saw arising from the global diffusion of information and the
creation of new kinds of international organization. In 1910, Otlet and La Fontaine first envisioned a
"city of knowledge" (Otlet originally named the Palais Mondial (World Palace)), that would serve as a
central repository for the world's information.

In 1906, Otlet and the chemist Robert Goldschmidt, had proposed "microfiche" as a standard format
for a "microphotographic book". Later on, they proposed a portable library of "microphotographic
books".

In his Trait de Documentation, Otlet speculated imaginatively about online communications, text-
voice conversion and what is needed in computer work stations, though of course he does not use this
terminology. He enumerates inventions, such as machine translation, that are needed for information
retrieval and information processing. After stressing the importance of telecommunications and the
need for technical standards, Otlet provides a concise outline of a personal information system,
including an anticipation of hypertext:
We should have a complex of associated machines which would achieve the following operations
simultaneously or sequentially: 1. Conversion of sound into text; 2. Copying that text as many times
as is useful; 3. Setting up documents in such a way that each datum has its own identity and its
relationships with all the others in the group and to which it can be re-united as needed; 4.
Assignment of a classification code to each datum; [division of the document into parts, one for each
datum, and] rearrangement of the parts of the document to correspond with the classification
codes; 5. Automatic classification and storage of these documents; 6. Automatic retrieval of these
documents for consultation and for delivery either for inspection or to a machine for making
additional notes; 7. Mechanized manipulation at will of all the recorded data in order to derive new
combinations of facts, new relationships between ideas, new operations using symbols. The
machinery which would achieve these seven requirement would be a veritable mechanical and
collective brain.

Otlet wrote expressively of the need for an international information handling system embracing
everything, from the creation of an entry in a catalogue to new forms of publication, from the
management of libraries, archives and museums as interrelated information agencies to the
collaborative elaboration of a universal encyclopedia codifying all of man's hitherto unmanageable
knowledge. Central to all of this were the so called Universal Decimal Classification, a new kind of
information agency for information management, called the Office of Documentation, a new principle
of information indexing and storage, the monographic principle, and microfilm (the idea of providing
convenient copies of documents on microfilm dates at least from 1859, when Ren Prudent Patrice
Dagron was granted the first microfilm patent in history. Otlet and the Belgian inventor Robert B.
Goldschmidt had proposed standardized microfiche in 1906). Ultimately he foresaw the creation of
a Universal Network for Information and Documentation, to which access would be had by
multimedia workstations, that lay waiting to be invented just beyond the technological capacity of his
time.

51
In the same Trait de Documentation, Otlet predicted that media that would convey feel, taste and
smell would also eventually be invented, and that an ideal information-conveyance system should be
able to handle all of what he called sense-perception documents.

The telectrophonoscope of Mark Twain

The great American author Samuel Langhorne Clemens (18351910), better known by his pen name
Mark Twain, wrote The Adventures of Tom Sawyer and its sequel, Adventures of Huckleberry Finn,
the latter often called "the Great American Novel".

In 1898 Twain met Jan Szczepanik (18721926) (see the nearby image), a young and very capable
Austrian-Polish inventor, with several hundred patents and over 50 discoveries to his name, many of
which are still applied today. Some of Szczepanik's concepts helped the future evolution of TV
broadcasting, such as the telectroscope (an apparatus for distant reproduction of images and sound
using electricity) or the wireless telegraph, which greatly affected the development of
telecommunications. Szczepanik invented also a submarine, a colorimeter, an electric rifle, a color
image weaving method, a moving wing aircraft, a duplex rotor helicopter, a dirigible, etc.

The term "telectroscope" was devised by the French writer and publisher Louis Figuier (18191894) in
1878 to popularize an invention wrongly interpreted as real and incorrectly ascribed to Alexander
Graham Bell. Figuier's "telectroscope" was a fictional device, the first prototype television, capable to
transmit pictures and sound anywhere.

Later the word "telectroscope" was widely accepted by several 19th century inventors like George
Carey, Constantin Senlecq, Adriano de Paiva, and Jan Szczepanik, who (together with the Viennese
banker Ludwig Kleinberg) applied for a British patent for his device (Method and Apparatus for
Reproducing Pictures and the like at a Distance by Means of Electricity) in 1897 (British Patent
N5031). Szczepanik's telectroscope was covered in the New York Times in April, 1898, where it was
described as a scheme for the transmission of colored rays.

In 1898 Twain described Szczepanik in two of his articles: "The Austrian Edison keeping school again"
and "From The Times of 1904".

In the "From The Times of 1904" (it is a fictional criminal short story) Twain dreamed up for a
telelectroscope (he called it telectrophonoscope), which used the existing phone system to create a
world wide network of information-sharing. In the Twain's short story, as early as in 1904,
telectrophonoscopes are spread all over the globe, and every man can get one and exchange visual and
sound information with all other owners of telectrophonoscopes.

Nikola Tesla
The great inventor Nikola Tesla (1856-1943) was also a great dreamer. In 1893, he started his wireless
investigations and several years later he described his futuristic vision: a means of tapping the sun's
energy with an antenna, possibility to control the weather with electrical energy, outlined machines
that would make war an impossibility, and proposed a global system of wireless communications.

In a 1900 issue of the Century Magazine, Tesla published a sensational article"The Problem of
Increasing Human Energy".

In this article among many other things, Tesla proposed to use stationary waves in the earth for
telegraphy without wires to any distance:
With these developments we have every reason to anticipate that in a time not very distant most
telegraphic messages across the oceans will be transmitted without cables. For short distances we

52
need a "wireless" telephone, which requires no expert operators. The greater the spaces to be
bridged, the more rational becomes communication without wires. The cable is not only an easily
damaged and costly instrument, but it limits us in the speed of transmission by reason of a certain
electrical property inseparable from its construction. A properly designed plant for effecting
communication without wires ought to have many times the working capacity of a cable, while it
will involve incomparably less expense. Not a long time will pass, I believe, before communication
by cable will become obsolete, for not only will signaling by this new method be quicker and
cheaper, but also much safer. By using some new means for isolating the messages which I have
contrived, an almost perfect privacy can be secured.
I have observed the above effects so far only up to a limited distance of about six hundred miles, but
inasmuch as there is virtually no limit to the power of the vibrations producible with such an
oscillator, I feel quite confident of the success of such a plant for effecting transoceanic
communication. Nor is this all. My measurements and calculations have shown that it is perfectly
practicable to produce on our globe, by the use of these principles, an electrical movement of such
magnitude that, without the slightest doubt, its effect will be perceptible on some of our nearer
planets, as Venus and Mars. Thus from mere possibility interplanetary communication has entered
the stage of probability. In fact, that we can produce a distinct effect on one of these planets in this
novel manner, namely, by disturbing the electrical condition of the earth, is beyond any doubt. This
way of effecting such communication is, however, essentially different from all others which have so
far been proposed by scientific men. In all the previous instances only a minute fraction of the total
energy reaching the planetas much as it would be possible to concentrate in a reflectorcould be
utilized by the supposed observer in his instrument. But by the means I have developed he would be
enabled to concentrate the larger portion of the entire energy transmitted to the planet in his
instrument, and the chances of affecting the latter are thereby increased many million fold.

The article caught the attention of one of the world's most powerful men at the time, John Pierpont
Morgan, who invited the inventor to his home.

Tesla proposed to Morgan a scheme that must have sounded like science fiction: a world system of
wireless communications to relay telephone messages across the ocean; to broadcast news, stock
market reports, private and military messages and communications, and even pictures and music to
any part of the world. When wireless is fully applied the earth will be converted into a huge brain,
capable of response in every one of its parts,....

Tesla asked Morgan for the money he needed to start his project, but Morgan turned him down. Then
Tesla offered him 51% of the patent rights to his inventions for $150000 and Morgan accepted. It
seems however, in spite of what Tesla told Morgan, his actual plan was to make a large-scale
demonstration of electrical power transmission without wires, and this turned out to be a fatal
mistake.

By 1901 the so called Wardenclyffe project was well under construction, the most challenging task
being the erection of an enormous tower, rising over 60 meters in the air and supporting on its top a
55 ton steel sphere. Beneath the tower, a well-like shaft plunged 40 meters into the ground, and 16
iron pipes were driven 90 meters deeper so that currents could pass through them and seize hold of
the earth. As Tesla explainedIn this system that I have invented, it is necessary for the machine to
get a grip of the earth, otherwise it cannot shake the earth. It has to have a grip... so that the whole
of this globe can quiver.

As Wardenclyffe tower construction slowly increased, it became evident that more money were
needed. Tesla pleaded with Morgan for more financial support, but he refused. To make matters
worse, the stock market crashed and prices for the tower's materials doubled. High prices combined
with Tesla's inability to find enough willing investors eventually led to the demise of the project in
1905, after some amazing electrical displays.

Tesla was also a pioneer in the area of remote control (as he demonstrated it to the crowd of people his
remotely-guided boat in Madison Square Garden in 1898), but he considered also the systems
endowed by their own intelligence. About all these things he wrote:
I treated the whole field broadly not limiting myself to mechanics, controlled from a distance but to
machines possessed of their own intelligence. Since that time I had advanced greatly in the evolution

53
of the invention and think that the time is not distant when I shall show an automaton which left to
itself, will act as though possessed of reason and without any willful control from the outside.
Whatever be the practical possibilities of such an achievement it will mark the beginning of a new
epoch in mechanics.

Herbert Wells

Herbert George Wells (21.09.186613.08.1946) is an internationally famous English author, a prolific


writer in many genres, including contemporary novels, history, and social commentary, best known
however for his work in the science fiction genre. Wells is often referred to as The Father of Science
Fiction. Everybody knows his The Time Machine, The War of the Worlds, The Invisible Man, The
Island of Doctor Moreau, etc.

Approaching the end of his life, in 1938, he published a book of essays, called World Brain (some of
the essays were first presented with great succes as speeches in 1937), which he later on described as a
book quite bold and uncompromising in substance, but still with a distinctly propitiatory manner. In
several instances throughout the book, Wells presents his idea of a universal, evolving encyclopedia,
that would help people become better informed citizens of the world.

The essay The Brain Organization of the Modern World lays out Wells's vision for ...a sort of mental
clearing house for the mind, a depot where knowledge and ideas are received, sorted, summarized,
digested, clarified and compared. Wells felt that technological advances, such as microfilm, could be
used towards this end so that any student, in any part of the world, will be able to sit with his
projector in his own study at his or her convenience to examine any book, any document, in an exact
replica.

In the essay, titled The World Brain: The Idea of a Permanent World Encyclopedia (see the essay)
Wells explains how then-current encyclopedias failed to adapt to both the growing increase in
recorded knowledge and the expansion of people requiring information that was accurate and readily
accessible. Wells asserted the need for an entirely new world organ, that should be created for the
collection, organization, and release of knowledge. This he called the Permanent World Encyclopedia,
and it would include everything from the practical needs of society to general, global education. In
addition, he explained the importance of workers whose job it would be to continually update and
maintain this index of knowledge.

Admittedly, Wells was a writer, not a technical genius, so he couldn't imagine what could be the
technological base of such a World Encyclopedia. According to him, the micro-photography would
also be vital, as it could provide a visual record of the knowledge it contains.

Wilhelm Ostwald

Friedrich Wilhelm Ostwald (18531932) was a German chemist, one of the founders of the field of
physical chemistry, winner of the Nobel Prize in Chemistry in 1909 for his work on catalysis, chemical
equilibria and reaction velocities.

In 1910 Ostwald was in Brussels and met Paul Otlet, and they discussed the methods of organization
of knowledge. Ostwald had a long-standing interest in the organizational of science, the relationship
between science and society, and the effective publication and use of science literature. He was very
interested in the efforts of Otlet and his partner LaFontaine to create their Universal Decimal
Classification and Universal Bibliographical Repertory, a catalog of all documents of all kinds,
including images.

54
Ostwald was so inspired by Otlet's institute, that he decided to establish a similar initiative in
Germany. He invited Adolf Saager, a German writer, and Karl Wilhelm Buhrer, a Swiss businessman,
and in 1911, using Ostwald's Nobel Prize money, they founded "The Bridge: International Institute for
the Organizing of Knowledge Work". The founders believed, that scientific and intellectual work was
more the result of the efforts of individuals, who are geographically and otherwise isolated from each
other, so bridges are needed to connect them. Ostwald, just like Goldberg and Otlet, believed in the
need for creative interaction between science and society.

Ostwald and his friends advanced a modernist approach to the management of knowledge by seeking
to atomize literature into small components of recorded thoughts, much smaller than books, articles
and reports. They believed, that these individual single chinks of knowledge could be arranged and
linked in multiple ways, using the expanded decimal classification for the especially important and
difficult task of linking each chink with other chunks on the same and related topics. They intended to
use as units of recorded knowledge sets of printed cards.

A complete sets of all cards would provide a comprehensive, dynamically updated, easily distributed
encyclopedia of all recorded knowledge, which in 1912, Ostwald described as a "world brain". Anyone
could then assemble, selectively, the set of cards, that would constitute a concise summary of any filed
of interest.

Ostwald and his friends called their approach to manipulate and rearrange knowledge das
Monographprinzip (monographic principle). Their use of this principle was a form of hypertext and
the sophisticated structure of links between documents. Of course, prior to the use of digital
computers, hypertext was cumbersome and laborious.

Unfortunately, after a brief but vigorous existence, the Bridge collapsed when the Ostwald's prize
money ran out.

Emanuel Goldberg

In the 1920s the German scientist Emanuel Goldberg (see biography of Emanuel Goldberg) of Zeiss
Ikon, Dresden, pioneered the electronic retrieval technology and library automation. Goldberg
designed, built and demonstrated a "photoelectric microfilm selector" which contained many, if not
all, of the concepts history of science professionals now associate with Vannevar Bush. It seems this
was the first practical application of electronics to the selection of data on film.

In 1914, Emanuel Goldberg developed a machine that read characters and converted them into
standard telegraph code (early OCR). Later (by May 1927) Goldberg designed a photoelectric
microfilm selector, which he called a statistical machine. Two prototypes were built at Zeiss Ikon by
1931 and, perhaps, constitute the first successful electronic document retrieval.

During the International Congress of Photography in Dresden in 1931 Goldberg presented his
"Statistical Machine," a document search engine that used photoelectric cells and pattern recognition
to search the metadata on rolls of microfilmed documents.

The Congress of Photography in Dresden in 1931, must be regarded as a peak in Goldberg's career. It
was the proposal presented on behalf of the Committee for Sensitometry by Goldberg and his
professor Robert Luther for a standard measure of film speeds that became the principal topic of
discussion. This proposal led to the adoption of the familiar DIN and ASA film speed ratings. At the
Congress Goldberg gave extremely interested lectures and was awarded with prestigious Peligot medal
of the French Society for Photography and Cinematography.

55
These events seem to have overshadowed a paper that Goldberg presented at one of the technical
sessions entitled "Neue Wege der photographischen Registertechnik" ("New Methods of Photographic
Indexing"). It was a clear and concise paper describing the design of a microfilm selector using a
photoelectric cell. It is, perhaps, the first paper on electronic document retrieval and describes what
seems to have been the first functioning document retrieval system using electronics. A prototype was
also demonstrated.

In the 1920s microfilm had become popular as a storage medium for records (e.g. in banks), and all
kinds of people were busily inventing microfilm equipment. Microfilming saved storage space and
banks found that microfilming cancelled checks was a useful measure against fraud. But, since the
documents were unlikely to have been microfilmed in an order that was convenient for identifying
individual records, the question became how to search for any given document.

Obviously the best solution was to have an integral retrieval system (one which combined the index
and the document). There are two logical possibilities. One could attach frames of microfilm to the
card ("aperture cards") or one could record the logical equivalent of a punched card on to the
microfilm alongside the image of the document. One might punch holes in the film or arrange opaque
and translucent spots on the film to denote hole or no-hole. Each of these techniques was tried. The
usual form of microfilm selector technology is to create a "search card" (a punched card) or template,
bearing the coding pattern sought, and align it and the coded areas on the microfilm between a light
source and a photoelectric cell.

In the upper picture you can see the Statistical Machine's sensing mechanism. Rays from the light are
blocked by the search card except the holes for the code being sought.

As the microfilm bearing codes moves past the search card (see the lower picture), the coincidence of a
pattern on the microfilm matching the pattern on the search card would affect the flow of light from
the light source to the photocell and, thereby, the flow of electric current from the photocell. In this
way the desired record is identified and appropriate action, such as the creation of a copy, is triggered.

When Goldberg's U.S. patent appeared in 1931 (see the patent US1838389), IBM promptly acquired a
license for it. James Bryce, the Chief Scientific Director of IBM, monitored new developments in
electronics and was interested in microfilm as a data storage medium. Later in 1936 Bryce himself
applied for a patent for an advanced microfilm selector.

George Stibitzremote access to a computer

The Relay Computers and the life of George Stibitz had already been examined in other sections of
this site. Here we will pay attention only to his pioneering work in the field of remote access to a
computer.

In the end of 1937, George Stibitz, then working at Bell Labs as a research mathematician, started to
develop his first relay-based calculator. After completion of the designs in February 1938, in April
1939 began the construction of the machine. The leading engineer was Samuel Williams, a switching
engineer in Bell Labs. The final product (called Complex Number Calculator, later renamed Model I
Relay Computer) was ready in October, 1939, and was first put into operation on January 8, 1940.

The calculating unit (processor) has 4 registers and is completely separated from the input/output
unit, which is a special terminal (see the nearby photo). The computer itself was kept in an out-of-the-
way room in the labs, where few ever saw it. The operators accessed it remotely, using one of three
modified teletype machines (see the nearby photo), which consisted of a keyboard and a printing

56
device, connected to processor by a multiple-wire buss and placed elsewhere, which however cannot
work simultaneously.

Stibitz developed further the idea of remote, multiple access to a computer. On September 11, 1940
the American Mathematical Society met at Dartmouth College in Hanover, New Hampshire (see the
lower photo), a few hundred miles north of the building of Bell Labs in New York, where was
the Complex Number Computer. Stibitz arranged to have the computer connected by telephone lines
(using 28-wire teletype cable) to a teletype unit, installed there. It was the first computing machine
ever used remotely over a phone line. The Complex Number Computer worked well, and there is no
doubt it impressed those who used it. The meeting was attended by many of America's most
prominent mathematicians, as well as individuals who later led important computing projects (e.g.,
John von Neumann, John Mauchly, G. Birkhoff and Norbert Wiener).

Later on George Stibitz remembered (in his article "Early Computers", published in A History of
Computing in the Twentieth Century, ed. N. Metropolis, J. Howlett, and Gian-Carlo Rota, New York,
1980): "In September 1940, after several months of routine use at the Laboratories, the computer
was demonstrated at a meeting of the American Mathematical Society held at Dartmouth College, in
Hanover, New Hampshire . . . I gave a short paper on the use and design of the computer after
which those attending were invited to transmit problems from a Teletype in McNutt Hall to the
computer in New York. Answers returned over the same telegraph connection and were printed out
on the Teletype."

Now there is a commemorative bronze plaque located in McNutt Hall with the text:
"IN THIS BUILDING ON SEPTEMBER 9, 1940, GEORGE ROBERT STIBITZ, THEN A
MATHEMATICIAN WITH BELL TELEPHONE LABORATORIES, FIRST DEMONSTRATED THE
REMOTE OPERATION OF AN ELECTRICAL DIGITAL COMPUTER. STIBITZ, WHO CONCEIVED
THE ELECTRICAL DIGITIAL COMPUTER IN 1937 AT BELL LABS, DESCRIBED HIS INVENTION
OF THE "COMPLEX NUMBER CALCULATOR" AT A MEETING OF THE MATHEMATICAL
ASSOCIATION OF AMERICA HELD HERE. MEMBERS OF THE AUDIENCE TRANSMITTED
PROBLEMS TO THE COMPUTER AT BELL LABS IN NEW YORK CITY, AND IN SECONDS
RECEIVED SOLUTIONS TRANSMITTED FROM THE COMPUTER TO A TELETYPEWRITER IN
THIS HALL."

The MEMEX of Vannevar Bush

Vannevar Bush (biography of Vannevar Bush) (1890-1974) was an American engineer, policymaker
and science administrator, known primarily for his work on analog computing and his political role in
the development of the atomic bomb.

In 1945, in the article As We May Think (the paper was originally written in 1939, but was originally
published in the July 1945 issue of the magazine The Atlantic Monthly) Bush proposed a theoretical
proto-hypertext system (an electromechanical device, called memex), which has influenced the
development of subsequent hypertext and intellect augmenting computer systems.

Bush was inspired by perceptions of need that are, similar to those that inspired Paul Otlet, Herbert
Wells and Emanuel Goldberg. Following the expansion of scientific activity, he had come to believe
that our methods for transmitting and reviewing the results of research were no longer adequate. As
the scientific specialization needed for progress increased, the investigator is staggered by the
findings and conclusions of thousands of other workersconclusions which he cannot find time to
grasp, much less to remember, as they appear. It seemed to him that publication has been extended
beyond our present ability to make real use of the record.
In his view, as an engineer and scientist, the answer was to be found in harnessing technology to

57
provide a sophisticated mechanical solution to the problem. Bush's idea should be viewed from the
historical perspective of microfilm technology developed prior to 1945, as Bush was involved in the
development of this technology and directed creation of a photoelectronic microfilm rapid selector at
MIT during 1938-1940.

Extrapolating from the technology of his time, Bush described a new kind of device which was a sort
of mechanized file and library. He called it a "memex" (from "memory extender"):
A memex is a device in which an individual stores all his books, records, and communications, and
which is mechanized so that it may be consulted with exceeding speed and flexibility. It is an
enlarged intimate supplement to his memory.
It consists of a desk, and while it can presumably be operated from a distance, it is primarily the
piece of furniture at which he works. On the top are slanting translucent screens, on which material
can be projected for convenient reading. There is a keyboard, and sets of buttons and levers.
Otherwise it looks like an ordinary desk (see the lower illustration from the September, 1945, issue
of Life magazine).

Original illustration of the Memex from the Life reprint of "As We May Think"

All of the documents used in the memex would be in the form of microfilm copy acquired as such or,
in the case of personal records, transformed to microfilm by the machine itself. Memex would also
employ new retrieval techniques based on a new kind of associative indexing the basic idea of which
is a provision whereby any item may be caused at will to select immediately and automatically
another to create personal "trails" through linked documents. The new procedures, that Bush
anticipated facilitating information storage and retrieval would lead to the development of wholly
new forms of encyclopedia.

The most important mechanism, conceived by Bush and considered as closed to the
modern hypertext systems is the associative trail. It would be a way to create a new linear sequence of
microfilm frames across any arbitrary sequence of microfilm frames by creating a chained sequence of
links in the way just described, along with personal comments and side trails.
The essential feature of the memex [is] the process of tying two items together When the user is
building a trail, he names it in his code book, and taps it out on his keyboard. Before him are the two
items to be joined, projected onto adjacent viewing positions. At the bottom of each there are a
number of blank code spaces, and a pointer is set to indicate one of these on each item. The user taps

58
a single key, and the items are permanently joined Thereafter, at any time, when one of these
items is in view, the other can be instantly recalled merely by tapping a button below the
corresponding code space.

In the article of Bush is not described any automatic search, nor any universal metadata scheme such
as a standard library classification or a hypertext element set. Instead, when the user made an entry,
such as a new or annotated manuscript, or image, he was expected to index and describe it in his
personal code book. Later on, by consulting his code book, the user could retrace annotated and
generated entries.

In 1965 Bush took part in the project INTREX of MIT, for developing technology for mechanization
the processing of information for library use. In his 1967 essay titled "Memex Revisited", he pointed
out that the development of the digital computer, the transistor, the video, and other similar devices
had heightened the feasibility of such mechanization, but costs would delay its achievements. He was
right again.

Ted Nelson, who later did pioneering work with first practical hypertext system and coined the term
"hypertext" in the 1960's, credited Bush as his main influence. Others, such as Licklider and Douglas
Engelbart have also paid homage to Bush. The modern Internet, the development and proliferation of
high-density storage media, and the critical dependence of computer users on searching, are
resounding testimonies to Vannevar Bush's foresight.

Murray Leinster
Murray Leinster (1896-1975) was a nom de plume of William Fitzgerald Jenkins, a famous American
writer of science fiction and alternate history. He wrote and published over 1500 short stories and
articles, 14 movie scripts, and hundreds of radio scripts and television plays.

In the March 1946 issue of the American science fiction magazine Astounding Science Fiction was
published the Murray Leinster's short story "A Logic Named Joe". The story actually appeared under
Leinster's real name, Will F. Jenkins, since that issue of Astounding also included a story under the
Leinster pseudonym called "Adapter". In this story Leinster made one of the first descriptions of a
personal computer (called a "logic") in science fiction. Moreover, in the story Leinster imaged a global
network, connecting logics, a real forerunner of the now ubiquitous Internet.

Leinster envisioned logics in every home, linked through a distributed system of servers (called
"tanks"), to provide communications, entertainment, data access, and even commerce. One of his
characters says that logics are civilization.

The story's narrator is a logic maintenance man, working for the Logics Company, nicknamed Ducky.
In the story, a logic named Joe develops some degree of sapience and ambition. Joe proceeds to switch
around a few relays in the tank (tank one of a distributed set of central information repositories,
something similar to the web servers on the World Wide Web), and cross-correlate all information
ever assembled (massive data-mining)yielding highly unexpected results. Joe then proceeds to
freely disseminate all of those results to everyone on demand (and simultaneously disabling all of the
content-filtering protocols). Logics everywhere begin offering up unexpected assistance, offering to
solve generally all human's problemsfrom designing custom chemicals to alleviate inebriation, to
giving sex advice to small children, to plotting the perfect murder. Information runs rampant as every
logic worldwide crunches away at problems too vast in scope for human minds to have attempted.
Societal chaos quickly ensues, the situation became critical.

And finally, what Ducky was supposed to do in this situation? Neither more nor less than to save
civilization, disconnecting Joe and put the logic down in the cellar. I was a simple and effective
solution, wasn't it? Sometimes I wish we have similar solution;-)

59

Das könnte Ihnen auch gefallen