Sie sind auf Seite 1von 9

Who is a Butler?

A butler is traditionally the chief manservant of a large household. He is in charge of all other servants in
the house. Butlers are usually male and are in charge of male servants. They are usually in charge of the
dining room, pantry and wine cellar. Traditionally, butler was the most experienced worker in the
household. Names such as majordomo, house manager, staff manager, chief of staff, estate manager
and head of household staff are also used to refer to a butler. Responsibilities of a butler may change
depending on the employers lifestyle.

Responsibilities of a Butler

Supervising and training household staff

Serving meals and drinks, answering the door and telephone, setting the table and serving
formal meals

Taking care of the wine cellar, and the valuable possessions of the house (china, crystal, etc.)

Managing the household budget and organizing events

Assisting with the family and household security measures

In addition, butlers are expected to respect the privacy and confidentiality of the employers and to
remain invisible and available. Butler positions are typically live-in positions and may even require
travelling with the employer. Butlers may be also required to perform valet duties and light
housekeeping. Thus, they should be flexible in terms of tasks and schedule.

Who is a Valet?
A valet is traditionally a mans male attendant who is responsible for his clothes and appearance. Valets
are typically responsible for the clothes, and personal belongings of the employers and other minor
details.

Valets are usually men. The rough female equivalent of valet is ladies maid. Valets are traditionally
employed by gentlemen belonging to noble or wealthy families. In grand houses, the master of the
house usually keeps a valet; if the family is very wealthy, the sons of the master may also have their
personal valets. However, in a small household, the butler may double as the valet.

Responsibilities of a Valet
Storing and keeping inventory of clothing, jewellery and personal accessories

Assisting in toiletries, dressing, and hair styling

Packing and unpacking for travelling and serving light meals


Doing light mending, pressing, polishing shoes, etc.

Providing personal assistant to the employer

What is the difference between Butler and Valet?

*Butler is the chief manservant in a household.

* Valet is a mans personal male attendant.

(Responsibilities)
*Butler is involved in supervising the staff, taking charge of the kitchen, pantry, wine cellar and dining
room, organizing events, greeting guests, assisting with security arrangements, managing the budget
and inventory, etc.

*Valets have responsibilities like assisting with toiletries and dressing, taking care of clothes and
accessories, packing and unpacking for travel, assisting with storage and inventory of personal items,
etc.

(Authority)
*Butler is in charge of the whole household staff.

*Valet is not in charge of other staff members.

(Female Equivalent)
*Butler is roughly equivalent to a housekeeper

.* Valet is roughly equivalent to a ladies maid.

Seniority butlers.
*Traditionally, butlers were the most senior worker of the household.

*Valets may not be as experienced as


History of Computers
This chapter is a brief summary of the history of Computers. It is supplemented by the two PBS documentaries video tapes
"Inventing the Future" And "The Paperback Computer". The chapter highlights some of the advances to look for in the
documentaries.

In particular, when viewing the movies you should look for two things:
The progression in hardware representation of a bit of data:
Vacuum Tubes (1950s) - one bit on the size of a thumb;
Transistors (1950s and 1960s) - one bit on the size of a fingernail;
Integrated Circuits (1960s and 70s) - thousands of bits on the size of a hand
Silicon computer chips (1970s and on) - millions of bits on the size of a finger nail.

The progression of the ease of use of computers:


Almost impossible to use except by very patient geniuses (1950s);
Programmable by highly trained people only (1960s and 1970s);
Useable by just about anyone (1980s and on).

to see how computers got smaller, cheaper, and easier to use. First Computers

First Computers
The first substantial computer was the giant ENIAC machine by John W. Mauchly and J. Presper Eckert at the University of
Pennsylvania. ENIAC (Electrical Numerical Integrator and Calculator) used a word of 10 decimal digits instead of binary ones like
previous automated calculators/computers. ENIAC was also the first machine to use more than 2,000 vacuum tubes, using
nearly 18,000 vacuum tubes. Storage of all those vacuum tubes and the machinery required to keep the cool took up over 167
square meters (1800 square feet) of floor space. Nonetheless, it had punched-card input and output and arithmetically had 1
multiplier, 1 divider-square rooter, and 20 adders employing decimal "ring counters," which served as adders and also as quick-
access (0.0002 seconds) read-write register storage.

How were computers invented?


Charles Babbage, an English mechanical engineer and polymath, originated the concept of a programmable computer.
Considered the "father of the computer", he conceptualized and invented the first mechanical computer in the early 19th
century. ... The machine was about a century ahead of its time.

How the first computer was invented?


We could argue that the first computer was the abacus or its descendant, the slide rule, invented by William Oughtred in 1622.
But the first computer resembling today's modern machines was the Analytical Engine, a device conceived and designed by
British mathematician Charles Babbage between 1833 and 1871.

When was the first computer used?


The first personal computer. In 1975, Ed Roberts coined the term "personal computer" when he introduced the Altair 8800.
Although the first personal computer is considered by many to be the KENBAK-1, which was first introduced for $750 in
1971.Apr 26, 2017
When was the second computer made?
Second Generation (1956-1963) Transistors. Transistors replace vacuum tubes and ushered in the second generation of
computers. The transistor was invented in 1947 but did not see widespread use in computers until the late 1950s
5 Generations of computer.

First Generation (1940-1956) Vacuum Tubes


The first computers used vacuum tubes for circuitry and magnetic drums for memory, and were often enormous, taking up entire rooms. They
were very expensive to operate and in addition to using a great deal of electricity, the first computers generated a lot of heat, which was often
the cause of malfunctions.

First generation computers relied on machine language, the lowest-level programming language understood by computers, to perform
operations, and they could only solve one problem at a time, and it could take days or weeks to set-up a new problem. Input was based on
punched cards and paper tape, and output was displayed on printouts.

The UNIVAC and ENIAC computers are examples of first-generation computing devices. The UNIVAC was the first commercial computer
delivered to a business client, the U.S. Census Bureau in 1951.

Second Generation (1956-1963) Transistors


Transistors replace vacuum tubes and ushered in the second generation of computers. The transistor was invented in 1947 but did not see
widespread use in computers until the late 1950s. The transistor was far superior to the vacuum tube, allowing computers to become smaller,
faster, cheaper, more energy-efficient and more reliable than their first-generation predecessors.

Though the transistor still generated a great deal of heat that subjected the computer to damage, it was a vast improvement over the vacuum
tube. Second-generation computers still relied on punched cards for input and printouts for output.

Second-generation computers moved from cryptic binary machine language to symbolic, or assembly, languages, which allowed programmers
to specify instructions in words. High-level programming languages were also being developed at this time, such as early versions of COBOL and
FORTRAN. These were also the first computers that stored their instructions in their memory, which moved from a magnetic drum to magnetic
core technology.

Third Generation (1964-1971) Integrated Circuits


The development of the integrated circuit was the hallmark of the third generation of computers. Transistors were miniaturized and placed on
silicon chips, called semiconductors, which drastically increased the speed and efficiency of computers.

Instead of punched cards and printouts, users interacted with third generation computers through keyboards and monitors and interfaced with
an operating system, which allowed the device to run many different applications at one time with a central program that monitored the
memory. Computers for the first time became accessible to a mass audience because they were smaller and cheaper than their predecessors.

Fourth Generation (1971-Present) Microprocessors


The microprocessor brought the fourth generation of computers, as thousands of integrated circuits were built onto a single silicon chip. What
in the first generation filled an entire room could now fit in the palm of the hand. The Intel 4004 chip, developed in 1971, located all the
components of the computerfrom the central processing unit and memory to input/output controlson a single chip.

In 1981 IBM introduced its first computer for the home user, and in 1984 Apple introduced the Macintosh. Microprocessors also moved out of
the realm of desktop computers and into many areas of life as more and more everyday products began to use microprocessors.

As these small computers became more powerful, they could be linked together to form networks, which eventually led to the development of
the Internet. Fourth generation computers also saw the development of GUIs, the mouse and handheld devices.

Fifth Generation (Present and Beyond) Artificial Intelligence


Fifth generation computing devices, based on artificial intelligence, are still in development, though there are some applications, such as voice
recognition, that are being used today. The use of parallel processing and superconductors is helping to make artificial intelligence a reality.
Quantum computation and molecular and nanotechnology will radically change the face of computers in years to come. The goal of fifth-
generation computing is to develop devices that respond to natural language input and are capable of learning and self-organization.
A brief history of computers
by Chris Woodford. Last updated: November 8, 2016.

Computers truly came into their own as great inventions in the last two decades of the 20th century. But
their history stretches back more than 2500 years to the abacus: a simple calculator made from beads
and wires, which is still used in some parts of the world today. The difference between an ancient
abacus and a modern computer seems vast, but the principlemaking repeated calculations more
quickly than the human brainis exactly the same.

Cogs and Calculators


It is a measure of the brilliance of the abacus, invented in the Middle East circa 500 BC, that it remained
the fastest form of calculator until the middle of the 17th century. Then, in 1642, aged only 18, French
scientist and philosopher Blaise Pascal (16231666) invented the first practical mechanical calculator,
the Pascaline, to help his tax-collector father do his sums. The machine had a series of interlocking cogs
(gear wheels with teeth around their outer edges) that could add and subtract decimal numbers. Several
decades later, in 1671, German mathematician and philosopher Gottfried Wilhelm Leibniz (16461716)
came up with a similar but more advanced machine. Instead of using cogs, it had a "stepped drum" (a
cylinder with teeth of increasing length around its edge), an innovation that survived in mechanical
calculators for 300 hundred years. The Leibniz machine could do much more than Pascal's: as well as
adding and subtracting, it could multiply, divide, and work out square roots. Another pioneering feature
was the first memory store or "register."

Engines of Calculation
How punched cards were used in early computers. A drawing from Herman Hollerith's Art of Compiling
Statistics Patent, January 8, 1889.

Neither the abacus, nor the mechanical calculators constructed by Pascal and Leibniz really qualified as
computers. A calculator is a device that makes it quicker and easier for people to do sumsbut it needs
a human operator. A computer, on the other hand, is a machine that can operate automatically, without
any human help, by following a series of stored instructions called a program (a kind of mathematical
recipe). Calculators evolved into computers when people devised ways of making entirely automatic,
programmable calculators.

Bush and the bomb


The history of computing remembers colorful characters like Babbage, but others who played
importantif supportingroles are less well known. At the time when C-T-R was becoming IBM, the
world's most powerful calculators were being developed by US government scientist Vannevar Bush
(18901974). In 1925, Bush made the first of a series of unwieldy contraptions with equally
cumbersome names: the New Recording Product Integraph Multiplier. Later, he built a machine called
the Differential Analyzer, which used gears, belts, levers, and shafts to represent numbers and carry out
calculations in a very physical way, like a gigantic mechanical slide rule. Bush's ultimate calculator was
an improved machine named the Rockefeller Differential Analyzer, assembled in 1935 from 320 km (200
miles) of wire and 150 electric motors. Machines like these were known as analog calculatorsanalog
because they stored numbers in a physical form (as so many turns on a wheel or twists of a belt) rather
than as digits. Although they could carry out incredibly complex calculations, it took several days of
wheel cranking and belt turning before the results finally emerged.

Turingtested
Many of the pioneers of computing were hands-on experimentersbut by no means all of them. One of
the key figures in the history of 20th-century computing, Alan Turing (19121954) was a brilliant
Cambridge mathematician whose major contributions were to the theory of how computers processed
information. In 1936, at the age of just 23, Turing wrote a groundbreaking mathematical paper called
"On computable numbers, with an application to the Entscheidungsproblem," in which he described a
theoretical computer now known as a Turing machine (a simple information processor that works
through a series of instructions, reading data, writing results, and then moving on to the next
instruction). Turing's ideas were hugely influential in the years that followed and many people regard
him as the father of modern computingthe 20th-century's equivalent of Babbage.

The microelectronic revolution


Vacuum tubes were a considerable advance on relay switches, but machines like the ENIAC were
notoriously unreliable. The modern term for a problem that holds up a computer program is a "bug."
Popular legend has it that this word entered the vocabulary of computer programmers sometime in the
1950s when moths, attracted by the glowing lights of vacuum tubes, flew inside machines like the
ENIAC, caused a short circuit, and brought work to a juddering halt. But there were other problems with
vacuum tubes too. They consumed enormous amounts of power: the ENIAC used about 2000 times as
much electricity as a modern laptop. And they took up huge amounts of space. Military needs were
driving the development of machines like the ENIAC, but the sheer size of vacuum tubes had now
become a real problem. ABC had used 300 vacuum tubes, Colossus had 2000, and the ENIAC had 18,000.
The ENIAC's designers had boasted that its calculating speed was "at least 500 times as great as that of
any other existing computing machine." But developing computers that were an order of magnitude
more powerful still would have needed hundreds of thousands or even millions of vacuum tubeswhich
would have been far too costly, unwieldy, and unreliable. So a new technology was urgently required.

Personal computers
By 1974, Intel had launched a popular microprocessor known as the 8080 and computer hobbyists were
soon building home computers around it. The first was the MITS Altair 8800, built by Ed Roberts. With
its front panel covered in red LED lights and toggle switches, it was a far cry from modern PCs and
laptops. Even so, it sold by the thousand and earned Roberts a fortune. The Altair inspired a Californian
electronics wizard name Steve Wozniak (1950) to develop a computer of his own. "Woz" is often
described as the hacker's "hacker"a technically brilliant and highly creative engineer who pushed the
boundaries of computing largely for his own amusement. In the mid-1970s, he was working at the
Hewlett-Packard computer company in California, and spending his free time tinkering away as a
member of the Homebrew Computer Club in the Bay Area.

The user revolution


Fortunately for Apple, it had another great idea. One of the Apple II's strongest suits was its sheer "user-
friendliness." For Steve Jobs, developing truly easy-to-use computers became a personal mission in the
early 1980s. What truly inspired him was a visit to PARC (Palo Alto Research Center), a cutting-edge
computer laboratory then run as a division of the Xerox Corporation. Xerox had started developing
computers in the early 1970s, believing they would make paper (and the highly lucrative photocopiers
Xerox made) obsolete. One of PARC's research projects was an advanced $40,000 computer called the
Xerox Alto. Unlike most microcomputers launched in the 1970s, which were programmed by typing in
text commands, the Alto had a desktop-like screen with little picture icons that could be moved around
with a mouse: it was the very first graphical user interface (GUI, pronounced "gooey")an idea
conceived by Alan Kay (1940) and now used in virtually every modern computer. The Alto borrowed
some of its ideas, including the mouse, from 1960s computer pioneer Douglas Engelbart (19252013)

From nets to the Internet


Standardized PCs running standardized software brought a big benefit for businesses: computers could
be linked together into networks to share information. At Xerox PARC in 1973, electrical engineer Bob
Metcalfe (1946) developed a new way of linking computers "through the ether" (empty space) that he
called Ethernet. A few years later, Metcalfe left Xerox to form his own company, 3Com, to help
companies realize "Metcalfe's Law": computers become useful the more closely connected they are to
other people's computers. As more and more companies explored the power of local area networks
(LANs), so, as the 1980s progressed, it became clear that there were great benefits to be gained by
connecting computers over even greater distancesinto so-called wide area networks (WANs).

Today, the best known WAN is the Interneta global network of individual computers and LANs that
links up hundreds of millions of people. The history of the Internet is another story, but it began in the
1960s when four American universities launched a project to connect their computer systems together
to make the first WAN. Later, with funding for the Department of Defense, that network became a
bigger project called ARPANET (Advanced Research Projects Agency Network). In the mid-1980s, the US
National Science Foundation (NSF) launched its own WAN called NSFNET. The convergence of all these
networks produced what we now call the Internet later in the 1980s. Shortly afterward, the power of
networking gave British computer programmer Tim Berners-Lee (1955) his big idea: to combine the
power of computer networks with the information-sharing idea Vannevar Bush had proposed in 1945.
Thus, was born the World Wide Weban easy way of sharing information over a computer network. It's
Tim Berners-Lee's invention that brings you this potted history of computing today.
A Brief History of the Computer
Computers and computer applications are on almost every aspect of our daily lives. As like many
ordinary objects around us, we may need clearer understanding of what they are. You may ask "What is
a computer?" or "What is a software", or "What is a programming language?" First, let's examine the
history.

1. The history of computers starts out about 2000 years ago in Babylonia (Mesopotamia), at the birth of
the abacus, a wooden rack holding two horizontal wires with beads strung on them.

2 .Blaise Pascal is usually credited for building the first digital computer in 1642. It added numbers
entered with dials and was made to help his father, a tax collector

.The basic principle of his calculator is still used today in water meters and modern-day odometers.
Instead of having a carriage wheel turn the gear, he made each ten-teeth wheel accessible to be turned
directly by a person's hand (later inventors added keys and a crank), with the result that when the
wheels were turned in the proper sequences, a series of numbers was entered and a cumulative sum
was obtained. The gear train supplied a mechanical answer equal to the answer that is obtained by using
arithmetic.

This first mechanical calculator, called the Pascaline, had several disadvantages. Although it did offer a
substantial improvement over manual calculations, only Pascal himself could repair the device and it
cost more than the people it replaced! In addition, the first signs of technophobia emerged with
mathematicians fearing the loss of their jobs due to progress.

3 .A step towards automated computing was the development of punched cards, which were first
successfully used with computers in 1890 by Herman Hollerith and James Powers, who worked for the
US. Census Bureau. They developed devices that could read the information that had been punched into
the cards automatically, without human help. Because of this, reading errors were reduced dramatically,
work flow increased, and, most importantly, stacks of punched cards could be used as easily accessible
memory of almost unlimited size. Furthermore, different problems could be stored on different stacks of
cards and accessed when needed.

4 .These advantages were seen by commercial companies and soon led to the development of improved
punch-card using computers created by International Business Machines (IBM), Remington (yes, the
same people that make shavers), Burroughs, and other corporations. These computers used
electromechanical devices in which electrical power provided mechanical motion -- like turning the
wheels of an adding machine. Such systems included features to:

* feed in a specified number of cards automatically

* add, multiply, and sort

* feed out cards with punched results


5. The start of World War II produced a large need for computer capacity, especially for the military.
New weapons were made for which trajectory tables and other essential data were needed. In 1942,
John P. Eckert, John W. Mauchly, and their associates at the Moore school of Electrical Engineering of
University of Pennsylvania decided to build a high - speed electronic computer to do the job. This
machine became known as ENIAC (Electrical Numerical Integrator And Calculator)

Two men (in uniform) being trained to maintain the ENIAC computer. The two women in the photo were
programmers. The ENIAC occupied the entire thirty by fifty feet room.

The size of ENIACs numerical "word" was 10 decimal digits, and it could multiply two of these
numbers at a rate of 300 per second, by finding the value of each product from a multiplication table
stored in its memory. ENIAC was therefore about 1,000 times faster then the previous generation of
relay computers. ENIAC used 18,000 vacuum tubes, about 1,800 square feet of floor space, and
consumed about 180,000 watts of electrical power. It had punched card I/O, 1 multiplier, 1
divider/square rooter, and 20 adders using decimal ring counters, which served as adders and also as
quick-access (.0002 seconds) read-write register storage. The executable instructions making up a
program were embodied in the separate "units" of ENIAC, which were plugged together to form a
"route" for the flow of information.

6 .Early in the 50s two important engineering discoveries changed the image of the electronic -
computer field, from one of fast but unreliable hardware to an image of relatively high reliability and
even more capability. These discoveries were the magnetic core memory and the Transistor - Circuit
Element.

These technical discoveries quickly found their way into new models of digital computers. RAM
capacities increased from 8,000 to 64,000 words in commercially available machines by the 1960s,
with access times of 2 to 3 MS (Milliseconds). These machines were very expensive to purchase or even
to rent and were particularly expensive to operate because of the cost of expanding programming. Such
computers were mostly found in large computer centers operated by industry, government, and private
laboratories - staffed with many programmers and support personnel. This situation led to modes of
operation enabling the sharing of the high potential available.

7 .Many companies, such as Apple Computer and Radio Shack, introduced very successful PCs in the
1970's, encouraged in part by a fad in computer (video) games. In the 1980's some friction occurred in
the crowded PC field, with Apple and IBM keeping strong. In the manufacturing of semiconductor chips,
the Intel and Motorola Corporations were very competitive into the 1980s, although Japanese firms
were making strong economic advances, especially in the area of memory chips. By the late 1980s, some
personal computers were run by microprocessors that, handling 32 bits of data at a time, could process
about 4,000,000 instructions per second.

Das könnte Ihnen auch gefallen