Sie sind auf Seite 1von 57

Selectie materiale: conf. dr. ing.

Florin Leon

Introducere

Ask any student who has had some programming experience the following question: You are
given a problem for which you have to build a software system that most students feel will be
approximately 10,000 lines of (say C# or Java) code. If you are working full time on it, how
long will it take you to build this system?
The answer of students is generally 1 to 3 months. And, given the programming expertise
of the students, there is a good chance that they will be able to build a system and demo it to
the Professor within 2 months. With 2 months as the completion time, the productivity of the
student will be 5,000 lines of code (LOC) per person-month.
Now let us take an alternative scenario we act as clients and pose the same problem to a
company that is in the business of developing software for clients. Though there is no "standard"
productivity gure and it varies a lot, it is fair to say a productivity gure of 1,000 LOC per
person-month is quite respectable (though it can be as low as 100 LOC per person-month for
embedded systems). With this productivity, a team of professionals in a software organization
will take 10 person-months to build this software system.
Why this dierence in productivity in the two scenarios? Why is it that the same students
who can produce software at a productivity of a few thousand LOC per month while in college
end up producing only about a thousand LOC per month when working in a company? Why
is it that students seem to be more productive in their student days than when they become
professionals?
The answer, of course, is that two dierent things are being built in the two scenarios. In
the rst, a student system is being built whose main purpose is to demo that it works. In the
second scenario, a team of professionals in an organization is building the system for a client
who is paying for it, and whose business may depend on proper working of the system. As
should be evident, building the latter type of software is a dierent problem altogether. It is
this problem in which software engineering is interested.
In software engineering we are not dealing with programs that people build to illustrate
something or for hobby (which we are referring to as student systems). Instead the problem
domain is the software that solves some problem of some users where larger systems or businesses
may depend on the software, and where problems in the software can lead to signicant direct
or indirect loss. We refer to this software as industrial strength software. Let us rst discuss
the key dierence between the student software and the industrial strength software.
When a computer software succeeds when it meets the needs of the people who use it,
when it performs awlessly over a long period of time, when it is easy to modify and even
easier to use it can and does change things for the better. But when software fails when its
users are dissatised, when it is error prone, when it is di cult to change and even harder to
use bad things can and do happen. We all want to build software that makes things better,
1

Florin Leon, Ingineria programarii, http://florinleon.byethost24.com/curs_ip.htm

Florin Leon, Ingineria programarii - Suport de curs, http://florinleon.byethost24.com/curs_ip.htm

IP01. Introducere n ingineria program


arii
Suport de curs

1.1

Deni
tia ingineriei program
arii

The Standard Glossary of Software Engineering Terminology (1990) of IEEE (Institute of Electrical and Electronics Engineers) contains the following denition:
Software engineering is the application of a systematic, disciplined, quantiable approach to the development, operation, and maintenance of software.
It encompasses techniques and procedures, often regulated by a software development
process, with the purpose of improving the reliability and maintainability of software systems.The eort is necessitated by the potential complexity of those systems, which may contain
millions of lines of code. The discipline of software engineering includes knowledge, tools, and
methods for software requirements, software design, software construction, software testing,
and software maintenance tasks. Software engineering is related to the disciplines of computer
science, computer engineering, management, mathematics, quality management, and systems
engineering.

1.2

Software-ul de putere industrial


a

A student system is primarily meant for demonstration purposes; it is generally not used
for solving any real problem of any organization. Consequently, nothing of signicance or
importance depends on proper functioning of the software. Because nothing of signicance
depends on the software, the presence of "bugs" (or defects or faults) is not a major concern.
Hence the software is generally not designed with quality issues like portability, robustness,
reliability, and usability in mind. Also, the student software system is generally used by the
developer him- or herself, therefore the need for documentation is nonexistent, and again bugs
are not critical issues as the user can x them as and when they are found.
An industrial strength software system, on the other hand, is built to solve some problem of
a client and is used by client organizations for operating some part of business (we use the term
"business" in a very broad sense it may be to manage inventories, nances, monitor patients,
air tra c control, etc.) In other words, important activities depend on the correct functioning
of the system. And a malfunction of such a system can have huge impact in terms of nancial
or business loss, inconvenience to users, or loss of property and life. Consequently, the software
system needs to be of high quality with respect to properties such as dependability, reliability,
user-friendliness, etc.
This requirement of high quality has many ramications. First, it requires that the software be thoroughly tested before being used. The need for rigorous testing increases the cost
2

Florin Leon, Ingineria programarii, http://florinleon.byethost24.com/curs_ip.htm

Florin Leon, Ingineria programarii - Suport de curs, http://florinleon.byethost24.com/curs_ip.htm

avoiding the bad things that lurk in the shadow of failed eorts. To succeed, we need discipline
when software is designed and built. We need an engineering approach.
Although managers and practitioners alike recognize the need for a more disciplined approach to software, they continue to debate the manner in which discipline is to be applied.
Many individuals and companies still develop software haphazardly, even as they build systems
to service the most advanced technologies of the day. Many professionals and students are
unaware of modern methods. And as a result, the quality of the software that we produce
suers and bad things happen. In addition, debate and controversy about the true nature of
the software engineering approach continue. The status of software engineering is a study in
contrasts. Attitudes have changed, progress has been made, but much remains to be done
before the discipline reaches full maturity.

1.3

Deni
tia software-ului

IEEE also has a denition for software:


Software is the collection of computer programs, procedures, rules, and
associated documentation and data.
This denition clearly states that software is not just programs, but includes all the associated documentation and data. This implies that the discipline dealing with the development
of software should not deal only with developing programs, but with developing all the things
that constitute software.

1.4

Provoc
arile ingineriei program
arii

Acording to our denition, software engineering is the systematic approach to the development,
operation, maintenance, and retirement of software. The use of the term systematic approach
for the development of software implies that methodologies are used for developing software
which are repeatable. That is, if the methodologies are applied by dierent groups of people, similar software will be produced. In essence, the goal of software engineering is to take
software development closer to science and engineering and away from ad-hoc approaches for
development whose outcomes are not predictable but which have been used heavily in the past
and still continue to be used for developing software.
As mentioned, industrial strength software is meant to solve some problem of the client (we
use the term client in a very general sense meaning the people whose needs are to be satised
by the software). The problem therefore is to develop software to satisfy the needs of some
users or clients. This fundamental problem that software engineering deals with is shown in
Figure 1.2.
Though the basic problem is to systematically develop software to satisfy the client, there
are some factors which aect the approaches selected to solve the problem. These factors are
3

Florin Leon, Ingineria programarii, http://florinleon.byethost24.com/curs_ip.htm

Florin Leon, Ingineria programarii - Suport de curs, http://florinleon.byethost24.com/curs_ip.htm

considerably. In an industrial strength software project, 30% to 50% of the total eort may be
spent in testing (while in a student software even 5% may be too high!)
Second, building high quality software requires that the development be broken into phases
such that the output of each phase is evaluated and reviewed so bugs can be removed. This
desire to partition the overall problem into phases and identify defects early requires more
documentation, standards, processes, etc. All these increase the eort required to build the
software hence the productivity of producing industrial strength software is generally much
lower than for producing student software.
Industrial strength software also has other properties which do not exist in student software
systems. Typically, for the same problem, the detailed requirements of what the software should
do increase considerably. Besides quality requirements, there are requirements of backup and
recovery, fault tolerance, following of standards, portability, etc. These generally have the eect
of making the software system more complex and larger. The size of the industrial strength
software system may be two times or more than the student system for the same problem.
Overall, if we assume 1/5 productivity, and an increase in size by a factor of 2 for the same
problem, an industrial strength software system will take about 10 times as much eort to build
as a student software system for the same problem. This is what Fred Brooksrule of thumb
says that industrial strength software may cost about 10 times the student software. The
software industry is largely interested in developing industrial strength software, and the area
of software engineering focuses on how to build such systems.

1.4.1

Costul software-ului

Industrial strength software is very expensive primarily due to the fact that software development is extremely labor-intensive. To get an idea of the costs involved, let us consider the
current state of practice in the industry. Lines of code (LOC) or thousands of lines of code
(KLOC) delivered is the most commonly used measure of software size in the industry. As the
main cost of producing software is the manpower employed, the cost of developing software is
generally measured in terms of person-months of eort spent in development. And productivity
is frequently measured in the industry in terms of LOC (or KLOC) per person-month.
The productivity in the software industry for writing fresh code generally ranges from 300
to 1,000 LOC per person-month. That is, for developing software, the average productivity per
person, per month, over the entire development cycle is about 300 to 1,000 LOC. And software
companies charge the client for whom they are developing the software upwards of $100,000
per person-year or more than $8,000 per person-month (which comes to about $50 per hour).
With the current productivity gures of the industry, this translates into a cost per line of code
of approximately $8 to $25. In other words, each ne of delivered code costs between $8 and
$25 at current costs and productivity levels! And even small projects can easily end up with
software of 50,000 LOC. With this productivity, such a software project will cost between $ 0.5
million and $1.25 million!
Given the current computing power of machines, such software can easily be hosted on a
workstation or a small server. This implies that software that can cost more than a million
dollars can run on hardware that costs at most tens of thousands of dollars, clearly showing
that the cost of hardware on which such an application can run is a fraction of the cost of the
application software! This example clearly shows that not only is software very expensive, it
indeed forms the major component of the total automated system, with the hardware forming
a very small component. This is shown in the classic hardware-software cost reversal chart in
Figure 1.1.
As Figure 1.1 shows, in the early days, the cost of hardware used to dominate the system
cost. As the cost of hardware has lessened over the years and continues to decline, and as
the power of hardware doubles every 2 years or so (the Moores law) enabling larger software
systems to be run on it, cost of software has now become the dominant factor in systems.
4

Florin Leon, Ingineria programarii, http://florinleon.byethost24.com/curs_ip.htm

Florin Leon, Ingineria programarii - Suport de curs, http://florinleon.byethost24.com/curs_ip.htm

the primary forces that drive the progress and development in the eld of software engineering.
We consider these as the primary challenges for software engineering and discuss some of the
key ones here.

ntrzieri
si instabilitate

Despite considerable progress in techniques for developing software, software development remains a weak area. In a survey of over 600 rms, more than 35% reported having some
computer-related development project that they categorized as a runaway. A runaway is not
a project that is somewhat late or somewhat over budget it is one where the budget and
schedule are out of control. The problem has become so severe that it has spawned an industry
of its own; there are consultancy companies that advise how to rein such projects, and one such
company had more than $30 million in revenues from more than 20 clients.
The Standish Group published a statistic of the project nalization status for the IT projects
in the US, displayed in the following table and gure.

Project nalization status (table)

Florin Leon, Ingineria programarii, http://florinleon.byethost24.com/curs_ip.htm

Florin Leon, Ingineria programarii - Suport de curs, http://florinleon.byethost24.com/curs_ip.htm

1.4.2

Similarly, a large number of instances have been quoted regarding the unreliability of software; the software does not do what it is supposed to do or does something it is not supposed
to do. In one defense survey, it was reported that more than 70% of all the equipment failures
were due to software! And this is in systems that are loaded with electrical, hydraulic, and
mechanical systems. This just indicates that all other engineering disciplines have advanced far
more than software engineering, and a system comprising the products of various engineering
disciplines nds that software is the weakest component. Many banks have lost millions of
dollars due to inaccuracies and other problems in their software.
A note about the cause of unreliability in software: software failures are dierent from
failures of, say, mechanical or electrical systems. Products of these other engineering disciplines
fail because of the change in physical or electrical properties of the system caused by aging. A
software product, on the other hand, never wears out due to age. In software, failures occur due
to bugs or errors that get introduced during the design and development process. Hence, even
though a software system may fail after operating correctly for some time, the bug that causes
that failure was there from the start! It only got executed at the time of the failure. This is
quite dierent from other systems, where if a system fails, it generally means that sometime
before the failure the system developed some problem (due to aging) that did not exist earlier.
n cele ce urmeaz
a sunt prezentate o serie de defecte software "celebre":
28 July 1962 Mariner I space probe. A bug in the ight software for the Mariner 1, a
Venus yby mission, caused the rocket to divert from its intended path on launch. Mission
control destroyed the rocket over the Atlantic Ocean. The investigation into the accident
discovered that a formula written on paper in pencil was improperly transcribed into
computer code, causing the computer to miscalculate the rockets trajectory. Programul
primit de pe P
amnt pentru recticarea orbitei continea linia DO 3 I = 1.3; instructiunea
corect
a n limbajul Fortran ar trebuit s
a contin
a virgul
a n loc de punct;
1982 Soviet gas pipeline. Operatives working for the Central Intelligence Agency allegedly planted a bug in a Canadian computer system purchased to control the TransSiberian gas pipeline. The Soviets had obtained the system as part of a wide-ranging
eort to covertly purchase or steal sensitive U.S. technology. The CIA reportedly found
out about the program and decided to make it backre with equipment that would pass
6

Florin Leon, Ingineria programarii, http://florinleon.byethost24.com/curs_ip.htm

Florin Leon, Ingineria programarii - Suport de curs, http://florinleon.byethost24.com/curs_ip.htm

Project nalization status (chart)

1983 - Software bugs in a Soviet early-warning monitoring system nearly brought on


nuclear war. The software was supposed to lter out false missile detections caused by
Soviet satellites picking up sunlight reections o cloud-tops, but failed to do so. Disaster
was averted when a Soviet commander, based on what he said was a "...funny feeling in
my gut", decided the apparent missile attack was a false alarm;
1985-1987 Therac-25 medical accelerator. A French radiation therapy device malfunctioned and delivered lethal radiation doses at several medical facilities. Based upon a
previous design, the Therac-25 was an "improved" therapy system that could deliver two
dierent kinds of radiation: either a low-power electron beam (beta particles) or X-rays.
Because of a subtle bug called a "race condition" (a aw where the output or result of
the process is unexpectedly and critically dependent on the sequence or timing of other
events the term originates with the idea of two signals racing each other to inuence
the output rst), a quick-ngered typist could accidentally congure the Therac-25 so the
electron beam would re in high-power mode but with the X-ray target out of position.
At least 5 patients died, and others were seriously injured;
1988-1996 Kerberos Random Number Generator. The authors of the Kerberos security system (a computer network authentication protocol) neglected to properly "seed"
the programs random number generator with a truly random seed. As a result, for 8
years it was possible to trivially break into any computer that relied on Kerberos for
authentication. It is unknown if this bug was ever actually exploited;
15 January 1990 AT&T Network Outage. A bug in a new release of the software that
controled AT&Ts long distance switches caused these mammoth computers to crash when
they received a specic message from one of their neighboring machines a message that
the neighbors sent out when they recovered from a crash. One day a switch in New York
crashed and rebooted, causing its neighboring switches to crash, then their neighbors
neighbors, and so on. Soon, 114 switches were crashing and rebooting every 6 seconds,
leaving an estimated 60000 people without long distance service for 9 hours. The x:
engineers loaded the previous software release;
25 February 1991 - Patriot missile bug. During the Gulf War, Operation Desert Storm
used sophisticated technology to end the war in a quick and timely manner. Part of this
technology was that of the Patriot missile air defence system. On the night of the 25
February 1991, a Patriot missile system operating in Dhahran, Saudi Arabia, failed to
track and intercept an incoming Scud. The Iraqi missile impacted into an army barracks,
killing 28 U.S. soldiers and injuring another 98. The cause of the missile system failing
to defend against the incoming Scud was traced back to a bug in Patriots radar and
tracking software, a rounding error;
15 October 1989 Dallas-Fort Worth Airport Air Control systems. Fans headed for the
Oklahoma-Texas football game this is one of the busiest days of the year, every year.
At 9:45 AM, the antiquated computers at DFW approach control shut down because an
operator had entered what should have been a routine command but from an unauthorized
terminal. The alarm rang and 3 seconds later the screens in front of the controllers went
blank. The 1989 programs only had 16,000 lines of code. Todays air controller software
have 2 million. (Empirical studies of computer programs has shown that the number of
7

Florin Leon, Ingineria programarii, http://florinleon.byethost24.com/curs_ip.htm

Florin Leon, Ingineria programarii - Suport de curs, http://florinleon.byethost24.com/curs_ip.htm

Soviet inspection and then fail once in operation. The resulting event was reportedly the
largest non-nuclear explosion in the planets history;

4 June 1996 Ariane 5 Flight 501. Working code for the European Ariane 4 rocket was
reused in the Ariane 5, but the Ariane 5s faster engines triggered a bug in an arithmetic
routine inside the rockets ight computer. The error was in the code that converted a
64-bit oating-point number to a 16-bit signed integer. The faster engines caused the
64-bit numbers to be larger in the Ariane 5 than in the Ariane 4, triggering an overow
condition, an unhandled exception that resulted in the ight computers crashing. As a
result, the rockets primary processor overpowered the rockets engines and caused the
rocket to disintegrate 40 seconds after launch;
November 2000 National Cancer Institute, Panama City. In a series of accidents, therapy planning software created by Multidata Systems International, a U.S. rm, miscalculated the proper dosage of radiation for patients undergoing radiation therapy. Multidatas software allowed a radiation therapist to draw on a computer screen the placement
of metal shields called "blocks" designed to protect healthy tissue from the radiation.
But the software would only allow technicians to use four shielding blocks, and the Panamanian doctors wished to use ve. The doctors discovered that they can trick the software
by drawing all ve blocks as a single large block with a hole in the middle. What the
doctors didnt realize is that the Multidata software gave dierent answers in this conguration depending on how the hole is drawn: draw it in one direction and the correct dose
is calculated, draw in another direction and the software recommends twice the necessary
exposure. At least 8 patients died, while another 20 receive overdoses likely to cause
signicant health problems. The physicians, who were legally required to double-check
the computers calculations by hand, were indicted for murder.
1.4.3

ntre
tinere
si reprogramare

Once the software is delivered and deployed, it enters the maintenance phase. Why is maintenance needed for software, when software does not age? Software needs to be maintained not
because some of its components wear out and need to be replaced, but because there are often
some residual errors remaining in the system that must be removed as they are discovered. It
is commonly believed that the state of the art today is such that almost all software that is developed has residual errors, or bugs, in it. Many of these surface only after the system has been
in operation, sometimes for a long time. These errors, once discovered, need to be removed,
leading to the software being changed. This is sometimes called corrective maintenance.
Even without bugs, software frequently undergoes change. The main reason is that software
often must be upgraded and enhanced to include more features and provide more services. This
also requires modication of the software. It has been argued that once a software system is
deployed, the environment in which it operates changes. Hence, the needs that initiated the
software development also change to reect the needs of the new environment. Hence, the
software must adapt to the needs of the changed environment. The changed software then
changes the environment, which in turn requires further change. This phenomenon is sometimes
called the "law of software evolution". Maintenance due to this phenomenon is sometimes called
adaptive maintenance.
Though maintenance is not considered a part of software development, it is an extremely
important activity in the life of a software product. If we consider the total life of software, the
cost of maintenance generally exceeds the cost of developing the software! The maintenanceto-development-cost ratio has been variously suggested as 80:20, 70:30, or 60:40. Figure 1.1
8

Florin Leon, Ingineria programarii, http://florinleon.byethost24.com/curs_ip.htm

Florin Leon, Ingineria programarii - Suport de curs, http://florinleon.byethost24.com/curs_ip.htm

bugs in software varies with the logarithm of number of lines of code. At that rate, the
number of bugs in the new computers should be 5 times greater than the 1989 codes);

Florin Leon, Ingineria programarii, http://florinleon.byethost24.com/curs_ip.htm

Florin Leon, Ingineria programarii - Suport de curs, http://florinleon.byethost24.com/curs_ip.htm

also shows how the maintenance costs are increasing.


Maintenance work is based on existing software, as compared to development work that
creates new software. Consequently, maintenance revolves around understanding existing software and maintainers spend most of their time trying to understand the software they have
to modify. Understanding the software involves understanding not only the code but also the
related documents. During the modication of the software, the eects of the change have
to be clearly understood by the maintainer because introducing undesired side eects in the
system during modication is easy. To test whether those aspects of the system that are not
supposed to be modied are operating as they were before modication, regression testing is
done. Regression testing involves executing old test cases to test that no new errors have been
introduced.
Thus, maintenance involves understanding the existing software (code and related documents), understanding the eects of change, making the changes to both the code and the
documents testing the new parts, and retesting the old parts that were not changed. Because
often during development, the needs of the maintainers are not kept in mind, few support
documents are produced during development to help the maintainer. The complexity of the
maintenance task, coupled with the neglect of maintenance concerns during development, makes
maintenance the most costly activity in the life of software product.
Maintenance is one form of change that typically is done after the software development is
completed and the software has been deployed. However, there are other forms of changes that
lead to rework during the software development itself.
One of the biggest problems in software development, particularly for large and complex
systems, is that what is desired from the software (i.e., the set of requirements) is not understood. To completely specify the requirements, all the functionality, interfaces, and constraints
have to be specied before software development has commenced! In other words, for specifying
the requirements, the clients and the developers have to visualize what the software behavior
should be once it is developed. This is very hard to do, particularly for large and complex
systems. So, what generally happens is that the development proceeds when it is believed that
the requirements are generally in good shape. However, as time goes by and the understanding
of the system improves, the clients frequently discover additional requirements they had not
specied earlier. This leads to requirements getting changed. This change leads to rework, the
requirements, the design, the code, all have to be changed to accommodate the new or changed
requirements.
Just uncovering requirements that were not understood earlier is not the only reason for
this change and rework. Software development of large and complex systems can take a few
years. And with the passage of time, the needs of the clients change. After all, the current
needs, which initiate the software product, are a reection of current times. As times change,
so do the needs. And, obviously, the clients want the system deployed to satisfy their most
current needs. This change of needs while the development is going on also leads to rework.
In fact, changing requirements and associated rework are a major problem of the software
industry. It is estimated that rework costs are 30 to 40% of the development cost. In other
words, of the total development eort, rework due to various changes consume about 30 to
40% of the eort! No wonder change and rework is a major contributor to the software crisis.
However, unlike the issues discussed earlier, the problem of rework and change is not just a
reection of the state of software development, as changes are frequently initiated by clients as
their needs change.

Scala

A fundamental factor that software engineering must deal with is the issue of scale; development
of a very large system requires a very dierent set of methods compared to developing a small
system. In other words, the methods that are used for developing small systems generally do
not scale up to large systems. An example will illustrate this point. Consider the problem of
counting people in a room versus taking a census of a country. Both are essentially counting
problems. But the methods used for counting people in a room (probably just go row-wise
or column-wise) will just not work when taking a census. Dierent set of methods will have
to be used for conducting a census, and the census problem will require considerably more
management, organization, and validation, in addition to counting.

Similarly, methods that one can use to develop programs of a few hundred lines cannot be
expected to work when software of a few hundred thousand lines needs to be developed. A
dierent set of methods must be used for developing large software. Any large project involves
the use of engineering and project management. For software projects, by engineering we mean
the methods, procedures, and tools that are used. In small projects, informal methods for
development and management can be used. However, for large projects, both have to be much
more formal, as shown in Figure 1.3.
As shown in the gure, when dealing with a small software project, the engineering capability
required is low (all you need to know is how to program and a bit of testing) and the project
management requirement is also low. However, when the scale changes to large, to solve such
problems properly, it is essential that we move in both directions the engineering methods
used for development need to be more formal, and the project management for the development
project also needs to be more formal. For example, if we leave 50 bright programmers together
(who know how to develop small programs well) without formal management and development
procedures and ask them to develop an on-line inventory control system for an automotive
manufacturer, it is highly unlikely that they will produce anything of use. To successfully
execute the project, a proper method for engineering the system has to be used and the project
has to be tightly managed to make sure that methods are indeed being followed and that cost,
schedule, and quality are under control.
There is no universally acceptable denition of what is a "small" project and what is a
"large" project, and the scales are clearly changing with time. However, informally, we can use
the order of magnitudes and say that a project is small if its size is less than 10 KLOC, medium
10

Florin Leon, Ingineria programarii, http://florinleon.byethost24.com/curs_ip.htm

Florin Leon, Ingineria programarii - Suport de curs, http://florinleon.byethost24.com/curs_ip.htm

1.4.4

Urm
atoarele exemple ntregesc imaginea asupra gradului de complexitate al programelor:
1. pentru realizarea sistemului de operare IBM OS 360 au fost necesari 5000 de ani-om
(1966);
2. programele scrise pentru naveta spatiala NASA au circa 40 de milioane de linii de cod
(1977);
3. sistemul de operare System V versiunea 4.0 (Unix) a fost obtinut prin compilarea a 3,7
milioane de linii de cod (1983);
4. sistemul de rezervare a biletelor pentru compania aeriana KLM continea, n anul 1992, 2
milioane de linii de cod n limbaj de asamblare.
1.4.5

Calitatea
si productivitatea

Like all engineering disciplines, software engineering is driven by the three major factors: cost,
time, and scope, with quality as the central theme.

11

Florin Leon, Ingineria programarii, http://florinleon.byethost24.com/curs_ip.htm

Florin Leon, Ingineria programarii - Suport de curs, http://florinleon.byethost24.com/curs_ip.htm

if the size is less than 100 KLOC (and more than 10), large if the size is less than one million
LOC, and very large if the size is many million LOC.
n anul 1946 Goldstine si von Neumann apreciau c
a 1000 de instructiuni reprezint
a o limit
a
superioar
a rezonabil
a pentru complexitatea problemelor ce pot concepute ca rezolvabile cu
ajutorul calculatorului. n 1981 Bill Gates a prev
azut c
a nici un program pentru calculatoare
personale nu va necesita vreodat
a mai mult de 640 KB de memorie RAM. To get an idea of
the sizes of some real software products, the approximate sizes of some well known products is
given in Table 1.1.

The cost of developing a system is the cost of the resources used for the system, which, in the
case of software, is dominated by the manpower cost, as development is largely labor-intensive.
Hence, the cost of a software project is often measured in terms of person-months, i.e., the cost
is considered to be the total number of person-months spent in the project. (Person-months
can be converted into a dollar amount by multiplying it with the average dollar cost, including
the cost of overheads like hardware and tools, of one person-month.)
Time or schedule is an important factor in many projects. Business trends are dictating
that the time to market of a product should be reduced; that is, the cycle time from concept
to delivery should be small. For software this means that it needs to be developed faster.
Productivity in terms of output (KLOC) per person-month can adequately capture both
cost and schedule concerns. If productivity is higher, it should be clear that the cost in terms
of person-months will be lower (the same work can now be done with fewer person-months.)
Similarly, if productivity is higher, the potential of developing the software in shorter time
improves a team of higher productivity will nish a job in lesser time than a same-size team
with lower productivity. (The actual time the project will take, of course, depends also on the
number of people allocated to the project.) In other words, productivity is a key driving factor
in all businesses and desire for high productivity dictates, to a large extent, how things are
done.
A major factor driving any production discipline is quality. Today, quality is a main mantra,
and business strategies are designed around quality. Clearly, developing high-quality software
is another fundamental goal of software engineering. However, while cost is generally well
understood, the concept of quality in the context of software needs further discussion. We
use the international standard on software product quality as the basis of our discussion here.
According to the quality model adopted by this standard, software quality comprises of 6 main
attributes (called characteristics), as shown in Figure 1.4.

These 6 attributes have detailed characteristics which are considered the basic ones and
which can and should be measured using suitable metrics. At the top level, for a software
product, these attributes can be dened as follows:
12

Florin Leon, Ingineria programarii, http://florinleon.byethost24.com/curs_ip.htm

Florin Leon, Ingineria programarii - Suport de curs, http://florinleon.byethost24.com/curs_ip.htm

Triunghiul managementului de proiect

1. Functionality. The capability to provide functions which meet stated and implied needs
when the software is used;
2. Reliability. The capability to maintain a specied level of performance;

4. E ciency. The capability to provide appropriate performance relative to the amount of


resources used;
5. Maintainability. The capability to be modied for purposes of making corrections, improvements, or adaptation;
6. Portability. The capability to be adapted for dierent specied environments without
applying actions or means other than those provided for this purpose in the product.
The characteristics for the dierent attributes provide further details. Usability, for example, has characteristics of understandability, learnability, operability; maintainability has
changeability, testability, stability, etc.; while portability has adaptability, installability, etc.
Functionality includes suitability (whether appropriate set of functions are provided), accuracy
(the results are accurate), and security. Note that in this classication, security is considered a
characteristic of functionality, and is dened as "the capability to protect information and data
so that unauthorized persons or systems cannot read or modify them, and authorized persons
or systems are not denied access to them".
There are two important consequences of having multiple dimensions to quality. First,
software quality cannot be reduced to a single number (or a single parameter). And second,
the concept of quality is project-specic. For an ultra-sensitive project, reliability may be of
utmost importance but not usability, while in a commercial package for playing games on a PC,
usability may be of utmost importance and not reliability. Hence, for each software development
project, a quality objective must be specied before the development starts, and the goal of
the development process should be to satisfy that quality objective.
Despite the fact that there are many quality factors, reliability is generally accepted to be
the main quality criterion. As unreliability of software comes due to presence of defects in
the software, one measure of quality is the number of defects in the delivered software per
unit size (generally taken to be thousands of lines of code, or KLOC). With this as the major
quality criterion, the quality objective is to reduce the number of defects per KLOC as much
as possible. Current best practices in software engineering have been able to reduce the defect
density to less than 1 defect per KLOC.
It should be pointed out that to use this denition of quality, what a defect is must be
clearly dened. A defect could be some problem in the software that causes the software to
crash or a problem that causes an output to be not properly aligned or one that misspells some
word, etc. The exact denition of what is considered a defect will clearly depend on the project
or the standards used by the organization developing the project (typically it is the latter).
1.4.6

Consecven
ta
si repetabilitatea

There have been many instances of high quality software being developed with very high productivity. But, there have been many more instances of software with poor quality or productivity
being developed. A key challenge that software engineering faces is how to ensure that successful results can be repeated, and there can be some degree of consistency in quality and
productivity.
13

Florin Leon, Ingineria programarii, http://florinleon.byethost24.com/curs_ip.htm

Florin Leon, Ingineria programarii - Suport de curs, http://florinleon.byethost24.com/curs_ip.htm

3. Usability. The capability to be understood, learned, and used;

1.4.7

Schimbarea

We have discussed above how maintenance and rework are very expensive and how they are
an integral part of the problem domain that software engineering deals with. In todays world
change in business is very rapid. As businesses change, they require the supporting software
to change. Overall, as the world changes faster, software has to change faster. Rapid change
has a special impact on software. As software is easy to change due to its lack of physical
properties that may make changing harder, the expectation is much more from software for
change. Therefore, one challenge for software engineering is to accommodate and embrace
change. As we will see, dierent approaches are used to handle change. But change is a major
driver today for software engineering. Approaches that can produce high quality software at
high productivity but cannot accept and accommodate change are of little use today they can
solve only very few problems that are change resistant.

Fazele procesului de dezvoltare

A development process consists of various phases, each phase ending with a dened output. The
phases are performed in an order specied by the process model being followed. The main reason
for having a phased process is that it breaks the problem of developing software into successfully
performing a set of phases, each handling a dierent concern of software development. This
ensures that the cost of development is lower than what it would have been if the whole problem
was tackled together. Furthermore, a phased process allows proper checking for quality and
progress at some dened points during the development (end of phases). Without this, one
would have to wait until the end to see what software has been produced. Clearly, this will not
work for large systems. Hence, for managing the complexity, project tracking, and quality, all
the development processes consist of a set of phases. A phased development process is central
to the software engineering approach.
Various process models have been proposed for developing software. In fact, most organizations that follow a process have their own version. In general, we can say that any problem

14

Florin Leon, Ingineria programarii, http://florinleon.byethost24.com/curs_ip.htm

Florin Leon, Ingineria programarii - Suport de curs, http://florinleon.byethost24.com/curs_ip.htm

We can say that an organization that develops one system with high quality and reasonable
productivity, but is not able to maintain the quality and productivity levels for other projects,
does not know good software engineering. A goal of software engineering methods is that system
after system can be produced with high quality and productivity. That is, the methods that
are being used are repeatable across projects leading to consistency in the quality of software
produced.
An organization involved in software development not only wants high quality and productivity, but it wants these consistently. In other words, a software development organization
would like to produce consistent quality software with consistent productivity. Consistency of
performance is an important factor for any organization; it allows an organization to predict
the outcome of a project with reasonable accuracy, and to improve its processes to produce
higher-quality products and to improve its productivity. Without consistency, even estimating
cost for a project will become di cult.
Achieving consistency is an important problem that software engineering has to tackle. As
can be imagined, this requirement of consistency will force some standardized procedures to be
followed for developing software. There are no globally accepted methodologies and dierent
organizations use dierent ones. However, within an organization, consistency is achieved by
using its chosen methodologies in a consistent manner.

2.1

Analiza cerin
telor

Requirements analysis is done in order to understand the problem the software system is to
solve. The emphasis in requirements analysis is on identifying what is needed from the system,
not how the system will achieve its goals. For complex systems, even determining what is needed
is a di cult task. The goal of the requirements activity is to document the requirements in a
software requirements specication document.
There are 2 major activities in this phase: problem understanding or analysis and requirement specication. In problem analysis, the aim is to understand the problem and its context,
and the requirements of the new system that is to be developed. Understanding the requirements of a system that does not exist is di cult and requires creative thinking. The problem
becomes more complex because an automated system oers possibilities that do not exist otherwise. Consequently, even the users may not really know the needs of the system.
Once the problem is analyzed and the essentials understood, the requirements must be
specied in the requirement specication document. The requirements document must specify
all functional and performance requirements and all design constraints that exist due to political, economic, environmental, and security reasons. In other words, besides the functionality
required from the system, all the factors that may eect the design and proper functioning of
the system should be specied in the requirements document. A preliminary user manual that
describes all the major user interfaces frequently forms a part of the requirements document.
2.1.1

Mituri ale clien


tilor

Many causes of a software a- iction can be traced to a mythology that arose during the early
history of software development. Unlike ancient myths that often provide human lessons well
worth heeding, software myths propagated misinformation and confusion. Software myths had
a number of attributes that made them insidious; for instance, they appeared to be reasonable
statements of fact (sometimes containing elements of truth), they had an intuitive feel, and
they were often promulgated by experienced practitioners who "knew the score". Today, most
knowledgeable professionals recognize myths for what they are misleading attitudes that have
caused serious problems for managers and technical people alike. However, old attitudes and
habits are di cult to modify, and remnants of software myths are still believed.
A customer who requests computer software may be a person at the next desk, a technical
group down the hall, the marketing/sales department, or an outside company that has requested
software under contract. In many cases, the customer believes myths about software because
software managers and practitioners do little to correct misinformation. Myths lead to false
expectations by the customer, and ultimately, dissatisfaction with the developer.
Myth: A general statement of objectives is su cient to begin writing programs we
can ll in the details later. Reality: A poor up-front denition is the major cause of
15

Florin Leon, Ingineria programarii, http://florinleon.byethost24.com/curs_ip.htm

Florin Leon, Ingineria programarii - Suport de curs, http://florinleon.byethost24.com/curs_ip.htm

solving in software must consist of requirement specication for understanding and clearly stating the problem, design for deciding a plan for a solution, coding for implementing the planned
solution, and testing for verifying the programs.
For small problems, these activities may not be done explicitly, the start and end boundaries
of these activities may not be clearly dened, and no written record of the activities may be
kept. However, systematic approaches require that each of these 4 problem solving activities
be done formally. In fact, for large systems, each activity can itself be extremely complex, and
methodologies and procedures are needed to perform them e ciently and correctly. Though
dierent process models will perform these phases in dierent manner, they exist in all processes.

Myth: Project requirements continually change, but change can be easily accommodated
because software is exible. Reality: It is true that software requirements change, but
the impact of change varies with the time at which it is introduced. If serious attention
is given to up-front denition, early requests for change can be accommodated easily.
The customer can review requirements and recommend modications with relatively little impact on cost. When changes are requested during software design, the cost impact
grows rapidly. Resources have been committed and a design framework has been established. Change can cause upheaval that requires additional resources and major design
modication, that is, additional cost. Changes in function, performance, interface, or
other characteristics during implementation (code and test) have a severe impact on cost.
Change, when requested after software is in production, can be over an order of magnitude
more expensive than the same change requested earlier.

2.2

Proiectarea

The purpose of the design phase is to plan a solution of the problem specied by the requirements document. This phase is the rst step in moving from the problem domain to the solution
domain. In other words, starting with what is needed, design takes us toward how to satisfy
the needs. The design of a system is perhaps the most critical factor aecting the quality of
the software; it has a major impact on the later phases, particularly testing and maintenance.
The design activity often results in 3 separate outputs: architecture design, high level design,
and detailed design. Architecture focuses on looking at a system as a combination of many
dierent components, and how they interact with each other to produce the desired results.
The high level design identies the modules that should be built for developing the system and
the specications of these modules. At the end of system design all the major data structures,
le formats, output formats, etc., are also xed. In detailed design, the internal logic of each
of the modules is specied.
In architecture the focus is on identifying components or subsystems and how they connect;
in high level design the focus is on identifying the modules; and during detailed design the focus
is on designing the logic for each of the modules. In other words, in architecture the focus is on
what major components are needed, in high level design the attention is on what modules are
needed, while in detailed design how the modules can be implemented in software is the issue.
A design methodology is a systematic approach to creating a design by application of a set of
techniques and guidelines. Most methodologies focus on high level design.

2.3

Implementarea

Once the design is complete, most of the major decisions about the system have been made.
However, many of the details about coding the designs, which often depend on the programming
language chosen, are not specied during design. The goal of the coding phase is to translate
the design of the system into code in a given programming language. For a given design, the
aim in this phase is to implement the design in the best possible manner.
The coding phase aects both testing and maintenance profoundly. Well-written code can
reduce the testing and maintenance eort. Because the testing and maintenance costs of software are much higher than the coding cost, the goal of coding should be to reduce the testing
16

Florin Leon, Ingineria programarii, http://florinleon.byethost24.com/curs_ip.htm

Florin Leon, Ingineria programarii - Suport de curs, http://florinleon.byethost24.com/curs_ip.htm

failed software eorts. A formal and detailed description of the information domain,
function, behavior, performance, interfaces, design constraints, and validation criteria is
essential. These characteristics can be determined only after thorough communication
between customer and developer;

and maintenance eort. Hence, during coding the focus should be on developing programs that
are easy to read and understand, and not simply on developing programs that are easy to write.
Simplicity and clarity should be strived for during this phase.
Mituri ale dezvoltatorilor

Myths that are still believed by software practitioners have been fostered by 50 years of programming culture. During the early days of software, programming was viewed as an art form.
Old ways and attitudes die hard.
Myth: Once we write the program and get it to work, our job is done. Reality: Someone
once said that "the sooner you begin writing code, the longer itll take you to get done".
Industry data indicate that between 60 and 80 percent of all eort expended on software
will be expended after it is delivered to the customer for the rst time;
Myth: Until I get the program "running" I have no way of assessing its quality. Reality:
One of the most eective software quality assurance mechanisms can be applied from the
inception of a project the formal technical review. Software review is a process or
meeting during which a software product is examined by project personnel, managers,
users, customers, user representatives, or other interested parties for comment or approval.
It is a "quality lter" that have been found to be more eective than testing for nding
certain classes of software defects;
Myth: The only deliverable work product for a successful project is the working program.
Reality: A working program is only one part of a software conguration that includes
many elements. Documentation provides a foundation for successful engineering and,
more important, guidance for software support;
Myth: Software engineering will make us create voluminous and unnecessary documentation and will invariably slow us down. Reality: Software engineering is not about
creating documents. It is about creating quality. Better quality leads to reduced rework.
And reduced rework results in faster delivery times.
Many software professionals recognize the fallacy of the myths just described. Regrettably,
habitual attitudes and methods foster poor management and technical practices, even when
reality dictates a better approach. Recognition of software realities is the rst step toward
formulation of practical solutions for software engineering.

2.4

Testarea

Testing is the major quality control measure used during software development. Its basic
function is to detect defects in the software. During requirements analysis and design, the
output is a document that is usually textual and nonexecutable. After coding, computer
programs are available that can be executed for testing purposes. This implies that testing
not only has to uncover errors introduced during coding, but also errors introduced during the
previous phases. Thus, the goal of testing is to uncover requirement, design, and coding errors
in the programs.
The starting point of testing is unit testing, where the dierent modules or components are
tested individually. As modules are integrated into the system, integration testing is performed,
which focuses on testing the interconnection between modules. After the system is put together,
system testing is performed. Here the system is tested against the system requirements to see
17

Florin Leon, Ingineria programarii, http://florinleon.byethost24.com/curs_ip.htm

Florin Leon, Ingineria programarii - Suport de curs, http://florinleon.byethost24.com/curs_ip.htm

2.3.1

Istoria program
arii

3.1
3.1.1

Anii 40
Zuse Z3 (Germania, 1941)

Konrad Zuse (1910-1995) was a German engineer and computer pioneer. He graduated in
civil engineering from the Technische Hochschule Berlin-Charlottenburg in 1935. In his engineering studies, Zuse had to perform many routine calculations by hand, which he found
mind-numbingly boring. This led him to dream about performing calculations by a machine.
He started as a design engineer at the Henschel aircraft factory in Berlin-Schnefeld but resigned a year later to build a programmable machine. Working in his parents apartment in
1936, his rst attempt, called the Z1, was a binary electrically driven mechanical calculator with
limited programmability, reading instructions from a punched tape. Z1 was the rst functional
general-purpose computer, using binary counting with mechanical telephone relays.
In 1939, Zuse was called for military service but was able to convince the army to let him
return to his computers. In 1940, he gained support from the Aerodynamische Versuchsanstalt
(Aerodynamic Research Institute), which used his work for the production of glide bombs
(aerial bombs modied with aerodynamic surfaces to change ight paths from purely ballistic
to atter, gliding, ones). Zuse built the Z2, a revised version of the Z1. The same year,
he started a company, Zuse Apparatebau (Zuse Apparatus Engineering), to manufacture his
machines. His greatest achievement was the worlds rst functional program-controlled Turingcomplete computer, the Z3, in 1941, improving on the basic Z2 machine. The program was
stored on a punched tape. Zuse wanted to switch to vacuum tubes, but Hitler ended the project
because he thought it would take too long to complete, that the war was thought to end in less
than 2 years, and there was no use for long-term projects.
Zuse also designed the rst high-level programming language, Plankalkl, rst published in
1948, although this was a theoretical contribution, since the language was not implemented in
his lifetime and did not directly inuence early languages.

18

Florin Leon, Ingineria programarii, http://florinleon.byethost24.com/curs_ip.htm

Florin Leon, Ingineria programarii - Suport de curs, http://florinleon.byethost24.com/curs_ip.htm

if all the requirements are met and if the system performs as specied by the requirements.
Finally, acceptance testing is performed to demonstrate to the client, on the real-life data of
the client, the operation of the system.
Testing is an extremely critical and time-consuming activity. It requires proper planning of
the overall testing process. Frequently the testing process starts with a test plan that identies
all the testing-related activities that must be performed and species the schedule, allocates
the resources, and species guidelines for testing. The test plan species conditions that should
be tested, dierent units to be tested, and the manner in which the modules will be integrated.
Then for dierent test units, a test case specication document is produced, which lists all the
dierent test cases, together with the expected outputs. During the testing of the unit, the
specied test cases are executed and the actual result compared with the expected output. The
nal output of the testing phase is the test report and the error report, or a set of such reports.
Each test report contains the set of test cases and the result of executing the code with these
test cases. The error report describes the errors encountered and the action taken to remove
the errors.

3.1.2

Colossus (Marea Britanie, 1944)

The Colossus machines were electronic computing devices used by British codebreakers to read
encrypted German messages during World War II. These were the worlds rst programmable,
digital, electronic, computing devices. They used vacuum tubes (thermionic valves) to perform
the calculations. Colossus was mainly designed by engineer Tommy Flowers.
The Colossus computers were used to help decipher teleprinter messages which had been
encrypted using the Lorenz SZ40/42 machine British codebreakers referred to encrypted
German teleprinter tra c as "Fish" and called the SZ40/42 machine and its tra c as "Tunny".
While the well-known Enigma machine was generally used by eld units, the Lorenz machine
was used for high-level communications which could support the heavy machine, teletypewriter
and attendant xed circuits. Colossus compared two data streams, counting each match based
on a programmable Boolean function. The encrypted message was read at high speed from a
paper tape. The other stream was generated internally, and was an electronic simulation of the
Lorenz machine at various trial settings. If the match count for a setting was above a certain
threshold, it would be output on an electric typewriter.
The use to which the Colossi were put was of the highest secrecy, and the Colossus itself
was top secret, and remained so until 1975 when Colonel Fredrick William Winterbothamthe
broke the secrecy imposed by publishing his book The Ultra Secret.

19

Florin Leon, Ingineria programarii, http://florinleon.byethost24.com/curs_ip.htm

Florin Leon, Ingineria programarii - Suport de curs, http://florinleon.byethost24.com/curs_ip.htm

Zuses Z3

3.1.3

ENIAC (SUA, 1945), EDVAC

ENIAC (Electronic Numerical Integrator And Computer), was the rst general-purpose electronic computer. It was designed and built to calculate artillery ring tables for the U.S.
Armys Ballistic Research Laboratory. The contract was signed in June 1943 and Project PX
was constructed by the University of Pennsylvanias Moore School of Electrical Engineering.
ENIAC was conceived and designed by John Mauchly and J. Presper Eckert of the University
of Pennsylvania.
ENIAC contained 17,468 vacuum tubes, 7,200 crystal diodes, 1,500 relays, 70,000 resistors,
10,000 capacitors and around 5 million hand-soldered joints. It weighed 27 tons, was roughly
2.6 m by 0.9 m by 26 m, took up 63 m2 , and consumed 150 kW of power. Input was possible
from an IBM card reader, while an IBM card punch was used for output. These cards could
be used to produce printed output o- ine using an IBM accounting machine. Six women did
most of the programming of ENIAC by manipulating its switches and cables.

20

Florin Leon, Ingineria programarii, http://florinleon.byethost24.com/curs_ip.htm

Florin Leon, Ingineria programarii - Suport de curs, http://florinleon.byethost24.com/curs_ip.htm

Colossus

ENIAC inventors John Mauchly and J. Presper Eckert proposed the EDVAC s (Electronic
Discrete Variable Automatic Computer) construction in August 1944, and design work for the
EDVAC commenced before the ENIAC was fully operational. The design would implement
a number of important architectural and logical improvements conceived during the ENIACs
construction. Unlike its predecessor, it was binary rather than decimal.
John von Neumann, while consulting for the Moore School of Electrical Engineering on
the EDVAC project, wrote an incomplete set of notes titled the First Draft of a Report on
the EDVAC. The paper, which was widely distributed, described a computer architecture in
which data and program memory are mapped into the same address space. This architecture
became the de facto standard and can be contrasted with a so-called "Harvard architecture",
which has separate program and data memories on a separate bus. Although the single-memory
architecture became commonly known by the name "von Neumann architecture" as a result
of von Neumanns paper, the architectures conception involved the contributions of others,
including Eckert and Mauchly. With very few exceptions, all present-day home computers,
microcomputers, minicomputers and mainframe computers use this single-memory computer
architecture.

21

Florin Leon, Ingineria programarii, http://florinleon.byethost24.com/curs_ip.htm

Florin Leon, Ingineria programarii - Suport de curs, http://florinleon.byethost24.com/curs_ip.htm

ENIAC

Eckert and Mauchly didnt nish in time to be useful for the war, but soon after, they
started the rst commercial computer company. They took the experience they gained and
founded the Eckert-Mauchly Computer Corporation, producing their rst computer, BINAC,
in 1949 before being acquired by Remington Rand in 1950 and renamed as their UNIVAC
division.
3.1.4

EDSAC (Marea Britanie, 1949)

EDSAC (Electronic Delay Storage Automatic Calculator), having been inspired by John von
Neumanns seminal First Draft of a Report on the EDVAC, was constructed by Maurice Wilkes
and his team at the University of Cambridge Mathematical Laboratory in England. EDSAC
was one of the rst practical stored-program electronic computers.
The project was supported by J. Lyons & Co. Ltd., a British rm, who were rewarded
with the rst commercially applied computer based on the EDSAC design. EDSAC ran its rst
programs on 6 May 1949, calculating a table of squares and a list of prime numbers.
The instructions available were: add, subtract, multiply, collate, shift left, shift right, load
multiplier register, store (and optionally clear) accumulator, conditional skip, read input tape,
print character, round accumulator, no-op and stop. There was no division instruction (though
a number of division subroutines were available) and no way to directly load a number into
the accumulator (a store and zero accumulatorinstruction followed by an addinstruction
were necessary for this).
An unusual feature of EDSAC was the availability of a substantial subroutine library. By
1951, 87 subroutines in the following categories were available for general use: oating point
arithmetic; arithmetic operations on complex numbers; checking; division; exponentiation; routines relating to functions; dierential equations; special functions; power series; logarithms;
miscellaneous; print and layout; quadrature; read (input); n th root; Trigonometric functions;
counting operations (simulating repeat, whileand forloops); vectors and matrices.
22

Florin Leon, Ingineria programarii, http://florinleon.byethost24.com/curs_ip.htm

Florin Leon, Ingineria programarii - Suport de curs, http://florinleon.byethost24.com/curs_ip.htm

EDVAC. John von Neumann and Robert J. Oppenheimer

EDSAC
3.1.5

Mark I (Marea Britanie, 1949)

Freddie Williams (Sir Frederic Calland Williams) was an English engineer. Working at the
Telecommunications Research Establishment he was a substantial contributor during World
War II to the development of radar. With Tom Kilburn he pioneered the rst stored-program
digital computer at the University of Manchester. The Manchester Mark I was one of the
earliest electronic computers and one of the rst electronic stored program computers. This is
also the earliest known implementation of index/base registers.
The rst realistic program to be run on the Mark I was a test of Mersenne primes, run in
early April 1949. The computer ran error-free for 9 hours on the night of 16-17 June 1949. A
Mersenne number is a number that is one less than a power of two, Mn = 2n 1. A Mersenne
prime is a Mersenne number that is a prime number. As of August 2007, only 44 Mersenne
primes are known; the largest known prime number (232;582;657 1) is a Mersenne prime and in
modern times the largest known prime has nearly always been a Mersenne prime.
Working on a prototype of the Mark II, in the summer of 1945, Grace Murray Hopper nds
the rst computer bug, a moth that had caused a relay failure.

23

Florin Leon, Ingineria programarii, http://florinleon.byethost24.com/curs_ip.htm

Florin Leon, Ingineria programarii - Suport de curs, http://florinleon.byethost24.com/curs_ip.htm

In 1951, Miller and Wheeler used the machine to discover a 79-digit prime the largest
known at the time. In 1952, A. S. Douglas developed "OXO", a version of tic-tac-toe for the
EDSAC, with graphical output to a cathode ray tube. This may well have been the worlds
rst video game.

3.2
3.2.1

Anii 50
LEO (Marea Britanie, 1951)

Joseph Lyons and Co., one of the UKs leading catering and food manufacturing companies in
the rst half of the 20th century, sent two of its senior managers to the USA in 1947 to look at
new business methods developed during the Second World War. During their visit they came
across digital computers then used exclusively for engineering and mathematical computations.
They saw the potential of computers to help solve the problem of administering a major business
enterprise. They also learned that Cambridge University, back in the UK, was actually building
such a machine, the pioneering EDSAC computer. On their return to company headquarters in
London they made a recommendation to the LyonsBoard that Lyons should acquire or build
a computer to meet their business needs. This was accepted, and it was agreed that Cambridge
University should receive some nancial support if the University of Cambridge Mathematical
Laboratory gave some help to the Lyonsinitiative. Cambridge provided training and support
for the Lyonsengineers. By 1949 they had the basics of a computer specically designed for
business data processing running, and on 17 November 1951 rolled out the rst commercial
business application. The computer was called the LEO (Lyons Electronic O ce).
Lyons used LEO I initially for valuation jobs, but its role was extended to include payroll,
inventory and so on. One of its early tasks was the elaboration of daily orders which were phoned
in every afternoon by the shops and used to calculate the overnight production requirements,
assembly instructions, delivery schedules, invoices, costings and management reports.

24

Florin Leon, Ingineria programarii, http://florinleon.byethost24.com/curs_ip.htm

Florin Leon, Ingineria programarii - Suport de curs, http://florinleon.byethost24.com/curs_ip.htm

Mark I

3.2.2

UNIVAC (SUA, 1951)

The company UNIVAC (UNIVersal Automatic Computer) began as the business computer division of Remington Rand formed by the 1950 purchase of the Eckert-Mauchly Computer Corporation, founded four years earlier by ENIAC inventors J. Presper Eckert and John Mauchly.
As well as being the rst American commercial computer, the UNIVAC I was the rst American
computer designed at the outset for business and administrative use (i.e. for the fast execution
of large numbers of relatively simple arithmetic and data transport operations, as opposed to
the complex numerical calculations required by scientic computers). As such, the UNIVAC
competed directly against punch-card machines, mainly made by IBM, but oddly enough the
UNIVAC originally had no means of either reading or punching cards (which initially hindered
sales to some companies with large quantities of data on cards, due to potential manual conversion costs). This was corrected by adding o- ine card processing equipment, the UNIVAC
Card to Tape converter and the UNIVAC Tape to Card converter, to transfer data between
cards and UNIVAC magnetic tapes.
UNIVAC was rst intended for the Bureau of the Census, which paid for much of the
development, and then was put in production. The rst sale was marked with a formal ceremony
in March 1951 in Philadelphia. The machine was not actually shipped until the following
December, because, as the sole fully set-up model, it was needed for demonstration purposes,
and the company was apprehensive about the di culties of dismantling, transporting, and
reassembling the delicate machine.
The most famous UNIVAC product was the UNIVAC I, which became known for predicting
the outcome of the U.S. presidential election in 1952, an overwhelming victory for Eisenhower,
25

Florin Leon, Ingineria programarii, http://florinleon.byethost24.com/curs_ip.htm

Florin Leon, Ingineria programarii - Suport de curs, http://florinleon.byethost24.com/curs_ip.htm

Leos world record

UNIVAC at CBS
UNIVAC I used 5,200 vacuum tubes, weighed 13 tons, consumed 125 kW, and could perform
about 1,905 operations per second running on a 2.25 MHz clock. The Central Complex alone
(i.e. the processor and memory unit) was 4.3 m by 2.4 m by 2.6 m high. The complete system
occupied more than 35.5 m2 of oor space. Originally priced at $159,000, the UNIVAC I rose in
price until they were between $1,250,000 and $1,500,000. A total of 46 systems were eventually
built and delivered. A few UNIVAC I systems stayed in service long after they were obsoleted
by advancing technology. The Census Bureau used its two systems until 1963, amounting to
12 and 9 years of service.
3.2.3

IBM 650 (SUA, 1953)

The IBM 650 was one of IBMs early computers, and the worlds rst mass-produced computer.
It was announced in 1953, and over 2,000 systems were produced between the rst shipment
in 1954 and its nal manufacture in 1962. One of its most signicant features was that, if
processing was interrupted by a "random processing error" (hardware glitch), the machine could
automatically resume from the last checkpoint instead of requiring the operators to restart the
job manually from the beginning.
16 types of IBM calculating devices were created before 1960. By the end of the 50s, IBM
held 75% of the market, selling 10,000 computers in the USA.

26

Florin Leon, Ingineria programarii, http://florinleon.byethost24.com/curs_ip.htm

Florin Leon, Ingineria programarii - Suport de curs, http://florinleon.byethost24.com/curs_ip.htm

even before the polls closed in California. Statistical sampling techniques related to results
from previous elections were used. The CBS network, which expected a narrow margin victory,
held back the predictions for quite a few hours because they didnt believe them, but the actual
results came out very close to the initial predictions. Obviously, this was not just because
of the computing power of the UNVAC, but because some very smart political analysts had
constructed insightful models which had been accurately programmed and tested.

3.2.4

Termenul "software" (1958)

In 1958, John Tukey, a world-renowned statistician, coined the term software in an article for
American Mathematical Monthly.
3.2.5

Limbaje de programare

The rst generation of programming languages used to program a computer, was called machine language or machine code, which is the only language a computer really understands, a
sequence of 0s and 1s that the computers controls interprets as instructions, electrically. The
second generation of programming languages was called assembly language, which turns the
sequences of 0s and 1s into human words like "add". Assembly language is always translated
back into machine code by programs called assemblers. The third generation of programming
languages was called high level language (HLL), which has human sounding words and syntax,
similar to words in a sentence. In order for the computer to understand any HLL, a compiler
translates the high level language into either assembly language or machine code. All software
programming languages need to be eventually translated into machine code for a computer to
use the instructions they contain.
Fortran (1954) Fortran (The IBM Mathematical Formula Translating System) is a generalpurpose, procedural, imperative programming language that is especially suited to numeric
computation and scientic computing. Originally developed by IBM in the 1950s for scientic
and engineering applications, Fortran came to dominate this area of programming early on and
has been in continual use for over half a century in computationally intensive areas such as
numerical weather prediction, nite element analysis, computational uid dynamics, computational physics, and computational chemistry. It is one of the most popular languages in the
area of high-performance computing. The programs to benchmark and rank the worlds fastest
supercomputers are written in Fortran.
Successive versions have added support for processing of character-based data (1977), array

27

Florin Leon, Ingineria programarii, http://florinleon.byethost24.com/curs_ip.htm

Florin Leon, Ingineria programarii - Suport de curs, http://florinleon.byethost24.com/curs_ip.htm

IBM 650

Fortran code
Lisp (1958) Lisp (List Processing) language has a long history and a distinctive, fully parenthesized, syntax. Originally specied in 1958, Lisp is the second-oldest high-level programming
language in widespread use today; only Fortran is older. Like Fortran, Lisp has changed a great
deal since its early days, and a number of dialects have existed over its history. Today, the most
widely known general-purpose Lisp dialects are Common Lisp and Scheme. Lisp was invented
by John McCarthy in while he was at the Massachusetts Institute of Technology (MIT), and
it was originally created as a practical mathematical notation for computer programs, based
on Alonzo Churchs lambda calculus. In lambda calculus, every expression is a unary function
meaning a function with only one input, known as its argument. When an expression is applied
to another expression (calledwith the other expression as its argument), it returns a single
value, known as its result. Lisp quickly became the favored programming language for articial
intelligence research. As one of the earliest programming languages, it pioneered many ideas
in computer science, including tree data structures, automatic storage management, dynamic
typing, object-oriented programming, and the self-hosting compiler.
Linked lists are one of Lisp languagesmajor data structures, and Lisp source code is itself
made up of lists. As a result, Lisp programs can manipulate source code as a data structure,
giving rise to the macro systems that allow programmers to create new syntax or even new
domain-specic programming languages embedded in Lisp. The interchangeability of code
and data also gives Lisp its instantly recognizable syntax. All program code is written as sexpressions, or parenthesized lists. A function call or syntactic form is written as a list with
the function or operators name rst, and the arguments following; for instance, a function f
that takes three arguments might be called using (f x y z).

28

Florin Leon, Ingineria programarii, http://florinleon.byethost24.com/curs_ip.htm

Florin Leon, Ingineria programarii - Suport de curs, http://florinleon.byethost24.com/curs_ip.htm

programming, module-based programming and object-based programming (1990), and objectoriented and generic programming (2003).

Algol (1958) Algol (short for ALGOrithmic Language) is a family of imperative computer
programming languages which greatly inuenced many other languages, and became the de
facto way algorithms were described in text-books and academic works for almost the next 30
years. It was designed to avoid some of the perceived problems with Fortran and eventually gave
rise to many other programming languages, including Pascal. Algol uses bracketed statement
blocks and was the rst language to use begin-end pairs for delimiting them.

Algol code
Cobol (1959) Cobol is one of the oldest programming languages still in active use. Its name is
an acronym for COmmon Business-Oriented Language, dening its primary domain in business,
nance, and administrative systems for companies and governments. The rst specications
of Cobol were proposed by a committee formed to recommend a short range approach to a
common business language. The Cobol 2002 standard includes support for object-oriented
programming and other modern language features.

29

Florin Leon, Ingineria programarii, http://florinleon.byethost24.com/curs_ip.htm

Florin Leon, Ingineria programarii - Suport de curs, http://florinleon.byethost24.com/curs_ip.htm

Lisp code

3.2.6

Industria software

The software industry began in the late 1950s when the use of computers for business applications expanded rapidly creating a huge demand for people with programming experience. A
number of people who had learned their programming skills working for computer manufacturers or for the large companies and government agencies that were the rst computer users saw
this as an opportunity to start their own companies and sell their services under contract.
The rst such company, Computer Usage Corporation (CUC), was founded in 1955 by
Elmer Kubie and John W. Sheldon, two former IBM employees. The company was founded
with $40,000 in start-up capital which supported a sta of ve in addition to the two founders.
Its rst project was a program written for California Research Corporation to simulate the ow
of oil. CUC became a public company in 1960 and by 1967 had a sta of over 700 people in
12 o ces around the U.S. and revenues over $13 million. Unfortunately, it suered nancial
losses in the late 1970s and eventually went bankrupt in 1986.
In 1959, seven UNIVAC programmers founded Applied Data Research (ADR) to market
their programming skills to computer manufacturers such as Sperry Rand and Honeywell to
develop systems software. ADR went public in 1965 and, in the late 1960s, became one of the
rst companies to successfully sell software products. It continued to be one of the largest U.S.
software product companies until it was acquired by Ameritech for $215 million in 1986.
Fletcher Jones and Roy Nutt, who had gained their computer experience in the aerospace
industry, founded Computer Sciences Corporation (CSC) in 1959 with $100 and a contract from
Honeywell to develop a Cobol compiler called FACT (Fully Automated Compiling Technique).
By 1963, CSC was the largest software company with revenues close to $4 million. CSC
continues to thrive today as one of the world largest information technology services rms with
more than $10.2 billion in revenues.

3.3
3.3.1

Anii 60
Tranzistorul
si circuitele integrate

The rst patent for the eld-eect transistor principle was led in Canada by Austrian physicist
Julius Edgar Lilienfeld on 22 October 1925, but Lilienfeld published no research articles about
his devices, and they were ignored by industry. In 1934 German physicist Oskar Heil patented
another eld-eect transistor. There is no direct evidence that these devices were built, but later
work in the 90s show that one of Lilienfelds designs worked as described and gave substantial
gain.
On 17 November 1947, John Bardeen and Walter Brattain observed that when electrical
contacts were applied to a crystal of germanium, the output power was larger than the input.
Brattain and H. R. Moore made a demonstration to several of their colleagues and managers
at Bell Labs on the afternoon of 23 December 1947, often given as the birth date of the
30

Florin Leon, Ingineria programarii, http://florinleon.byethost24.com/curs_ip.htm

Florin Leon, Ingineria programarii - Suport de curs, http://florinleon.byethost24.com/curs_ip.htm

Cobol code

Transistors
A transistor could be 1/50 in size and 1/100 in weight compared to a vaccum tube. However,
through the 60s, computer engineers were faced with the problem of being unable to increase
the performance of their designs due to the huge number of components involved. In theory,
every component needed to be wired to every other one, and were typically strung and soldered
by hand. In order to improve performance, more components would be needed, and it seemed
that future designs would consist almost entirely of wiring. This problem became known as the
tyranny of numbers.
The problem was eventually solved with the widespread introduction of the integrated circuit
(IC). ICs are essentially a number of related components making up one particular function,
what had previously been known in computer design as a "module". Unlike the individual
components in existing module designs, however, the IC was "wired up" via the process of
photolithography, allowing them to be mass produced and dramatically reducing the overall
complexity of the machines. Photolithography is a process used to selectively remove parts of
a thin lm. It uses light to transfer a geometric pattern from a photomask to a light-sensitive
chemical on the substrate. A series of chemical treatments then engraves the exposure pattern
into the material underneath the photoresist.
In 1959 Jack Kilby and Texas Instruments on the one hand, and Robert Noyce and the
Fairchild Semiconductor Corporation on the other hand applied for patents for integrated circuits. The two companies wisely decided to cross license their technologies after several years
of legal battles, creating a global market now worth about $1 trillion a year. In 1961 the rst
commercially available integrated circuits came from the Fairchild Semiconductor Corporation.
Sputnik 1 (Satellite 1), was the rst articial satellite to be put into outer space, launched
into geocentric orbit by the Soviet Union on 4 October 1957. On 12 April 1961, Yuri Gagarin
became the rst human in space and the rst to orbit the Earth. These events started the Space
31

Florin Leon, Ingineria programarii, http://florinleon.byethost24.com/curs_ip.htm

Florin Leon, Ingineria programarii - Suport de curs, http://florinleon.byethost24.com/curs_ip.htm

transistor. William Shockley saw the potential in this and worked over the next few months
greatly expanding the knowledge of semiconductors and is considered by many to be the "father"
of the transistor. Shockley, Bardeen, and Brattain were Nobel Laureates in Physics in 1956
"for their researches on semiconductors and their discovery of the transistor eect".
Legal papers from the Bell Labs patent show that Shockley and Pearson had built operational versions from Lilienfelds patents, yet they never referenced this work in any of their
later research papers or historical articles.
The transistor is considered by many to be the greatest invention of the 20th century. It is
the key active component in practically all modern electronics. Its importance in todays society
rests on its ability to be mass produced using a highly automated process of fabrication that
achieves astonishingly low per-transistor costs. However, in the 60s, the electronics industry
was not interested at rst in this technology, because at that time it was too expensive and it
was a radical change from the vacuum tubes that had been used until then.

Integrated circuit
3.3.2

Legea lui Moore (1965)

Moores law describes an important trend in the history of computer hardware. Since the
invention of the integrated circuit, the number of transistors that can be placed inexpensively
on an integrated circuit has increased exponentially. The trend was rst observed by Intel
co-founder Gordon E. Moore in a 1965 paper for the 35th Anniversary edition of Electronics
magazine. Originally suggesting processor complexity doubled every year, the law was revised
in 1975 to suggest a doubling in complexity every two years. It has continued for almost half
of a century and is not expected to stop for another decade at least and perhaps much longer.
Almost every measure of the capabilities of digital electronic devices is linked to Moores law:
processing speed, memory capacity, even the number and size of pixels in digital cameras. All
of these are improving at (roughly) exponential rates as well.
3.3.3

ARPANET (1969)

From the launch of Sputnik and the U.S.S.R. testing its rst intercontinental ballistic missile,
the Advanced Research Projects Agency (ARPA) was born. ARPA was the U.S. governments
research agency for all space and strategic missile research. In 1958, NASA was formed, and
the activities of ARPA moved away from aeronautics and focused mainly on computer science
32

Florin Leon, Ingineria programarii, http://florinleon.byethost24.com/curs_ip.htm

Florin Leon, Ingineria programarii - Suport de curs, http://florinleon.byethost24.com/curs_ip.htm

Race, a competition of space exploration between the Soviet Union and the United States. It
involved the eorts to explore outer space with articial satellites, to send humans into space,
and to land people on the Moon. J. F. Kennedy made the goal for landing a man on the Moon
in speaking to a Joint Session of Congress on 25 May 1961, saying: "I believe that this nation
should commit itself to achieving the goal, before this decade is out, of landing a man on the
Moon and returning him back safely to the Earth".
An integrated circuit used to cost $1,000, and early ICs had failure rates of over 90%. But
the Pentagon and NASA were willing to pay for the IC technology in order to use it on their
space vehicles. The computer that would guide the Apollo 11 lunar module into the orbit was
the most powerful computer in the world at that time (1969).
As the technology developed, ICs began to be mass-produced. 10 years later, an IC would
cost 1 cent and would be 1,000 times more powerful. The original IC had only one transistor,
three resistors and one capacitor and was the size of an adults pinkie nger. Today an IC
smaller than a penny can hold 125 million transistors.
Computers built between 1964 and 1972 are often regarded as "Third Generation" computers, they are based on the rst integrated circuits - creating even smaller machines. Typical of
such machines was the IBM System/360 series mainframe, while smaller minicomputers began
to open up computing to smaller businesses.

3.3.4

Limbaje de programare

Simula (1962) Simula is a name for two programming languages, Simula I and Simula 67,
developed in the 1960s at the Norwegian Computing Center in Oslo, by Ole-Johan Dahl and
Kristen Nygaard. Syntactically, it is a fairly faithful superset of Algol 60. Simula 67 introduced
objects, classes, subclasses, virtual methods, coroutines, discrete event simulation, and featured
garbage collection. Simula is considered the rst object-oriented programming language. As its
name implies, Simula was designed for doing simulations, and the needs of that domain provided
the framework for many of the features of object-oriented languages today. Since Simula-type
objects are reimplemented in C++, Java and C#, the inuence of Simula is often understated.
The creator of C++, Bjarne Stroustrup, has acknowledged that Simula 67 was the greatest
inuence on him to develop C++, to bring the kind of productivity enhancements oered by
Simula to the raw computational speed oered by lower level languages.

Simula code

33

Florin Leon, Ingineria programarii, http://florinleon.byethost24.com/curs_ip.htm

Florin Leon, Ingineria programarii - Suport de curs, http://florinleon.byethost24.com/curs_ip.htm

and information processing. One of ARPAs goals was to connect mainframe computers at
dierent universities around the country so that they would be able to communicate using a
common language and a common protocol. Thus the ARPANET the worlds rst multiplesite computer network was created in 1969. ARPANET is the original basis for what now
forms the Internet. It was opened to non-military users later in the 1970s and many universities
and large businesses went on-line.

3.3.5

Paradigme de programare

The structured program theorem provides the theoretical basis of structured programming. It
states that three ways of combining programs sequence, selection, and iteration are su cient
to express any computable function. This observation did not originate with the structured
programming movement; these structures are su cient to describe the instruction cycle of a
central processing unit, as well as the operation of a Turing machine. Therefore a processor is
always executing a "structured program" in this sense, even if the instructions it reads from
memory are not part of a structured program.
In the 1960s, language design was often based on textbook examples of programs, which
were generally small (due to the size of a textbook); however, when programs became very large,
the focus changed. In small programs, the most common statement is generally the assignment
statement; however, in large programs (over 10,000 lines), the most common statement is
typically the procedure-call to a subprogram. Ensuring parameters are correctly passed to the
correct subprogram becomes a major issue.
Object-oriented programming can also trace its roots to the 60s. As hardware and software
became increasingly complex, quality was often compromised. Researchers studied ways in
which software quality could be maintained. Object-oriented programming was deployed in part
as an attempt to address this problem by strongly emphasizing discrete units of programming
logic and reusability in software. The term object-orientedwas coined by Alan Kay in 1967.
OOP may be seen as a collection of cooperating objects, as opposed to a traditional view in
which a program may be seen as a group of tasks to compute ("subroutines"). Each object is
capable of receiving messages, processing data, and sending messages to other objects. OOP
was not commonly used in mainstream software application development until the early 1990s.
3.3.6

Industria software

The number of computers in use and their size and speed expanded rapidly in the 1960s escalating the demand for software to support the numerous tasks for which computers were now
being used. This provided enormous opportunities for entrepreneurs to create new companies
to serve this expanding market. Some of the companies founded in the early 1960s were Informatics, Electronic Data Systems (EDS) and California Analysis Center, Inc. (CACI) in 1962,
Management Science America (MSA) in 1963, and Keane, Inc. in 1965.
By 1965, there were an estimated 45 major software contractors in the U.S, some employing
more than 100 programmers and with annual revenues as much as $100 million. In addition,
there were hundreds of small rms, typically with just a few programmers. In 1967, it was

34

Florin Leon, Ingineria programarii, http://florinleon.byethost24.com/curs_ip.htm

Florin Leon, Ingineria programarii - Suport de curs, http://florinleon.byethost24.com/curs_ip.htm

Basic (1964) Basic (Beginners All-purpose Symbolic Instruction Code) is a family of highlevel programming languages. The original Basic was designed by John George Kemeny and
Thomas Eugene Kurtz at Dartmouth College (US), in order to provide access for non-science
students to computers. At the time, nearly all use of computers required writing custom
software, which was something only scientists and mathematicians tended to do. The language
(in one variant or another) became widespread on microcomputers in the late 1970s and home
computers in the 1980s. Basic remains popular to this day in a handful of highly modied
dialects and new languages based on it such as Microsoft Visual Basic.
Basic a fost primul produs vndut de Microsoft si primul caz major de piraterie software: a
fost copiat si distribuit pe scar
a larg
a nc
a nainte de a lansat deoarece Bill Gates a pierdut
o copie n timpul unei demonstratii publice.
Limbajul Basic a fost standardizat n ANSI Minimal Basic (1978) si ANSI Full Basic (1987).

3.3.7

Criza software

At that time, the programming task was very di cult. Computers stood idle while programmers
struggled with the machine language. These programs took long to write and even longer to
nd errors and correct them. Due to these di culties, there were very few programmers.
The software crisis was a term used to describe the impact of rapid increases in computer
power (according to Moores law) and the complexity of the problems which could be tackled.
In essence, the crisis refers to the di culty of writing correct, understandable, and veriable
computer programs. The roots of the software crisis are complexity, expectations, and change.
Conicting requirements have always hindered the software development process. For example,
while users demanded a large number of features, customers generally wanted to minimise the
amount they had to pay for the software and the time required for its development. Many
software projects ran over budget and schedule. Some projects caused property damage. A few
projects caused loss of life. The software crisis was originally dened in terms of productivity,
but evolved to emphasize quality.
Un raport n care erau analizate o serie de proiecte si stadiile lor de nalizare a constatat
c
a:
2% din sistemele software contractate au functionat de la predare;
3% din sistemele software au putut functiona dup
a cteva modic
ari;
35

Florin Leon, Ingineria programarii, http://florinleon.byethost24.com/curs_ip.htm

Florin Leon, Ingineria programarii - Suport de curs, http://florinleon.byethost24.com/curs_ip.htm

estimated that there were 2,800 software services rms in the U.S. By the end of the 1960s, a
number of these rms, such as Informatics, ADR, and CSC, were publicly held.
By the early 1960s, a customer of any of the major hardware manufacturers could expect to
have access to a library of software which was included (bundled) in the cost of the computer.
This software included the computers operating system, of course, but also utility programs
(such as sort programs), compilers for languages such as Cobol and Fortran, and a growing
library of programs written to handle specic applications. IBM, for example, maintained
a library of application programs written by its programmers to meet the needs of specic
customers but which were then made available at no cost to other IBM customers.
Computer users had the choice of getting the software they needed from their hardware
vendors or having it custom-built for their needs by their own programmers or by a contract
programming rm. Many of the executives in the software industry didnt believe that there
would ever be a viable market for software productsit was too di cult to compete against free
software from the hardware manufacturers.
But early in the 1960s, some contract programming rms began to see opportunities, when
there was no comparable product available from the hardware vendor, to sell programs they had
written to more than one customer. For example, CACI began selling SIMSCRIPT, a simulation
language, in 1962, and ADPAC Corporation made several sales of its ADPAC compiler in 1964
to customers who had seen it used by ADPAC programmers and wanted it available to their
own programmers.
In 1965, ADR released AUTOFLOW, a program which automatically produced program
owcharts by reading the program source code, and which ultimately was sold to thousands of
customers. And in November 1967, Informatics released MARK IV, a generalized le management and report generation program, which surpassed $1 million in revenues within 12 months
after its formal announcement.
On 30 June 1969, IBM announced that, eective 1 January 1970, it would begin to unbundle
(charge separately for) some of its software eectively ending the expectation of its customers
that they would always be able to get all the software they needed from IBM for free.

29% au fost predate dar n-au functionat niciodat


a;
19% au fost folosite dar au fost abandonate;

3.3.8

Conferin
tele NATO de ingineria program
arii (1968, 1969)

The term software engineering was coined by Brian Randell and popularized by F.L. Bauer
during the NATO Software Engineering Conference (7-11 October 1968) in Garmisch, Germany.
Its proceedings were published in Brussels in 1969 with Naur and Randell as editors. Here are
some excerpts:
The Study Group [of the NATO Committee] concentrated on possible actions which
would merit an international, rather than a national eort. In particular it focussed
its attentions on the problems of software. In late 1967 the Study Group recommended the holding of a working conference on Software Engineering. The phrase
software engineeringwas deliberately chosen as being provocative, in implying the
need for software manufacture to be based on the types of theoretical foundations and
practical disciplines, that are traditional in the established branches of engineering.
It was suggested that about 50 experts from all areas concerned with software problems computer manufacturers, universities, software houses, computer users,
etc. be invited to attend the conference. It was further suggested that every eort
should be made to make the conference truly a working conference, whose discussions
should be organised under the three main headings: Design of Software, Production
of Software, and Service of Software.
Randell: I am worried about the term "software engineering". I would prefer a
name indicating a wider scope, for instance "data systems engineering".
Dijkstra: We, in the Netherlands, have the title Mathematical Engineer. Software
engineering seems to be the activity for the Mathematical Engineer par excellence.
This seems to it perfectly. On the one hand, we have all the aspects of an engineering
activity, in that you are making some thing and want to see that it really works. On
the other hand, our basic tools are mathematical in nature.
Bauers denition: Software engineering is the establishment and use of sound engineering principles in order to obtain economical software that is reliable and works
e ciently on real machines.
A second NATO conference on Software Engineering Techniques was held in Rome in 1969
and its proceedings were published in 1970 in Brussels.

3.4
3.4.1

Anii 70
XEROX PARC (1970)

Xerox Palo Alto Research Center (PARC) is the agship research division of the Xerox Corporation, founded in 1970. PARC has been the incubator of many elements of modern computing.
Most were included in the Alto, an early personal computer (1973), which introduced and unied
most aspects of now-standard personal computer usage model: the mouse, computer generated
color graphics, a graphical user interface featuring windows and icons, the WYSIWYG (What
36

Florin Leon, Ingineria programarii, http://florinleon.byethost24.com/curs_ip.htm

Florin Leon, Ingineria programarii - Suport de curs, http://florinleon.byethost24.com/curs_ip.htm

47% au fost pl
atite dar niciodat
a predate.

You See Is What You Get) text editor, InterPress (a resolution-independent graphical page description language and the precursor to PostScript), Ethernet, and fully formed object-oriented
programming in the Smalltalk programming language and integrated development environment.
The laser printer was developed at the same time, as an integral part of the overall environment.
Unix (1970)

Unix is a computer operating system developed by a group of AT&T employees at Bell Labs
including Ken Thompson, Dennis Ritchie and Douglas McIlroy. In the 1970s the project was
named Unics, and eventually could support two simultaneous users. In 1973, Unix was rewritten
in C, contrary to the general notion at the time that something as complex as an operating
system, which must deal with time-critical events, had to be written exclusively in assembly
language. The migration from assembly language to the higher-level language C resulted in
much more portable software, requiring only a relatively small amount of machine-dependent
code to be replaced when porting Unix to other computing platforms.
Unix was designed to be portable, multi-tasking and multi-user in a time-sharing conguration. Unix systems are characterized by various concepts: the use of plain text for storing data;
a hierarchical le system; treating devices and certain types of inter-process communication
(IPC) as les; and the use of a large number of software tools, small programs that can be
strung together through a command line interpreter using pipes, as opposed to using a single
monolithic program that includes all of the same functionality.
Unix operating systems are widely used in both servers and workstations. The Unix environment and the client-server program model were essential elements in the development of
the Internet and the reshaping of computing as centered in networks rather than in individual
computers.
3.4.3

Software-ul dep
a
se
ste hardware-ul (1973)

In an important report to DARPA, Barry W. Boehm predicted that software costs would
overwhelm hardware costs. DARPA had expected him to predict that hardware would remain
the biggest problem, encouraging them to invest in even larger computers. The report inspired
a change of direction in computing.
3.4.4

Intel 8080 (1974)

The Intel 8080 was an early microprocessor designed and manufactured by Intel. The 8-bit
CPU was released in April 1974 running at 2 MHz (at up to 500,000 instructions per second),
and is generally considered to be the rst truly usable microprocessor CPU design. These events
marked the advent of the Fourth Generation computers, the modern day computers (following
the relays/vacuum tubes, transistors, and integrated circuits technologies). The size started to
go down with the improvement in the integerated circuits. Very Large Scale (VLSI) ensured
that millions of components could be t into a small chip. It reduced the size and price of the
computers at the same time increasing power, e ciency and reliability.

Intel 8080
37

Florin Leon, Ingineria programarii, http://florinleon.byethost24.com/curs_ip.htm

Florin Leon, Ingineria programarii - Suport de curs, http://florinleon.byethost24.com/curs_ip.htm

3.4.2

Altair 8800 (1975)

The MITS (Micro Instrumentation and Telemetry Systems) Altair 8800 was a microcomputer
design from 1975, based on the Intel 8080 CPU and sold as a mail-order kit through advertisements in Popular Electronics, Radio-Electronics and other hobbyist magazines. The designers
intended to sell only a few hundred to hobbyists, and were surprised when they sold thousands
in the rst month. Today the Altair is widely recognized as the spark that led to the personal
computer revolution of the next few years. The computer bus designed for the Altair was to become a de facto standard, and the rst programming language for the machine was Microsofts
founding product, Altair Basic. The basic conguration costed around $500.

Altair 8800
3.4.6

IBM 5100 (1975)

The IBM 5100 Portable Computer was a desktop computer introduced in September 1975 which
had integrated keyboard, display, and mass storage on tape. It resembled the IBM Personal
Computer 6 years later, although it did not use a microprocessor. It was the size of a small
suitcase, weighed about 25 kg, and could be transported in an optional carrying case, hence the
"portable" name. While the IBM 5100 seems large today, in 1975 it was an amazing technical
accomplishment to package a complete computer with a large amount of ROM and RAM, CRT
display, and a tape drive into a machine that "small".

38

Florin Leon, Ingineria programarii, http://florinleon.byethost24.com/curs_ip.htm

Florin Leon, Ingineria programarii - Suport de curs, http://florinleon.byethost24.com/curs_ip.htm

3.4.5

3.4.7

Mitica lun
a-om (Brooks, 1975)

The Mythical Man-Month: Essays on Software Engineering is a book on software project management by Fred Brooks, whose central theme is that "Adding manpower to a late software
project makes it later". This idea is known as Brookss law.
Brookss observations are based on his experiences at IBM while managing the development
of OS/360. He had mistakenly added more workers to a project falling behind schedule. He
also made the mistake of asserting that one project (writing an Algol compiler) would require
six months, regardless of the number of workers involved. It required longer. The tendency
for managers to repeat such errors in project development led Brooks to joke that his book
is called "The Bible of Software Engineering" because "everybody reads it but nobody does
anything about it".
3.4.8

Apple (1976)

Apple was established on 1 April 1976 by Steve Jobs, Steve Wozniak, and Ronald Wayne to
sell the Apple I personal computer kit. They were hand-built by Wozniak and rst shown to
the public at the Homebrew Computer Club. The Apple I was sold as a motherboard (with
CPU, RAM, and basic textual-video chips) not what is today considered a complete personal
computer. The Apple I went on sale in July 1976 and was market-priced at $666.66. The Apple
II was introduced on 16 April 1977 at the rst West Coast Computer Faire. It came with color
graphics and an open architecture. While early models used ordinary cassette tapes as storage
devices, they were superseded by the introduction of a 5.25 inch oppy disk drive and interface,
the Disk II. The Apple II was chosen to be the desktop platform for the rst "killer app" of
the business world the VisiCalc spreadsheet program. VisiCalc created a business market for
the Apple II, and gave home users an additional reason to buy an Apple II compatibility with
the o ce.

39

Florin Leon, Ingineria programarii, http://florinleon.byethost24.com/curs_ip.htm

Florin Leon, Ingineria programarii - Suport de curs, http://florinleon.byethost24.com/curs_ip.htm

IBM 5100

3.4.9

Limbaje de programare

Pascal (1970) Pascal is an inuential imperative and procedural programming language,


developed in 1970 by Niklaus Wirth as a small and e cient language intended to encourage
good programming practices using structured programming. It is based on the Algol programming language and named in honor of the French mathematician and philosopher Blaise Pascal.
Initially, Pascal was largely, but not exclusively, intended to teach students structured programming. A derivative known as Object Pascal was designed for object oriented programming.
Un pas nainte important fata de celelalte limbaje existente la momentul respectiv a fost
faptul c
a suporta recursivitatea.
In 1983, the language was standardized, in the international standard ISO/IEC 7185.
n prezent, Pascal st
a la baza mai multor medii de dezvoltare a aplicatiilor comerciale:
Borland Delphi (sub Windows) si CodeWarrior Pascal si THINK Pascal (sub Macintosh) sunt
cele mai utilizate.
C(1972) C a fost dezvoltat de Dennis Ritchie n 1972 la AT&Ts Bell Laboratories si a fost
folosit pentru scrierea sistemului de operare Unix. Datorit
a legilor antitrust, Laboratoarelor
Bell li s-au interzis drepturile de autor asupra C-ului si Unix-ului. De aceea, compilatoarele de
C sunt n domeniul public si au fost adoptate de majoritatea universit
atilor.
Although C was designed for implementing system software, it is also widely used for developing application software. In 1983, the American National Standards Institute formed a
committee to establish a standard specication of C. After a long and arduous process, the
standard was completed in 1989 and ratied as ANSI X3.159-1989 "Programming Language
C". This version of the language is often referred to as "ANSI C".
Smalltalk (1972) Smalltalk is an object-oriented, dynamically typed, reective programming language. In computer science, reection is the process by which a computer program
can observe and modify its own structure and behavior; the programming paradigm driven by
reection is called reective programming. Smalltalk was created as the language to underpin
the "new world" of computing exemplied by "human-computer symbiosis". It was designed
and created in part for educational use, more so for constructionist learning, at Xerox PARC.
Constructionist learning is inspired by constructivist theories of learning that assume that learning is an active process wherein learners are actively constructing mental models and theories

40

Florin Leon, Ingineria programarii, http://florinleon.byethost24.com/curs_ip.htm

Florin Leon, Ingineria programarii - Suport de curs, http://florinleon.byethost24.com/curs_ip.htm

Apple I

Smalltalk code
Prolog (1972) Prolog is a logic programming language, rst conceived by a group around
Alain Colmerauer in Marseille, France. It is a general purpose language often associated with
articial intelligence and computational linguistics. It has a purely logical subset, called "pure
Prolog", as well as a number of extralogical features.
Having its roots in formal logic, and unlike many other programming languages, Prolog is
declarative. The program logic is expressed in terms of relations, and execution is triggered
by running queries over these relations. Relations and queries are constructed using Prologs
single data type, the term. Relations are dened by clauses. Given a query, the Prolog engine
attempts to nd a resolution refutation of the negated query. If the negated query can be
refuted, i.e., an instantiation for all free variables is found that makes the union of clauses and
the singleton set consisting of the negated query false, it follows that the original query, with the
found instantiation applied, is a logical consequence of the program. This makes Prolog (and
other logic programming languages) particularly useful for database, symbolic mathematics,
and language parsing applications.
While initially aimed at natural language processing, the language has since then stretched
far into other areas like theorem proving, expert systems, games, automated answering systems,
ontologies and sophisticated control systems, and modern Prolog environments support the
creation of graphical user interfaces, as well as administrative and networked applications.

41

Florin Leon, Ingineria programarii, http://florinleon.byethost24.com/curs_ip.htm

Florin Leon, Ingineria programarii - Suport de curs, http://florinleon.byethost24.com/curs_ip.htm

of the world around them. Constructionism holds that learning can happen most eectively
when people are actively making things in the real world.

SQL (1974) SQL (Structured Query Language) is a database computer language designed for
the retrieval and management of data in relational database management systems (RDBMS),
database schema creation and modication, and database object access control management.
The rst version of SQL was developed at IBM by Donald D. Chamberlin and Raymond F.
Boyce. This version, initially called SEQUEL, was designed to manipulate and retrieve data
stored in IBMs original relational database product, System R. The core of SQL is formed by
a command language that allows the retrieval, insertion, updating, and deletion of data, and
performing management and administrative functions.
3.4.10

Industria software

In 1970, less than 1% of the public could have intelligently described what "computer software"
meant. Today, most professionals and many members of the public at large feel that they
understand software (but do they?). The 1970s saw the contract programming industry continue
to grow at a rapid pace. These companies came to be known as "professional services" rms
reecting the fact that they often provided a broad range of consulting, analysis and design
services in addition to programming.
The software products industry became rmly established as a viable source of software
for computer users. If, at the beginning of the 1970s, customers were skeptical that software
purchased from a vendor could meet their needs as well as software written in-house, by the
end of the decade almost all computer users were buying some portion of their software from
software products companies. As a result of unbundling, the hardware vendors were also major
players in the eld of software products.
Despite the growth and demonstrated success of the software industry throughout the 1970s,
it was not generally recognized as a major investment opportunity. Funding dried up even for
those professional services rms that had gone public in the 1960s and the newly-established
software products rms had an extremely di cult time raising capital to fund their growth.
Almost all of their growth was internally nanced and many of the company founders extensively
leveraged their own personal assets to keep the companies going. In 1978, a full ten years after
its founding, Cullinane Corporation went public, the rst software product company to do so.
It was founded with the intent to repackage and market software developed by users.
However, the perception of the software business as an investment opportunity was still a
number of years away.
Atari (1972) In 1972 Atari Inc., a video game and computer company was founded by Nolan
Bushnell and Ted Dabney. It was primarily responsible for the formation of the video arcade

42

Florin Leon, Ingineria programarii, http://florinleon.byethost24.com/curs_ip.htm

Florin Leon, Ingineria programarii - Suport de curs, http://florinleon.byethost24.com/curs_ip.htm

Prolog code

Atari Pong
Microsoft (1975) Following the launch of the Altair 8800, William Henry Gates III (Bill
Gates) called the creators of the new microcomputer, Micro Instrumentation and Telemetry
Systems (MITS), oering to demonstrate an implementation of the Basic programming language
for the system. After the demonstration, MITS agreed to distribute Altair Basic. Gates left
Harvard University, moved to Albuquerque, New Mexico where MITS was located, and founded
Microsoft there. On 1 January 1979, the company moved from Albuquerque to a new home in
Bellevue, Washington.
Oracle (1977) On 16 June 1977 Oracle Corporation was founded in Redwood Shores, California, as Software Development Laboratories (SDL) by Larry Ellison, Bob Miner and Ed Oates.
SDL was renamed to "Relational Software Inc." (RSI), and relocated to Sand Hill Road, Menlo
Park, California. The company decided to name the rst version of its agship product "version
2" rather than "version 1" because they believed customers might hesitate to buy the initial
release of their product.

43

Florin Leon, Ingineria programarii, http://florinleon.byethost24.com/curs_ip.htm

Florin Leon, Ingineria programarii - Suport de curs, http://florinleon.byethost24.com/curs_ip.htm

and modern video game industries. In Japanese, "atari" is the nominalized form of the verb
"ataru", meaning "to hit the target" or "to receive something fortuitously". They named the
company after a term from the "Go" game. "Atari" is equivalent to "check" in chess. The
rst game was Pong (a ping pong game). In 1973 they had sold 8,000-10,000 video consoles,
and would eventually sell upwards of 35,000. In one year, Atari made a little over $3.2 million.
However, there was a black side to this fortune. Atari never patented the design for Pong.
Since the game was designed using discreet logic, there was very little they could do to protect
their intellectual property. Anyone who owned a machine could open it up, examine the circuit
board, and copy it chip for chip. By the end of 1973, there were so many competitors selling
Pong style games, that Atari was no longer the leading manufacturer of its own game. Some
of the copies were made so well, they looked exactly like the original Atari versions. In 1975
the production of a exible video game console began that was capable of playing all of Ataris
games: Pong, Pong Double, Space Race, Gotcha! (a maze game). The result was the Atari
2600, one of the most successful consoles in history, sometimes called VCS for Video Computer
System.

3.5.1

Anii 80
IBM PC (1981)

The IBM Personal Computer, commonly known as the IBM PC, is the original version and
progenitor of the IBM PC compatible hardware platform. It is IBM model number 5150,
and was introduced in 1981. It was created by a team of engineers and designers under the
direction of Don Estridge of the IBM Entry Systems Division in Boca Raton, Florida. Alongside
"microcomputer" and "home computer", the term "personal computer" was already in use
before 1981. It was used as early as 1972 to characterize Xerox PARCs Alto. However,
because of the success of the IBM Personal Computer, the term came to mean more specically
a microcomputer compatible with IBMs PC products.

IBM PC
3.5.2

MS-DOS (1981)

In 1981, Microsoft released the rst version of MS-DOS (Microsoft Disk Operating System). It
was the most commonly used member of the DOS family of operating systems and was the
main operating system for computers during the 1980s. It was based on the Intel 8086 family
of microprocessors, particularly the IBM PC and compatibles. 8 major versions were released
before Microsoft stopped development in 2000. It was the key product in Microsofts growth
from a programming languages company to a diverse software development rm, providing
the company with essential revenue and marketing resources. It was also the underlying basic
operating system on which early versions of Windows ran as a GUI.
3.5.3

Sinclair Spectrum
si Commodore (1982)

In 1982, the Sinclair ZX Spectrum was released in the United Kingdom by Sinclair Research
Ltd. It was an 8-bit personal home computer, based on a Zilog Z80A CPU running at 3.5 MHz.
It was originally released in 1982 with 16 KB of RAM for 125 or with 48 KB for 175. It was
among the rst mainstream audience home computers in the UK. The introduction of the ZX
Spectrum led to a boom of companies producing software and hardware for it.

44

Florin Leon, Ingineria programarii, http://florinleon.byethost24.com/curs_ip.htm

Florin Leon, Ingineria programarii - Suport de curs, http://florinleon.byethost24.com/curs_ip.htm

3.5

A major rival to the Spectrum was the Commodore 64, also an 8-bit home computer released
by Commodore International in 1982 in USA, at a price of $595. It featured 64 KB (65536
bytes) of RAM with sound and graphics performance that were superior to IBM-compatible
computers of that time. During the Commodore 64s lifetime, sales totalled 30 million units,
making it the best-selling single personal computer model of all time. For a substantial period
of time (1983-1985), the Commodore 64 dominated the market with approximately 40% share,
outselling IBM PCs and Apple computers. Approximately 10,000 commercial software titles
were made for the Commodore 64 including development tools, o ce applications, and games.

Commodore 64 computer
3.5.4

Apple Macintosh (1984)

Macintosh, commonly nicknamed "Mac", is a brand name which covers several lines of personal
computers designed, developed, and marketed by Apple Inc. The Macintosh 128K was released
in 1984 and it was the rst commercially successful personal computer to feature a mouse and
a graphical user interface (GUI) rather than a command line interface. Through the second
half of the 1980s, the company established market share only to see it dissipate in the 1990s as
the personal computer market shifted towards IBM PC Compatible machines running MS-DOS
and Microsoft Windows. Production of the Mac is based on a vertical integration model in
that Apple facilitates all aspects of its hardware and creates its own operating system that is
pre-installed on all Macs. This is in contrast to most IBM compatible PCs, where multiple
vendors create hardware intended to run another companys software.

45

Florin Leon, Ingineria programarii, http://florinleon.byethost24.com/curs_ip.htm

Florin Leon, Ingineria programarii - Suport de curs, http://florinleon.byethost24.com/curs_ip.htm

Sinclair ZX Spectrum+ 8-bit computer

3.5.5

Microsoft Windows (1985)

Microsoft rst introduced the operating environment named Windows in November 1985 as
an add-on to MS-DOS in response to the growing interest in graphical user interfaces (GUIs).
Microsoft Windows came to dominate the worlds personal computer market, overtaking Mac
OS, which had been introduced previously. By 2004, it was estimated that Windows had
approximately 90% of the client operating system market.

A screenshot of Windows 1.01


Microsoft O ce is a set of interrelated desktop applications, servers and services, collectively
referred to as an o ce suite, for the Microsoft Windows and Mac OS X operating systems.
Prior to packaging its various o ce-type Macintosh software applications into O ce, Microsoft
released Mac versions of Word 1.0 in 1984, the rst year of the Macintosh computer; Excel 1.0 in
1985; and PowerPoint 1.0 in 1987. O ce was introduced by Microsoft in 1989 on Mac OS, with
a version for Windows in 1990. Initially a marketing term for a bundled set of applications, the
rst version of O ce contained Microsoft Word, Microsoft Excel, and Microsoft PowerPoint.
Additionally, a "Pro" version of O ce included Microsoft Access and Schedule Plus (a timemanagement software product whose functionality was later incorporated into Outlook). Over
the years, O ce applications have grown substantially closer with shared features such as

46

Florin Leon, Ingineria programarii, http://florinleon.byethost24.com/curs_ip.htm

Florin Leon, Ingineria programarii - Suport de curs, http://florinleon.byethost24.com/curs_ip.htm

The Macintosh 128K, the rst Macintosh, was the rst commercially successful personal
computer to use images, rather than text, to communicate

Microsoft Word 5.1a for Mac OS


3.5.6

Nu exist
a gloan
te de argint (Brooks, 1986)

No Silver Bullet - Essence and Accidents of Software Engineering is a well-known paper on


software engineering written by Fred Brooks in 1986. Brooks argues that there will be no more
technologies or practices that will serve as "silver bullets" and create a two-fold improvement
in programmer productivity over two years.
At the heart of the argument is the distinction between accidental complexity and essential
complexity. Accidental complexity relates to problems that we create on our own and can be
xed for example, the details of writing and optimizing assembly code or the delays caused by
batch processing. Essential complexity is caused by the problem to be solved, and nothing can
remove it if users want a program to do 30 dierent things, then those 30 things are essential
and the program must do those 30 dierent things.
Brooks claims that we have cleaned up much of the accidental complexity, and todays
programmers spend the majority of their time addressing essential complexity. One technology
that has made signicant improvement in the area of accidental complexity was the invention
of high level languages, such as Fortran. Todays languages, such as C++, Java and C# are
considered to be improvements, but not of the same order of magnitude.
3.5.7

World Wide Web (1989)

The World Wide Web (commonly shortened to the "Web") is a system of interlinked hypertext
documents accessed via the Internet. With a Web browser, a user views Web pages that may
contain text, images, videos, and other multimedia and navigates between them using hyperlinks. The World Wide Web was created in 1989 by British scientist Sir Tim Berners-Lee,
working at the European Organization for Nuclear Research (CERN, Conseil Europen pour la
Recherche Nuclaire) in Geneva, Switzerland, and released in 1992. (CERN is the organization
that built the Large Hadron Collider, the worlds largest and highest-energy particle accelerator complex, which lies underneath the Franco-Swiss border near Geneva, Switzerland, and
circulated its rst particle beam on 10 September 2008.)

47

Florin Leon, Ingineria programarii, http://florinleon.byethost24.com/curs_ip.htm

Florin Leon, Ingineria programarii - Suport de curs, http://florinleon.byethost24.com/curs_ip.htm

a common spell checker, OLE data integration and Microsoft Visual Basic for Applications
scripting language.

Limbaje de programare

C++ (1983) C++ is a general-purpose programming language, regarded as a middle-level


language, as it comprises a combination of both high-level and low-level language features.
Developed by Bjarne Stroustrup in 1979 at Bell Labs as an enhancement to the C programming language and originally named "C with Classes", it was renamed to "C++" in 1983.
Enhancements started with the addition of classes, followed by, among other features, virtual
functions, operator overloading, multiple inheritance, templates, and exception handling. The
C++ programming language standard was ratied in 1998 as ISO/IEC 14882:1998, the current
version of which is the 2011 version, ISO/IEC 14882:2011.
Perl (1987) Perl is a high-level, general-purpose, interpreted, dynamic programming language. Perl was originally developed by Larry Wall, a linguist working as a systems administrator for NASA, in 1987, as a general purpose Unix scripting language to make report processing
easier. Since then, it has undergone many changes and revisions and became very popular
among programmers.

Perl script

3.6
3.6.1

Anii 90
Windows

Windows 3.1x was a major release of Microsoft Windows. Several editions were released between 1992 and 1994, succeeding Windows 3.0. A special version named Windows 3.1 for
48

Florin Leon, Ingineria programarii, http://florinleon.byethost24.com/curs_ip.htm

Florin Leon, Ingineria programarii - Suport de curs, http://florinleon.byethost24.com/curs_ip.htm

3.5.8

Windows 3.1 desktop showing a customized color theme


Windows 95 was released in August 1995, and was a signicant progression from the companys previous Windows products. During development it was referred to as Windows 4.0
or by the internal codename "Chicago". Windows 95 was intended to integrate Microsofts
formerly separate MS-DOS and Windows products and included an enhanced version of DOS,
often referred to as MS-DOS 7.0. It featured signicant improvements over its predecessor,
Windows 3.1, most visibly the graphical user interface whose basic format and structure is still
used in later versions until Windows 7.
Windows 98 (codenamed "Memphis") was released in June 1998. Like Windows 95, it was
a hybrid 16-bit/32-bit monolithic product based on MS-DOS. Windows 98 was followed by
Windows Millennium in September 2000.
Windows NT 4.0 was released to manufacturing in July 1996. It is a 32-bit Windows system
available in both workstation and server editions with a graphical environment similar to that of
Windows 95. The "NT" designation in the products title initially stood for "New Technology"
according to Microsofts then-CEO Bill Gates.
3.6.2

Linux (1991)

Linux is a Unix-like computer operating system family which uses the Linux kernel. Linux is
one of the most prominent examples of free software and open source development; typically
all the underlying source code can be freely modied, used, and redistributed by anyone. Predominantly known for its use in servers, it is installed on a wide variety of computer hardware,
ranging from embedded devices and mobile phones to supercomputers. The name "Linux"
comes from the Linux kernel, originally written in 1991 by Linus Torvalds. The systems utilities and libraries usually come from the GNU operating system (GNUs design is Unix-like, but
diers from Unix by being free software and containing no Unix code; development of GNU was
49

Florin Leon, Ingineria programarii, http://florinleon.byethost24.com/curs_ip.htm

Florin Leon, Ingineria programarii - Suport de curs, http://florinleon.byethost24.com/curs_ip.htm

Central and Eastern Europe was released that had fonts with diacritical marks characteristic of
Central and Eastern European languages, including Romanian, and allowed the use of Cyrillic.

Hello everybody out there using minix Im doing a (free) operating system (just a hobby, wont be big and professional like
gnu) for 386(486) AT clones. This has been brewing since april, and is starting
to get ready. Id like any feedback on things people like/dislike in minix, as my
OS resembles it somewhat (same physical layout of the le-system (due to practical
reasons) among other things).
Ive currently ported bash(1.08) and gcc(1.40), and things seem to work. This implies that Ill get something practical within a few months, and Id like to know what
features most people would want. Any suggestions are welcome, but I wont promise
Ill implement them :-)
Linus (torvalds@kruuna.helsinki.)
PS. Yes its free of any minix code, and it has a multi-threaded fs. It is NOT
portable (uses 386 task switching etc), and it probably never will support anything
other than AT-harddisks, as thats all I have :-(.
Linus Torvalds
3.6.3

Browser-ul Mosaic (1993)

Scholars generally agree that the turning point for the World Wide Web began with the introduction of the Mosaic Web browser in 1993, a graphical browser developed by a team at
the National Center for Supercomputing Applications at the University of Illinois, led by Marc
Andreessen. Funding for Mosaic came from the High-Performance Computing and Communications Initiative, a funding program initiated by the High Performance Computing and
Communication Act of 1991, one of several computing developments initiated by Senator Al
Gore. Prior to the release of Mosaic, graphics were not commonly mixed with text in Web
pages, and its popularity was less than older protocols in use over the Internet, such as Gopher
and WAIS (Wide Area Information Servers). Mosaics graphical user interface allowed the Web
to become, by far, the most popular Internet protocol.

50

Florin Leon, Ingineria programarii, http://florinleon.byethost24.com/curs_ip.htm

Florin Leon, Ingineria programarii - Suport de curs, http://florinleon.byethost24.com/curs_ip.htm

announced by Richard Stallman in 1983, started in 1984 and was the original focus of the Free
Software Foundation, FSF). The primary dierence between Linux and many other popular
contemporary operating systems is that the Linux kernel and other components are free and
open source software.
In 1991, in Helsinki, Linus Torvalds began a project that was initially a terminal emulator,
which Torvalds used to access the large Unix servers of the university. He wrote the program
specically for the hardware he was using and independent of an operating system because he
wanted to use the functions of his new PC with an 80386 processor. Development was done
on Minix using the GNU C compiler, which is still the main choice for compiling Linux today.
Minix is a Unix-like computer operating system based on a microkernel architecture. Andrew S.
Tanenbaum wrote this operating system to be used for educational purposes. Its name derives
from the words "minimal" and "Unix".
Torvalds eventually realized that he had written an operating system kernel. On 25 August
1991, he announced this system in a Usenet posting to the newsgroup "comp.os.minix.":

3.6.4

Netscape Navigator (1994)

One of the central gures in the Netscape story is Marc Andreessen, cofounder of Netscape
Communications Corporation and co-author of Mosaic at the National Center for Supercomputing Applications. After his graduation from Illinois in 1993, Andreessen moved to California to work at Enterprise Integration Technologies. Andreessen then met with Jim Clark,
the recently-departed founder of Silicon Graphics. Clark believed that the Mosaic browser had
great commercial possibilities and provided the seed money. Soon Mosaic Communications
Corporation was in business in Mountain View, California, with Andreessen appointed as a
vice-president. The University of Illinois was unhappy with the companys use of the Mosaic
name, so "Mosaic Communications Corporation" changed its name to Netscape Communications and its agship web browser was the Netscape Navigator.
Netscape announced in its rst press release that it would make Navigator freely available to
all non-commercial users, and the Beta version of version 1.0 was freely downloadable. However,
within 2 months of that press release, Netscape apparently reversed its policy on who could
freely obtain and use the full version 1.0 release in December 1994, by only mentioning that
educational and non-prot institutions could use it at no charge.
During development, the Netscape browser was known by the code name "Mozilla", which
became the name of a Godzilla-like cartoon dragon mascot used prominently on the companys
web site.

Mozilla mascot
By 2002 its users had almost disappeared, partly because of Microsofts bundling (inclusion) of its Internet Explorer web browser software with the Windows operating system software
51

Florin Leon, Ingineria programarii, http://florinleon.byethost24.com/curs_ip.htm

Florin Leon, Ingineria programarii - Suport de curs, http://florinleon.byethost24.com/curs_ip.htm

Mosaic Web Browser Interface

3.6.5

Microsoft Internet Explorer (1995)

The Internet Explorer project was started in the summer of 1994 by Thomas Reardon and subsequently led by Benjamin Slivka, leveraging source code from Spyglass Mosaic, a commercial
web browser with formal ties to the pioneering NCSA Mosaic browser. Internet Explorer 1.0,
debuted in August 1995 and came with Microsoft Plus! for Windows 95 and OEM release of
Windows 95.
Presently known as Windows Internet Explorer, it has been the most widely used web
browser since 1999, attaining a peak of about 95% usage share during 2002 and 2003 with IE5
and IE6 but steadily declining since, despite the introduction of IE7. Microsoft spent over 100
million dollars a year in the late 1990s, with over 1,000 people working on IE by 1999.
Windows Internet Explorer 8 (IE8) was released in March 2009. In September 2010, Microsoft launched the IE9 Public Beta.
In September 2010, IE had a market share of 59.65% and Firefox had 22.96%.
In September 2013, IE had 28,56% and Firefox had 18,36%. Google Chrome had 40,8%.
3.6.6

Hotmail (1996)
si Yahoo! Mail (1997)

The Hotmail web-based email service was founded by Jack Smith and Sabeer Bhatia and
launched in 1996. Hotmail was one of the rst free webmail services. Hotmail was acquired
in 1997 by Microsoft for an estimated $400M, and rebranded as "MSN Hotmail". The current
version, "Windows Live Hotmail", was o cially announced in 2005 and released worldwide in
2007.
Yahoo! Mail was inaugurated in 1997. In early 2008, Yahoo! started oering unlimited
mail storage even to its non-paying users, in response to heated competition in the free-webmail
market segment, especially with Googles Gmail.
In 2009, with over 280 million users, Yahoo! was the largest web-based email service.
Windows Live Hotmail was the second most popular web-based free email service with over
270 million users, and Gmail had 146 million users.

52

Florin Leon, Ingineria programarii, http://florinleon.byethost24.com/curs_ip.htm

Florin Leon, Ingineria programarii - Suport de curs, http://florinleon.byethost24.com/curs_ip.htm

(both Microsoft products), and partly because Netscape corporation did not sustain Netscape
Navigators technical innovation after the late 1990s. The business demise of Netscape was a
central premise of Microsofts antitrust trial, wherein, the Court ruled that Microsoft corporations bundling of Internet Explorer with the Windows operating system was monopolistic, an
illegal business practice.
The war between Microsoft and Netscape was denominated the "Browser Wars". Version
1.0 and version 2.0 of Internet Explorer (IE) were mostly inferior to contemporary versions of
Netscape Navigator; IE 3.0 (1996) caught up competitively; IE 4.0 (1997) seemed-to, but did
not, over-take Netscape; IE 5.0 (1999), of improved stability, took market share from Netscape
Navigator, for the rst time.
In March 1998, Netscape released most of the code base for Netscape Communicator under
an open source license. The product, Netscape 5, used open-source community contributions,
and was known as "Mozilla", Netscape Navigators original code name. Netscape programmers
gave Mozilla a dierent GUI, releasing it as Netscape 6 and Netscape 7. After a long public
beta test, Mozilla 1.0 was released on 5 June 2002. The same code-base, notably the Gecko
layout engine, became the basis of independent applications, including Firefox and Thunderbird.
Gecko is the second most-popular layout engine on the World Wide Web, after Trident (used
by Internet Explorer for Windows since version 4), and followed by WebKit (used by Safari &
Google Chrome) and Presto (used by Opera).

In 2012, Gmail had 425 million users, Outlook.com had 325 million users, and Yahoo! Mail
had 298 million users.
Google (1998)

Google began in January 1996, as a research project by Larry Page, who was soon joined by
Sergey Brin, two Ph.D. students at Stanford University in California. They hypothesized that
a search engine that analyzed the relationships between websites would produce better ranking
of results than existing techniques, which ranked results according to the number of times
the search term appeared on a page. Their search engine was originally nicknamed "BackRub"
because the system checked backlinks to estimate the importance of a site. The name "Google"
originated from a common misspelling of the word "googol", which refers to 10100 .
Convinced that the pages with the most links to them from other highly relevant web pages
must be the most relevant pages associated with the search, Page and Brin tested their thesis
as part of their studies, and laid the foundation for their search engine. Originally, the search
engine used the Stanford University website with the domain google.stanford.edu. The domain
google.com was registered on 15 September 1997, and the company was incorporated as Google
Inc. in September 1998 at a friends garage in Menlo Park, California. In March 1999, the
company moved into o ces in Palo Alto, home to several other noted Silicon Valley technology
startups. Then it moved to a complex of buildings in Mountain View, known as the Googleplex,
bought in 2006 for $319 million.
In May 2013, the top search engines were: Google (66.7% of the total number of visits),
Bing (17.4%), and Yahoo! Search (11.9%).
3.6.8

Limbaje de programare

Python (1991) Python was rst released by Guido van Rossum in 1991 (labeled version
0.9.0) and reached version 1.0 in January 1994. It is a general-purpose, high-level programming
language. Its design philosophy emphasizes programmer productivity and code readability.
Pythons core syntax and semantics are minimalist, while the standard library is large and
comprehensive. It is unusual among popular programming languages in using whitespace as
block delimiters. Python supports multiple programming paradigms (primarily object oriented,
imperative, and functional) and features a fully dynamic type system and automatic memory
management, similar to Perl. The language has an open, community-based development model
managed by the non-prot Python Software Foundation. While various parts of the language
have formal specications and standards, the language as a whole is not formally specied. The
de facto standard for the language is the CPython implementation, written in C.

53

Florin Leon, Ingineria programarii, http://florinleon.byethost24.com/curs_ip.htm

Florin Leon, Ingineria programarii - Suport de curs, http://florinleon.byethost24.com/curs_ip.htm

3.6.7

PHP (1995) PHP (a recursive acronym for "PHP: Hypertext Preprocessor") is a computer
scripting language. Originally designed for producing dynamic web pages, it has evolved to
include a command line interface capability and can be used in standalone graphical applications. While PHP was originally created by Rasmus Lerdorf in 1995, the main implementation
of PHP is now produced by The PHP Group and serves as the de facto standard for PHP as
there is no formal specication. Released under the PHP License, the Free Software Foundation
considers it to be free software. PHP is a widely used general-purpose scripting language that is
especially suited for web development and can be embedded into HTML. It generally runs on a
web server, taking PHP code as its input and creating web pages as output. It can be deployed
on most web servers and on almost every operating system and platform free of charge. PHP
is installed on more than 20 million websites and 1 million web servers.
Java (1995) James Gosling initiated the Java language project in June 1991 for use in one
of his many set-top box projects. A set-top box (STB) is a device that connects to a television
and an external source of signal, turning the signal into content which is then displayed on
the television screen. The language, initially called "Oak" after an oak tree that stood outside
Goslings o ce, also went by the name "Green" and ended up later renamed as "Java", from
a list of random words.
Limbajul Java este rezultatul Stealth Project al Sun Microsystems, care avea ca scop
cercetarea n domeniul aplicabilit
atii calculatoarelor pe piata produselor electronice n vederea
cre
arii de produse electronice inteligente care s
a poat
a controlate si programate centralizat,
printr-un dispozitiv asem
an
ator cu o telecomand
a. Aceleasi cerinte de stabilitate si independenta n sisteme eterogene existau si pentru Internet. De aceea, desi proiectat initial n alte
scopuri, Java s-a potrivit perfect aplicatiilor world-wide-web.
Sun a prezentat formal Java n 1995. n curnd, Netscape a anuntat c
a va ncorpora suport
pentru Java n browser-ul lor. Mai trziu, si Microsoft a f
acut acelasi lucru, nt
arind rolul
limbajului Java n zona Internet.
The language derives much of its syntax from C and C++ but has a simpler object model
54

Florin Leon, Ingineria programarii, http://florinleon.byethost24.com/curs_ip.htm

Florin Leon, Ingineria programarii - Suport de curs, http://florinleon.byethost24.com/curs_ip.htm

This Python script will rename all les in a directory

JavaScript (1996) JavaScript is a scripting language most often used for client-side web development. It is a dynamic, weakly typed, prototype-based language with rst-class functions.
JavaScript was inuenced by many languages and was designed to look like Java, but be easier
for non-programmers to work with. Although best known for its use in websites as client-side
JavaScript, JavaScript is also used to enable scripting access to objects embedded in other
applications (Mozilla Firefox, Adobe Acrobat, Photoshop etc.) JavaScript, despite the name,
is essentially unrelated to the Java programming language, although both have the common C
syntax, and JavaScript copies many Java names and naming conventions. The language was
originally named "LiveScript" but was renamed in a co-marketing deal between Netscape and
Sun, in exchange for Netscape bundling Suns Java runtime with their then-dominant browser.
"JavaScript" is a trademark of Sun Microsystems.

3.7
3.7.1

Mileniul 3
Windows

The latest Windows products are: Windows 2000 (released in February 2000), Windows XP
(October 2001), Windows Server 2003 (April 2003), Windows Vista (January 2007), Windows
Server 2008 (February 2008), Windows 7 (October 2009), Windows 8 (October 2012).
Regula 80-20 Vilfredo Pareto (18481923) was an Italian sociologist, economist, and philosopher. He made several important contributions, particularly in the study of income distribution
and in the analysis of individualschoices. In 1906, he made the famous observation that 20%
of the population owned 80% of the property in Italy, later generalised into the so-called Pareto
principle (also termed the 80-20 rule).
One common adage in the IT industry is that 80% of all end users generally use only 20%
of a software applications features.
In 2002, after analyzing the results of the automated error-reporting tool (mini-dump),
Microsoft has learned that 80% of the errors and crashes in Windows and O ce are caused by
20% of the entire pool of bugs detected, and that more than 50% of the headaches derive from
a mere 1% of all awed code.
3.7.2

Mozilla Firefox (2003)

Mozilla Firefox is a free and open source web browser descended from the Mozilla Application
Suite, managed by the Mozilla Corporation. Dave Hyatt and Blake Ross began working on the
Firefox project as an experimental branch of the Mozilla project. They believed the commercial
requirements of Netscapes sponsorship and developer-driven feature creep compromised the
utility of the Mozilla browser. They created a stand-alone browser, with which they intended
to replace the Mozilla Suite. In April 2003 the Mozilla Organization announced that they
planned to change their focus from the Mozilla Suite to Firefox and Thunderbird (an e-mail
and news client).

55

Florin Leon, Ingineria programarii, http://florinleon.byethost24.com/curs_ip.htm

Florin Leon, Ingineria programarii - Suport de curs, http://florinleon.byethost24.com/curs_ip.htm

and fewer low-level facilities. Java applications are typically compiled to bytecode that can run
on any Java virtual machine (JVM) regardless of computer architecture.
As of May 2007, in compliance with the specications of the Java Community Process, Sun
made available most of their Java technologies as free software under the GNU General Public
License.
Java SE (Standard Edition) 7 was released in 2011.

Mozilla Firefox 3 was released in June 2008 and had over 8 million unique downloads the
day it was released, setting a Guinness World Record. Firefox had 19.73% of the recorded
usage share of web browsers as of August 2008, making it the second-most popular browser in
current use worldwide, after Internet Explorer.
Gmail / Google Mail (2004)

Gmail is a free web-based e-mail (webmail) service provided by Google. On 1 April 2004
the product began as an invitation-only beta release. In February 2007 the beta version was
opened to the general public. With an initial storage capacity of 1 GB, it drastically increased
the standard for free storage. In October 2013, the limit was 15 GB. It has a search-oriented
interface and a "conversation view" similar to an internet forum. Gmail is well-known for its
use of the Ajax programming technique in its design, and has tens of millions of users.
3.7.4

Google Chrome (2008)

Google Chrome is a web browser built with open source code and developed by Google.
The name is derived from the graphical user interface frame, or "chrome", of web browsers.
"Chromium" is the name of the open source project behind Google Chrome, released under
the BSD permissive free software license. (A permissive free software licence is a free software
licence for a copyrighted work that oers many of the same freedoms as releasing a work to the
public domain; in contrast, copyleft licences like the GNU General Public License require copies
and derivatives of the source code to be made available on terms no more restrictive than those
of the original licence.) A beta version for Microsoft Windows was released in September 2008
in 43 languages, and the public stable release was in December 2008. In September 2013, as
stated before, Chrome was the most widely used browser, with 40.8% of web browsers market
share.
3.7.5

Limbaje de programare

C# (2002) C# (pronounced C Sharp) is an object-oriented programming language developed


by Microsoft as part of the .NET initiative and later approved as a standard by ECMA (ECMA334) and ISO (ISO/IEC 23270). Anders Hejlsberg, the designer of Delphi, leads development of
the C# language, which has an object-oriented syntax based on C++ and includes inuences
from aspects of several other programming languages (most notably Delphi and Java) with a
particular emphasis on simplication.
In January 1999 Anders Hejlsberg formed a team to build a new language at the time called
"Cool". By the time the .NET project was publicly announced at the July 2000 Professional
Developers Conference (PDC), the language had been renamed C#. Hejlsberg stated that
aws in most major programming languages (e.g. C++, Java, Delphi, and Smalltalk) drove
the fundamentals of the Common Language Runtime (CLR), which, in turn, drove the design
of the C# programming language itself.
Este un limbaj modern care combin
a cele mai bune caracteristici ale celor mai folosite
limbaje de programare. Odat
a cu C#, Microsoft a lansat si platforma .NET, care permite
compilarea si interfatarea de programe scrise n limbaje diferite. C# este foarte asem
an
ator
n ceea ce priveste sintaxa cu Java, ns
a p
astreaz
a o apropiere mai mare de C++. Att Java
ct si C# compileaz
a mai nti ntr-un limbaj intermediar: Java byte-code, respectiv Microsoft
Intermediate Language (MSIL) sau Common Intermediate Language (CIL). n C#, compilarea
codului intermediar n cod nativ este ns
a mai ecient
a. C# are ca si Java un garbage collector,
care s-a demonstrat matematic c
a este foarte aproape de optimul posibil.
56

Florin Leon, Ingineria programarii, http://florinleon.byethost24.com/curs_ip.htm

Florin Leon, Ingineria programarii - Suport de curs, http://florinleon.byethost24.com/curs_ip.htm

3.7.3

In August 2012, C# 5.0 and .NET Framework version 4.5 were launched alongside the
release of Visual Studio 2012.
Re
tele sociale

A social network service focuses on building online communities of people who share interests
and/or activities, or who are interested in exploring the interests and activities of others. Most
social network services are web based and provide a variety of ways for users to interact,
such as e-mail and instant messaging services. Social networking has encouraged new ways to
communicate and share information. Social networking websites are being used regularly by
millions of people.
Among the many social networks websites (over 100), we can mention:
Facebook (2004): initially intended for college students, now allows general membership.
It had 1.15 billion unique visitors in August 2013;
Twitter (2006): A free social networking service that allows users to send "updates"
(text-based posts that are up to 140 characters long) via SMS, instant messaging, email,
the Twitter website, or an application such as Twitterric. In March 2013, it had 500
million unique visitors;
Google+ (2011): 343 million unique visitors in January 2013;
LinkedIn (2003): a tool for business networking, with 238 million unique visitors in August
2013.

Concluzii

Software cost now forms the major component of a computer systems cost. Software is currently
extremely expensive to develop and is often unreliable. In this course, we have discussed a few
themes regarding software and software engineering:
1. The problem domain for software engineering is industrial strength software;
2. Software is not just a set of computer programs but comprises programs and associated
data and documentation. Industrial strength software is expensive and di cult to build,
expensive to maintain due to changes and rework, and has high quality requirements;
3. Software engineering is the discipline that aims to provide methods and procedures for
systematically developing industrial strength software. The main driving forces for software engineering are the problem of scale, quality and productivity (Q&P), consistency,
and change. Achieving high Q&P consistently for problems whose scale may be large and
where changes may happen continuously is the main challenge of software engineering;
4. The fundamental approach of software engineering to achieve the objectives is to separate
the development process from the products. Software engineering focuses on process since
the quality of products developed and the productivity achieved are heavily inuenced
by the process used. To meet the software engineering challenges, software development
is a phased process.
The milestones of software history were also presented, since the 1940s until present.
57

Florin Leon, Ingineria programarii, http://florinleon.byethost24.com/curs_ip.htm

Florin Leon, Ingineria programarii - Suport de curs, http://florinleon.byethost24.com/curs_ip.htm

3.7.6

Das könnte Ihnen auch gefallen