You are on page 1of 3


Software Crisis 2.0

Brian Fitzgerald
Lero—the Irish Software Engineering Research Centre

Individual efforts to improve software development capability

are disjointed and not likely to deliver the capacity needed
to keep pace with advances in hardware technology and the
opportunities afforded by big data.

lthough barely 50 years life without technology, but who have B. Randell, eds., “Software Engi-
old, the software domain an enormous appetite for new techno- neering: A Report on a Conference
has already endured one logical applications. The advances in Sponsored by the NATO Science Com-
well-documented crisis. hardware technology along with the mittee,” NATO, 1968).
Software Crisis 1.0 first arose in the vast amounts of potentially available Over the years, several studies
1960s, with software taking longer data afford truly enormous opportu- have confirmed Software Crisis 1.0.
and costing more to develop than nities for individuals, business, and Per Flaatten and colleagues estimated
estimated, and not working very well society. the average project’s development
when eventually delivered. Neverthe- Unfortunately, we haven’t seen time at 18 months (Foundations of
less, software is one of the computing similar advances in software develop- Business Systems, Dryden Press,
revolution’s big success stories, spur- ment capability, giving rise to what I 1989)—a conservative figure, given
ring a huge shift in how we go about call Software Crisis 2.0. Individual that other estimates put the figure
our daily lives. efforts seek to address this crisis— at three years (“The Software Trap:
The past 50 years have also seen data analytics, parallel processing, Automate—or Else,” Business Week,
enormous advances in hardware new development methods, cloud ser- 9 May 1988, pp. 142-154) and even
capability, with dramatic reductions vices—but they’re disjointed and not up to five years (T. Taylor and
in hardware costs allied to equally likely to deliver the software develop- T. Standish, “Initial Thoughts on
impressive increases in processing ment capacity needed. Rapid Prototyping Techniques,” ACM
power and device proliferation. An SIGSOFT Software Eng. Notes, vol. 7,
almost infinite amount of data is now SOFTWARE CRISIS 1.0 no. 5, 1982, pp. 160-166).
available through ubiquitous sensors The term software was first coined Perhaps this isn’t surprising, given
and applications such as Google. in 1958 ( that an IBM study estimated 68 per-
Complementing these “push” mathtrek_7_31_00.html), but within cent of all software projects overran
factors is a significant “pull” factor 10 years, problems in software’s their schedules (P. Bowen, “Rapid
arising with the emergence of digital development and delivery led to the Application Development: Concepts
natives—users who have never known phrase software crisis (P. Naur and and Principles,” IBM document no.

0018-9162/12/$31.00 © 2012 IEEE Published by the IEEE Computer Society APRIL 2012 89

it undoubtedly follows a similar pat-

tern. In 2005, Eric Schmidt, Google’s
advances CEO, suggested that the amount of
PFLOPS, data available electronically com-
multiprocessors prised 5 million Tbytes (5 million
billion Mbytes), of which Google
indexed only .004 percent (http://
Digital natives
Software Desire to
Push Pull
Crisis 2.0 consume new does-one-measure-a-3285174.html).
technology At that time, Schmidt estimated the
amount of data to be doubling every
Data availability five years.
Proliferation of
sensors, other In 2010, Dave Evans, Chief Futur-
technology ist at Cisco Systems, estimated that
applications 35 billion devices were connected
to the Internet, which comes out to
more than five times the planet’s
Figure 1. Software Crisis 2.0. The demand for data from digital natives, coupled with
population (
the huge volume of data now generated through ubiquitous mobile devices, sensors,
and applications, has led to a new software crisis.
This figure is estimated to increase
94283UKT0829, 1994). In relation to demands of the devices and users that to 100 billion devices by 2020, giving
cost, the IBM study also suggested can manipulate it. rise to the concept of the Internet of
that development projects were as Things (IoT)—the virtual representa-
much as 65 percent over budget. BIG DATA AND THE RISE OF tion of uniquely identifiable things in
Indeed, the term shelfware was THE DIGITAL NATIVE an Internet-like structure. An exem-
coined to refer to software systems Many eye-catching figures and plar project designed for the IoT is
that are delivered but never used. statistics illustrate the enormous Hewlett-Packard’s plan to place one
Although the Standish Group con- advances in hardware capacity over trillion smart dust sensors all over the
tinues to paint a rather bleak picture the past half-century. Moore’s law, for world as part of a planet-wide sensing
of the high rates of software project example, predicted the doubling of network infrastructure. These sensors
failure ( hardware capacity roughly every 18 will detect a wide variety of factors,
newsroom/chaos_manifesto_2011. months. To put that in perspective, if including motion, vibration, light,
php), its findings and methodology you had invested just a single dollar temperature, barometric pressure,
have been challenged ( J. Eveleens when Moore first made his predic- airflow, and humidity, and will have
and C. Verhoef, “The Rise and Fall tion, and if the return on investment obvious applications in transporta-
of the CHAOS Reports,” IEEE Soft- had kept pace accordingly, your net tion, health, energy management, and
ware, 2010, pp. 30-36). In fact, I worth would be more than a $1 qua- building automation.
believe that Software Crisis 1.0 has drillion, or $1 million billion. Seeking to extend the IoT concept,
passed, and that the myriad, incre- Similar “laws” parallel Moore’s Usman Haque proposes an “ecosys-
mental advances pushing software prediction in relation to storage tem of environments” through his
development forward have changed (Kryder’s law) and network capac- Pachube project (
our lives for the better. Changes in ity (Butter’s law). On each occasion pachube.php), a service that allows
the practice have ultimately led the when hardware advances appear to consumers to tag and share real-time
field to the point where software is have halted due to an insurmountable sensor data from objects and environ-
now routinely developed largely on challenge in the fundamental laws ments globally. The potential success
time, within budget, and meeting user of physics—the impurity of atoms, of user-led innovation, co-creation of
expectations. light-wavelength limits, heat genera- value, and high-profile crowdsourcing
Unfortunately, Software Crisis 2.0 tion, radiation-induced forgetfulness, in solving complex R&D problems for
is now looming. As Figure 1 shows, for example—new advances have NASA, Eli Lilly, and Du Pont highlight
this crisis stems from the inability to emerged to overcome them. the crucial role of the digital native.
produce software that can leverage Although it’s extremely difficult By age 20, digital natives will have
the staggering increase in data gen- to quantify the increased volume of spent 20,000 hours online and can
erated in the past 50 years and the electronic data that potentially exists, cope with, and even welcome, an

abundance of information (S. Voda- same pace as hardware and data? The much more complex subsequent
novich, D. Sundaram, and M. Myers, Wirth’s law effectively summarizes steps require the identification of an
“Digital Natives and Ubiquitous the comparative evolution in the soft- agenda to resolve it.
Information Systems,” Information ware domain—namely, that software
Systems Research, vol. 21, no. 4, 2010, is getting slower more rapidly than Brian Fitzgerald holds the Frederick
Krehbiel Chair in Innovation in Global
pp. 711-723). hardware becomes faster (N. Wirth,
Business and Technology at the Uni-
Rather than resisting technology, “A Plea for Lean Software,” Computer, versity of Limerick. Contact him at
digital natives have an insatiable Feb., 1995, pp. 64-68).
appetite for its new applications. By
early 2012, mobile handset cellular The author acknowledges feedback

from Michael Myers and financial
subscriptions reached almost 6 billion e’ve entered an era in
support from Science Foundation Ire-
( which the limits of our
land grant 10/CE/I1855 to Lero—the
marketing-tools/latest-mobile-stats). imagination should be Irish Software Engineering Research
Although obviously not evenly dis- the only limiting factor in taking Centre.
tributed, this equates to almost 90 advantage of the past 50 years’ worth
percent of the world’s population. of advances to help solve intractable
problems in areas such as healthcare, Editor: Mike Hinchey, Lero—The Irish
RESEARCH INITIATIVES energy efficiency, and climate con- Software Engineering Research Centre;
Various initiatives have sought to trol. But the first step in solving any
address Software Crisis 2.0. Early problem is to acknowledge its actual
efforts in computer-aided software existence.
engineering (CASE) sought to auto- This article might seem contro- Selected CS articles and
mate software development, but they versial, but it seeks to accomplish columns are available for free at
haven’t solved the problem. Similarly, the easy part—naming the problem.
initiatives in software architectures,
patterns, reuse, and software product
lines sought to provide improvements
by building on existing foundations.
Software capability maturity models
and software development initiatives
such as method engineering have 26th IEEE International Parallel
likewise been the subject of research. & Distributed Processing Symposium
Several initiatives support hard-
ware advances in the a reas of 21-25 May 2012
multicore processing and parallel Regal Shanghai East Asia Hotel, Shanghai, China
computing. More recently, autono-
mous computing research has sought IPDPS is an international forum for engineers and scientists
to deliver self-maintaining systems from around the world to present their latest research
that would evolve automatically, an findings in all aspects of parallel computation.
attempt that builds on previous arti-
ficial intelligence initiatives such as
genetic algorithms. Efforts in relation Register today!
to the Semantic Web and ontologies
have also sought to address “big data”
Given the scarcely comprehensi-
ble increases in hardware power and
data capacity to date, it’s perhaps sur-
prising that we have yet to find a silver
bullet that delivers even a modest
one-order-of-magnitude improvement
in software productivity. Can we even
imagine what life would be like if soft-
ware development had evolved at the

APRIL 2012 91