Sie sind auf Seite 1von 70

ware

ents of VM
With the complim

VMware Special Ed

ition

s
n
o
i
t
a
c
i
l
App
Learn to:
Overcome app delivery and
management challenges
Build a winning
applicationstrategy

Charles Barrett
Mark Ewert
Ben Goodman

About VMware
VMware is a leader in cloud infrastructure and
business mobility. Built on VMwares industry-leading
virtualization technology, our solutions deliver a
brave new model of IT that is fluid, instant and more
secure. Customers can innovate faster by rapidly
developing, automatically delivering and more safely
consuming any application. With 2014 revenues of
$6 billion, VMware has more than 500,000 customers
and 75,000 partners. The company is headquartered
in Silicon Valley with offices throughout the world
and can be found online at www.vmware.com.

Applications

By Charles Barrett, Mark Ewert


andBenGoodman

Applications For Dummies


Published by
John Wiley & Sons, Ltd
The Atrium
Southern Gate
Chichester
West Sussex
PO19 8SQ
England
For details on how to create a custom For Dummies book for your business or organisation,
contact CorporateDevelopment@wiley.com. For information about licensing the
ForDummies brand for products or services, contact BrandedRights&Licenses@Wiley.com.
Visit our Home Page on www.customdummies.com
Copyright 2015 by John Wiley & Sons Ltd, Chichester, West Sussex, England
All Rights Reserved. No part of this publication may be reproduced, stored in a retrieval system
or transmitted in any form or by any means, electronic, mechanical, photocopying, recording,
scanning or otherwise, except under the terms of the Copyright, Designs and Patents Act 1988 or
under the terms of a licence issued by the Copyright Licensing Agency Ltd, 90 Tottenham Court
Road, London, W1T 4LP, UK, without the permission in writing of the Publisher. Requests to the
Publisher for permission should be addressed to the Permissions Department, John Wiley &
Sons, Ltd, The Atrium, Southern Gate, Chichester, West Sussex, PO19 8SQ, England, or emailed to
permreq@wiley.com, or faxed to (44) 1243 770620.
Trademarks: Wiley, the Wiley Publishing logo, For Dummies, the Dummies Man logo, A Reference
for the Rest of Us!, The Dummies Way, Dummies Daily, The Fun and Easy Way, Dummies.com and
related trade dress are trademarks or registered trademarks of John Wiley & Sons, Inc. and/or its
affiliates in the United States and other countries, and may not be used without written permission. All other trademarks are the property of their respective owners. Wiley Publishing, Inc., is
not associated with any product or vendor mentioned in this book.
Limit of Liability/Disclaimer of Warranty: The publisher, the author, AND
ANYONE ELSE INVOLVED IN PREPARING THIS WORK make no representations or
warranties with respect to the accuracy or completeness of the contents of this work and specifically disclaim all warranties, including
without limitation warranties of fitness for a particular purpose. No
warranty may be created or extended by sales or promotional materials.
The advice and strategies contained herein may not be suitable for every
situation. This work is sold with the understanding that the publisher is
not engaged in rendering legal, accounting, or other professional services. If professional assistance is required, the services of a competent
professional person should be sought. Neither the publisher nor the
author shall be liable for damages arising herefrom. The fact that an
organization or Website is referred to in this work as a citation and/or
a potential source of further information does not mean that the author
or the publisher endorses the information the organization or Website
may provide or recommendations it may make. Further, readers should be
aware that Internet Websites listed in this work may have changed or
disappeared between when this work was written and when it is read.
Wiley also publishes its books in a variety of electronic formats. Some content that appears in
print may not be available in electronic books.
ISBN: 978-1-119-09005-2
Printed and bound in the United Kingdom by Page Bros Ltd., Norwich
10 9 8 7 6 5 4 3 2 1

Table of Contents
Introduction. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
About This Book ...................................................................................1
Foolish Assumptions ............................................................................2
How This Book is Organized ..............................................................2
Icons Used in This Book ......................................................................3
Where to Go from Here ........................................................................3

Chapter 1: Its All About the Apps . . . . . . . . . . . . . . . . . . . . . . . . . 5


Understanding Apps ............................................................................5
Giving the People WhatTheyWant ..................................................6
Vive La (Continuous) Revolution ......................................................6
Working Out What You Need ..............................................................8

Chapter 2: A Brief History of Digital Computing


andComputer Applications. . . . . . . . . . . . . . . . . . . . . . . . . . . . 9
The First Steps .......................................................................................9
Moving On: The First Software .........................................................10
Computers Mean Business ...............................................................10
The IBM 650 ..............................................................................11
Technology triumphant: transistors
and integrated circuits .......................................................12
Mind your language ................................................................13
Innovation, innovation, innovation .....................................13
The 70s: A DIY Decade .......................................................................14
Finding the key .........................................................................14
Developing new languages and systems ............................15
The 80s: Up Close and Personal ......................................................16
Devices and desires ................................................................16
The rise of the gamer ..............................................................17
Scaling down the mainframe .................................................17
Programmers proliferate .......................................................17
Share and share alike: Thecompatibility problem ..........18
Its PC gone mad . . . ................................................................19
The 90s: PC Paradise ..........................................................................19
Communication and connectivity .......................................19
The Wonder of the Web .....................................................................20
Greeting the New Millennium ...........................................................22
Getting mobile ..........................................................................22
Embracing the cloud ...............................................................23

iv

Applications For Dummies

Chapter 3: Managing the Application Lifecycle:


Selection to Packaging. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25
Determining the App You Need .......................................................25
Determining Whether to Buildor Buy ............................................26
Buying off-the-shelf: Packaged software .............................27
Refining the job: Customizedsoftware ...............................28
You can go your own way: Thefully-customized route....29
It Takes Apps to Build Apps: Supporting Developers .................31

Chapter 4: The Application Lifecycle:


Deployment to Decommissioning. . . . . . . . . . . . . . . . . . . . . 35
Deploying Applications ......................................................................35
Installing apps manually ........................................................35
Using application deployment systems .............................36
Deploying through the OS .....................................................36
Using terminal servers ...........................................................36
SaaS at your service ................................................................37
Managing Applications ......................................................................37
Monitoring Applications ....................................................................38
Securing Applications ........................................................................39
Making identity secure ...........................................................39
Controlling access ...................................................................39
Introducing role-based accesscontrol ...............................40
Ensuring effective encryption ...............................................40
Decommissioning Applications .......................................................40

Chapter 5: Applications of the Future. . . . . . . . . . . . . . . . . . . . . 43


Innovation, Innovation, Innovation .................................................43
HTML5 .......................................................................................45
Native mobile applications ....................................................46
Hybrid mobile applications ...................................................46
Cloud/SaaS ................................................................................46
Enterprise mobility management (EMM) ..........................47
DevOps ......................................................................................47
Modern and Legacy Living Sideby Side ........................................48
Application virtualization ......................................................48
The Internet of Things (IoT) ..................................................49
2020 Vision (Star Gazing) ..................................................................49
Interactions and Considerations .....................................................51
Haptic Technology ..................................................................51
Gesture Recognition ...............................................................51
Augmented Reality (AR) ........................................................51
Context and Location Awareness ....................................................52
Geofencing ................................................................................53
Host Posture Checking ...........................................................53
Device Switching ......................................................................53

Table of Contents

Chapter 6: Ten Take-away Points. . . . . . . . . . . . . . . . . . . . . . . . 55


Buying or Building ..............................................................................55
SaaS and The Cloud ............................................................................55
Managing with Confidence ................................................................56
Dealing with Development ................................................................56
The Evolution of the Desktop ...........................................................56
Managing the Mobile Environment .................................................56
Getting to Grips with EMM ................................................................57
Blending Past and Present ................................................................57
The Rise of the Machines ..................................................................57
Dusting Off Your Crystal Ball ............................................................57

Appendix: Resources. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 59

vi

Applications For Dummies

Introduction

elcome to Applications For Dummies, your short and


sweet guide to understanding the ins and outs of
application delivery and management without having to sit in
a darkened room for a week.
Remember the days when employees all had PC terminals,
with a suite of software which was manually loaded by a guy
from IT? Well, theyre pretty much over.
Now, employees and contractors alike are using all manner
of new devices, requiring a seemingly endless conveyor belt
of new applications. And these bring with them a similarly
endless stream of challenges security, development
standards, obsolescence management you name it.
Applications are here to stay but were here to help. The tips
and advice in this book arent only about helping you to get
off on the right foot when it comes to supporting applications:
Theyll also help you improve end-user satisfaction and
productivity. And, more importantly, they ll enable you to
simplify the day-to-day management of end users, enhance
security, and contain costs in the process.

About This Book


This book aims to provide you with information to tackle and
support application initiatives. Its full of useful information
to help you understand the current application environment,
and how we all got here. It also brims with tips to help
you plan and implement an application strategy based on
your requirements. This guide can help you consider your
technology options and ensure that you cover all of your
bases before you get started.

Applications For Dummies

Foolish Assumptions
In writing this book, we made some assumptions about you:

You work in IT or within an IT organization.

You want to understand how to tackle application


management and delivery.

Youre looking for information, tips and tricks about how


to get started.

How This Book is Organized


Applications For Dummies is divided into six small but
perfectly formed chapters:

Chapter 1: Its All About the Apps. This chapter sets the
scene for the whole book, giving a brief run-down of apps
and application delivery, and the best approaches to
getting this right.

Chapter 2: A Brief History of Digital Computing and


Computer Applications. The story of how we got here:
Arun-through of the history of computing and apps.

Chapter 3: Applications: To Buy or to Build. This chapter


gets down to business and gives you the skinny on the
crucial choices you need to make when developing your
own software and buying it. It also gives you tips on
supporting in-house development.

Chapter 4: The Application Lifecycle: Deployment


through to Decommissioning. This chapter leads you
through all of the things you need to think about when
it comes to managing apps from rolling them out to
giving them a dignified send-off.

Chapter 5: Applications of the Future. This chapter takes


you on a whirlwind tour of where we think apps will be in
the next few years. Its a crazy place, and you need to be
ready for it.

Chapter 6: Ten Take-away Points. Here we give you ten


key points to remember, even if you forget everything
else.

Introduction

Icons Used in This Book


We highlight crucial text for you with the following icons:

The knotted string highlights important information to bear in


mind.

Home in on the target for tips to enable you to support the


new world of apps.

Watch out for these pitfalls.

Where to Go from Here


As with all For Dummies books, you can either read this
guide from cover to cover or flick straight to the section
that interests you. Whether you read it in small doses
using the section headings or all in one session, youll find
plenty of information to get you on your way to supporting
applications.

Applications For Dummies

Chapter 1

Its All About the Apps


In This Chapter
Grasping the importance of apps
Understanding what end users want in an app
Coming to terms with the permanent revolution in software
Laying out what you need to know to deliver great apps

here isnt an area of IT that is changing as fast as the


end-user landscape, whether in terms of the devices, the
way we access applications, the locations from which we
access applications or the volume of applications entering the
business.

Understanding Apps
Apps (or applications, to be precise) have been a nuisance
for IT since the dawn of computing, and are split between
client and server applications. This book focuses on the enduser type of applications (applications installed or launched
on a users desktop/mobile device): It covers how they have
evolved, where they are heading (we do a bit of star gazing)
and what you should be focusing on to ensure that your business can move forward, both handling modern applications as
well as supporting the legacy applications that your organization has grown up on.
Applications are some of the most expensive parts of any IT
organization, and they have a profound impact on the performance of the business, either negatively or positively, depending on the way in which they have been implemented. I only
wish my children would heed my advice of only subscribing to
the applications they need access to when I look at their tablet
computers, which are a rainbow of applications. Theycarry a

Applications For Dummies


myriad of applications, and I have no idea how many of them
got on there. And it is this simplified way of installing applications that typifies what end users have come to expect in
recent years . . .

Giving the People


WhatTheyWant
Enterprise consumers of applications want applications that
are simple to access, simple to install and intuitive to use. They
want the Facebook concept: Applications which users dont
need a manual to install but simply download, log onto and
enjoy. Sadly the same cant be said for applications that reside
within organizations. Why?, I hear you ask.
Well, I answer, one of the primary reasons for this is that IT
is showing its age. Many enterprises are working in an age of
social, mobile and cloud computing with applications that
were built in a previous generation, by a previous generation
of developers, using the most appropriate technology at that
time. Core business applications have frequently been built
from the ground up, leveraging an internal team of developers.
They are both costly to implement and manage, and importantly, are typically shackled to the technology available at the
time. This is opposed to the notion of commercial off-the-shelf
(COTS) software.
Approximately 60 per cent of IT budget is spent on maintaining
existing applications, leaving little with which to innovate and
move forward.

Vive La (Continuous)
Revolution
The IT industry has undergone significant transitions roughly
each decade, and you can boil those transitions down as
follows:

Mainframe computers: The 1960s and 1970s.

PCs and the first apps: The 1980s.

Client Server: The mid-1980s.

Chapter 1: Its All About the Apps


The Internet: From the mid-1990s.

Mobile computing: From the early 2000s.

Software as a Service (SaaS): From 2010 onwards.

Through all this, two factors have remained constant,


however:

The volume of applications increases.

End users demand better experiences.


If youre interested in understanding more about the application landscape where they have come from and the challenges you may face in any application delivery/modernization
project you are embarking upon then Chapter 2 is the place
for you!
The complexity of applications their architectures and
their criticality across the business is what makes them so
challenging.
In addition to some of the challenges applications inherently
pose, several industry trends have had significant effects on
applications and their delivery across the enterprise. These
are:

The proliferation of end-user devices, which is estimated


to grow to 9 individual devices owned per end user by
2020.

The advent of HTML5 programming languages.

The rise of mobile computing.

The support of BYOD/Consumerization of IT.

The emergence of Cloud computing.

The drive for data collaboration.


At the heart of this landscape lies a set of people, processes
and technologies that need to pull together to deliver applications in an aggregated manner (that is, from multiple sources)
into a simple interface accessible across any device, but
governed by the business. Weve heard this described as controlled freedom.

Applications For Dummies

Working Out What You Need


Application delivery and management is a critical feature of
any enterprise landscape and as organizations start to consume mobile, SaaS, and web apps along with traditional client
server applications, identity management becomes as critical
as the delivery mechanism.
In thinking about your application strategy you need to consider at a basic level the high-level segments in the following
list, and guess what? This book covers them all:

The history and future of applications: Where weve


come from, and where were going.

The presentation of those applications: How youre going


to deliver all this stuff.

Application protocols: How youre going access what you


need.

Security management: How youre going to make the


applications safe and secure to use, and keep them in
line.

Your end users: Who gets applications, how they get


them and why.

Access: How youre going to access your applications


from various locations and adhere to governance.

Virtualization: Whether youre going to virtualize your


applications, how youre going to do it, and which apps
youre going to do it to.

Licensing: Each delivery method has an impact on licensing options, and those options lead to cost. Work out the
right blend of delivery methods to support your cost
model.
Thats a lot to consider, and I think you will agree (if you are
like me) that you didnt get into IT to patch servers or desktops. You got into IT to make a digital difference and to simplify end-user experience. If so, I also hope you will agree that
there hasnt been a more exciting or complicated time to be in
the world of end-user technology.
We hope you enjoy your read.

Chapter 2

A Brief History of Digital


Computing and Computer
Applications
In This Chapter
Getting an overview of the history of apps
Understanding the developing requirements of business users
Getting to grips with the sheer scale of change in applications

he history of digital computing and computer applications


is a testament to human ingenuity and innovation. The
first digital computing devices were purpose-built. Limited in
power and capabilities, these computers were built to simplify
a critical task.
This chapter walks you through some of the key developments
in the history of computing.

The First Steps


In late 1943, British engineers at Bletchley Park built the
Colossus. Engineered to aid British military cryptographers in
deciphering encrypted messages during World War Two, the
Colossus is known as the worlds first programmable digital
computer, and was one of the key technological advances that
lead to the end of World War Two. But since Colossus was
purpose-built for war time cryptography, it had no use after
World War Two and was retired less than three years after it
wascreated.

10

Applications For Dummies


Meanwhile, also in 1943, the United States began to construct
the Electronic Numerical Integrator And Computer (ENIAC).
Completed in 1945, ENIAC was physically huge: weighing over
30 tons, it was able to perform 5,000 addition or subtraction
calculations per second, a thousand times faster than any
other machine at the time. ENIAC was designed to calculate
artillery trajectories for the United States Army, and like the
Colossus, it far out-performed the calculating capacity of
human mathematicians.
These early computers were all hardware. Modifying one of
these systems to do something different required that hardware be re-engineered, so most of these early computers had
only brief lifespans, despite performing incredibly important
tasks during their time.

Moving On: The First Software


The next major advance in computing came in 1948, when
a team at the University of Manchester in England built the
Manchester Small-Scale Experimental Machine. Nicknamed
Baby, this computer and its successor the Manchester Mark
I, were the first practical application of stored-program
computing. Prior to stored-program computers, programming a computer meant changing the position of hundreds of
switches (if you were lucky). In most cases it meant physically
re-wiring or even re-designing the system. Stored-program
computers changed everything. By being capable of receiving a set of instructions known as a program, stored-program
computers could be modified to perform new tasks without
rebuilding the hardware. These early programs were typically
stored on paper tape or punch cards which could be fed into
the computer whenever the program needed to be run. This
enabled computers to more rapidly handle more useful and
diverse tasks, simply by feeding them different programs.
Because they enabled computers to perform new tasks without changing the hardware, computer programs became
known as software.

Computers Mean Business


Although software enabled the Mark I and similar systems
of this era like the EDVAC and EDSAC to more easily perform

Chapter 2: A Brief History of Digital Computing and Computer Applications

11

different tasks, these computers were still very limited in


their capabilities. Being equally enormous in size and cost to
operate, most were only accessible to governments and universities. But by the early 1950s, computers had found their
way outside government and academia. One early corporation to use them was Dutch Shell Labs in Amsterdam. Shell
used a version of the Manchester Mark I manufactured by the
UK firm Ferranti to support oil refining research. In 1952 CBS
Broadcasting pioneered another application of computing
technology when it used a UNIVAC I built by the Remington
Rand Corporation to successfully predict the outcome of the
1952 Presidential election. Using just a 1 per cent sample of the
voting public the UNIVAC I correctly predicted a landslide win
for Dwight Eisenhower while political experts had predicted
a win for Adlai Stevenson. Known as the first mass produced
computer, Remington Rand sold 46 of the UNIVAC I despite a
price tag of more than a million US dollars at the time.
Another computer of the early 1950s that advanced the application of computing technology was the LEO, which was
designed for business. The first use of the LEO in 1951 was
to support the operation of UK bakeries. Each bakery would
phone in its orders for the day which would be entered into
LEO using paper tape or punch cards. LEO then used this data
to calculate required ingredients, develop baking and delivery schedules and even invoice customers. By the end of the
1950s, several organizations were using LEOs, including the
Ford Corporation UK for their payroll systems, and the UK
Meteorological Office which used a LEO for weather analysis.

The IBM 650


In 1954 the IBM Corporation brought the computer within
reach of more organizations with the release of the IBM 650.
Still weighing thousands of pounds but much smaller in size
than its predecessors, the 650 could be purchased for $500,000
US dollars, or leased from IBM for $3,500 per month. This
made it practical for more organizations to start using computers, and the relative ease of programming software for the IBM
650 lead to it being used to support a wide variety of applications. When the last 650 was built in 1962, IBM had sold nearly
2,000 of them.
While computers like the LEO, UNIVAC I and IBM 650 made
computing technology more accessible, they were still out of

12

Applications For Dummies


reach of all but the most deep-pocketed organizations. And
their complexity, large size, and exotic power and cooling
requirements limited their practical usage even within organizations fortunate enough to own one. Applications for these
computers were custom developed and tied to the system on
which they were programmed. Innovations we take for granted
today like off-the-shelf software and software portability
were still decades away. These computers were also far from
user friendly. Programs were input into these computers using
media such as punch cards or paper tape. Any problems such
as a small tear in a tape or punch cards out of order and the
application could not be loaded.
But these historic computers and their lesser-known cousins
helped advance computing technology. As computers demonstrated they could perform critical tasks, more and more
uses were envisioned for them. Government and university
researchers along with pioneering corporations like IBM and
Remington Rand rapidly advanced computing technology from
the mid-fifties through the sixties.

Technology triumphant:
transistors and integrated circuits
The big innovations during this period began with the transistor. In 1955 the transistor started to replace the vacuum tubes
that had provided the computing power for earlier machines.
A fraction of the cost and size of vacuum tubes, requiring
much less power and giving off far less heat, transistors enabled building smaller and smaller computers at lower costs
without reducing performance. Transistors actually enabled
computer processing power to steadily increase through the
sixties as engineers could harness thousands of them to build
systems that would have been simply impractical to build with
enormous and expensive vacuum tubes.
Innovations in transistors led to the development of the
integrated circuit in 1959. Combining advances in the miniaturization of transistors with the advent of semi-conductors,
integrated circuits took thousands of tiny transistors and
connected them together without wires using a conductive
material. Because wires were no longer needed, smaller and
smaller transistors could be created. Early integrated circuits
used germanium, but silicon was soon found to be a superior

Chapter 2: A Brief History of Digital Computing and Computer Applications

13

conductive material. And with silicon being an inexpensive


and abundantly available resource, integrated circuits continued the rapid reduction in size and expense of computing technology, while enabling design of ever more powerful machines.

Mind your language


Computer programming languages also evolved during this
period. Programming languages make it much easier to create
software applications that enable computers to perform useful
tasks. Before the advent of programming languages, computers
had to be programmed using obscure machine code that was
not easily understandable by anyone except those with intimate knowledge of the inner workings of each machine. In the
mid-1950s IBM created FORTRAN. One of the first high-level
programming languages, FORTRAN enabled computer programming using words. This was much easier to comprehend
than the obscure number and letter combinations of machine
code.
Another important programming language created during this
time was COBOL. Designed specifically to support the development of business applications, COBOL is an acronym for
COmmon Business Oriented Language. Remarkably COBOL,
and applications developed in COBOL, are still in use today
over fifty years after the programming language was created.

Innovation, innovation, innovation


By the end of the sixties, innovations like transistors, integrated circuits and high-level programming languages had led
to computers that were more powerful than ever, but at a fraction of the size and cost of those from a decade before. These
advances not only brought computing within the reach of
more organizations, they became practical for more and more
applications. While they had once taken up several rooms,
computers by the early 1970s had shrunk from massive to mini
to micro.
This was enabled by the development of the first microprocessor by the Intel Corporation in 1971. Known as the Intel 4004,
the microprocessor shrunk the computers core processing
components into a single general purpose chip. It was the
dawn of a new era of human productivity and the seeds of
todays vast computing landscape had been planted.

14

Applications For Dummies

The 70s: A DIY Decade


The 1970s were an incredibly innovative period in the history
of computing. Just a few years after development of the first
microprocessor, computing entered the realm of hobbyists.
The July 1974 issue of Radio Electronics magazine announced
the Mark-8 microcomputer, a build-it-yourself kit based on the
8008, the next generation of Intels microprocessor. Readers
could send away for the schematics and required parts list and
for the first time build a basic computer themselves. Soon the
Mark-8 was followed by another historic kit: The Altair 8800.
Announced in the January 1975 issue of Popular Electronics
magazine, the Altair 8800 was based on the more powerful
Intel 8080 processor. Much less expensive than other kits, the
Altair 8800 fueled a growing community of computer hardware
hackers. With no monitor, mouse or keyboard, users toggled
switches on the front of the case to input programs and data.
Although far from the user-friendly computers we have today,
the Altair 8800 enabled anyone interested to get started in
computing. Famously, the Microsoft Corporation was born
after the success of the BASIC programming language programmed for the Altair 8880 by Microsoft founders Bill Gates
and Paul Allen.

Finding the key


In April 1975, only a few months after the debut of the Altair,
the Olivetti Corporation previewed its P6060 computer. In just
three months, toggle switches for inputting programs had been
replaced by a keyboard. The P6060 also featured a thermal
printer, floppy disk drive and orange plasma display. It could
also be programmed using the BASIC programming language.
Computing was becoming personal.
IBM soon announced a similar computer, as did many other
pioneering computer companies like Commodore, Apple and
Atari. The success of the Apple I, released first in 1976 as a
build-it-yourself kit, launched the Apple Corporation. It led to
the development of one of the most successful early personal
computers: the Apple II. The Apple II featured a full-sized keyboard, green screen monitor and floppy disk drive.
Similar systems at the time included the Commodore PET
and the Atari 400 and 800. Radio Shack, headquarters for

Chapter 2: A Brief History of Digital Computing and Computer Applications

15

electronics hobbyists, also entered the market with the TRS


80. All of these machines were marketed not just for business,
but also for education and home use. By the start of the 1980s,
computers had escaped the large windowless rooms of governments and universities and were steadily being incorporated
into everyday life.
Although advances in microprocessors enabled the rise of personal computing, this phenomenon would not have been successful without equivalent innovations in software. The early
hobbyists built computers just for the sake of experimenting
with computing. Success was simply getting the system to run
at all. But soon, as more reliable and functional computing
hardware became available, software was required to get the
computer to do useful things.

Developing new languages and


systems
Two early types of programs were essential for the advancement of software itself, as both served to make computing
easier and more accessible. The first was the software programming language. It may be counterintuitive to think of programming languages as applications themselves, but the first
computers only understood machine code, which resembled
nothing like a language understandable by humans. Computer
programming languages, like FORTRAN and COBOL from the
1960s followed by BASIC, C and PASCAL in the 1970s, were software applications that made it much easier to program computers using language more understandable by humans.
Another essential early software application was the operating system. Simply put, an operating system is software that
makes it easier to operate a computer. Operating systems were
developed to enable the use of early mainframe computers,
and by the 1970s mainframes like the IBM/370 were shipping
with four different operating system options, including a version of DOS (Disk Operating System). DOS eventually evolved
to become the standard operating system for PCs by the mid1980s. Another important operating system that emerged in
the 1970s was UNIX. First created to make it easier to develop
applications for early mainframes, more and more features
were added to UNIX until it became an application used to
operate the mainframe itself. Since the 1970s UNIX has evolved

16

Applications For Dummies


to power millions of computing devices from the data center
to the wristwatch. It is also the ancestor of many of the operating systems we use today including Linux, OS X and Android.

The 80s: Up Close and Personal


The innovations of operating systems, programming languages
and microcomputers created a renaissance for computing and
computer software in the 1980s. Mainframes had evolved into
more powerful systems and spawned a middle tier of computing power known as minicomputers, or midrange. Historic
midrange systems include Digital Equipment Corporations
PDP line, the Hewlett Packard HP3000 and IBMs System/32.
Together mainframes and midrange systems ran the majority
of applications used by governments, universities and businesses. Terminals began to appear on more and more workers
desks. These very basic devices enabled users to use applications running on mainframe and midrange systems.
Soon word processing software accessed through terminals
began to replace typewriters. Accounting applications running
on mainframe and midrange systems helped more organizations manage their finances, and terminals began appearing in
airports for airline ticketing applications. But mainframe and
midrange computers had their drawbacks. They were expensive and only practical for providing multiple users access to
common applications and data.
Although the mainframe and midrange systems of the late
1970s and early 1980s had exponentially more power than
their predecessors, they still had limited resources. This
meant that organizations had to be selective about which
applications they chose to run and users were limited to running only these programs. Users were also out of luck if the
systems were down. Terminals could only provide access to
mainframe and midrange systems: they could not run applications themselves.

Devices and desires


Meanwhile, the early 1980s saw an explosion in the development of new personal computing devices. The success of
the Apple II and Commodore PET led to the Apple II plus and

Chapter 2: A Brief History of Digital Computing and Computer Applications

17

Commodore 64. Texas Instruments entered the market with


the TI 99/4A, as did Coleco with the Adam and Timex Sinclair
with the Z80 series. Even IBM, the mainframe market leader,
got into the game with the IBM Personal Computer in 1981. All
of these early systems had similar capabilities. They could use
televisions or monitors for displays. They supported printing,
floppy disks for loading programs and storing data, and even
early modems for communicating with other computers over
telephone lines. By 1984 there were more than 600,000 home
computers in the United States. Soon what began to separate
these machines was not so much the hardware, as the availability of software.

The rise of the gamer


These early personal computing devices provided a ravenous
market for new applications. Video games helped sell many
systems, following the success of early home video game consoles like the Atari 2600. Educational computing took off with
both Apple and Radio Shack marketing systems and software
specifically for schools. New applications soon enabled people
to use home computers to write letters, print greeting cards,
catalog their baseball card collection, and track their personal
finances.

Scaling down the mainframe


People also started to use these computers for business.
Frustrated with the limited capabilities and inflexibility of
mainframes and terminals, some office workers started bringing home computers into the office. Soon these home computers often became standard office equipment. These systems
also brought computing to many small businesses and organizations for the first time, as access to mainframe and midrange
systems had simply been unaffordable.

Programmers proliferate
All of these use cases accelerated the development of new
software applications. The availability of affordable computers
also enabled more and more people to learn how to program,
providing the developers were required to meet the demand
for new software. The 1980s saw the founding of Adobe,

18

Applications For Dummies


Electronic Arts, Aldus, Brderbund and dozens of other new
software companies. Ashton Tate found early success with
dBase, its database management system. Likewise IBM empowered accountants and financial planners with its Lotus 1-2-3
spreadsheet software, and the WordPerfect word processor
freed humanity from the misery of typewriters.

Share and share alike:


Thecompatibility problem
By the end of the 1980s it was becoming possible to do nearly
anything with a computer, but not all of the early systems survived the decade. As organizations started being able to do
more and more with computers they began to recognize the
need for compatibility. It was not possible to take a disk out
of a Commodore 64 for example, and read it on an IBM PC. So
sharing data between different systems was very difficult. The
burden of having different brands of computers running different software soon began to diminish their benefits.
When it rushed its first Personal Computer to market, IBM
unwittingly made two important decisions that solved this
problem, leading to the rise of the PC Compatible computer.
The first was the decision to use mostly off-the-shelf components that IBM did not invent. The second was to license DOS
from Microsoft for the PC operating system instead of developing its own. Not designing all of the system components was a
big departure for the mainframe giant. However, had it waited
to do things the old IBM way it is doubtful that IBM could have
entered the PC market in time. But while both of these decisions enabled IBM to get its PC to market quickly, they also
allowed other companies to manufacture compatible systems
known as clones. Because IBM did not own most of the components or the operating system, it could not prevent pioneering clone manufacturers like Compaq from building machines
themselves, using the same parts and also licensing DOS from
Microsoft. Since the parts and operating system were the
same, software written for the IBM PC also ran on these new
clone PCs and they were soon referred to as PC Compatible.
This meant that people could now buy either a PC made by
IBM, or a PC Compatible from a clone vendor and not have to
worry about problems sharing data or applications.

Chapter 2: A Brief History of Digital Computing and Computer Applications

19

Its PC gone mad . . .


As more and more people started to prefer machines that
were PC Compatible, much of the innovation and development of new hardware and software moved to the PC. This
helped accelerate the demise of most competing systems like
those from Commodore, Atari and Radio Shack. Apple was
one of the few exceptions, having followed its Apple II series
with the wildly successful Macintosh line of computers. The
Macintosh was far ahead of its time with a graphical user interface, mouse, sound processor and other pioneering features.
However its lack of PC compatibility confined its success to
desktop publishing, education, and the arts.

The 90s: PC Paradise


The early 1990s continued to see PC compatibles with
Microsoft DOS dominate the personal computer market. But
by the mid-80s, IBM itself was no longer the leading manufacturer of IBM compatible PCs. Dozens of other companies had
started selling PC compatibles including Hewlett-Packard,
Acer, Dell and Toshiba. Microsoft DOS had also been followed
by Microsoft Windows, a graphical user interface for the
PC. Taking a cue from Apples success with the Macintosh,
Microsoft Windows enabled users to control PCs with a
mouse, clicking on pictures known as icons to run applications
rather than having to type arcane DOS commands. Software
developers were able to harness the power and ease-of-use of
the graphical user interface, creating richer PC applications.
And as long as users had a PC compatible running Microsoft
Windows they could all run the same applications, which
made it easier to collaborate and share data. Within many corporations the need for PC compatibility shifted to a need for
connectivity.

Communication and connectivity


Although PCs had invaded the office they did not displace
mainframe or midrange systems, which still ran most mission critical applications. This caused many employees to
have both PCs and terminals crowding their desks. Data also
needed to be transferred between the mainframe and PCs,

20

Applications For Dummies


and between individual PCs themselves. These factors helped
advance the development of computer networking. Computer
networks enable multiple interconnected computers to communicate using a common language, known as a protocol. The
early 1990s saw corporations running wire in their buildings to
connect computers using Ethernet and Token Ring networks.
PCs were now able to communicate with each other, speaking
protocols like TCP/IP, which was to become the standard for
network communication. Users were now able to share data
between PCs and access each others devices such as printers.
This also enabled PCs to communicate with mainframe and
midrange systems, providing the connectivity required to support a new type of applications known as Client-Server.
Before Client-Server, applications ran either on mainframe
and midrange systems, or on PCs. With PCs fast replacing
dumb terminals, there was now computing power in the client
devices used to connect to the mainframe, not just in the
mainframe and midrange systems themselves. This enabled
software developers to write applications that took advantage
of the resources available on both the client and the mainframe/midrange systems which became known as servers.
Applications running on PCs were now able to access data
stored in server-based databases. Simple terminal-based interfaces were replaced with rich graphical applications harnessing the power of the PC and offloading work from expensive
servers.

The Wonder of the Web


Advances in computer networking, client-server computing
and operating systems with graphical user interfaces also
provided the parts required for one of the most revolutionary
technological advances in human history: The World Wide
Web. Since the 1960s researchers at governments and universities had been connecting their mainframe and midrange computers together over phone networks to enable communication and data sharing.
Throughout the 1970s and 80s more and more universities and
corporations connected to this growing, global network which
by the late 80s had become commonly known as The Internet.
One of the first applications to harness this connectivity was
electronic mail. Other applications that followed were FTP

Chapter 2: A Brief History of Digital Computing and Computer Applications

21

which enabled transferring files between computers, telnet


which was used to access other computers over the network,
and gopher, an application for searching and accessing data
and documents stored on other computers. But these early
Internet applications were text-based and cumbersome to use
due to their origin on mainframe and midrange systems.
In the early 1990s, a British scientist named Tim Berners-Lee
harnessed the power of the graphical user interface and a new
network communications protocol he dubbed the HyperText
Transfer Protocol (HTTP) to enable easy connectivity to
information and documents stored on other computers using
graphical links. This made it much easier to aggregate and
organize data stored on lots of different computers. These
links between information and computers became known
as webs and these webs were organized using pages of links
known as webpages, grouped together into websites. To
access these websites, users simply needed a new application
that spoke HTTP called a web browser running on an operating
system with a graphical user interface. As more and more computers all over the world began to host webpages using new
applications known as web servers, the entire system came to
be known as the World Wide Web. Any user with a computer
connected to the Internet using a client web browser could
now easily access information stored on web-servers all over
the world.
One of the first killer applications for the World Wide Web was
the search engine. Pioneered by companies like Yahoo, Lycos
and AltaVista in the mid-90s and followed by Google in 1998,
search engines actively indexed the Web, making it easy for
users to search and find web-based information. During the
late 1990s and early 2000s the number of active users on the
Internet exploded from just 16 million in 1995 to over 500 million by 2001. Software developers began using new programming languages designed for the web to create more advanced
applications. These new languages like Java, Javascript and
Flash enabled developers to advance the web from basic information organization and retrieval to supporting sophisticated
online applications.
This new breed of client-server application leveraged the
global connectivity and ease-of-use of the web. Soon people
were doing their banking, ordering goods and services and
communicating with friends and family using these web applications. Traditional PC applications started to integrate with

22

Applications For Dummies


the Internet and World Wide Web too, expanding the capability of client applications by leveraging the power of the global
network.

Greeting the New Millennium


Throughout the 2000s, computers became ever more powerful and communication networks got faster and faster. New
technological innovations like virtualization made it possible
to more efficiently harness the power of new computing hardware by enabling single servers to safely run multiple workloads that would have previously required several computers. Sophisticated monitoring and management systems also
helped IT departments keep up with the ever growing number
of computers and applications needed by their workforces.
While Microsoft Windows and IBM PC Compatibles have continued to be the dominant personal computing platform, a
resurgent Apple revived the Macintosh computer. But Apples
creation of the iPhone smartphone and the iPad tablet had a
much greater impact. Although many other companies had
sought market success with handheld devices, Apples combination of elegant hardware, a uniquely intuitive touch-driven
interface known as iOS and the innovation of Internet-based
music and application stores helped create a ravenous new
market for mobile computing and applications.
Google soon followed with its Droid smartphones and tablets,
running the Android OS. While Apple tightly controls development of the iPhone and iPad; Google made the Android OS
open source. This has enabled dozens of companies to manufacture Android compatible smartphones and tablets including
Samsung, LG, and Motorola making Android OS the number
one selling operating system.
In 2013 and 2014 Android outsold not only all other mobile
operating systems combined, but also all PC operating systems including Windows.

Getting mobile
Since their debut in the mid-2000s, billions of smartphones
and tablets have been sold globally, supported by millions of mobile applications. Although they have largely

Chapter 2: A Brief History of Digital Computing and Computer Applications

23

complemented and not displaced the personal computer,


mobile devices have transformed corporate computing.
Their touch screen interface, size and mobility have brought
computing and applications to places simply impractical for
personal computers. Revolutionizing inventory control, order
entry, logistics and educational computing; mobile devices
have also been incorporated into cars and mass transportation, integrated with GPS to support navigation and surveying, and even mounted on the walls of hotel rooms to provide
control over lighting and entertainment. Mobile devices and
applications have enabled humans to easily leverage computing while on the go, with access to the Internet and the world
at their fingertips. Computing and communications technology
are being integrated into seemingly everything, heralding the
Internet of Things. Soon just about anything including the
kitchen sink will be able to communicate, calculate, educate
and entertain.

Embracing the cloud


Web applications have also evolved. Advances in web development, programming languages and communications protocols
have enabled increasingly complex applications to move
online. This has led to a new class of software hosted entirely
online and sold via a subscription model instead of being sold
for purchase outright.
Known as Software as a Service (SaaS), these web-based applications negate the need for customers to install and manage
the systems required to run these applications themselves.
Instead this function is performed as a service by the provider
of the SaaS application. This has enabled organizations to rapidly provide new applications to their users without needing
the IT infrastructure and staff traditionally required.
The ever increasing power of computing hardware, alongside
innovations like virtualization, has also enabled Infrastructure
as a Service (IaaS). With IaaS, some or all of an organizations
server, storage and network infrastructures are hosted online
as a service. This reduces, or in some cases eliminates, the
need for organizations to own and manage these complex
infrastructures themselves. It is also now possible to provide your staff virtualized desktops hosted and managed for
you online. This is known as Desktops as a Service (DaaS).
Collectively the innovations of SaaS, IaaS and DaaS are known
as cloud computing.

24

Applications For Dummies


Now anyone with just a credit card can get access to online
computing power and applications millions of times more
powerful than the early computers, which only governments
and the largest corporations on Earth had been able to afford.
Digital computing and computer applications have advanced
at a phenomenal rate in 60 years, completely transforming
human civilization. Just imagine what is yet to come!

Chapter 3

Managing the Application


Lifecycle: Selection to
Packaging
In This Chapter
Deciding on the applications your business needs
Getting to grips with the build or buy question
Supporting your local developer

t their core, applications focus on two things: doing


something more efficiently, or entertaining us. In order
for them to exist, they need to do things better than could be
done without them. This chapter leads you through the main
criteria organizations use to select the right application for a
job, and covers the basics of packaging software.

Determining the App You Need


Applications are often written to increase efficiency. Whether
the job you need is communicating, executing a task, or operating a task or operating a machine, applications help you do
it better.
Examples of applications that help you communicate better
include:

Word processors

Presentation applications

Email

26

Applications For Dummies

Instant messaging

Social media
Applications which improve our working efficiency include:

Spreadsheets

Calculators

Manufacturing control software


Applications are also a big part of how we entertain ourselves,
whether theyre video games, video streaming, audio services
or even the code running inside your new smart TV.
Enterprises and software companies have different motivations for building apps. A software company tries to build
something that solves a problem, provides efficiencies or
entertains well enough that someone will pay money for it.
Ifsomeone is already doing these things, the software company must to it better, faster or cheaper than the existing
options.
For an enterprise which isnt in the software business the
question is more about whether the problem solved or efficiency gained by using a piece of software is greater than the
cost of that software. This is generally referred to as the return
on investment. Most organizations will do an evaluation to
ensure that the return (efficiencies gained or problems solved)
are greater than the investment (the money spent acquiring or
building and implementing the software).

Determining Whether to
Buildor Buy
If the first step is figuring out whether or not you need an
application, the next step is deciding whether to build the
app that is, develop it yourself or buy it.
Whether to build or buy is one of the most contentious decisions in information technology. The first step is truly understanding what you need. You must then marry those needs
with what is available in the market and decide whether whats
already out there fits with what you want.

Chapter 3: Managing the Application Lifecycle: Selection to Packaging

27

The next sections discuss the benefits and downsides of each


method of getting an application.

Buying off-the-shelf: Packaged


software
When you buy software, you are obviously paying someone
else to develop it. As most of the companies you will acquire
software from are for profit companies, you are not only
paying for their development costs, but for their sales, marketing and profit. However, because a piece of commercial
software is typically developed for many customers, the costs
and the profit can be spread amongst all those customers.
Additionally, when buying software, you know exactly how
much it will cost you to acquire it. Also, most enterprise software has a fixed maintenance cost, so youll know the cost of
ongoing software upgrades and technical support in advance.
However, according to Gartner, acquisition costs account for
only about 30 per cent of the total five-year cost of ownership.
This data shows us that deployment and ongoing IT support
costs can be an order of magnitude more than the cost of
just buying software. These components can be a significant
factor when trying to calculate the total cost of software
implementation.
If we get into greater detail around deployment, we can see
other concerns that we need to be aware of regarding Off
the shelf software. Commercial software is often developed
according to the design philosophy of the developing company, not that of the customer. Because of this, the software
may not fit an organizations operating environment. Perhaps a
company has standardized on Linux and its associated development and management stack, but the software that suits
their needs the best is a Windows application written with the
.NET stack. This difference in design philosophy could mean
that in order to accommodate this application, the company
needs to add additional infrastructure, tools, and even properly trained staff. This extra overhead naturally drives up the
cost and complexity of deploying that application.

28

Applications For Dummies

Refining the job:


Customizedsoftware
Another thing to consider when acquiring commercial software is that it is typically developed for a mass audience. That
means that it may not fit perfectly with what you are trying to
accomplish. Your organization may have to change its workflow or business practices to align with the application, or use
additional software to supplement the one that doesnt perfectly fit your needs.

Getting the right gear


An alternative is customization of the software. Many applications provide Software Development Kits (SDKs) or application programming interfaces (APIs) that allow organizations to
customize or build upon a particular piece of software. These
tools and interfaces provide a significant level of flexibility for
an organization, allowing them to modify or extend a piece of
commercial software. Rather like a tailor modifying an off-therack suit, this can allow a customer to get a more tailored fit
without having to have the suit made for them from scratch.

Watching the costs


However, there can be a significant cost of customization. First
off, you need to hire or assign developers that can do these
customizations using the SDKs and APIs that the application
provider has chosen or built. These skill sets may differ from
the ones you have in-house. These additional requirements
may force you to hire developers with those skills, retrain your
existing staff or perhaps hire an external consultant.
These requirements mean that a customization project can be
expensive, with all the risks of any other custom development
project.

Keeping the scope clear


If your customizations become too voluminous, a variety of
additional concerns arise:

Have you modified the software to a point where it no


longer fits the use for which it was originally designed?

Chapter 3: Managing the Application Lifecycle: Selection to Packaging

29

Does your customization make it hard for the vendor


to provide you with effective technical support and
patching?

What happens when the core software is upgraded?

Will your customizations continue to work?

Will you have a significant testing load every time the


software updates?

Will you have to bring back your consultants or developers to update your customizations every time you refresh
the core software?
The answers to these questions may vary based on the customization or the organization, but it is critical to take these
things into consideration as you modify or customize software
to fit your specific needs.

You can go your own way:


Thefully-customized route
Now that we understand the benefits and downsides of packaged software, perhaps fully customized software is the way to
go? Well, lets better understand the advantages and disadvantages of custom software as well.

Getting to the bottom line: cost pros and cons


Lets start with cost. If you are using existing development
talent available in house, you may be able to save money.
Of course, those people dont work for free, but if they are
already on staff and their salaries are accounted for, perhaps
that makes the development of software less expensive in
some respects. Also, the internal development team doesnt
need to assign funds to marketing and sales, and they rarely
need to make a profit from their development efforts.
Lets talk about the cost downsides. Although an internal
development team has little sales or marketing overhead and
rarely needs to make a profit, it is seldom as efficient as a software company which develops and ships software for a living.
As we mentioned earlier, software development firms also
spread the cost of their development across many customers,
as opposed to custom development in which the organization
bears all the costs. There are still costs to implement custom

30

Applications For Dummies


software, even if the software has been created to fit their
environment.

Delivering what you need


In addition to the cost aspects, software developed in-house
from scratch should provide a perfect fit for the needs of
the organization. As a result, there should be little need for
additional customization. The application would typically be
developed based on the organizational design philosophy and
designed from the ground up to run on the organizations preferred infrastructure. To use our earlier analogy, there should
be no need for tailoring of this suit because everything from
the cloth to the style was designed from scratch for you.

Avoiding scope creep


There is also an ever-present danger of scope creep, in which a
projects scope continues to grow and along with its costs and
time to value. Because commercial software is typically developed for many customers, the scope is based on an aggregate
of many customer requirements. Custom software is for a
single user, and therefore it is easy for that individual user to
broaden the scope until they lose focus. Also, unlike the acquisition of off-the-shelf software, the cost of developing custom
software is rarely fixed or as predictable. Organizations often
run into unexpected costs during development. When this
happens to a software company, they can spread those costs
across many customers, perhaps taking a hit in their overall
profit. However, an organization developing software for themselves, must absorb the risk and the cost all on their own.
These unexpected costs can be significant, and organizations
rarely budget for them.
Another thing to consider is the time associated with building
custom software from scratch compared with buying software
and deploying it. Even if the software requires customization,
there could be a significant delta in the time to value between
developing software from scratch rather than buying something that has already been built. The development of custom
software usually requires more time than making a purchase.

Keeping it working: ongoing costs


Finally, we come to the issue of ongoing costs. As we stated
earlier, with commercial software we have a good picture of
our costs over time. Although there may be deployment costs

Chapter 3: Managing the Application Lifecycle: Selection to Packaging

31

associated with new versions of software, fixed maintenance


means the costs of new version and software support are typically very predictable. This is not the case with custom software, where the cost of support and adding functionality falls
entirely on the organization. This unpredictability can be as
strong as the initial application development phase leading to
ongoing risk and cost.
So lets net out the build-versus-buy discussion. Buying offers
lower risk and often better predictability of costs, and because
the risk and cost are spread amongst many customers, commercial software tends to be less expensive than custom
software.
The trade-off is software that may not be a perfect fit, or may
require expensive customization to work. Also, if your needs
are too specialized, no software company may develop this
functionality because it wont be able to monetize it.
Custom software offers a perfect fit every time. It should fit
the needs, infrastructure and philosophy of the organization
that develops it. However, that custom fit often comes with
increased costs, risks and a longer time to value.
These considerations are why it is imperative to understand
your needs and your organizational requirements so you can
see how well commercial software can fill those needs and
meet those requirements without the needs for expensive
customizations. Only you can make the right determination
between build and buy, and only if you get all the facts.

It Takes Apps to Build Apps:


Supporting Developers
It may seem strange to some, but in order to build, deploy,
and run applications, we actually use large suites of . . .
applications.
Chapter 2 provides a brief history of computing in which
we see that as we moved from computing era to computing
era, the application requirements changed and so too did
the demands placed on the applications used to build those
applications.

32

Applications For Dummies


The concept of computer programming predates the digital
computer itself, and it has evolved rapidly, as the computer
has. Methods of programming computers have evolved
from moving gears, to manually turning on and off electrical switches, to using digital codes represented by 0s and
1s. Subsequently, weve seen the emergence of computer
languages that could be written so that humans could read
them before being translated or compiled into a machine- or
computer-readable format.
As the processing power of computers grew, and the requirements for applications became more complex and sophisticated, programming languages evolved. FORTRAN, designed
for scientific computing, gave way to COBOL, better suited for
business needs. From there weve seen a parade of programming languages, leading to todays popular program languages
such as C, C++ and Java.
Beyond the languages themselves, there are ever-evolving
tools which help developers write their code. The first tools
were punched cards or Hollerith cards. These were stiff pieces
of paper with holes punched in them to represent the machine
code and later, digital code. The first digital tools for writing code were simple text editors and compilers. We quickly
moved to graphical development interfaces allowing developers simpler tools for building their applications and often
offering real-time feedback on code syntax. These gave way to
the Integrated Development Environment (IDE), which brought
together the code editor, all of the debugging and testing technology, the tools for graphical interface building, and the compiler, into a single powerful tool.
For decades, applications were often written for and therefore
tied to particular operating systems. This lack of flexibility
was fine when a single operating system dominated the computer or the data center. Over time, as heterogeneity became
more of a rule than an exception, customers became apprehensive about tying their code to a single platform. Enter Sun
Microsystems and their Java programming language. Java was
designed to Write once, run everywhere with a concept of
a Java Virtual Machine (JVM) that ran on top of an operating
system. This capability theoretically allowed Java code to run
inside a JVM independent of the underlying operating system.
This technology rapidly evolved into a set of enterprise-class
interfaces and tools that allowed powerful applications to be

Chapter 3: Managing the Application Lifecycle: Selection to Packaging

33

run across many platforms. These applications were run inside


application servers, written for these application servers
instead of being tied to a particular operating system or platform. Other languages and services were inevitably encapsulated in application servers, and portable applications became
a reality.
As application code became further and further abstracted
from the operating systems, savvy cloud architects realized
that running the applications server itself in the corporate
data center was no longer a requirement. This realization led
to the emergence of Platform as a Service (PaaS). These multitenant application servers in the cloud, allowed people to
write their code in their IDE, and then push it to the cloud for
execution. This technology allowed the applications to run on
cloud infrastructure without the burden of managing servers,
operating systems or even application servers.

34

Applications For Dummies

Chapter 4

The Application Lifecycle:


Deployment to
Decommissioning
In This Chapter
Learn the history of application deployment
Get the lowdown on managing and monitoring apps
Understand the crucial importance of securing your apps
Decommission old apps with confidence

kay, so youve got your application, whether youve


bought it off-the-shelf or built it from scratch. This chapter continues the discussion of the steps of the Application
Lifecycle from deployment to decommissioning.

Deploying Applications
Before you can use an application, you have to deploy it.
Deployment is simply another term for installation. This section gives you what you need to know on deployment.

Installing apps manually


The oldest method for deploying applications to computing
devices was to install them manually. Performed either by a
technician or the end users themselves, manually installing
applications may be suitable for small organizations without many devices. But as the number of users and devices

36

Applications For Dummies


increases, the inefficiency of this manual approach becomes
obvious, as the number of technicians significantly increases
the amount of time required. This inefficiency is embodied in
the tongue-in-cheek IT term term SneakerNET, in which the IT
technicians sneakers replace the network as the method to
transfer applications and data between multiple devices.

Using application deployment


systems
To address this inefficiency, PC application deployment systems first began to appear in the 1990s. These systems use
the network to automatically deploy applications on multiple
personal computers. This makes it possible for a relatively
small number of technicians to deploy software to a very large
number of devices. These systems became particularly useful
as IT infrastructures grew to encompass multiple geographic
locations. From a central point a single technician could now
provide software to users located all over the world. However,
for all of their benefits, these application deployment systems
are very complex, requiring significant IT resources to support
and maintain.

Deploying through the OS


Another method of deploying applications is to include them
with the operating system image used to build and update
devices. This negates the need to separately install the applications as they are deployed with the base operating system.
However, updating the previously installed applications
requires either the use of an alternate deployment method or
the re-deployment of an entire, updated system image when
only a small application change may be required. Maintaining
stable system images is also an intricate task, made more
difficult as the number of applications included in the image
increases.

Using terminal servers


Access to applications can also be provided via special server
computers capable of supporting multiple users accessing
the same application simultaneously. These systems are commonly referred to as terminal servers. The applications they

Chapter 4: The Application Lifecycle Deployment to Decommissioning

37

provide are known as Hosted Applications as they run or are


hosted on behalf of users on the special Terminal Servers.
Other terms for this technology include Remote Applications
and Published Applications. Although this approach to application deployment negates the need to install them on every
device, the applications must still be deployed to the Terminal
Servers. As these servers support multiple users, the impact
of a failed installation has the potential to negatively impact
a much larger percentage of the user community than a failed
deployment to a single-user device. Hosted Applications also
require users to be connected to the network, and as such are
not a viable solution for users who need to use their applications offline. Like application deployment systems, Terminal
Servers are also complex to support.

SaaS at your service


SaaS applications may be a good alternative for organizations
that wish to avoid such complexity and reduce their reliance
on IT staff. SaaS, or Software as a Service, consists of applications hosted by a third party, negating the need for organizations to deploy and maintain the applications and supporting
systems themselves. Typically, SaaS applications are webbased and like hosted applications require users to be connected to the corporate network or Internet. Unlike traditional
applications, organizations commonly subscribe to SaaS applications instead of purchasing them, with any required deployment performed by the vendor on the vendors infrastructure.
Often access to SaaS applications can be provided to users
within minutes of subscribing, making them an ideal option for
organizations that need to rapidly deploy new computing functionality to their users.

Managing Applications
Like all technology, applications need to be managed.
Application management covers all aspects of the application lifecycle from application selection and handling
requests for new applications, to deployment, application
inventory, license tracking, end-user training, support, patching and planning for application upgrades, and ultimately
decommissioning.

38

Applications For Dummies


While much of application management entails more business
process than technology, applications are available to help
organizations manage their software investment. Typically
software deployment systems and PC Lifecycle Management
(PCLM) tools provide application management capabilities
while some organizations develop their own systems to support their unique processes.
Providing a stable, functional and cost-effective computing
environment to support the business and user community
requires comprehensive application management.

Monitoring Applications
You need to monitor applications that are essential to the business. You can monitor applications for availability, health, performance, security and faults, notifying the IT department of
application issues before end users notice anything is wrong.
Many enterprise applications have monitoring capabilities
built in. These applications are able to log errors and automatically send alerts and performance data to support technicians.
If applications do not have comprehensive monitoring capabilities, you can still monitor them using special monitoring
systems. These systems are typically capable of monitoring
multiple applications, and some also keep an eye on the health
of the underlying computer and network infrastructures supporting the application.
However these monitoring systems are often expensive
and complex, requiring a significant investment of time and
resources by IT. For organizations that do not have the capability to support such systems, Software as a Service (SaaS)
applications are a good solution. One of the value-added benefits of SaaS applications is that both the application and its
supporting infrastructure are monitored for the customer by
the SaaS vendor.

Chapter 4: The Application Lifecycle Deployment to Decommissioning

39

Securing Applications
Securing the computing infrastructure against misuse and
data theft is absolutely critical, and this includes applications.
Many applications provide access to sensitive and valuable
data. With identify theft and hacker attacks on the rise, you
must properly secure all applications.

Making identity secure


Securing applications and data requires a multi-pronged
approach. Multiple defenses are required to provide comprehensive and effective security. One of the first lines of defense
is user authentication. Tightly controlling which users have
access to an application starts with being able to accurately
identify each user and to protect their identity from misuse.
User accounts and passwords are the most basic forms of
user identify and access control. Since user passwords are frequently the weakest link in the security chain, more advanced
mechanisms like two-factor authentication systems provide a
higher level of security and user identification. These systems
typically use a password combined with a unique hardware or
software generated token which only the individual user can
access. The combination of a password plus the unique token
provides a higher level of security and validation of the users
identity.

Controlling access
The next line of defense is access control. Some applications
have their own mechanism for limiting access to only authorized user accounts. Many applications rely on the devices
operating system and a centralized directory of user accounts
to govern access. This approach helps reduce the burden on
both IT departments and end users as there are fewer user
accounts for IT to track and passwords for users to remember.
Many modern application control systems feature a single
sign-on capability. Single sign-on systems aggregate access to
multiple applications with separate authentication systems
using a single user account. This simplifies access for end
users while providing IT a central point to control and monitor
application usage.

40

Applications For Dummies

Introducing role-based
accesscontrol
Once the user has been authenticated and permitted access,
some applications further limit what the user can do with the
program. Applications with role-based access control (RBAC)
are able to define different classes of users, each with different
levels of authority to use parts of the application. A banking
program for example, might have one role for tellers enabling
them to service customer accounts but preventing them from
creating new ones. A supervisor role could provide all the
capabilities of a teller, plus the ability to monitor the activities
of all tellers. A security officer role would enable monitoring
use of the system to help detect and prevent fraud, but not
allow security staff to manipulate customer accounts. And an
administrator role would provide access to the entire system,
including the ability to assign user accounts to specific roles.

Ensuring effective encryption


In addition to authenticating valid users and controlling what
they can do with applications, it is important to secure the
application data. Encryption is one of the best ways to secure
data, both when it is at rest, stored within the application or
on disk; and when it is being transmitted across the network.
Encryption protects data by scrambling it so that is unreadable until decrypted when accessed by an authorized user.

Decommissioning Applications
When you no longer need an application, you should decommission it. Decommissioning an application means removing
it from service. Sometimes you need to do this because a new
application provides a better solution or includes the functionality provided by the one being decommissioned. The need
may also be because of changing business requirements that
make the old application obsolete.
Whatever the reason, when an application is decommissioned
you must remove it from any systems it has been installed
upon. Even if an application is no longer used, it can still cause
conflicts with other applications. Removing decommissioned

Chapter 4: The Application Lifecycle Deployment to Decommissioning

41

applications may also be required in order to comply with software license agreements that require an application to be fully
uninstalled when the terms of the agreement end.
Applications that are installed also need to be updated with
security patches, even if they are no longer needed. Leaving
them installed could make it possible for a hacker to use them
to steal data or disrupt the business.
Removing an application from a device is typically the reverse
of the process used to install it. Applications can be manually
uninstalled by technicians or the end user by using the applications uninstallation routine. Most application deployment
systems for computers, tablets and smartphones are also able
to uninstall applications, reversing the process they used to
install them. If the application was installed as part of a system
image, like those used to deploy virtual desktops and hosted
application servers, it can be removed by deploying a new
image.
When you have fully decommissioned an application, be sure
to update application inventories and system documentation
to help maintain a supportable infrastructure.

42

Applications For Dummies

Chapter 5

Applications of the Future


In This Chapter
Understanding the scope of the permanent IT revolution
Merging innovation and legacy
Checking out the crystal ball: apps in 2020
Getting to grips with context and location awareness

kay, so if you have been reading the book (and enjoying


it) you will realize that there is more to applications than
meets the eye. Thats no bad thing, as it makes you think hard
about how you can do things differently to make improvements to the business and make end users happy (happier,
anyway). In this chapter we are going to perform a bit of star
gazing, to hopefully put into context where we believe the
future of applications is headed, what happens to devices, and
most importantly, why should we care. We also talk about the
elephant in the roomthe Internet of Things (IoT)and how
this phenomenon will will impact both the consumer world
and the business world.

Innovation, Innovation,
Innovation
To look forward we need to briefly look back, and not too far:
Its only in Chapter 2. IT has evolved at an alarming rate since
its first beginnings in the 1940s, but the one thing that has
remained constant is the velocity at which complexity and
choice has continued to impact organizations. The pace of
innovation has been phenomenal in the last 10 years, to the
point where it is easy to see why end users are disappointed
with the IT they use at work, because in reality they are IT

44

Applications For Dummies


administrators in their personal lives. And with the advent of
the IoT they are looking to become mini network administrators too. Its clear that any move to a modern application delivery model needs to coexist with legacy applications that the
business still depends upon. However, that doesnt stop you
planning for the future, or what many organizations are calling
their 2020 Vision.
Applications in the future will become even further removed
from the desktop itself. As we move forward we expect (as
our customers are telling us) the enterprise desktop to significantly decline in importance with the rise of Bring Your
Own Device (BYOD) and Choose Your Own Device (CYOD)
programs driving a different way of operating across the business. Expect devices to all be treated as untrusted, with management moving towards the newer model of mobile device
management.
So what will the future hold for enterprise applications? Well
we dont think it will come as any surprise that you can expect
to find organizations adopt the mobile cloud architecture.
This concept will see applications built for cloud economies of
scale and consumed by users with mobile devices. As it stands
to date, the consumer world of tablet applications leverages
a mobile client/cloud architecture and it is evident that this
models successsupported with agile software development methodsis attractive to enterprise businesses that are
desperate to innovate and maintain their relevance in their
marketplace.
As organizations continue to try to battle the flood of new consumer technology that is entering the business, whether thats
in the realm of traditional devices (laptop, phone or tablet)
or the more recent additions of wearable technology (smart
watches), the challenges will be to achieve one of two ends:

Develop applications that are consumable across all


operating systems.

Develop applications that are native for a preferred operating system.


The latter option is useful if you want to take advantage of
native device characteristics, but poses both technical and
commercial challenges when supporting BYO environments. In
this new world expect to hear about technologies/frameworks

Chapter 5: Applications of the Future

45

such as HTML5, EMM, Fluid and DevOps. Great, I hear you


say, more acronyms! Lets explore each of these areas a little
bit.
It is worth thinking about your options for application development at this juncture as you proceed down a route of device
independence with a cloud/mobile first mindset. These options
are broadly categorized as follows:

Native apps are specific to a mobile platform using the


development tools and language that the platform supports. Native apps look and perform the best, taking
advantage of the physical capabilities of the device.

HTML5 apps use standard web technologies (typically


HTML5, JavaScript and CSS). This is a write-once-runanywhere approach and creates cross-platform mobile
applications that work on multiple devices. Some limitations remain, specifically session management, secure
offline storage, and access to native device functionality
(camera, calendar, accelerometer geolocation, and so on).

Hybrid apps make it possible to embed HTML5 apps


inside a thin native container, combining the best (and
worst) elements of native and HTML5 apps.

HTML5
HTML5 is seen as a silver bullet by many organizations to solve
the challenge of cross platform development, but in reality you
cant beat developing applications that are native to the device,
which can leverage built-in device capabilities. But as we have
already discussed that comes with technical and cost challenges depending on your hardware policy.
An HTML5 mobile app is basically a series of web pages that
are designed to work on a small form factor. HTML5 apps are
device agnostic and can be opened with any modern mobile
browser. HTML5 has become a very popular way for building
mobile applications. Multiple UI frameworks are available for
solving very complex problems that stop you having to reinvent the wheel.

46

Applications For Dummies

Native mobile applications


Native apps provide the best usability, functionality and overall mobile experience. Native apps are usually developed using
an integrated development environment (IDE). IDEs provide
tools for building, debugging, project management, and version control.
Native apps give you direct access to:

Touch features

Fast graphics

Hardware components

Fluid animation

Ease of adoption
Sounds great, right? But remember, you might have a plethora
of devices and operating systems to support across your
business.

Hybrid mobile applications


Hybrid development, as you might assume, combines the best
of both development environments. A hybrid app is ultimately
defined as a web application that is mainly built from HTML5
and JavaScript, and is then wrapped inside a native container,
providing access to native features on the device.

Cloud/SaaS
Like it or hate it, cloud is no longer a buzzword with limited
meaning: It has become a de facto standard which organizations are adopting to deliver scalable end-user services. In the
context of end-user applications we are specifically talking
about SaaS applications.
Organizations are adopting SaaS applications, which have historically been in-house services involving, for example, collaboration and messaging, to reduce cost and support the organizations entrance into the cloud era. One of the challenges with
SaaS happens when you need to standardize on an authentication mechanism and wish to provide a single password, to

Chapter 5: Applications of the Future

47

drive down costs on the service desk and improve end-user


adoption.
To support this across tablet and traditional devices you need
to think about a workspace service that aggregates SaaS, Web,
Mobile and Windows applications into a context (device)
aware service. This workspace becomes the main point of
entry for the end user into the corporate world of applications
and data, and handles secondary sign-on to other applications.

Enterprise mobility management


(EMM)
Clearly, the enterprise world has become despondent with
desktop services and the application challenges that come
with them. A common approach is to treat traditional computing devices in the same way as are with untrusted mobile
devices. These devices are not subjected to traditional management processes such as application lifecycle management
and group policy processing which are notoriously painful to
execute and manage.
One school of thought wishes to lift the technology and capabilities that have been created in supporting mobile devices,
and apply that simplicity and cost model to desktop computing. Welcome to the world of EMM. EMM is the collection of
people, processes and technology focused on managing the
increasing volume of end-user assets in a business context.
EMM is your mobile estates best friend. If youre adopting
mobility in your business you need EMM to protect your end
users and your business and simplify the management of that
estate.

DevOps
As your organization continues to move to a more agile way
of developing business applications that support the mobile
cloud era, a fundamental shift in your development cycle
needs to happen: Hence the industry-wide interest in DevOps.
DevOps is a software development methodology that highlights the importance of communication, collaboration, integration, automation, and measurement of cooperation. DevOps

48

Applications For Dummies


acknowledges the interdependence of software development
and IT operations. It aims to help an organization rapidly produce software products and services and to improve operations performance.
DevOps attempts to encourage the development of communication skills, understanding the business landscape that the
application is being developed for and, importantly, ensuring it
succeeds in its purpose.

Modern and Legacy Living


Sideby Side
Okay, so now you know about some of the technologies and
development frameworks that the industry is adopting or
considering. Its time to think about how the new and old will
coexist. 25 or more years of applications will not be wiped out
anytime soon, but your desktop and devices will be changing
rapidly.

Application virtualization
An industry-accepted way to support legacy applications with
modern ones is Application Virtualization. Many organizations
have a desire to move towards an Application as a Service
model where they are no longer in the business of managing
desktops. Their future vision is SaaS, mobile and browser but
they still have the annoyance of Windows client server applications. This is where application virtualization becomes strategic in your migration plans.
Application virtualization is software technology that
abstracts and encapsulates the application from the underlying operating system on which it is executed. A fully virtualized application is not installed, although it is still executed as
if it were. The application behaves at runtime like it is directly
interfacing with the original operating system and all the
resources managed by it, but can be containerized to varying
degrees.
When you combine this model with your SaaS and mobile
applications you need to aggregate them to the end user to
simplify access. The role of the workspace (amongst others)

Chapter 5: Applications of the Future

49

supports the coexistence of legacy and modern application


architectures side-by-side and is typically presented through
a web browser. Your workspace should be context aware (see
later) and will become the heart of your end-users experience.

The Internet of Things (IoT)


The concept of the Internet of Things isnt new. It has been
described for 20 years as a world where things (devices and/or
sensors) are connected and able to share data. As with many
pervasive technologies, the IoT hasnt just become adopted
for any random reason. Like many technologies that just
make sense there is a coming together of other supporting
technologies to create the perfect storm. The world of IoT is
no different and is brought to life by supporting technologies
such as internet connectivity, cost of hardware, smarter software, and ultimately machine-to-machine communications.
These factors combined are driving the IoT marketplace.
As these things connect and begin sharing data, they bring
huge improvements in logistics, employee efficiency, energy
consumption and personal productivity. This is the promise of
the Internet of Things (IoT).
In 2013 there were 13 billion online devices. It is estimated that
by 2020 there will 50 billion devices, some of which havent
even been invented yet! Which brings me nicely onto 2020:
Well, what have you got planned?

2020 Vision (Star Gazing)


So in 2020 will your desktop be on the next version of
Windows? Hey, will you even still have a desktop? What will
your application strategy be and, importantly, what will your
daily worklife pattern be? Its worth pausing and thinking
about the interactions in 2020 that may or will change based
upon technology.
Think about your daily routine and try to imagine how technology will impact your working life for the better. You get up in
the morning with an alarm clock that is Internet connected.
It knows your work schedule and also the commute challenges that lie ahead for you, so it changes your alarm time

50

Applications For Dummies


dynamically. Once up, youre into the bathroom, where the
shower door has the ability to become an interactive screen
that displays all the things you need to consider during the day
and any business issues that have occurred overnight to bring
you up to speed.
As you move to the kitchen, you find a set of white goods that
are connected to each other and to a smart wall telling you
vital stats about your house (temperature, security, and energy
usage, for example). The same smart wall also has the ability to
carry on the interactive session you started in the shower.
In 2020 expect a greater emphasis on work-life balance, so be
prepared to deliver not only applications and data to any location but also to holographic teleconferences. Technology that
has been used in concerts such as Musion with great success
will continue to find its way into consumer lives at a cheaper
price. In doing this we have the ability to collaborate at a time
that works for us and is not dictated by traditional 9-5 working
practices.
The car is the next place where youll notice change. Expect
to see vehicles that are driverless and/or highly connected
through machine-to-machine communications. Your cars vital
stats will be monitored by garages, changing the model associated with servicing. Your car will also have GPS for things such
as emergency services or breakdown services. Satnav will be
standard, but in a more HUD model whereby it is in the drivers
line of sight but not impacting his or her driving ability. In
driverless cars, expect the windscreen to become yet another
interactive display with an element of augmented reality
blended into it to provide a more enhanced driving experience.
This screen will become a workspace in which you are able to
launch and view applications controlled by gestures.
All of this is before you actually get into the office, which will
be largely controlled by touch- and gesture-based computing models. It is likely that collaboration across different time
zones will no longer be a challenge, with true interactive
whiteboards sessions using holographic representations of
attendees.
These are just a few examples of what the workspace of 2020
might hold, but as you can see it is highly connected and highly
collaborative with application interactions being a lot more
intuitive than they are today.

Chapter 5: Applications of the Future

51

Interactions and Considerations


The previous sections show how some of your physical interactions may change in 2020, and we have also acknowledged
that the world of applications is changing to a more mobile,
cloud-driven world. What does that mean for possible future
interactions with your application estate, and what are the key
considerations you need to think about?
It is likely that your application interactions will change to
support some of the following more natural ways of operating
with technology: haptic technology, gesture recognition, and
augmented reality.

Haptic Technology
Haptic technology is tactile technology, which recreates the
sense of touch through the application of force, vibration or
motion to the user via the device they are using. The simulation can support the creation and control of virtual objects,
and further supports the remote control of machines and
devices. Haptic technology will be integrated into the touch
device era on tablets and smartphones, offering up a world
of innovative applications that need to capture user exertion
levels.

Gesture Recognition
Gesture computing has already entered the home through
the gaming world, and offers the ability for humans to communicate with machines without any mechanical devices. The
concept, for example allows an individual to point a finger at
a computer and control the interactions, making traditional
input devices such as mouse, keyboard and touch-screens
redundant.

Augmented Reality (AR)


AR is a live view of the physical world through your eyes, and
is augmented with context-aware computer-generated information, such as sound, video and graphics.

52

Applications For Dummies


With the help of advanced AR technology the information
about the surrounding real world of the user becomes digitally
manipulable. Social interactions are great examples of situations in which AR would be useful in the business world, supporting better customer engagement.

Context and Location


Awareness
Delivery of content and applications to drive business outcomes is the sole purpose of the IT organization. IT needs to
ensure that the right data is provided with the right level of
security and right application construct driving the right experience to the user, but most importantly, protect both the user
and the business. This type of scenario is where we start to
see both context- and location-aware technology supporting
the delivery of the application and content.
The delivery of a full desktop operating system to a small
form factor, touch-enabled device is never going to be a great
experience. Equally, presenting users with applications that
are not suitable to natively run on the end users device is a
bad experience (Context Awareness). The application delivery
technology needs to be aware of the device that is connecting
to the environment in order for it to intelligently deliver the
right experience.
For those organizations that operate under strict regulatory
requirements, users who work from remote or mobile locations need to be treated with a different level of security
(Location Awareness). A devices location is usually determined by one of three methods:

GPS.

Mobile tower triangulation.

The devices MAC address.


So how do you address these needs, and what should you be
looking for in an application delivery platform? The following
are some of the key issues you need to be thinking about.

Chapter 5: Applications of the Future

53

Geofencing
Geofencing is a feature that uses the GPS or RFID to define
geographical boundaries. A geofence is a virtual barrier. You
as a business can define what happens to the users asset once
they enter or leave the area surrounded by that virtual barrier. You may, for arguments sake, wish for all applications to
be run through a central data center, leveraging technologies
such as VDI, so you can control usage. Again you might force
a user to connect back to the office before consuming SaaS
applications. The advantage of this is that you can then apply
additional security mechanisms to the users session.
Other examples of geofencing might be where a network
administrator can set up alerts so that when a corporate
owned tablet leaves the business premises, the administrator
can disable the device. A marketer can geofence a shop and
send an e-voucher to a customer who has downloaded a particular mobile app when the customer and device enters the
store.

Host Posture Checking


So you know where your assets are, because you have them
tracked (EMM), and you have set policies up for devices that
enter and access your virtual barriers. You now need to be
certain that the device hasnt got any security vulnerabilities.
This is known as posture assessment and refers to the act of
applying a set of rules to the device posture and establishing
the correct level of network access. Once you have determined
the posture of the asset you can then decide on how to handle
the connecting request.

Device Switching
It is highly unlikely that users will consolidate the volume of
devices they have access to over the next five years. In fact
it is estimated that consumers will access on average nine
devices a day. This poses real questions about how we switch
between each device and maintain productivity. Well, if you
have read the entire book so far (I hope you have) you will
have (hopefully) started to piece together both some scenarios and some technologies that can help.

54

Applications For Dummies


The key to switching between devices is standardization in
the presentation of the environment across all those devices.
Providing the look and feel is the same, the switching becomes
easier. Couple that with a context- and location-aware environment, and all of a sudden you have an enterprise workspace
that can port across your multiple devices, sense what can
run natively, and knows what security levels you need and are
appropriate given your current location. Obviously, you need
applications to present to the end user, and we have discussed
so far some of the different types of applications you are likely
to encounter.
One of the biggest challenges has historically been creating
data and making sure that it is accessible across multiple
devices if you need it to be. For example what if you create
data on your smartphone and need to review it later on your
watch, laptop, tablet, desktop, or whatever it may be? This is
where enterprise file sync and share (EFSS) applications come
into play, allowing you to create and sync instantly on any
device, and then pick up and review on another device. This is
similar to Dropbox, but runs securely based upon your authentication protocols in your data center.
Applications have always been a challenge and no doubt they
will continue to be in the future. If we can abstract away from
the operating system and move towards a mobile cloud world
underpinned with a strong DevOps culture, many of the challenges we face will be removed and the end-user experience
will be improved. And isnt that what we strive to do as IT
organizations?

Chapter 6

Ten Take-away Points


In This Chapter
Making the critical choice between buy and build
Managing the whole application life cycle
Understanding the speed of change in IT delivery

his book has taken you on quite a journey. You got to go


from the history of applications, to why and how applications get built, to how they get deployed, right through to
where they are going in the future. Here are a few key points
you might want to bear in mind.

Buying or Building
Applications are fundamentally built to help us do things
more efficiently. If the efficiency gained using an application
is greater than the cost of buying or building it, then you have
the beginning of a business case for that application.
The decision to buy or build an application is a difficult one
and will be different for every use case and every application.
There are diverse toolsets available for building, deploying
and running applications, and you should investigate which
ones make the most sense for you.

SaaS and The Cloud


SaaS applications may be a good alternative for organizations
that wish to avoid complexity and reduce their reliance on IT
staff.

56

Applications For Dummies

Managing with Confidence


Application management covers all aspects of the application
lifecycle. This includes application selection and handling
requests for new applications, deployment, application inventory, license tracking, end-user training, support, patching,
planning for upgrades, and ultimately their decommission.
Applications that are essential to the business should be monitored and secured. Multiple defenses are required to provide
comprehensive and effective security.

Dealing with Development


Application development standards need to match the speed
of consumer application development. Think about the native
vs hybrid vs. HTML5 challenges and how that aligns to your
bring-your-own or choose-your-own (BYO/CYO) projects.
DevOps will become even more important to ensure effective collaboration between development, operations and end
users.

The Evolution of the Desktop


Organizations will be moving away from dependence on desktop management to a future centered on applications, data and
user profiles across all devices. Organizations will be increasingly focused on application and data access, performance and
user experience.

Managing the Mobile


Environment
An increase in mobile devices means you need to ensure that
your delivery and management technologies are both contextand location-aware to ensure the best experience, along with
protecting company intelligence.

Chapter 6: Ten Take-away Points

57

Getting to Grips with EMM


Enterprise Mobility Management will continue to evolve and
become the de facto approach for all device management,
removing the reliance on group policy object and traditional
PC lifecycle management technologies. This will increase management efficiency while improving the end-users experience.

Blending Past and Present


Organizations need to plan for the co-existence of legacy and
modern applications, and have a single pane of glass and set of
policies to present this to enterprise consumers.

The Rise of the Machines


The Internet of Things will become a core part of your enduser computing (EUC) journey, not just for end users but
also for machine-to-machine communications. As millions of
devices, sensors, and applications begin to talk to each other,
these communications will need to be managed and relevant.
The toolsets to do that are still evolving.

Dusting Off Your Crystal Ball


Start thinking about your 2020 vision now and try to see the
world through a day in the life of the end user. In the future,
there will be even more focus on work/life balance as the divisions between the two distinct worlds continue to blur.

58

Applications For Dummies

Appendix

Resources

his is a collection of online resources you can use to


enhance your understanding of application management
and delivery.

Dont Let A Mountain Of Technical Debt Derail Mobile


And Customer-Facing App Delivery. Forrester Research,
October 28, 2014. By Phil Murphy with Christopher Mines,
Kurt Bittner, Eric Wheeler
<www.forrester.com/Dont+Let+A+Mountain+Of+Tech
nical+Debt+Derail+Mobile+And+CustomerFacing+A
pp+Delivery/fulltext/-/E-res103942>

Competitive Pressures Drive The Business Case For


Modern Application Delivery. (Forrester Research, October
15, 2014). By Kurt Bittner, Diego Lo Giudice with Christopher
Mines, Phil Murphy, Amy DeMartine, Dominique Whittaker
<www.forrester.com/Competitive+Pressures+Drive
+The+Business+Case+For+Modern+Application+Del
ivery/fulltext/-/E-res115535>

Equipped To Thrive: Help Employees Turn Mobile


Moments Into Customer Value. (Forrester Research
February 4, 2015). By David K. Johnson, Martha Bennett,
Dan Bieler, Eveline Oehrlich with Christopher Voce,
MarkLindwall, TJ Keitt, Andrew Hewitt
<www.forrester.com/Equipped+To+Thrive+Help+Em
ployees+Turn+Mobile+Moments+Into+Customer+Va
lue/fulltext/-/E-RES120182>

60

Applications For Dummies

Match Digital Workspace Delivery Systems To Your


Organizations Workforce: Personas Help You Decide On
Virtual Desktops, Virtual Apps, Or Native Apps (Forrester
Research, August 21, 2014). By David K. Johnson with
Christopher Voce, Michelle Mai, Michael Caputo
<www.forrester.com/Match+Digital+Workspace+Del
ivery+Systems+To+Your+Organizations+Workforce/
fulltext/-/E-RES117452>

Application-Delivery Options in VMware Horizon 6


(Technical White Paper)
<www.vmware.com/files/pdf/techpaper/vmwarehorizon-view-workspace-application-deliveryoptions.pdf>

Measuring the Business Value of VMware Horizon View


(IDC Research, December 2013). By Randy Perry, Brett
Waldman. An IDC analysis of organizations adopting
acentralized virtual desktop (CVD) computing environment
(also known as virtual desktop infrastructure [VDI]) with the
use of VMware Horizon
<www.vmware.com/files/pdf/view/IDCQuantifying-Business-Value-VMware-View-WP.pdf>

The Future of Enterprise Applications Is Mobility (Gartner


Research, 9 July 2014). By Michael Maoz and Robert P. Desisto
<www.gartner.com/doc/2793917>

SOA and Application Architecture Key Initiative Overview


(Gartner Research, 16 July 2014). By Ross Altman and
KirkKnoernschild
<www.gartner.com/doc/2799817>

Application Rationalization Key Initiative Overview.


(Gartner Research, 25 July 2013). By Bill Swanton.
<www.gartner.com/doc/2551315>

U.S. School System Utilizes VMware Horizon View to Drive


Value (IDC Research; Buyer Case Study). By: Ian Song.
<www.idc.com/getdoc.jsp?containerId=AP246224>

Its all about Apps!


Applications are designed to make us more
productive. And over the years digital computing
and applications have evolved at a phenomenal rate.
But when it comes to supporting and delivering
these appswhen does it make sense to build vs.
buy? How do you decide which apps make the most
sense for your business? How do you determine
which apps make the most sense for your business?
And more importantly, how do you ensure you dont
get overwhelmed with everything that goes into
delivering, managing, securing and monitoring apps
on an ongoing basis?
Well, this book will help you with all of this. Itspells
out everything you need to know to put a successful
application strategy in place and maintain your
sanity in the process!
Map out your plan find out what to
consider when devising your application
strategy
Discover the options understand the pros
and cons of common tools used to deliver,
manage and monitor apps

Open the book


and find:
The history of apps
how they have evolved
and why you
shouldcare
An overview of different approaches
tobuilding, delivering,
securing and managing apps
Tips and trade-offs for
getting started
How to plan for a
world where legacy
and modern apps
coexist

Get going learn common tips and best


practices to help you deploy apps today
andtomorrow

Charles Barrett has been working in IT for over 18 years as a Consultant, Architect,
Board Director and as a Business Solution Architect at VMware. Charles was also
co-author of the VMware BYOD for Dummies guide released in 2014.
Mark Ewert has been working with IT for over 25 years and has designed hundreds
ofsuccessful IT solutions. Currently Mark is the Lead Technologist on VMwares EUC
Competitive Marketing Team.
Ben Goodman is the Lead Evangelist for VMwares End-User Computing products.
Heis responsible for helping craft and articulate VMwares vision and strategy for enduser computing. Prior to his time at VMware he was the Principal Identity Strategist at
Novell, where he helped build and grow their Identity Management portfolio.

ISBN: 978-1-119-09005-2
Not for resale

Das könnte Ihnen auch gefallen