Beruflich Dokumente
Kultur Dokumente
Date: Nov 20, 2000
Return to the ar cle
IT Management Level: Establishing Performance Benchmarks
Baselining an organiza on's performance level has become a standard industry prac ce, par cularly in companies whose IT
organiza ons are required to track and improve their delivery of products and services rela ve to improved me to market, cost
reduc on, and customer sa sfac on. Crea on of an IT performance baseline (o en referred to as benchmarking) gives an organiza on
the informa on it needs to properly direct its improvement ini a ves and mark progress.
Performance levels are commonly discussed in terms of delivery—for example, produc vity, quality, cost, and effort. In each of these
categories, func on points are used as the denominator in the metrics equa on. By using func on points as the base measure, the
organiza on benefits in two ways. First, because func on points are applied in a consistent and logical (not physical) fashion, they are
considered a normalizing metric, thus allowing for comparisons across technologies, across business divisions, and across organiza ons,
all on a level playing field. Second, there is an extraordinary amount of industry baseline data, which can be used to compare
performance levels among various technologies and industries and to compare internal baseline levels of performance to best‐prac ce
performance levels.
Table 1
ISBSG Industry Data
Func on Point Size Mainframe Client‐Server Packaged So ware Object‐Oriented
Note: All values are expressed in hours per func on point as a rate of delivery.
Table 2
Rates of Delivery by Business Area
Business Area Rate of Deliverya
Accoun ng 11.3
Manufacturing 6.3
Banking 12
Telecommunica ons 2
Insurance 27.4
Engineering 8.1
aExpressed in hours per func on point.
The project was completed or was undergoing development during the previous 18 months.
The labor effort to complete the project amounted to more than six staff‐months.
The project represents similar types of projects planned for future development.
The primary technical pla orms are represented.
The project selec on includes a mix of technologies and languages.
Figure 1
Rate of delivery during baselining
Figures 2 and 3 have sorted the baseline projects rela ve to the type of development. Figure 2 shows all enhancement projects; Figure 3
shows all new development projects. Note the difference among the various views for a baseline project of 400 func on points. The
advantage of looking at this baseline data from different viewpoints is to be er understand the impact of different development types
on performance levels. It would not be reasonable to expect future enhancement projects to perform at the same rate of delivery as
new development projects.
Figure 2
Rate of delivery for enhancement projects
Figure 3
Rate of delivery for new development projects
For example, we can observe from figures 1 through 3 that our capacity to deliver is influenced by size and type of development. If we
hold these two variables constant while analyzing our baseline data, we can observe that there are s ll varia ons in performance data.
Figure 4 shows data from several projects that are closely related in size. Four data points fall in the range of 400 to 550 func on points.
Their corresponding rates of delivery are from 8 to 18 func on points per person‐month. That is a significant difference in performance.
The challenge now becomes one of determining the contribu ng factors that caused these projects to perform at different levels.
Figure 4
Rate of delivery by func onal size
We have completed many of these types of process performance assessments on the basis of our own proprietary assessment method.
The method of collec on consists of selected interview and team survey sessions with each of the project teams. Data is collected on
key influence factors, such as project management capabili es, requirements and design effec veness, build and test prac ces, skill
levels, and other contribu ng environmental factors. Individual projects are analyzed, and project profiles are created. This data is also
analyzed in aggregate and may be used as the basis for determining overall organiza on process improvements.
The results are somewhat predictable. Typical influence factors include skill levels, effec ve use of front‐end life cycle quality prac ces,
tool u liza on, and project management. These findings parallel, in part, those of the ISBSG database analysis, which revealed that
language complexity, development pla orm, methodology, and applica on type were significant factors influencing produc vity.
Organiza on Level: Establishing Service‐Level Measures
Service‐level measures are most commonly associated with outsourcing arrangements. They are established as a means to measure the
performance of an external provider of so ware services. In addi on, as organiza ons become increasingly sensi ve to the needs of
their customers, and as IT goals and objec ves become more aligned with business performance and customer sa sfac on levels,
internal service‐level measures are becoming more popular.
To allow companies to focus on "core competencies"
To convert rela vely fixed allocated costs to direct variable costs
To take the IT func on to the next level of capability
To improve me to market
To change the culture and re‐skill the IT func on
To convert legacy system resources to new development resources
An effec ve set of measures is mandatory to monitor performance trends and improvements. These measures should link to and
provide informa on on performance as it relates to stated organiza on goals and objec ves.
Project and Applica on Outsourcing
Outsourcing of a project or applica on involves contrac ng a third‐party vendor to perform the work, either on‐ or off‐site. The
comple on of the contract is the delivery of the project or applica on so ware.
1. What is the outsource provider's responsibility?
2. What standards or development prac ces are required?
3. What defines the "goodness" of the deliverable?
Answers to these ques ons will guide us in outlining and defining which service levels are most appropriate.
First, defining the areas of responsibility provides an opportunity to define hand‐off or touch points where deliverables are passed
between the provider and the customer. The most obvious hand‐off point is the final deliverable, but other hand‐off points may include
specifica ons, design, test plans, test cases, code, and so on. Each hand‐off point is an opportunity for measuring the level of service.
What will be delivered to the provider, and what will the provider in turn deliver?
Second, knowing what standards or development prac ces are required typically leads to the establishment of one or more compliance‐
related service‐level measures. The opportunity to use func on points is limited in this aspect of the outsourcing arrangement.
However, there may be a desire to monitor produc vity, measured in func on points per unit of me or cost. The proper use of
development tools and techniques during the development process will influence produc vity, of course, and the func on point‐related
metric can be used to measure the effec veness of the selected tools and techniques.
Finally, the "goodness" of the deliverable is the most basic of the service‐level measures. Here func on points would usually be the
denominator in a variety of metrics equa ons measuring such things as rate of delivery, dura on, cost, and quality.
Maintenance Outsourcing
The outsourcing of selected applica ons to be maintained by a third‐party vendor requires a much different set of measures from those
used in the project or applica on outsourcing arrangement. The key elements involved when one is considering the measurement of a
maintenance outsourcing arrangement include the following:
Monitoring customer expecta ons
Maintaining an acceptable response me
Limi ng bad fixes
Limi ng the volume of fixes
Establishing effec ve hand‐offs
Ensuring applica on exper se
Monitoring smooth customer interfacing
AD/M Outsourcing
The final outsourcing scenario that we will review is the outsourcing of an en re applica on development and maintenance
department. Such outsourcing usually extends over mul ple years and may be linked to a much larger outsourcing ini a ve, which
could include the outsourcing of the en re IT organiza on. Once again, we observe a much different measurement dynamic from what
we encountered with the first two scenarios.
In this scenario, the customer con nues to monitor the management and performance of individual projects and the maintenance of
the applica on por olio. However, usually a much higher or broader performance view is taken when the service‐level measures for an
en re IT department are being considered.
The issues driving AD/M outsourcing engagements include the following:
Assessing the impact of new technology
Increasing profitability
Reducing costs
Improving customer service
Improving me to market
Increasing financial control
Service‐level measures that are formulated to support an AD/M outsourcing arrangement are usually mul layered. The contract may
require a primary set of service‐level measures that measure overall performance on an organiza on‐wide basis. In addi on, the
contract may require a set of service‐level measures to monitor specific project deliverables.
This scenario is similar to an agreement that you might have with a contractor building your house, in which there are specific measures
rela ve to the details of the house such as room dimensions, electrical capaci es, and the like. However, the overall performance is
more likely to be monitored as me to market, final cost, and overall quality of workmanship.
A well‐defined process should be followed to establish the proper service‐level measures for an AD/M arrangement. This process
requires an understanding of the goals and objec ves of the outsourcing arrangement, iden fica on of the measures that will monitor
the performance or the achievement of those goals, and determina on of the proper level of service to expect ini ally and on a
con nuing basis.
Through analysis of the business drivers and the performance considera ons of the development environment, a set of applicable
metrics is derived (see Figure 5). For each service‐level metric iden fied, a metric profile is created. Each profile includes a defini on of
the metric, its stated purpose, the data elements required, and the formula to calculate.
Figure 5
Metrics derived by analysis of business drivers and performance considera ons
The next step in the process is to establish the actual values or levels of performance that must be assigned to each service‐level metric.
We must determine reasonable values for the established service levels. If we have established me to market as a service level, we
need to determine the proper interval of me to be expected. If cost is going to be measured, what is a reasonable cost? When quality
is integrated into the set of service measures, what are the defect density levels? We can determine these values best either by
establishing organiza onal performance benchmarks or by relying on industry data.
Summary
The use of func on points as part of a comprehensive baseline performance measurement ini a ve demonstrates the versa lity and
many uses of func on points. Specifically, func on points permit an organiza on to compare internal performance levels in a more
consistent fashion. In addi on, the use of func on points allows for a wide range of comparison opportuni es outside the organiza on.
An ever‐increasing amount of industry performance data uses func on points as the base measure. An organiza on that can express its
produc vity in terms of func on points can compare those performance values with industry‐related benchmarks and determine the
rela ve posi on of its performance. In addi on, some industry data goes a step further in that it can iden fy and quan fy best‐prac ce
performance levels.
Source
This ar cle has been excerpted from Func on Point Analysis: Measurement Prac ces for Successful So ware Projects (Addison Wesley,
2000), by David Garmus and David Herron.
© 2019 Pearson Educa on, Informit. All rights reserved.
800 East 96th Street, Indianapolis, Indiana 46240