Beruflich Dokumente
Kultur Dokumente
Abstract— ITIL is the most popular “best practices” Maturity models in IT management have been proposed
framework for managing Information Technology (IT) since at least 1973 [8]. More than one hundred different
services. However, implementing ITIL is not only very difficult maturity models have been proposed [9] but most are too
but also there are no recommendations and guidelines for it. general and, as a result, not well defined and documented
As a result, ITIL implementations are usually long, expensive [10]. The Process Maturity Framework (PMF) [12] is the
and risky. In a previous paper we proposed a maturity model only maturity model specifically designed for ITIL but, in a
to assess an ITIL implementation and provided a roadmap for few pages, cannot provide enough information to help an
improvement based on priorities, dependencies, and guidelines. ITIL implementation.
In this paper, we demonstrate a practical application of the
The maturity model we propose is more descriptive,
proposed model with a questionnaire to assess the Incident
Management and Configuration Management processes as well
detailed, and useful because it was designed specifically for
as the Service Desk Function. We evaluate the questionnaires ITIL and contains comprehensive questionnaires for each
in 13 assessments in five Portuguese organizations and then ITIL process. This model can be used to help an ITIL
implemented a prototype to support the assessments. We implementation step-by-step by assessing the maturity of the
finally draw conclusions that could be very useful for existing processes and suggesting what to improve or
organizations that are considering ITIL implementation. implement next.
In this paper we describe our most recent results built on
Keywords- ITIL, implementation, maturity model top of the maturity model described in the previous paper
[12].
I. INTRODUCTION Since the last paper we used the maturity model to build
more questionnaires in order to assess more processes in
IT Service Management (ITSM) is the discipline that more organizations, getting valuable feedback to improve the
strives to better the alignment of IT efforts to business needs model.
and to manage the efficient provision of IT services with We also implemented a prototype to support the
guaranteed quality [1]. Although there are many other assessments that has already been discussed in two
frameworks, the Information Technology Infrastructure organizations.
Library (ITIL) has become the most popular for
implementing ITSM [1,2] and, as a result, the framework of II. PROBLEM
choice in the majority of organizations [3]. With ITIL, ITIL is a methodology to improve service delivery
organizations aspire to deliver services more efficiently and efficiently and effectively, with high quality, based on the
effectively, with more quality and less cost [2,3,4]. best practices of service. Every year more organizations
Furthermore, preliminary results have shown that ITIL desire implementing ITIL. However a considerable
works in practice [4]. percentage of them fail and some organizations collapse
ITIL was launched by the Central Computer and trying it [7, 11]. Some of the most common mistakes made
Telecommunications Agency (now OGC) in the UK with the by organizations when implementing ITIL are [11]:
aim of providing technology-related services in a cost-
• Lack of management commitment
efficient and reliable manner, by offering a systematic
approach to the delivery of quality IT services [5]. ITIL • Spend too much time on complicated process
presents a comprehensive set of guidelines for defining, diagrams
designing, implementing, and maintaining management • Not creating work instructions
processes for IT services from an organizational (people) as • Not assigning process owners
well as from a technical (systems) perspective [6]. • Concentrating too much on performance
Many organizations that decide to implement ITIL fail • Being too ambitious
completely. Many others keep implementing ITIL long after • Failing to maintain momentum
the planned deadline. Empirical evidence shows that several • Allowing departmental demarcation
organizations underestimate the time, effort, and risks – not • Ignoring constant reviewing of the ITIL
to mention the cost – of implementing ITIL. The problem is • Memorizing self ITIL books
that implementing ITIL is not easy [7].
260
TABLE I. COMPARISON BETWEEN MATURITY MODELS
Criteria Bootstrap Trillium PMF CMM ITSCMM CMMI-SVC
Extremely
Success Medium Medium Very Low Medium High
High
Staged
model (SM)/ Continuous Continuous Staged Staged
Both Both
Continuous Model Model Model Model
model (CM)
Number of
SM: 1-5 SM: 1-5
maturity 0-5 1-5 1-5 1-5
CM: 1-5 CM: 0-5
levels
Scope Software Software Services Software Services Services
Extremely
Details Medium Medium Low High High
high
Many
Base for Any model Any model Any model CMMI-SVC --------------
Models
261
Figure 2. Map of dependencies
262
TABLE II. STAGED MODEL AND CONTINUOUS MODEL RELATIONSHIP
Staged Continuous Model Maturity Levels
Model
ITIL processes
Maturity Level 2 Level 3 Level 4 Level 5
Level
Service Catalogue Management
Service Level Management
Supplier Management
Service Asset & Configuration
Management
2
Event Management
Incident Management
Request Fulfillment
Monitoring & Control
Service Desk
Technical Management
Service Generation
Demand Management
IT Financial Management
Service Portfolio Management
Capacity Management
Availability Management
IT Service Continuity Management 3
Transition Plan & Support
Change Management
Release & Deployment Management
Service Validation & Testing
Problem Management
Access Management
Application Management
Information Security Management
Evaluation
4
Knowledge Management
Service Report
Service Measurement
Service Improvement 5
We can easily see the complex flux of dependencies the organizations assess their ITIL processes more
between the ITIL processes, without a guide it’s really hard professionally, easily and efficiently, and it’s free of charge.
and complex to implement ITIL without failures. Even if the The prototype has two facets, as single organization and
organizations just wish to implement one or two processes multi-client (i.e. an organization with two or more Service
they should consider the several connections and information Desks or Incident Managements). This enables comparison
that are necessary. between teams of the same organization.
Some of the prototype functionalities are: process
C. Staged Model and Continuous Model assessment, delegation of questions to other people that the
A Staged Model will enable the organization to have a responsible believes to be more appropriated to answer,
global vision about the main processes to implement while assessment reports, assessment evolutions of the same
Continuous Model will enable the assessment of each ITIL process, compare assessments of multi-clients and provide a
process. However they are both connected as we can see at roadmap with the most appropriate next steps to implement.
Table II. Fig. 3 contains a screenshot of the home page
All staged models proposed in the maturity models (dashboard) where three kinds of information are presented:
before have the particularity of achieve the next level only if I. The percentage of the implementation of each
all the processes of the level below are fully implemented. process. On this case we have two teams (clients)
This implies that in order to achieve a certain level the and we can see the percentage of both.
organization should totally implement a set of processes and II. The number of questions that each user has pendent
this is a huge effort in all the ways. to answer. This is useful to control if the users are
collaborating with the assessment.
D. Prototype
III. The amount of questions answered per day so the
In order to support the proposed questionnaires we manager can quickly understand if the assessment is
designed and implemented a software prototype that helps progressing well and, if not, do something about it.
263
Figure 3. Prototype Dashboard
264
TABLE III. PART OF INCIDENT MANAGEMENT QUESTIONNAIRE
Level 3
Was a policy for the planning and execution of the process
Key
established?
Is the policy for the planning and implementation of the process
Key
documented?
Key Was a plan for the implementation of the process defined?
• Was the plan reviewed by the stakeholders
Key
and had their consent?
Key • Is the plan revised when necessary?
Key Is the plan for the execution of the process documented?
Key Is the description of the incident management documented?
Is there described how the responsibility in the handling
Key
incidents are assigned and transferred?
Is there a description of the process that tells the needs and
Key
objectives for the implementation of the process?
Key • Is it maintained?
Key • Is it updated?
Key • Is it revised when necessary?
Is there a description of how to notify customers or end users
that could be affected by an incident reported? (typically
Key
documented in the service agreement).
Describes the following parameters:
Non Key • Definitions of impact
Non Key • Response time
Non Key • Resolution time
Non Key • Rules for ordering
Non Key • Expectations in providing feedback to users
Is the repository audited in accordance with a documented
Key
procedure?
Depend
Did it exchanged information with the “Configuration
Key
Management” in order to maintain the registry settings?
(Config. M.)
• Key questions: All these questions are essential for • Don’t know: They don’t understand the question or
the correct implementation of each specific level of don’t know the answer
the process, and all must be implemented to reach • In implementation: If they are already in an
the level. Basically, they’re related with roles, implementation phase of that question
responsibilities, activities, documents, audits, Table IV shows a summary of the results of all
reviews, etc. assessments, as well as the answers to the open questions. In
• Non-key questions: Not all of these questions need fact we made some question before and after the
to be implemented, just 75 percent of them. questionnaires in order to understand the quality and
Basically, they’re related with sub-practices; they are completeness of our proposed questionnaire and the vision
not the main focus. that each organization has about its own ITIL
• Dependent key questions: These questions are implementation.
related to dependencies between processes. If the The columns in Table IV mean the following:
process that is being assessed depends in some way • Simple: If the person responsible for the process
on other ITIL processes and the organization has understood all the questions
those processes implemented too, then the question • Complete: If the questionnaire had all the activities
must be implemented; otherwise, it shouldn’t be. and practices implemented by the organization
The organizations can answer each question of the • Copy: If the organization wants a copy of the
questionnaires with: questionnaire with the results
• Yes: They have the question implemented • State: The level of process implementation in which
• No: They don’t have the question implemented the responsible believes the organization is
265
TABLE IV. ASSESSMENTS RESULTS SUMMARY
Process Organization Simple Complete Copy State Model Start Finish People Budget Level
state (€)
Organization 1 No Yes Yes 15% 39% 2010 No 215 10M 1
Organization 2 Yes Yes Yes 70% 98% 2008 No 60 --- 5
(Team 1)
Service Desk
Organization 2 No Yes Yes 70% 77% 2008 No 56 --- 1
(Team 2)
Organization 3 Yes Yes Yes 100% 87% 2007 Yes 250 13M 2
Organization 1 No Yes Yes 10% 31% 2010 No 215 10M 1
Configuration
Organization 2 Yes Yes Yes 60% 43% 2008 No 60 --- 1
Management
Organization 3 Yes Yes Yes 90% 71% 2007 Yes 250 13M 1
Organization 1 No Yes Yes 15% 11% 2010 No 215 10M 1
Organization 2 Yes Yes Yes 90% 90% 2008 Yes 60 --- 3
(Team 1)
Incident Organization 2 Yes Yes Yes 90% 93% 2008 No 56 --- 2
Management (Team 2)
Organization 3 Yes Yes Yes 100% 83% 2007 Yes 250 13M 1
Organization 4 Yes Yes Yes 60% 61% 2008 Yes 30 6M 1
Organization 5 Yes Yes Yes 90% 62% 2009 Yes 20 500K 1
Average 66% 65% 133 1,6
• Model State: It’s the state of the implementation starting to implement ITIL. Nevertheless, they still are at
based on the results of the questionnaire level 1.
• Start: The beginning date of process implementation Organization 2 achieved the best results and they were
in the organization clearly aware of their maturity level. Service Desk of Team 1
• Finish: The ending date of process implementation was a little pessimistic about their level but Incident
in the organization if it is finished Management of Team 1 had less percentage and a higher
• People: Number of people that work IT department level of maturity compared with Incident Management of
• Budget: The budget of the IT department Team 2. Nevertheless, with 90% they should be in a higher
• Level: Maturity level of the ITIL implementation level of maturity and the reason they are not should be
based on the results of the assessment investigated.
It should be noted that we assessed two Service Desks Organization 3 was very optimistic because they said the
and Incident Management processes in the Organization 2, implementation had already finished but indeed they are
and therefore we can make a comparison of different teams quite far away from that end. They received a high
in the same organization. percentage but their level of maturity was low because, they
Not all organizations were comfortable to disclose their did not answer important questions.
budget; therefore we don’t have information about all the Organization 4 is clearly aware of their maturity level.
budgets. However, they keep being in a low level of maturity and
We can see that Organization 2 has highest level gave as completed the ITIL implementation. Is quite strange
regarding their implementation of ITIL. Only Organization 3 to have 61% of implementation and level 1 since 2008 and
has one assessment with level 3, while the rest of the already finished the implementation. Seems like a futile
assessments all have level 1. effort, since the beginning of ITIL implementation. We just
Most of the organizations thought the questionnaire was find two possible reasons, and both identified in problem of
simple as we only received four “No”. Interestingly enough, the paper, or they were completely lost in the implementation
the percentage of implementation for those that answered and didn’t know the next steps, or they just reach the limit
“No” was relatively low, thus we may conclude the budget.
questionnaire was not simple only for those with lower Organization 5 was the most optimistic but they got a
percentage of ITIL implementation. This result must suggest low percentage and the lowest level of maturity (level 1).
we should design an even simpler questionnaire for These results confirm our initial assumption:
organization with a lower level of ITIL implementation, or organizations fail to implement ITIL because they neglect
perhaps a questionnaire that adapts itself to the maturity of critical practices or activities.
the organization. Most organizations fail to implement ITIL properly and
All confirmed the completeness of the questionnaires, remain at a low maturity level, not because they cannot
everything they implemented, or plan to implement, was in implement a high percentage of what ITIL proposes, but
the questionnaires. because they fail to implement specific but crucial details. As
Organization 1 got better results that they were thinking a result, many organizations end up with the worst of both
in two questionnaires. We believe the reason has something worlds: they invest on implementing most of ITIL and still
to do whit the strong investment on CMMI they made before cannot recover the return of that investment because they
have, in fact, a low maturity model. Our research can then
266
help organizations benefit more from ITIL without making • Two processes and one function, amongst the 24
more investments, just by pinpointing what is still missing. processes found on the ITIL v3 books, only cover so
The maturity model suffered some improvements since far a small part of ITIL
the beginning of the assessments. More questions were • The Staged Model cannot be assessed because we
added to the mini questionnaires (before and after each are still lacking the questionnaires for most of the
assessment) in order to gather important information to be level 2 processes.
able to draw more accurate conclusions. Other adjustments • The sequence of implementation proposed by the
were made as some Key questions that come to Non Key, or Staged Model may not be the most appropriate for
some superficial improvements as repeated questions or all organizations.
mitigation of redundant information. About the percentage of non-key questions that need to
be implemented (75 percent), we don’t have any empirical
VI. EVALUATION
validation. However the idea isn’t new and the value isn’t
Although quite useful, as can be seen in the previous arbitrary. Trillium defines a minimum percentage to reach
section the proposed model is not perfect and on this section the next level, too (90 percent), but Trillium doesn’t have
we will discuss the pros and cons of the model. distinct questions (key, non-key, dependent key). Therefore
With the 13 assessments made we may conclude the 90 percent was a high value for those who must implement
following: all key questions, and then we believe that three-quarters of
• Most organizations are at level 1 of maturity, even non-key questions is acceptable. This number needs to be
some with a high percentage of ITIL confirmed in the future with more assessments.
implementation. This means they are skipping Overall the advantages of the proposal are evident. The
important details when implementing ITIL. present reality of ITIL implementation in most organizations
• Organizations with a low percentage of makes this proposal extremely useful. The proof is that all
implementation cannot understand all the questions. organizations was very interested on our questionnaires and
Perhaps in the future the model should be adapted to asked for a copy of the results.
the maturity level of each organization.
• Some organizations finish their ITIL implementation VII. CONCLUSION
and do not even have the level 2 of maturity. We Implementing ITIL is not easy, as seen by the fact that
should investigate why this is happening because it several organizations fail in their efforts to do so; they
may be caused by extremely difficulty, too high a clearly need something to facilitate that as well as recover
price, and /or a lack of benefits. Again, the proposed more benefits from the investments. We made a conceptual
model may help solve this problem. map that demonstrates the complexity of the dependencies
• An organization already with CMMI can have better between processes and the results show that some
results than expected but never reach more than level organization, even in the implementation of a single
1. This makes sense because the proposed model process, got lost.
was based on CMMI-SVC and ITSCMM. In a previous paper we proposed a maturity model to
• The average of the maturity level shows that, in help organizations assess the maturity level of their ITIL
general, the maturity level of the organizations is implementations and present a roadmap for improvement.
low (1,6). Probably this happens because
In this paper we presented an evaluation of the model
organizations are not implementing ITIL properly.
using 13 assessments in five organizations, discussed the
According to the evaluation performed so far, the
identified pros of the proposed model are: pros and cons of the proposed model in practice, and
• Extremely useful for helping organizations summarized a prototype that we implemented to support
implementing ITIL these assessments.
• Very detailed and complete There are points of view from which we can draw
• Can be used to assess and guide an ITIL different conclusions. They are:
implementation • Size of the organizations
• The Staged Model follows an incremental path that • Budget of the organizations
reduces the initial effort • Public Vs Private organizations
• Enables organization to know “where they are” and • Teams of the same organization
“what they should do” regarding ITIL We can see at Table IV that the larger the organization,
• Organizations that follow the proposed model avoid the better the results. It’s an interesting conclusion; the
the most common mistakes largest organizations are more careful with the
• The questions are easily understood by most implementation and spend their time, efforts and money
organizations. more appropriately.
• The model is useful and interesting, until now all Budget subject is quite sensitive, not all organizations
organization wished a copy of the questionnaires were comfortable in providing us their budget so we cannot
results. get an accurate conclusion. However, after visiting the
However, we also identified some cons of the model: workplace of the organization 6 as well as knowing the
267
dimension of the same, it’s completely reasonable to assume Simulation, 2009. AMS '09. Third Asia International Conference, pp.
369 – 374. IEEE Press (2009)
that this organization is the one with the bigger budget
[4] Kashanchi, R., Toland, J, “Can ITIL contribute to IT/business
between all the organizations assessed. Based on this alignment? An initial investigation,” WIRTSCHAFTS
assumption we can easily conclude that a larger budget is a INFORMATIK. 340-348 (2006)
synonymous of better results. [5] ITIL Lifecycle Publication Suiite. Ofice of Government Commerce
Distinguish the organizations in public and private is not (OGC). 2008
fair since they are not in the same number, have different [6] Graupner, S., Basu, S., Singhal, S, “Collaboration environment for
sizes, different budgets; however it’s an interesting ITIL,” in: Integrated Network Management-Workshops, 2009. IM
'09, pp. 44 – 47. IEEE Press (2009)
viewpoint to evaluate. Organization 1, 5 and 6 are private
[7] Nicewicz-Modrzewska, D., Stolarski, P, “ITIL implementation
and the rest are public organizations. Private organizations roadmap based on process governance,” in: EUNIS 2008. Adam
clearly achieve better results in percentage of Mickiewicz University Computer Centre (2008)
implementation and maturity level achieved. [8] Rocha, A., Vasconcelos, J, “Maturity models for information systems
We can see on Table IV that both teams said to be on the management. (Os modelos de maturidade na gestão de sistemas de
same level of implementation. However, the results show informação, in Portuguese),” in: Journal of the Faculty of Sciences
and Technology at University Fernando Pessoa, Nº 1, pp. 93-107
that apparently they are not with the same percentage of (2004)
implementation as well as the same maturity level. After [9] Bruin, T., Rosemann, M., Freeze, R., KulKarni, U, “Understanding
review specific assessments, we believe that the difference the main phases of developing a maturity assessment model,” in: 16th
is on the questions that the responsible of Team 2 answered Australian Conference on Information Systems (ACIS), Sydney
(2005)
as “Don’t Know”; maybe the responsible wasn’t as aware of
[10] Becker, J., Knackstedt, R., Pöppelbuß, J, “Developing maturity
the implementation as the Team 1. Without those answers models for IT management – A procedure model and its application,”
the assessments results of the Teams would have been very in: Journal Business & Information Systems Engineering (BISE),
similar. Due to this, we can conclude that inside the same Volume 1, Number 3. B&ISE (2009)
organization the results become very similar [11] Paulk, C.P.,Weber, V.C., Curtis, B., Chrissis, B.M.: The Capability
We also found that, in most cases, ITIL implementation Maturity Model: Guidelines for Improving the Software Process.
Addison-Wesley (1994)
is not at the level that organizations believe. Besides that
[12] Pereira, R. Mira da Silva, M.: A Maturity Model for Implementing
most organizations implement ITIL as they wish, i.e. not ITIL v3. In: IEEE 2010 International Workshop on Net-Centric
following the ITIL best practices. Our proposed model is Service Enterprises: Theory and Application (NCSE2010)
very useful because the organizations find what they are not [13] Sharifi, M., Ayat, M., Rahman, A.A., Sahibudin, S, “Lessons learned
doing properly. We finally conclude that the assessments in ITIL implementation failure,” in: Information Technology, 2008.
ITSim 2008, pp. 1 – 4. IEEE (2008)
demonstrate the usefulness and importance of our research
[14] Trillium: Model for Telecom Product Development and Support
work. Process Capability, Release 3.2, Technical Report, Bell, Montreal
The results show that almost all organizations skip (1996)
important steps and the average level of maturity is only 1.6. [15] Krishnan, M.S., Mukhopadhyay, T., Zubrow, D.: Software Process
The problem we are trying to solve is thus worth our Models and Project Performance. Information Systems Frontiers.
research effort because most organizations are, in fact, 1:3,267 – 277 (1999)
implementing ITIL incorrectly and not getting the benefits [16] Kuvaja, P.: BOOTSTRAP: A Software Process Assessment and
Improvement Methodology. In: Lecture Notes in Computer Science.
from their ITIL implementation. LNCS, Volume 926/1995, 31-48. Springer Berlin (2006)
As part of our future research work, we will make many [17] Robben, D., Nederland, B.V.S.: TPI, BOOTSTRAP and Testing.
more assessments, in more organizations. This will be (2006)
achieved partly using the prototype we implemented, thus [18] Lloyd, V.: Planning to Implement Service Management (ITIL). OGC
also serving to improve the prototype. We will also create (2002)
questionnaires for other popular ITIL processes, such as [19] Niessink, F., Van Vliet, H.: The Vrije Universiteit IT Service
Change Management. Further down the road, we expect to Capability Maturity Model. Faculty of Sciences, Division of
Mathematics and Computer Science Vrije Universiteit Amsterdam
offer the prototype free of charge as a service on the Internet (2005)
so that any organization in the world can self-assess their [20] C. Forrester, E., L. Buteau, B., Shrum, S.: CMMI for Services,
ITIL implementation. Version 1.2. Technical Report, Software Engineering Institute (2009)
[21] Pinheiro dos Santos, R., Marçal de Oliveira, K., Pereira da Silva, W.:
REFERENCES Evaluating the Service Quality of Software Providers Appraised in
[1] Brenner, M, “Classifying ITIL processes: A taxonomy under tool CMM/CMMI. Software Quality Journal, 283-301, Springer
support aspects,” in: The First IEEE/IFIP International Workshop, pp. Netherlands (2008)
19 – 28. IEEE Press (2006)
[2] Hochstein, A., Zarnekow, R., Brenner, W, “ITIL as common practice
reference model for IT service management: Formal assessment and
implications for practice,” in: e-Technology, e-Commerce and e-
Service, 2005. EEE '05. Proceedings of the 2005 IEEE International
Conference, pp. 704 – 710. IEEE Press (2005)
[3] Ayat, M., Sharifi, M., Sahibudin, S., Ibrahim, S, “Adoption factors
and implementation steps of ITSM in the target,” in: Modelling &
268