Beruflich Dokumente
Kultur Dokumente
Thesis number:
INF/SCR-07-07
Capgemini supervisors:
Chris ten Zweege
Erwin Dunnink
Capgemini
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007
Preface
Writing a thesis is unmistakably one of the hardest tasks that a graduating student has
to deal with. Despite the enthusiasm at the beginning of the venture, you inevitably
end up counting the days before the final deadline (that is, if you have decided upon
one). As comical as it might sound, the whole process of doing the thesis assignment
can be summarized with three consecutive thoughts (at least in my case).
First, you will think that other students are exaggerating since there is no way you
would take that much time to finish your thesis. After all, you are positive about the
subject you have chosen. After several months, you will begin to wonder if you are on
the right track because something just doesn’t feel right. And then, finally, when you
are nearing your final deadline, you would most probably wish you had taken the
assignment more seriously from the beginning.
Fortunately, support is provided to alleviate the recurring dips of motivation and self-
discipline during this long and cumbersome journey towards graduation. In my case, I
wish to thank my supervisors of the University of Utrecht, Lidwien van de Wijngaert
and Sjaak Brinkkemper, for their guidance.
I also want to thank the people at Capgemini Netherlands for providing me with a
resourceful environment to conduct my thesis assignment, especially my supervisors
Chris ten Zweege and Erwin Dunnink.
Likewise to all members of the PMI project group, please accept my thanks for your
cooperation and helpful feedback during and after the project meetings.
And last but certainly not the least; I want to thank my parents for supporting me from
the very beginning. Without them, I would never have come this far.
Tjie-Jau Man
i
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007
Abstract
To conduct project-based management (PM) as successful as possible, it is
fundamental for organizations to invest time and effort to construct the necessary
infrastructure, such as organizational structure, policies and competencies of people.
Over time, more advanced organizations may wonder where they exactly stand in the
whole process and what they should do to make further advancements.
Maturity models for PM are developed to assist organizations that have these
thoughts. By comparing their own practices against best practices described by these
models, organizations can find out how mature or professionalized they are in
performing project-based management and what they could do to realize desirable
improvements in it. However, with more than 20 maturity models available in the
field of PM, organizations have to consider carefully which one they can adopt. In
order to do this, organizations need to know what aspects of these models are
important to consider and how they should evaluate them.
In this thesis, research is done on relevant dimensions to the evaluation of maturity
models for PM. This set the stage for the selection of measures that are needed to
evaluate similarities and differences between maturity models for PM.
The research showed that maturity models for PM can be evaluated along three
dimensions: structure, applicability and usage. And three measures were selected to
operationalize these dimensions in the same respective order: Process-Data Diagrams,
evaluation criteria and user interviews.
These measures formed a framework that was applied to several maturity models for
PM to determine its quality. The framework and its constituting measures proved
useful in shedding light on the relevant similarities and differences between the
models. It was able to show the strengths and weaknesses of each evaluated maturity
model, which should be considered by organizations planning to adopt them.
ii
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007
Table of contents
1. INTRODUCTION ...............................................................................................................2
APPENDIX ...............................................................................................................................80
1
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007
1. Introduction
In general, there are two reasons why it is beneficial for organizations to adopt a
maturity model for project-based management, which includes the management of
projects, programs and portfolios. Ever since organizations began to adopt the project-
based way of conducting business, they have strived to deliver projects successfully.
To do this, organizations require the necessary infrastructure, which includes
processes (methods and techniques), governance structures, competences of people
and tools [1]. Developing such an infrastructure may take several years, and because
of this, more advanced organizations may start to wonder after a while where they
exactly stand in the whole process and whether they are going the right way. This is
when the adoption of a maturity model proves useful. A maturity model is able to
assist organizations in verifying what they have achieved by describing activities and
best practices and categorizing these descriptions into progressive levels of maturity.
The second benefit for adopting a maturity model becomes apparent when an
organization has finished assessing its current practices and aims for advancements to
a desired level of maturity [2]. By comparing the results of a maturity assessment with
the descriptions in a maturity model, an organization gains insight into their strengths
and weaknesses and is able to prioritize its actions to make improvements.
2
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007
Closely aligned to this assignment, research was done on maturity models for project-
based management. A framework was developed to support the evaluation and
comparison between such models. This framework was applied to compare three
maturity models for project-based management with each other.
3
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007
The following sub research questions will set the stage for answering the main
research question.
3. What are relevant dimensions to the evaluation of maturity models for project-
based management?
These dimensions provide guidance to the selection of the measures that facilitate the
comparison between the maturity models. They will ultimately form the evaluation
framework mentioned earlier.
4. What are the main similarities and differences between maturity models for
project-based management?
This sub question focuses on the main similarities and differences found after
evaluating several maturity models for project-based management with the developed
framework.
4
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007
Supporting the above assumption is the contingency theory; a theory that takes many
forms in the world of research. The earliest and extensively researched form of
contingency theory was introduced by Fiedler in the 60s, which explains that group
performance is a result of interaction of two factors: leadership style and situational
favorableness [4][5]. Since then, the contingency approach has been applied and
adopted many times [6][7][8][9]. A list of scientific articles that use the contingency
theory is shown in [10].
Similar to what other researchers have done, this thesis applies the contingency theory
on maturity models for project-based management. As mentioned before, the focus
here is on finding measures to elicit similarities and differences among these maturity
models. The results of this research could set the stage for further research on the
possible contingency or ‘fit’ between the models and organizational situations, and
relate this to performance variables. Also, the differences found with the measures
may prove useful in future research efforts on the categorization of maturity models
for project-based management. And if the framework proves useful to the
comparisons between them, it may provide a foundation or be used to evaluate and
compare other maturity models of the same discipline.
5
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007
In order to determine the quality of the evaluation framework, we have adopted the
five requirements from the field of situational method engineering [11]. These
requirements are used to assess the quality of a method assembled from method
fragments to suit a situation specific for a project. As our framework describes a
method to evaluate and compare maturity models for project-based management,
these requirements can be used for its evaluation. The five requirements are defined as
follows for this thesis:
- Completeness: the framework describes all relevant dimensions for the
evaluation of maturity models for project-based management;
- Consistency: all activities and concepts are consistently defined and described
throughout the framework;
6
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007
- Applicability: the researchers are able to execute the method described by the
framework;
- Reliability: the method described by the framework is semantically correct and
meaningful;
- Efficiency: the method described by the framework can be performed at
minimal cost and effort.
This thesis concludes with the elaboration of discussion topics and conclusions based
on the analysis results and the description of the framework’s quality based on the
above requirements (Chapter 5).
7
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007
8
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007
Programs differ from projects in that they are carried out to achieve specific strategic
business objectives or goals. Or as formulated in [12], “a program’s focus is on
producing, in accordance with a vision of an ‘end state’ consistent with organizational
strategic objectives”. An example of an ‘end state’ is ‘the realization of 5% cost
reduction throughout the entire organization’. To achieve this, a program will consist
of a number of projects or functional activities including for example ‘the
implementation of a new logistics system’ and ‘the development of a new IT system’
[2]. In addition, while the aforementioned project is successful when the logistics
system is implemented conform specifications, the completion of the above program
depends on the realization of the 5% cost reduction, and not with a new IT or logistics
system.
According to Wikipedia [15], the management of programs is:
9
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007
One well-known maturity model is the Capability Maturity Model (CMM), introduced
by the Software Engineering Institute (SEI). This model was later replaced by its
successor, the Capability Maturity Model Integration (CMMI) [28]. The development
of Capability Maturity Models had inspired the emergence of other maturity models
in the same field of Software Development. Examples of these are the Test Process
Improvement (TPI) Model developed by Sogeti [29] and the Usability Maturity
Model [30].
10
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007
During this research, an attempt was made to construct a long-list containing existing
maturity models for PM. This list of maturity models is depicted in Table 1, along
with their names and owners.
11
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007
Maturity models differ from one another in the concepts they embody and the
suggestions they make as to how the path to maturity looks like [22]. Different
maturity model for PM may define maturity differently and measure different things
to determine maturity. Because of this, organizations should give careful
consideration to the selection of a maturity model.
A candidate maturity model should be at least publicly available (or against moderate
payment) through publication in a book (electronically or in print). This is to ensure
that the needed information about a maturity model is accessible when it is evaluated
using the framework. The table below shows the average scores given to the maturity
models.
12
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007
Table 2 (continued)
The maturity models were rated with scores ranging from 0 to 10 where 0 indicates
“Hardly” and 10 indicates “Completely”. Grayed out maturity models are those
excluded from the long-list based on their accessibility. The maturity models
‘MINCE2’ (nr. 4) and ‘PM Solutions PMMM’ (nr. 8) have not been examined yet due
to the limitations of time.
The short-list that resulted from the selection process consisted of the following
maturity models:
13
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007
The first three models of the list were selected to conduct this thesis, which are the
OPM3, CMMI and PMMM. For this thesis, materials are used provided by the
following institutions for the respective maturity models: Project Management
Institute (PMI) for OPM3, International Institute for Learning (IIL) [34] for PMMM
and Software Engineering Institute (SEI) for CMMI. A brief background of each
model is provided in the following sections, including the reasons why they were or
were not selected to test the framework.
2.4.1. OPM3
OPM3 is an acronym for Organizational Project Management Maturity Model. It is a
standard developed under the stewardship of and introduced in December 2003 by the
Project Management Institute (PMI). The development of this standard was inspired
by the increasing interest in a maturity model that shows a step-by-step method of
improving and maintaining an organization’s ability to translate organizational
strategy into the successful and consistent delivery of projects. In other words, OPM3
is meant to enable organizations to bridge the gap between organizational strategy and
successful projects [35].
The purpose of OPM3 is not to prescribe what kind of improvements users should
make or how they should make them. Rather, by providing a broad-based set of
organizational project management (OPM) best practices, this standard allows an
organization to use it as a basis for study and self-examination, and consequently to
make its own informed decision regarding potential initiatives for changes [12].
14
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007
OPM3 was selected for the evaluation because of its popularity in the field of PM.
Information about this model was readily accessible since the chairman of the project
group is a certified OPM3 assessor.
This model is closely aligned to the PMBOK [12], which is a well-accepted standard
approach for project management. Besides this, additional PMBOK guides have
recently been developed to describe approaches to program and portfolio
management. These additions are also embedded into the OPM3.
2.4.2. CMMI
CMMI stands for Capability Maturity Model Integration. Its first version (1.1) was
introduced by the Software Engineering Institute (SEI) in 2002 as the successor of the
Capability Maturity Model (CMM), which was developed from 1987 until 1997. The
SEI (2007) defines CMMI as a process improvement approach that helps
organizations integrate separate functions, set process improvement goals and
priorities, provide guidance for quality processes, and provide a point of reference for
appraising current processes.
15
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007
The latest version of CMMI (2.1), released in 2006, comprises a framework that
allows the generation of multiple models, training courses and appraisal methods
supporting specific areas of interest. CMMI for development is one of those models
and provides guidance for managing, measuring, and monitoring software
development processes.
The development of CMM and CMMI was based on the premise, which states that
“the quality of a system is highly influenced by the quality of the process used to
acquire, develop and maintain it” [28]. These models comprise the essential elements
of effective processes for one or more disciplines (e.g. Software Development) and
describe an evolutionary improvement path from ad hoc, immature processes to
disciplined, mature processes with improved quality and effectiveness. And the
fundamental idea behind this is that even the finest people within an organization
cannot perform their best if the process is not understood or operating at its best [28].
During the evaluation, documents of SEI’s CMMI and the Standard CMMI
Assessment Method for Process Improvement (SCAMPI) [37] are consulted. It should
be noted that SEI is not the only institution that provides CMMI maturity assessments.
While SEI’s assessment method is the only one being evaluated for CMMI, it is not
the only that exists for CMMI. Thus, the findings here do not necessarily account for
the procedures employed by other institutions that also provide CMMI assessments.
16
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007
2.4.3. PMMM
The Project Management Maturity Model (PMMM) was introduced by H. Kerzner in
1998. The first edition of his book describing this model was published in 2001. In
2005, he published the second edition.
PMMM is a practical PMBOK-aligned standard [25]. The model sets out various
levels or stages of development towards project management maturity; along with
assessment instruments to validate how far along the maturity curve the organization
has progressed. The original intent of the PMMM is to provide organizations with a
framework that allows organizations to create an organization-specific maturity
model. Each organization can have a different approach to maturity and that is why
organizations are allowed to adapt the questions and answers of the PMMM
questionnaire [25].
The physical part of the model consists of a book and an online assessment tool. Both
components serve to provide individual assessment participants and their
organizations with:
- a breakdown on how they are doing in different categories in each maturity
level;
- a comparison on overall results against those of other companies and
individuals who have taken the assessment; and
- a high-level prescriptive action plan to follow for individual and organizational
improvement.
This model was chosen because of its simplicity and availability. It is interesting to
examine the differences between PMMM and OPM3 since both of them are aligned to
the PMBOK.
2.4.4. P3M3
P3M3 stands for Portfolio, Program and Project Management Maturity Model. It was
formally published in February 2006 by the Office of Government Commerce (OGC)
after some refinements were made [27]. The model describes the portfolio, program
17
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007
Although P3M3 was eligible to be included in this thesis, access to information about
this model was only granted to people of accredited institutions. Several attempts
were made to contact these accredited institutions, but most of them could not fulfill
the request for information due to license agreements. Some of the contacted persons
agreed to provide answers to questions only relevant to conducting the evaluation, but
because this took place when most of the research was already done it was no longer
possible to include P3M3 in the thesis.
2.4.5. MINCE2
The last candidate maturity model for PM was MINCE2, acronym for Maturity
INcrements IN Controlled Environments 2. The MINCE2 Foundation (established in
May 2007) developed this model in order to:
- determine the project maturity level an organization is in;
- report in a standardized way regarding the findings; and
- indicate what to do in order to increase the maturity [33].
MINCE2 was not included in the thesis because at the moment of the maturity model
selection, the owners of the model were still working on the publication of the model.
All details about MINCE2 were going to be published in August 2007. So although
one of the project group members had access to information about the model, it could
not be used for the thesis research due to publishing rights.
In this chapter, the concept of maturity models for PM was explained. The next
chapter elaborates on the development of the framework used to evaluate the selected
maturity models.
18
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007
After several brainstorm sessions and informal meetings, the project group members
agreed upon a staged approach in the evaluation framework. It was decided that the
framework should evaluate a maturity model’s structure, applicability and usage. The
following sections describe these three dimensions. After explaining what a
dimension holds and how it can be operationalized, a measure is described to elicit the
characteristics of a maturity model on that dimension. The description of each
dimension concludes with an elaboration of how the results of the models, found with
the selected measure, are compared with each other.
19
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007
The structure of the maturity reference model comprises a collection of concepts and
relationships between these concepts. Each concept says something about the concept
of ‘maturity’ as defined by a maturity model. And it is the relationship between these
concepts that illustrates their importance and role in the definition. Shedding light on
the structure of the model concepts makes it easier for organizations to understand the
purpose and essence of a maturity model.
The assessment method of a maturity model can be broken down into multiple process
phases and activities. By knowing the structure of these process phases and activities,
organizations will know what to anticipate when engaging in an assessment. The
products resulting from the assessment activities are also important, especially their
relationships with the concepts underlying the maturity reference model. After all,
these products contain the data that assessors use to assess maturity. And they do that
by comparing this data with the measures defined by maturity reference models. So
the relationships between the products resulting from the assessment activities and the
concepts of the reference model must not be ignored.
To depict the structure of the maturity reference model and assessment method as
good as possible, a meta-modeling technique was selected as a measure for the
structure dimension.
20
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007
In [38], the authors were able to compare different object-oriented analysis and design
techniques with each other using a meta-modeling technique. According to the
authors, the construction of meta-models is a uniform and formal way to compare
methodologies with each other as objective as possible, provided that the same
constructs are used to model them.
In this thesis, the framework will employ a type of meta-models called Process-Data
Diagrams (PDD) described in [39]. A PDD is made up of two different meta-models,
namely a ‘meta-process model’ and a ‘meta-data model’. As mentioned earlier, a
maturity model for PM incorporates a maturity reference model and an assessment
method. The two types of meta-models are suitable for modeling each of the two
parts. More specifically, the meta-process model will be used to depict the process
phases and activities of an assessment method, while the meta-data model models the
concepts underlying the maturity reference model. After creation, these two meta-
models will be combined to create a PDD in which the relationships between the
activities of the assessment process and the concepts of the model are shown.
PDDs are able to answer the following three questions for each maturity model for
PM:
- What process phases and activities is the assessment method made up of?
(using the meta-process model)
- What products do the activities of the assessment method deliver? (meta-data
model)
- What concepts underlie a maturity model for PM, and how are they related to
each other and the products of the assessment method? (process-data diagram)
21
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007
After all activities and concepts are defined, they are used to create two additional
tables: an activity comparison table and a concept comparison table. The activity
comparison table consists of a consolidated reference list of the activities of all
maturity models that are selected to test the framework. This means that overlapping
activities of the maturity models are combined and non-overlapping ones are added to
the table. The activity comparison table also contains consolidated process phases of
the assessment methods. Similarly, the concept comparison table contains a
consolidated reference list of concepts of the selected maturity models. These tables
are used for the actual comparison of the activities and concepts of the selected
maturity models. This is done by filling in the fields in the ‘Maturity model’ columns
using the symbols explained below.
22
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007
Finally, an empty field means that an activity or concept in the reference list is not
present in the PDD of the respective maturity model for PM. Examples of comparison
tables resulting from the analysis are depicted below:
A comparison table allows a quick overview of what activities or concepts are present
in the PDD of a maturity model for PM and what the main differences are with other
models. It should be noted that only the concepts related to the assessment activities
are compared using a concept comparison table. The core concepts underlying a
maturity model are depicted as gray boxes in the respective PDDs and are compared
using narrative text instead. This was decided because a concept comparison table
23
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007
24
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007
Eventually, the decision was made to use evaluation criteria to measure relevant
properties of maturity models for PM. Criteria are appropriate measures to use for this
dimension because of their flexibility. Each criterion can capture one property
independently from each other, and this is useful especially when the list of criteria is
not definitive. It allows the addition or removal of criteria without affecting other
criteria on the list.
25
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007
Because the purpose of the evaluation framework is descriptive rather than normative,
no scores or ranks will result from the evaluation based on the criteria. For this reason,
the findings per maturity model will be presented in the format shown in Table 7. The
value ‘Yes’ indicates that a maturity model meets a criteria and ‘No’ indicates
otherwise. The ‘Reference’ column contains references to the information sources
used and the ‘Explanation’ column provides a brief explanation of the findings.
Besides books and publications, assessors, experts and accredited associations are
consulted to gather and verify information. Experts and assessors are people who have
had experience with a maturity model for PM before during maturity assessments.
Accreditation associations are organizations qualified to train and dispatch consultants
to conduct maturity assessments for other organizations. These associations may also
provide training sessions to those who wish to be certified assessors or those who
want to know more about a particular maturity model for PM.
Each criterion is defined and explained below to clarify the property it measures, the
reasons why this property is relevant and how it is used to measure this property.
26
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007
MM1 - Openness
Openness is the degree to which a maturity model for PM is available for public and
whether its usage is limited to certain individuals or organizations.
This criterion employs four possible situations that have significant implications for
the openness of a maturity model. It measures whether a maturity model for PM
and/or assessment materials:
- can be accessed by without payment (free access)
- can be accessed against payment (paid access)
- is meant to be used by certified assessors (certified usage)
- can only be used by specific organizations (proprietary access & usage)
These situations are not independent of each other. A maturity model can, for
instance, be made openly available while its assessment can only be done by certified
assessors. The resulting schema will indicate whether the four situations hold for a
maturity model and assessment. In de ‘Explanation’ column, additional information
will be provided for the values found.
MM3 - Scope
As explained before, the discipline of PM includes the domains of project
management, program management and portfolio management. Because of the
differences between projects, programs and portfolios, the corresponding best
practices and processes will differ as well. The scope of a maturity model for PM is
the extent to which a maturity model embodies PM. Depending on the domains
employed, a maturity model may be structured differently and describe different
processes to improve in. One maturity model for PM may describe all three domains
27
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007
of PM while another one may focus solely on the management of projects. And the
model that only describes project management limits its usage to organizations that
wishes to improve its project management processes. In other words, the limits in
scope have effect on the applicability of a maturity model. This is why scope was
selected as a relevant property.
Process area A process area is a group of related practices that, when implemented collectively,
contribute to making improvements in that area. At the highest abstraction level,
assessors determine the maturity level of organizations by assessing the
presence/absence of particular process areas. Examples of process areas are: risk
management, cost management and change management.
Activity The description of an activity provides assessors with guidance in evaluating whether an
organization carries out certain activities to achieve purposes defined in the maturity
reference model. Examples of activities are: developing a project plan and involving
stakeholders.
Role Roles describe functions of individuals who should be responsible for executing certain
activities. This element can be employed by maturity reference models to see whether
activities are executed by people with the appropriate authority.
Competency Besides roles of the people executing activities, a maturity model may describe the
minimum or required knowledge levels and capabilities of these people. The
competencies of individuals carrying out certain activity may affect the outcome of that
activity.
Deliverable The idea behind this element is that if an organization claims to carry out certain activities,
it should be able to deliver the products resulting from these activities. For instance, if a
member of an organization claims to develop a project plan, this member should be able
to have a project plan produced.
Result Maturity models are meant to help an organization improve. A maturity model may
describe activities or practices that should be in place within organizations, but it can also
describe outcomes or improvements that an organization may experience after achieving
a particular maturity level. In this case, an assessor can determine an organization’s
maturity by assessing whether the organization is able to achieve
improvements/outcomes described by a maturity reference model.
28
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007
The level of detail of a maturity reference model is related to the amount of elements
used to describe the maturity level conditions. The more elements are used, the more
accurate assessors can determine the maturity of organizations. As a result, maturity
can be rated repeatedly leaving little space for inconsistencies caused by the maturity
reference model itself (reliability).
Maturity models for PM of the first group can describe different dimensions in which
organizations can mature or professionalize their practices, but these dimensions are
measured on all maturity levels. Take for instance a dimension such as ‘competences
of people’. A maturity model adopting this dimension describes the type of
competences that members of an organization should possess at each maturity level.
This model then rates an organization’s maturity on this dimension and perhaps if
applicable, on other dimensions as well. The next criterion shall shed more light on
models of the second family.
By shedding light on the maturity dimensions employed by a maturity model,
organization will gain a better understanding of them in order to select a model. The
initial intention was to use a pre-defined list of dimensions used in the literature.
However, the evaluation of the maturity models for PM with this list proved to be
very difficult due to the varying interpretations of the dimensions and underlying
concepts. For this reason, it was decided that the framework will not employ a pre-
defined list of maturity dimensions. Instead, the framework will simply enlist and
describe the dimensions used by a maturity model.
29
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007
30
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007
31
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007
The number of elements used affects the level of detail of a method description. With
a tight protocol to conduct an assessment, the outcomes are less likely affected by
variances in the choices and actions of the assessors. So a thorough description of an
assessment method can ensure its repeatability and the reliability of the assessment
outcomes.
32
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007
for PM may vary, this criterion only considers the part of the questionnaire focusing
on the ‘project management’ domain.
This criterion results in a number, indicating the amount of questions in a
questionnaire provided by each maturity model for PM.
AM7 - Benchmarking
Depending on the maturity model, the results of a maturity assessment may be used
for benchmarking purposes. This means that anonymous maturity results of assessed
organizations can be gathered by assessment institutions and used to make
comparisons of aggregated scores between for instance industry sectors or region. If
an institution allows it, assessed organization may compare their maturity scores with
the aggregated maturity scores of organizations operating in the same industry or
region.
Before that is possible, however, a standard method for the assessment needs to be in
place. As with the standardization of a questionnaire, the standardization of the
assessment method decreases the amount of variance in the results attributable to
various factors (e.g. choice of participants or assessors) and increases the attribution
33
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007
to the real variable in question: maturity. This criterion will consider two questions
when evaluating each assessment method description for this criterion:
- whether a standard method is described for a maturity assessment, and
- whether maturity profiles will be preserved for benchmarking purposes
available to an assessed organization.
difference #1 similarity #1
Differences
After these tables, another table is depicted to summarize briefly the findings of each
model per criteria. By placing all values in a schema, it is easy to see the differences
between maturity models in a quick glance. As some criteria cannot be answered with
a simply yes/no answer, some of the fields of the schema will also contain narrative
text as values. An example of the resulting schema is shown in Table 11.
Criterion A No Yes No
Criterion B
- aspect ba Yes No Yes
- aspect bb No No Yes
Criterion C No Yes No
Criterion D Yes Yes No
34
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007
There are two ways to measure on this dimension: via a survey or conducting
interviews. Because there was little time to gather enough respondents to make a
survey statistically relevant, the project group had decided to opt for interviews.
3.3.1. Interviews
This part of the framework was originally meant to elicit user experiences with a
maturity model for PM; therefore, these interviews had to involve people who have
had experience before with the selected maturity models.
In the end, only a small number of interviews were conducted: two for CMMI and one
for OPM3. The reason for this is because there were few people available who had
experience with the selected maturity models, especially OPM3 and PMMM. These
two models are relatively less well-known in Europe compared to CMMI. On top of
that, not many organizations in the Netherlands have adopted these two models.
Ideally, the usage dimension should be measured by interviews with two different
user groups per maturity model, namely: the assessors and members of user
organizations. The reason behind this is because simply taking one of the two
perspectives will create a biased view of these models. Unlike (certified) assessors,
members of user organizations do not have much experience with maturity models
and thus, their knowledge backgrounds will differ from those of assessors.
And even if members of a user organization attend training sessions to gain
knowledge about a maturity model (i.e. not with the intention to become certified
assessors), there will still be a difference in how they look at a maturity model simply
because they have different objectives of using one. Assessors use maturity models to
assess other user organizations; they interview members of an organization and verify
35
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007
the findings to determine the degree of maturity of that organization. Conversely, user
organizations are those who request assistance of certified assessors to assess their
maturity. They are the ones being interviewed and providing information about their
ways of working. Depending on the maturity model, either assessors or user
organizations will use the model to determine improvement trajectories. Ultimately, it
will be the members of the user organizations who initiate and realize the
improvement initiatives.
Because of these reasons, eliciting practical experiences from both the assessors and
members of user organizations will help generating a complete picture of the usage
dimension of a maturity model.
Two different questionnaires were developed for this dimension: one meant for a user
organization and the other for an assessor. These questionnaires are included in the
Appendix section in Dutch (see Appendix D-1).
Due to the small number of interviews, the Project Group decided that the retrieved
interview data will be used to support findings of the previous two dimensions.
However, information retrieved from both interviews and informal consultations are
described separately as the results of the usage dimension if it provides insight into
the models overlooked by the previous two dimensions.
36
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007
OPM3
The OPM3 standard comprises several important components, namely the
‘foundation’, a ‘self-assessment tool’ and three sets of ‘directories’. And to register
maturity scores of client organizations for reasons of benchmarking, the PMI also has
a ‘maturity database’ to store maturity profiles. While the foundation and self-
assessment tool are made available to assist client organizations in understanding the
OPM3 standard, the three ‘directories’ form the core of the assessment method. These
are the ‘best practices directory’, ‘capabilities directory’ and ‘improvement planning
directory’. The best practices directory contains two types of ‘best practices’, namely
‘organizational enablers’ and ‘process best practices’. The former are supportive
practices that relate to the organizational structures and processes required to facilitate
efficient and effective realization of best practices for projects. The latter are basically
practices that are currently recognized and applied by industries to achieve a stated
goal or objective. These best practices are grouped by ‘process improvement stages’
37
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007
and ‘PM domains’. ‘Process improvement stages’ are the four stages of process
maturity: from process Standardization to Measurement to Control and ultimately to
Continuous Improvement. In other words, the OPM3 model describes best practices
that an organization can learn from to Standardize, Measure, Control or Continuously
Improve its PM processes. Organizations can also choose to assess their process
maturity within a specific PM domain such as project management, program
management or portfolio management.
OPM3 describes ‘capabilities’, stored in the ‘capabilities directory’, which are
specific competencies that must exist in an organization in order for it to execute a
best practice. To prove the existence of a capability, the existence of one or more
corresponding ‘outcomes’ is examined. Outcomes are the tangible or intangible result
of applying a capability. The degree to which an outcome exists is determined by
criteria also known as ‘key performance indicators (KPI)’.
Furthermore, OPM3 acknowledges ‘dependencies’ among its underlying concepts,
particularly best practices and capabilities. The first type of dependency lies between
the series of capabilities leading to a best practice. Also, each capability builds upon
preceding capabilities to achieve one single best practice. But a capability under one
best practice can also depend on the existence of a capability under a different best
practice, in this case a best practice is said to be dependent on another best practice.
This is the second type of dependency described within OPM3. These dependencies
are stored in the ‘improvement planning directory’.
CMMI
CMMI embodies five ‘maturity levels’, each a layer in the foundation for ongoing
process improvement. Because each maturity level forms a necessary foundation for
the next level, an organization cannot achieve, for instance, maturity level 3 if it
hasn’t achieved level 2 yet. CMMI describes ‘dependencies’ between pairs of
adjacent maturity levels.
In CMMI, each maturity level is represented by several ‘process areas’ and for an
organization to achieve a particular maturity level, the corresponding process areas
have to have achieved that maturity level.
The maturity model describes ‘generic goals’ and ‘specific goals’ that guides the
process of bringing a process area to a higher maturity level. Generic goals can be
38
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007
PMMM
Just like CMMI, the PMMM also embodies five ‘maturity levels’. In the book of
Kerzner [15] where this model is described, each maturity level is equipped with
explanations about ‘roadblocks’, ‘risks’ ‘advancement criteria’ and an ‘assessment
instrument’. The first three concepts represent things an organization need to know
before it can achieve that level and advance to the next level. Each level is
accompanied by an assessment instrument in the form of a questionnaire that
organizations can use to assess the degree to which it has achieved that level.
Organizations can do it manually using the book of Kerzner, but PMMM also has an
online version of the assessment instruments. And those who have used the online
assessment tool can also compare their assessment scores with scores of other
organizations that are stored in IIL’s ‘benchmarking database’.
39
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007
40
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007
In the first process phase, there are several differences between the three maturity
models for PM. First of all, during an OPM3 assessment, organizations usually
familiarize themselves with the model and conduct a self-assessment themselves to
determine the necessity for an OPM3 rigorous assessment involving certified
assessors. Contrary to that, the assessment method of both CMMI and PMMM do not
include these two activities. This is because the SEI does not provide any tools for
self-assessments for CMMI. The remaining activities in the first phase are present in
the assessment methods of both OPM3 and CMMI. Both models require assessors to
consult documents and conduct interviews to gather information in the second phase,
and the activities 1.3 to 1.9 are carried out to make the necessary preparations.
PMMM on the other hand contain only activities regarding the requirements analysis,
the selection of participants and the development of an assessment plan. Furthermore,
there is no self-assessment included in the assessment method. This is because the
assessment method of PMMM only requires organizations to interact with an online
tool, which takes place during the second phase (2.5). The IIL provides an online as
well as offline version of the same questionnaire for the assessment. Organization
may use the offline version to do a self-assessment, but unlike OPM3’s self-
assessment tool, no explanation is provided for the results found.
Furthermore, while OPM3 and CMMI assessors have the responsibility to decide on
the adequate amount and the appropriate roles of assessment participants. IIL does not
have control over who ultimately interacts with the PMMM online assessment tool. In
this latter case, it is the responsibility of the client organization to select the right
amount of particular roles to include in the scoped sample.
After gathering the necessary data, it is documented during a CMMI assessment. This
data is then used to manually determine the maturity level of the organization. During
an OPM3 assessment, this data is entered into a tool by lead assessors to have the
assessment results automatically generated. In both cases, a final report is then
delivered to the assessed organization containing the aggregated results and
corresponding explanations. The PMMM assessment differs from this by allowing
assessments participants to retrieve their individual scores and compare them to
41
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007
Finally, while CMMIs assessment method ends after the assessors deliver the final
report to the assessed organization, the OPM3 assessment continues until an
improvement plan is developed containing prioritized improvement initiatives. This is
because OPM3 assessments are meant to help organizations realize improvements.
CMMI assessments do not include these activities because a CMMI assessment is not
necessarily conducted with the goal to realize improvements. Organizations may
undergo CMMI assessments just to rank themselves onto a standardized model
without the intention to improve. This is not possible for OPM3 because it does not
define strict maturity levels.
42
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007
43
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007
In the conduct assessment phase, CMMIs method describes more thoroughly the
categories and type of data gathered during the assessment (2.2-2.5) while OPM3
summarizes them into one concept (2.1). PMMM does not employ these concepts at
all because, unlike OPM3 and CMMI, its assessment does not involve assessors who
have to gather the right data from the right sources.
In the activity comparison table, it was evident that PMMM allows participants to
access their individual scores. This explains the corresponding concepts in the second
phase of the concept comparison table (2.8-2.10). The differences found in the
concepts 2.12-2.16 between OPM3 and CMMI can be explained by the differences in
the concepts used to describe maturity (see previous section).
Unlike PMMM, the OPM3 and CMMI benchmarking scores are not provided during
the maturity assessment, which explains why the benchmark related concepts (2.17-
2.19) only have corresponding symbols in the PMMM column.
And finally, the OPM3 model is the only one describing concepts that are related to
the development of an improvement plan in the fourth phase. The reasons for this are
already explained before.
44
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007
assess whoever it wants to assess. Considering the fact that participants can look into
their individual scores, this means that the model allows for individual assessments
and development for all members of a project-based organization. The downside of
this, however, is that unless a large sample is involved, the aggregate scores will be
affected by the choice of participants. Furthermore, with an online website and
questionnaire as a medium during the assessment, no account is taken for the
environment of an organization or other factors when generating the results. So
organizations have to contemplate whether the results and improvement suggestions
are applicable to their specific situation.
During the assessments of OPM3 and CMMI, the assessors are responsible for
selecting, preparing and interviewing the appropriate members based on their
experience besides consulting documents to gather information. Both models
determine the maturity level of an organization by the practices that it has
implemented. The difference between the two models is evident when the gathered
data is processed to generate the assessment results. OPM3 assessors make use of a
tool to process the data gathered and automatically generate a report containing
assessment results, while CMMI assessors (following the SCAMPI method) have to
do it manually through meetings between lead assessors and assessment team
members. All things being equal, this might imply that a CMMI assessment takes
more time than an OPM3 assessment.
Finally, the structure of OPM3 shows that it is a maturity model revolving around
making improvements. What is less evident is that OPM3 does not have a
standardized while CMMI can also be used to determine an organization’s maturity
level in a standardized model. PMMM differs from OPM3 in that it defines distinctive
levels of maturity, but it does not require organizations to follow the path to maturity
from 1 to 5. PMMM allows organizations to assess its progress at all 5 maturity levels
in a relatively simple way.
45
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007
After all seven maturity reference model criteria are discussed separately; another
table is depicted with the summarized results along with a brief explanation.
46
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007
The most important similarity here is that the application of the three models is not
limited to the owners of the models. Materials of all three models are available to the
public, although the access can be bound by payment in the case of PMMM and
OPM3. The definition documents of CMMI, on the other hand are freely
downloadable from the Web. Unlike OPM3 and CMMI, PMMM does not require the
intervention of certified assessors to conduct the assessment. Explanations about this
and about the possible implications are already described before in section 4.1.2.
47
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007
Regarding scope of the maturity models, none of them places industry related
restriction on the application of the model (see Table 15). As for the size of the client
organizations, OPM3 is the only one that explicitly states that the model can be
applied to organizations of all sizes. An implication of this is that organizations have
to consider by themselves whether a maturity model is still applicable to the size of
the scope they have in mind.
Similarities
- The models describe best - The models describe best
OPM3 practices related to project practices related to project
management management
- PM practices within CMMI - The models describe best
are described to achieve practices related to project
goals regarding software management
development. Conversely, - The models do not cover
OPM3 describes practices practices related to program
CMMI to achieve goals regarding and portfolio management
PM
Differences
All three models appear at first glance to cover the project management domain, but it
should be noted that CMMI describe project management practices differently than
the other two models. This is because, as mentioned before, CMMI is developed for
software development purposes and not PM. This model focuses on the management
48
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007
Similarities
- Process areas and activities - Process areas and activities
are described are described
- The models do not describe - The models do not describe
competences that members the results that
of an organization should organizations should be
harbor, or results that could able to achieve at a certain
OPM3 be expected when a maturity level
maturity is achieved
- Both models verify the
existence of process areas
and activities by the
products resulting from
them
- OPM3 only describe - Process areas and activities
activities (i.e. practices) that are described by both
should be in place in an models
organization. CMMI, on the - Roles are described as well
CMMI other hand, also describe - The models do not describe
process areas, and roles the results that client
that should be responsible organizations should be
for executing certain able to achieve at a certain
activities maturity level
- OPM3 does not describe - CMMI does not describe
Differences
The fourth criterion is the degree to which a PM maturity model describes the
conditions to achieve a maturity level (see Table 17). None of the three models appear
to describe the project results that should be achieved when a maturity level has been
reached. This is not too surprising when taking into account that a lot of factors can
influence the outcome of a project. A project can still fail even if PM is successfully
49
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007
carried out within an organization [17] and vice versa. So even if a certain maturity
level is achieved, it does not guarantee desired outcomes of projects (or PM). An
implication of this is that organizations should judge by themselves whether outcomes
have improved after the adoption of a maturity model for PM (models do not measure
the value delivered to business stakeholders).
An important difference that is not immediately obvious in the table is the way CMMI
and PMMM describe and employ the notion of process areas. CMMI employs process
areas to indicate what processes within an organization should receive attention at a
particular maturity level in order to achieve certain (software development) goals.
How this is different from PMMMs approach becomes evident when examining the
dimensions of maturity employed by the model (see next criterion). It will also
become clear why PMMM is the only model that describes the competencies of
people within an organization.
OPM3 and CMMI both require an organization to show evidence (i.e. deliverables) of
the existence of practices while with PMMM the scores given are completely based
on the answers provided by participants. One important implication of this is the lack
of control over the correctness of the answers given during a PMMM assessment.
This could be compensated, however, if a large assessment sample is selected.
50
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007
OPM3
processes
- PMMM assesses different - PMMM assesses different
dimensions on each dimensions of PM maturity
consecutive level. OPM3 on each consecutive level.
PMMM does not describe maturity CMMI does not describe
dimensions along which an maturity dimensions along
organization can achieve which an organization can
maturity achieve maturity
CMMI describes maturity by the capability levels of processes. OPM3 focuses on the
improvement of process in terms of best practices. And PMMM assesses several
dimensions but each at a different maturity level. At level 1, the model looks at the
knowledge of people within an organization regarding basic project management
terminology. At level 2, it measures the degree to which processes are made common
throughout an organization. Level 3 is about combining all corporate methodologies
into a singular methodology to conduct project managements. At level 4, PMMM
measures whether an organization is aware of the importance of benchmarking and
the degree to which benchmarking is carried out. And finally, level 5 measures the
extent to which an organization is improving the singular methodology and business
processes using the information obtained through benchmarking. By examining these
5 maturity levels, it becomes apparent that, unlike OPM3 and CMMI, PMMM is a
model developed to help organizations understand the basic requirements for doing
project management.
51
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007
OPM3
52
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007
best practices)
- OPM3 describes - PMMM only elaborates
dependencies between best dependencies between
practices and capabilities maturity levels, not process
PMMM while PMMM describes areas. CMMI describe
dependencies between dependencies between
maturity levels process areas and maturity
levels
The Process areas dependencies criterion was developed to examine whether the PM
maturity models employed process areas to categorize all PM processes on each
maturity level and whether the realization of one process area depended on the
realization of another one. Although PMMM describes dependencies between
maturity levels like CMMI, these dependencies do not involve process areas. CMMI
categorize processes into process areas on all maturity levels and describe
relationships between them to indicate their dependencies.
Because OPM3 describes dependencies at a lower level between components such as
best practices or capabilities, organizations are provided with more guidance when
selecting improvement trajectories. But there is also a downside to this since more
descriptions may lead to restrictions of an organization’s choice in improvement
actions.
The above results are summarized in the following table (see Table 21). It is
accompanied by a short explanation of the main similarities and differences between
the maturity models for PM.
53
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007
Information about all three models is relatively easy to access, with or without
payment. And none of them limit their usage to their owners or organizations
operating in specific industries. OPM3 is the only maturity model describing practices
of program and portfolio management. CMMI describes project management
practices in a software development context and PMMM describes project
management practices applicable in organizations that want to learn the basics about
conducting project management.
OPM3 only defines and determines maturity in terms of activities (best practices) and
deliverables, while CMMI and PMMM also look at roles of people and process areas.
PMMM is the only model that looks at competencies of the people in an organization
54
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007
and does not concern itself with the deliverables. Furthermore, none of the maturity
models describe the expected results of having achieved a particular maturity level.
Regarding the dimensions of maturity, only PMMM assesses different dimensions of
an organization at each maturity level. CMMI only looks at the processes dimension
in organizations and OPM3 does not explicitly define stages of maturity or
dimensions in which maturity can be achieved.
Unlike CMMI, OPM3 does not define process areas or dependencies between them.
Instead, it describes dependencies between best practices and the capabilities that
constitute them.
OPM3
55
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007
Looking at Table 23, OPM3 and CMMI both provide criteria for selecting (lead)
assessors and PMMM does not provide guidelines at all. These guidelines contribute
to the reliability of an assessment since they help in ruling out factors that might affect
the outcome of the assessment, which in this case would be the competences of the
assessors or participants. By not providing guidelines, an organization may risk
selecting participants that do not represent the scope of the assessment.
56
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007
Both OPM3 and CMMI describe detailed standard procedures that assessors should
follow when conducting assessments. PMMM also provides these descriptions, but of
a lower level of detail since they are meant to be used by organizations that want to
undergo an assessment.
Having a detailed description of the assessment method adds up to the repeatability of
an assessment, which contributes to the reliability of the outcomes. This is relevant
when maturity scores are compared with each other. This way, differences in maturity
scores would be less attributable to factors other than the differences between
assessed organizations.
57
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007
Although group discussion can strengthen assessment findings and rule out personal
biases, none of the three assessment methods prescribe this data gathering method to
retrieve assessment data. Both CMMI and OPM3 prescribe interviews and the
consultation of documents to retrieve and verify data (Table 25). The questionnaire
provided by OPM3 is only meant as a self-assessment tool. The data resulting of this
self-assessment can then be used as one of the input (i.e. trigger) to conduct the more
rigorous assessments with external assessors. PMMM provides one questionnaire for
the assessment and does not describe other methods to retrieve additional information
or for verification.
58
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007
OPM3
- OPM3s questionnaire
CMMI comprise more questions
than CMMIs questionnaire
Differences
The questionnaire of OPM3 contains more questions than CMMI and PMMI, because
each maturity level is made up of multiple components and subcomponents (best
practices and capabilities). Each question in the questionnaire signifies either a best
practice or capability and since OPM3 embodies about 600 best practices, it is not
surprising that only the project management domain would result in more than 800
questions. Questionnaires used by CMMI assessors contain fewer details, but it also
means that assessors have to rely on their own judgment to create the appropriate lists
of questions to ask participants during interviews. The same holds for OPM3
assessors. The difference in the amount of questions is also explained by the scope of
the questionnaires used for the evaluation. The 800 questions of OPM3 cover the
entire project management domain across four process maturity levels. The project
management related questions in CMMI however does not represent one particular
maturity level. The questions of CMMI are grouped according to the 23 process areas
employed by this model and the number of process areas relating to project
management is only a subset of the total amount. The 183 questions of PMMM is the
standard questionnaire for project management over all five maturity levels.
The more questions are provided by a maturity model, the more aspects are
considered, provided that all questions are relevant to the assessment. However, too
many questions may lengthen an assessment process considerably, since it takes time
for assessors to convert hundreds of probe questions into a relevant list of questions
for interviews.
59
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007
Since both CMMI and OPM3 employ assessors to conduct the assessment, it is not
surprising that trainings and certifications are provided for these two models.
Organization using PMMM have to consult the IIL or the available literature [25]
instead of trainings in order to understand PMMM.
No electronic tools are provided for CMMI while assessors of OPM3 and participants
of PMMM can use electronic tools to calculate the maturity score of an organization.
In addition, OPM3 (Foundation) provide means to conduct self-assessments.
Organizations are able to conduct a quick scan on themselves before deciding whether
a full-blown assessment using external assessors is necessary.
60
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007
During and after a PMMM assessment, organizations can access benchmarking scores
by default. Organizations conducting an OPM3 or CMMI assessment can only request
for them after the assessment has ended. In this latter case, organizations receive
anonymous maturity scores aggregated per industry, while PMMM also provides
explanations of the benchmark findings in the elaborated results report. Benchmarking
is important because it allows organizations to compare their maturity scores with
other similar organizations. There is, however, a downside to this since multiple
factors can influence a maturity score. Not much information is gained when only
looking at industry-related differences. Organizations have to consider other factors
such as size or region.
Below is another table to summarize the above criteria results regarding the
assessment methods.
61
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007
Length of questionnaire
(of the project management domain) 800+ 155-180 183
Supportive assessment tools
(Self-)assessment toolset Yes No Yes
Training Yes Yes No
Certification Yes Yes No
Benchmarking Yes Yes Yes
The assessment methods of all three models do not pose requirements for members of
organizations participating in the assessments. However, OPM3 and CMMI do
provide guidelines for the assessors, since they are the ones responsible for selecting
appropriate participants. As PMMM only employs an online questionnaire to gather
information, there are no requirements described for assessors. The organizations
themselves are responsible for selecting participants during PMMM assessments.
Because of this, PMMM is the only one that does not include roles in its assessment
method description.
Except for the self-assessment questionnaire provided by OPM3, both OPM3 and
CMMI use interviews and document consultations to gather data during an
assessment. Also, training sessions and certification possibilities are provided by both
OPM3 and CMMI to help organizations understand the models better.
And finally, maturity results are gathered by all three assessment methods for
benchmarking purposes.
62
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007
The first half of the questionnaires was developed to gain information about the
interviewee and his/her experiences with the maturity model. The second part of the
questionnaires, where interviewees are asked to score the maturity models, was meant
to find out about how the selected maturity models for PM are rated by users eyes in
terms of usability and satisfaction. But these scores are only representative if multiple
assessors or members of assessed organizations are interviewed for the same maturity
model. Since this was not the case, the retrieved scores held no meaning and were
excluded from analysis and this thesis.
The relevant findings of the interviews are summarized in Appendix D-2, excluding
the information about the interviewees (and their organizations) and the scores given
to maturity models. Although the retrieved data of this dimension were used to
support the analysis of the previous findings, some additional insight was gained as
well. Especially the informal meetings with experts elicited important information to
what was already found, which is summarized in the next paragraphs.
63
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007
According to CMMI, an organization can only achieve level 2 if all process areas of
level 1 have fulfilled the necessary requirements. So even though the organization has
fulfilled all requirements to achieve maturity level 2, it will still be rated as level 1 if
one or more requirements for level 1 have not been met. The model applies the same
logic as working with building blocks: one cannot place a block on top of a lower one
that is incomplete. Because of this strict distinction between maturity levels,
organizations can use it to rank themselves onto a standard continuum of maturity.
OPM3 works with a completely different logic. Since it works with best practices,
scores are provided to an organization with the presence of each best practice found.
When determining the organizational maturity at the end of the assessment, no
distinction is made between scores found in the three PM domains (project, program
and portfolio management) or in the four improvement stages (standardize, measure,
control and continuously improve). In other words, all scores are added together. This
is analogous to filling a bucket with water, where the resulting water level indicates
the organizational maturity. An implication of this is that the organizational maturity
64
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007
score does not provide definite information about what an organization is doing well
or poor. OPM3 does not explicitly define maturity levels; it provides best practices
that organization can learn from or adopt to improve their processes.
65
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007
There are two reasons why these two components should be distinguished when
evaluating and comparing maturity models for PM. First, the two components contain
different characteristics that should be measured differently. At the most basic level,
the assessment method describes a process while the reference model description has
66
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007
a more static nature. Second, a maturity model for PM can have more than one
assessment method to apply its reference model. If a distinction is made between the
two components when developing measures for the framework, the components can
be evaluated separately. This may save the effort of having to evaluate an entire
maturity model when it is only the assessment method that is different.
3. What are relevant dimensions to the evaluation of maturity models for project-
based management?
Three dimensions were selected to include in the evaluation framework: structure,
applicability and usage. These three dimensions were used to evaluate three existing
maturity models for PM.
The structure dimension was used to reveal the concepts that underlie each maturity
model and the activities that comprise the corresponding assessment method. During
the evaluation, it showed that each maturity reference model employed a different
structure and different concepts. These concepts altogether reflect the approach that a
maturity model takes to define and determine maturity. And by evaluating the
structure of the assessment method, it became clear what kind of procedure is needed
to use a maturity model. Focusing on the structure alone simplified the process of
understanding how a maturity model is built up in a certain way and effects it has on
its application. In addition, this dimension was able to show where the differences lay
within the maturity models and what implications these differences cause.
67
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007
could not be retrieved for all three models. This dimension is essential because it
focuses on how users are experiencing the usage of the models in practice instead of
what is described about the models in theory (see second dimension). Also, the
retrieved information was useful for the verification and explanation of the findings of
the previous two dimensions.
4. What are the main similarities and differences between maturity models for
project-based management?
One of the main similarities between maturity models for PM lies in the structure of
their assessment methods. Maturity assessments described by these maturity models
always start with a preparation phase, a conduct assessment phase and end with a
finalization or improvement phase where respectively a report is delivered or an
improvement plan is developed. Most activities in the preparation and finalization
phases are similar. The differences between the models are mainly present in the
second phase where activities regarding the data retrieval and results generation are
described. A model such as PMMM gathers information about an organization using
only one questionnaire, while OPM3 and CMMI require the intervention of external
assessors to conduct interviews. Moreover, the results are generated automatically by
a tool during an OPM3 or PMMM assessment while they are generated manually
during a CMMI assessment.
Another similarity is that maturity models for PM are all developed for the same goal:
to help organizations that are adopting the project-based way of working. And it is in
the way of achieving this goal that these maturity models can differ from each other.
For example, a maturity model may be developed to enable an organization to verify
its current position in a standard continuum of maturity (e.g. CMMI) while another
maturity model may help organizations by suggesting improvement initiatives (e.g.
OPM3). On top of that, a maturity model may exist to do both (e.g. PMMM).
Maturity models for PM also differ from each other in the way they define maturity
itself. This has implications for a maturity model’s structure (maturity reference
model) because the definition of maturity determines what aspects of an organization
a model finds important when assessing its maturity. For instance, OPM3 defines
maturity in terms of best practices, and CMMI defines maturity in terms of processes
and whether these processes are effective in helping an organization attain
68
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007
With the above conclusions, the main research question can be answered next.
What measures are needed to evaluate the similarities and differences between
maturity models for project-based management?
To measure the maturity models along the structure, applicability and usage, the
techniques Process-Data Diagram (PDD) modeling, evaluation criteria and interviews
were respectively used.
The construction of PDDs was useful in several ways; it enabled the evaluation of the
assessment method and reference model separately. In spite of that, it was still able to
show the connection between the two components. This measure made it possible to
depict the structure of a maturity model in a simple way. The comparison tables that
were developed accordingly made it easy to reveal the position and nature of the
similarities and differences between the maturity models. For instance, it showed that
most of the differences between the assessment methods of OPM3, CMMI and
PMMM lay in their descriptions of how the gathered assessment data should be
processed to generate the results. And the most basal similarity is that all three
assessment methods have a ‘preparation phase’, ‘conduct assessment phase’ and
‘finalize assessment phase’.
The evaluation criteria were configured to evaluate specific properties affecting the
applicability of the maturity models. Of the 14 properties, openness appears to be the
most important property of a maturity model, because it determines whether
information can be retrieved to evaluate the model on the remaining properties of the
list. Properties such as industry & size and scope were useful in that they may
describe limitations to the context in which a maturity model can be applied. Criteria
such as maturity level description, assessment method description, data gathering
method and supportive assessment tools were able to reveal aspects related to the ease
of use of the maturity models. Criteria such as maturity dimension and process areas
69
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007
are only useful if the evaluated maturity models employ maturity dimensions and
process areas, and if there is enough time to examine the different definitions used by
each maturity model to describe them. Nevertheless, these criteria shed light on the
aspects for which organizations can achieve maturity according to a maturity model,
so they should be included in the framework when possible. Finally, criteria such as
the length of the questionnaire and assessment commitment contributed little to
distinguishing one maturity model from another. These criteria can be excluded from
further usage.
As for the third measure, few interviews were conducted due to the lack of contacts
and the unavailability of people of both user groups for each of the three models.
However, the retrieved data still provided additional insight into the maturity models
that was not found with the previous two measures (section 4.3). Thus, the arguments
for the relevancy of this dimension and measure remain valid. The real value of a
maturity model lies in the eyes of its users, not in the literature or publications. And
interviews are more useful than a survey because it enables the collection of more in-
depth information such as user experiences.
In the end, the purpose was to find out whether the above measures were eligible to
describe the three dimensions and show similarities and differences between maturity
models for PM. After applying the framework to the three maturity models, modeling
PDDs and using evaluation criteria were proven useful measures for the
corresponding dimensions. And regardless of the small number of formal interviews,
the useful data gathered by speaking with assessors and a user proves the adequacy of
the third measure.
70
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007
Completeness
The framework is comprehensive for the analysis and comparison of maturity models
for PM; however it is not yet complete because of the few user experiences gathered
for the third dimension. The framework is considered complete when more interviews
are conducted with members of both user groups for each maturity model being
evaluated. This may happen when this framework is used by the PMI-NL Project
Group to compare maturity models for PM.
Consistency
The framework is consistent since all activities and concepts are consistently defined
and described throughout the framework.
Applicability
This thesis has described the initial application of the framework. The applicability of
the framework will be further determined by the PMI-NL Project Group when using
the framework to compare other maturity models for PM.
Reliability
The reliability of the framework is determined by the way the framework is
constructed and by the sources consulted for information. The framework is not
reliable if the findings of the three measures contradict each other. The correctness of
the PDDs, criteria results and interview data have been verified by experts and
references are described for every piece of retrieved information. And above all, the
findings of the three measures complement each other well, so the framework is
reliable.
71
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007
Efficiency
Evaluating maturity models for PM using the framework is efficient because it
focuses on the most important aspects of these models to evaluate and compare. The
only costs are made when acquiring information about the models. Those wanting to
compare maturity models do not have to search for and purchase literature aimlessly
since the framework already describes what information is needed. License costs can
be avoided since the framework does not enforce the evaluation of models that are not
open to public. Modeling PDDs can be done using Microsoft Visio, which is not
expensive to purchase.
Using the framework might require some effort and time since evaluating on the
applicability dimension requires thorough desktop research with the gathered
information about the models.
Ultimately, the application of the framework does not limit itself to maturity models
for the PM discipline. The only exceptions are several criteria for the applicability
dimension (e.g. scope, process areas). The remaining criteria may be used to evaluate
maturity models of other disciplines. PDDs can be used to model maturity reference
models and assessment methods regardless of the discipline. The same holds for
interviews with user groups for the usage dimension.
72
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007
73
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007
…can decide for themselves whether the results are applicable to their own specific
situation and have a high tolerance for the reliability of the results.
Because PMMM uses only an online questionnaire to gather information, there is no
guarantee that the assessment results are aligned to the situation of the organization.
The reliability of the results is also questionable since the organizations themselves
are responsible for selecting the respondents to fill out the questionnaire. This is the
trade-off for the simplicity and the savings in costs provided by this model.
74
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007
Due to the unavailability of the needed resources, models such as the P3M3 and
MINCE2 model could not be included in the evaluation. While the OPM3 is one of
the most prominent maturity models in America based on the PMBOK, the P3M3 is a
well-known model in Europe based on PRINCE2. If this model was included in the
research, it would have been interesting to see the differences between these two
models. And because MINCE2 is one of the newest and easy to access maturity
models developed, it would have been interesting to evaluate as well. It is now the
task of the PMI-NL Project Group and future researchers to examine these two
maturity models.
And above all, further research should focus on the possible ‘fit’ between different
maturity models for PM and different types of organizations, and the effects this ‘fit’
might have on the performance of an organization.
75
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007
References
[1] Ten Zweege, H.C., De Koning, M.C. & Bons, F. (2006). PPI Project Performance
Improvement. Succesvolle projecten zijn geen toeval. The Hague, The Netherlands: Sdu
Uitgevers.
[2] Kwak, Y.H. & Ibbs, C.W. (2002). Project management process maturity (PM)2 Model
[electronic version]. Journal of Management Engineering, 18(3), 150-155.
[3] Pennypacker, J.S. & Grant, K.P. (2003). Project management maturity: an industry benchmark
[electronic version]. Project Management Journal, 34(1), 4-11.
[4] Fiedler, F.E. (1972). The effects of leadership training and experience: a contingency model
interpretation [electronic version]. Administrative Science Quarterly, 17(4), 453-470.
[5] Wikipedia.org (2007). Fiedler contingency model. Retrieved August 27, 2007 from
Wikipedia.org:
http://en.wikipedia.org/wiki/Fiedler_contingency_model
[6] Lin, W.T. & Shao, B.B.M. (2000). The relationship between user participation and system
success: a simultaneous contingency approach [electronic version]. Information & Management,
37, 283-295.
[7] Beach, L.R. & Mitchell, T.R. (1978). A contingency model for the selection of decision
strategies [electronic version]. The Academy of Management Review, 3(3), 439-449.
[8] Shenhar, A.J. (2001). One size does not fit all projects: exploring classical contingency domains
[electronic version]. Management Science, 47(3), 394-414.
[9] Ginzberg, M.J. (1980). An organizational contingencies view of accounting and information
systems implementation [electronic version]. Accounting, Organization and Society, 5(4), 369-
382.
[10] Association for Information Systems. (2007). Theories used in IS research: contingency theory.
Retrieved August 16, 2007, from the World Wide Web:
http://www.istheory.yorku.ca/contingencytheory.htm
[11] Brinkkemper, S., Saeki, M. & Harmsen, F. (1999). Meta-modelling based assembly techniques
for situational method engineering [electronic version]. Information Systems, 24(3), 209-228.
[12] Project Management Institute [PMI]. (2003). Organizational Project Management Maturity,
knowledge foundation. Newton Square, Pennsylvania USA: PMI.
[13] Project Management Institute [PMI]. (2004). A guide to the project management body of
knowledge (3rd Ed.). Square, Pennsylvania USA: PMI.
[14] Wikipedia, 2007. “Process management”. Retrieved May 20, 2007, from Wikipedia.org:
http://en.wikipedia.org/wiki/Process_management
[15] Wikipedia, 2007. “Program management”. Retrieved August 15, 2007, from Wikipedia.org:
http://en.wikipedia.org/wiki/Project_management
[16] Grant, K.P. & Pennypacker, J.S. (2006). Project management maturity: an assessment of project
management capabilities among and between selected industries [electronic version]. IEEE
Transactions on Engineering Management, 53(1), 59-68.
76
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007
[17] De Wit, A. (1988). Measurement of project success [electronic version]. Project Management
Journal, 6(3), 164-170.
[18] Dvir, D., Raz, T. & Shenhar, A.J. (2003). An empirical analysis of the relationships between
project planning and project success [electronic version]. International Journal of Project
Management, 21, 89-95.
[19] Storm, P.M. & Janssen, R.E. (2004). High performance projects – a speculative model for
measuring and predicting project success. Conference paper submitted to IRNOP VI Project
Research Conference. Retrieved November 11, 2006 from www.ou.nl:
http://www.ou.nl/Docs/Faculteiten/MW/MW%20Working%20Papers/gr%2004-
04%20Storm%20en%20Janssen.pdf
[20] Khang, D.B. & Moe, T.L. (AIT - SOM working paper, August 2006). Success criteria and
factors for international development projects: a lifecycle-based framework [electronic version].
[22] Johnson, J. Boucher, K.D., Connors, K. & Robinson, J. (2001). Project management: the criteria
for success [electronic version]. Software Magazine, Feb 2001
http://findarticles.com/p/articles/mi_m0SMG/is_1_21/ai_71562148
[23] Atkinson, R. (1999). Project management: cost, time and quality, two best guesses and a
phenomenon, its time to accept other success criteria [electronic version]. International Journal
of Project Management, 17(6), 337-342.
[24] Clarke, A. (1999). A practical use of key success factors to improve the effectiveness of project
management [electronic version]. International Journal of Project Management, 17(3), 139-
145.
[25] Kerzner, H. (2005). Using the project management maturity model: strategic planning for
project management (2nd Ed.). New Jersey, USA: John Wiley & Sons.
[26] Crawford, J.K. (2002). Project management maturity model: providing a proven path to project
management excellence. Basel, Switzerland: Marcel Dekker, Inc.
[27] Murray, A. & Ward, M. (2006). Capability maturity models – using P3M3 to improve
performance. Outperform, UK. Retrieved 28 November 2006 from outperform.co.uk:
http://www.outperform.co.uk/Portals/0/P3M3%20Performance%20Improvement%201v2-
APP.pdf
[28] Software Engineering Institute [SEI] (August, 2006). CMMI® for development, version 1.2.
Improving processes for better products. Retrieved March 3, 2007 from sei.cmu.edu (searched
with “CMMI for development”):
http://www.sei.cmu.edu/publications/
[29] Koomen, T. & Pol, M. (1998). Improvement of the test process using TPI. Retrieved May 21,
2007 from sogeti.nl:
http://www.sogeti.nl/images/summary%20TPI%20UK%20v1%2E2_tcm6-32137.pdf
[30] Earthy, J. (1999). Usability maturity model: processes. Public document. Retrieved May 21,
2007 from the World Wide Web:
http://www.idemployee.id.tue.nl/g.w.m.rauterberg/lecturenotes/Usability-Maturity-
Model%5B2%5D.PDF
77
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007
[31] Cooke-Davies, T.J. (2004). Measurement of organizational maturity: what are the relevant
questions about maturity and metrics for a project-based organization to ask, and what do these
imply for project management research? In D.P. Slevin, D.I. Cleland & J.K. Pinto (Eds.),
Innovations: project management research 2004 (pp. 211-228). Newton Square, Pennsylvania
USA: PMI.
[32] Office of Government Commerce [OGC]. (2006). Portfolio, programme & project management
maturity model (P3M3). Crown copyright. Retrieved 28 November 2006 from the World Wide
Web:
http://www.ogc.gov.uk/documents/p3m3.pdf
[34] International Institute for Learning [IIL]. (2007, June 14). Kerzner project management maturity
model online assessment. Retrieved June 14, 2007, from the Word Wide Web:
http://www.iil.com/pm/kpmmm/
[35] Schlichter, J. (2000). Organizational project management maturity model program plan.
Retrieved February 26, 2007, from the World Wide Web:
https://committees.standards.org.au/COMMITTEES/IT-030/Z0005/IT-030-Z0005.PDF
[36] DNV (2007). OPM3 standard knowledge course. Retrieved June 6, 2007 from dnv.nl:
http://www.dnv.nl/dnvtraining/categorie/project_management/OPM3_standard_knowledge_cou
rse.asp
[37] Software Engineering Institute [SEI] (August, 2006). Standard CMMI® Assessment Method for
Process Improvement (SCAMPIsm) A, Version 1.2: Method definition document. Retrieved
March 3, 2007 from sei.cmu.edu (searched with “SCAMPI”):
http://www.sei.cmu.edu/publications/
[38] Hong, S., Van Den Goor, G. & Brinkkemper, S. (1993). A formal approach to the comparison of
object-oriented analysis and design methodologies [electronic version]. Proceeding of the
Twenty-Sixth Hawaii International Conference, 4, 689-698.
[39] Weerd, I. van de & Brinkkemper, S. (2007). Meta-modeling for situational analysis and design
methods [electronic version]. To appear in the Handbook of Research on Modern Systems
Analysis and Design Technologies and Applications. Hershey, PA, USA: Idea Group
Publishing.
[40] White, D. & Fortune, J. (2002). Current practice in project management – an empirical study
[electronic version]. International Journal of Project Management, 20, 1-11.
[41] Wikipedia.org (2007). “ITIL”. Retrieved September 10, 2007 from Wikipedia.org:
http://nl.wikipedia.org/wiki/Information_Technology_Infrastructure_Library
[42] Dalkey, N. & Helmer, O. (1963). An experimental application of the Delphi method to the use
of experts [electronic version]. Management Science, 9(3), 458-467.
[43] Project Management Institute [PMI] (2006). OPM3 ProductSuite Assessor Training. PMI Inc.
[44] Project Management Institute [PMI] (2007). OPM3 Consultant Training Program. PMI Inc.
[45] International Institute for Learning [IIL] (2007). Kerzner project management maturity
assessment: level 1-5 assessment report for ABC company [sample report]. Provided June 13,
2007 by C. Damiba, IIL NY.
78
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007
[46] Project Management Institute [PMI] (2007). Organizational project management maturity model
(OPM3). Retrieved April 15, 2007, from the World Wide Web:
http://opm3online.pmi.org/Default.aspx
[47] Project Management Institute [PMI] (2004). An executive’s guide to OPM3. Retrieved June 5,
2007 from pmi.org:
http://opm3online.pmi.org/demo/downloads/PP_OPM3ExecGuide.pdf
[48] Software Engineering Institute [SEI] (2007). SEI’s official CMMI website: publications.
Retrieved April 14, 2007 from the Word Wide Web:
http://www.sei.cmu.edu/publications/
[49] Software Engineering Institute [SEI] (August, 2006). Appraisal requirements for CMMI®,
Version 1.2. Retrieved March 3, 2007 from sei.cmu.edu (searched with “appraisal
requirements”):
http://www.sei.cmu.edu/publications/
79
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007
APPENDIX
80
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007
OPM3
Chris ten Zweege
(OPM3 assessor, Capgemini)
Lex van der Helm
(OPM3 assessor, Capgemini)
Raimond Wets
(OPM3 assessor, Capgemini)
CMMI
Ben Kooistra
(CMMI assessor/IT architect, Capgemini)
Gerhard Lutjenhuis
(CMMI lead assessor/PM auditor, Capgemini)
Ahmet Ersahin
(senior consultant, Capgemini)
Bert den Ouden
(manager change management Schade & Inkomen, ING)
PMMM
Christian Damiba
(director Methodology and Assessment Solutions, IIL)
81
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007
Activities
Activities can be subdivided into standard activities and composite activities. The
latter can, in its turn, be subdivided into ‘open’ and ‘closed’ activities. An ‘open’
activity is a composite activity of which the (sub)-activities are elaborated and a
‘closed’ activity is one of which the (sub)-activities are not shown. To increase the
readability and understandability of the PDDs, the framework will only use the
notation of standard activities and open activities to when modeling the PM maturity
models. Also, although they are officially called activities, the framework will use the
term ‘processes’ for all open activities and ‘activities’ to indicate (standard) ‘sub-
activities’ from here on (see Figure 1 below).
Transitions
There are officially four types of transitions: unordered, sequential, simultaneous and
conditional. ‘Unordered’ means that activities of a process can be carried out in an
arbitrarily chosen order. This is illustrated by the absence of transition arrows between
the activities. On the other hand, sequential activities are obliged to follow a
82
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007
predefined order. The arrows that connect these activities indicate the order in which
they should be carried out (see Figure 2).
The concurrent property of the simultaneous activities are depicted using two thick
black lines, respectively called the ‘fork’ and ‘join’. The ‘fork’ initiates the
simultaneous flow where separate activities are carried out at the same time and the
‘join’ closes this flow by uniting all separate transitions into one single transition. It is
important to note that the following activity can only be initiated once all
simultaneous activities between the ‘fork’ and ‘join’ have been completed.
Lastly, the conditional flow is shown as a toppled square that splits up an incoming
transition into two or more conditional transitions. A conditional transition is
accompanied by a condition written between brackets (see Figure 2). This condition
has to be formulated in a way such that it can be answered with a ‘true’ or ‘false’.
Concepts are depicted as rectangles and written in capitals using singular nouns
within the meta-model as well as outside the model. In order to distinguish the
83
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007
Relationships
Generalization is a type of relationship between a general concept and a more specific
concept. This type of relationship is illustrated with a directed line with an open
arrowhead pointing at the general concept. A generalization is also called a ‘is-a’
relationship. The generalization relationship shown in Figure 4 below can be read as
“a survey/interview is a data source”.
An association is a structural relationship that describes how concepts are connected
to each other. It is depicted as a simple undirected line between two or more concepts.
The connection represented by an association is expressed as an active verb together
with by a black triangle that indicates the direction in which the connection should be
read. Thus, the association depicted in Figure 4 can be read as “a maturity score
determines a maturity”.
Lastly, an aggregation connotes a relationship between a concept ‘as a whole’ and
other concepts as ‘a part of it’. This relationship is also known as a ‘has-a’
relationship. It is shown as a directed line with an open diamond pointing at the
concept ‘as a whole’, which is also an ‘open’ concept. In the example below, the
aggregation relationship can be read as “a plan has an action and (has a) schedule”.
84
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007
Multiplicity
Besides a name and direction, there is another property of an association worth
mentioning: multiplicity.
This property indicates how many instances of a certain concept have a connection
with an instance of another concept. Multiplicity is expressed numerically at each end
of an association. The different forms of multiplicity are shown in the table below. In
Figure 5, the association example of Figure 1 is repeated, but this time with
expressions of multiplicity. According to this example, one or more instances of
maturity scores can determine precisely one instance of maturity, provided that one is
assuming the assessment of one single organization.
85
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007
Assess maturity
Assess process
MATURITY
Determine maturity score
SCORE
determines►
86
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007
Questionnaire user
Over de gebruiker
1. Hoe lang hebt u al ervaring met het volwassenheidsmodel?
87
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007
Questionnaire assessor
Over de assessor
1. Hoe lang hebt u al ervaring met het volwassenheidsmodel?
2. Sinds wanneer bent u een gecertificeerde assessor voor het model?
3. Hoe vaak bent u betrokken geweest bij een evaluatie sinds de certificering?
4. In welke industrieën hebt u de evaluatie toegepast?
5. Bij welke type en grootte van organisaties hebt u de evaluatie toegepast?
6. Hebt u herhaaldelijke evaluaties uitgevoerd bij dezelfde organisaties?
88
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007
CMMI
Wat was de aanleiding voor het toepassen van het model en wat waren de
verwachtingen/doelstellingen?
- de beheersbaarheid van processen verbeteren
Wat vindt het management van het model? Staan ze er sterk achter? Ondersteunen ze
het initiatief actief?
Het management is overtuigd van de voordelen en ondersteunt het initiatief door
tijdens de kick-off aan iedereen van de organisatie uitgelegd wat de bedoeling is en
waar het goed voor is. Management commitment meet je door te kijken wie er komt
opdagen bij de kick-off, evaluatie en begeleiding. Consistentie is ook erg belangrijk:
als je eist dat een proces aanwezig moet zijn en het ontbreekt dan moet je dat niet
accepteren. En als je straks resultaten terugkrijgt van de evaluatie, moet je er ook iets
mee doen.
89
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007
omdat ze vaak niet in staat zijn om op een afstand naar hun eigen gebruiken en
processen te kijken.
Een zwakke punt van CMMI is dat het model wat te theoretisch is bij het beschrijven
van de activiteiten. Het wordt soms erg generiek beschreven waardoor men weleens
afvraagt wat het model er precies mee bedoelt. De practices zijn soms onvoldoende
concreet beschreven. CMMI zegt wel wat je moet doen, maar hoe je het moet doen
laat het model open. Een nadeel hiervan is dat een organisatie veel tijd en moeite
ervoor moet doen om de ‘hoe’ vast te stellen en er naartoe te handelen. Aan de andere
kant zijn er ook nadelen zodra je de medewerkers van een organisatie een model gaat
opleggen. Dit zorgt voor weerstand. Het feit dat CMMI de ‘hoe’ openlaat maakt het
model moeilijker toe te passen in een grote organisatie dan in een kleine. In kleine
organisaties kunnen alle medewerkers een bijdrage leveren aan het definiëren van
processen conform CMMI. Dit is haalbaar. Maar bij een groep van een paar honderd
man is het moeilijker. In dat geval zou een model waarin de ‘wat’, ‘wie’, ‘hoe’ en
‘waarom’ staan beschreven wel handig zijn.
OPM3
Is er sprake van een standaard procedure bij een evaluatie?
Gedurende de OPM3 training voor assessors wordt er een standaard methode
beschreven die toegepast moet worden tijdens een assessment.
Een zwak punt van OPM3 is dat het sterk is gebaseerd op PMBOK waardoor een
organisatie, die volgens een ander project management methode zoals PRINCE2
werkt, meer moeite heeft om de aanbevelingen in het rapport te vertalen zodat zij er
gebruik van kunnen maken.
90
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007
Overige opmerkingen
OPM3 gaat verder dan alleen het opleveren van een rapport na een assessment.
Medewerkers van een organisatie die opgeleid willen worden tot OPM3 assessors
hebben ook de keuze om een opleiding te volgen om OPM3 consultant te worden. De
meeste OPM3 assessors zijn ook opgeleid als OPM3 consultants waardoor ze na het
opleveren van de assessment resultaten een organisatie ondersteuning kunnen bieden
bij het opstellen van een plan van aanpak voor verbeteringstrajecten.
91
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007
OPM3
OPM3
1
1. Prepare assessment
FOUNDATION
ASSESSMENT 1
Familiarize with OPM3 leads to►
1 TRIGGER
1
SELF-ASSESSMENT
Perform self-assessment SELF-ASSESSMENT REPORT ◄generates TOOL
1 1..* 1
[else]]
1 ORGANIZATION
[Pursue rigorous DESCRIPTION
assessment]
1 COMMUNICATION PM DOMAIN
1
categorizes►
2. Perform Assessment PLAN
BEST PRACTICES
1 ENGAGEMENT PROCESS
Determine depth and breadth DIRECTORY 1
DESCRIPTION IMPROVEMENT 1
of rigorous assessment STAGE 1
1
ORGANIZATIONAL 1..* 1..*
ASSESSMENT ENABLER
Develop assessment plan PLAN 1
BEST PRACTICE
PROCESS BEST
DATA PRACTICE 1
COLLECTION
Prepare for data collection PLAN
1 has▼
PROTOCOL
1..* has▼
KEY 1..* has▼
PERFORMANCE decides►
Verify assessment findings INDICATOR 1 1
OUTCOME 1..*
SCORE
decides►
1 1
Enter findings in rigorous RESULTS 1..* CAPABILITY 1..*
assessment tool ANALYSIS SCORE
decides►
1..* 1
1 1..*
BEST PRACTICE
1 SCORE
Generate & analyze data ASSESSMENT 1..*
RESULTS
- OPM maturity score
1 1
Prepare final report
FINAL REPORT
1 1 MATURITY MATURITY
1
1 PROFILE 1..* DATABASE 1
leads to▼ 1
1..*
[else] IMPROVEMENT
TRIGGER DEPENDENCY
[Pursue
improvements] - (other) BEST PRACTICE/
1..* CAPABILITY 1..*
◄influences
(value: present/absent)
3. Plan for improvement
1 1
1 1..*
PRIORITY ◄influences FACTOR IMPROVEMENT
Select & prioritize PLANNING DIRECTORY
1 1
improvement initiatives
has▼
1
ATTAINABILITY
Develop improvement plan INITIATIVE SCHEDULE
1..* 1 STRATEGIC
PRIORITY
1
92
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007
CMMI
2. Conduct appraisal
PRACTICE ARTIFACT
Prepare participants ◄categorizes IMPLEMENTATION
1 INDICATOR AFFIRMATION
1..* 1..*
1..*
Examine Objective Evidence OBJECTIVE
EVIDENCE
◄determines
has▲
INTERVIEW
Document Objective Evidence
DOCUMENT
1
1..*
PRACTICE SPECIFIC PRACTICE
Verify Objective Evidence RATING
◄has PRACTICE
1 1 GENERIC PRACTICE
1..*
1..* 1..*
MATURITY PROFILE
IMPROVEMENT DATABASE 1
1..* SUGGESTION
Package and archive 1
appraisal Assets
1
1..*
APPRAISAL MATURITY
RECORD PROFILE
1 1 1
93
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007
PMMM
ASSESSMENT PMMM
PLAN
1
1
1. Initiation
1
Determine
SCOPE
assessment scope 1
ROADBLOCK
1..*
ADVANCEMENT
CRITERIA 1..*
2. Assessment
ASSESSMENT
◄calculates
1 INSTRUMENT 1
1..*
5
Fill in online INDIVIDUAL
questionnaire SCORE
provides▲
1..*
[else] BENCHMARKING
DATABASE 1
1
[Need for
benchmark
results] ORGANIZATIONAL
SCORE 1..*
Generate benchmark
scores BENCHMARK INDUSTRY
1..* SCORE SCORE 1..*
SIZE SCORE
1..*
1
Generate ASSESSMENT
assessment results RESULTS
summary SUMMARY
1
[else]
3. Assessment COMPANY
report development 1 BACKGROUND
& delivery
1 RESULTS ANALYSIS
1
Generate & deliver ELABORATE
elaborate assessment ASSESSMENT
report REPORT 1
SUGGESTED ACTION
1..*
BENCHMARK
1..* COMPARISON
94
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007
OPM3
Note: Two of the sources for the definition of the PDDs activities and concepts comprise official OPM3 training
materials. These sources are subdivided into modules (chapters), each with their own separate page numbers. When
referring to particular pages, the capital M will be used to indicate the module number, followed by the actual page
numbers.
95
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007
MATURITY PROFILE A profile containing information regarding the ‘current’ maturity of an organization. A
basic maturity profile includes:
- achieved BEST PRACTICES
- achieved CAPABILITIES
- tool generated SCORES
([44], M.3.p.8)
IMPROVEMENT The needs of the organization and the reason for choosing to pursue organizational
TRIGGER improvements leading to increased maturity. ([12], p.9)
IMPROVEMENT PLAN An improvement plan provides all of the key information relevant to the purpose and
contents of the improvement project. This plan typically includes descriptions of: the
organization, the purpose and objectives of the improvement, the process used to
achieve the improvement plan, the implementation strategy, list of best practices to be
improved, schedule, risks and constraints and (as a reference) the final report of the
assessment. ([43], M.4.p.17)
PRIORITY A right to precedence given to an improvement initiative ([12], pp.39-40)
INITIATIVE A leading action to implement a capability that leads to the realization of improvements
and increased organizational project management maturity. ([12], pp.39-40)
SCHEDULE A scheme stating the time allocated to improvement initiatives, the other in which they
should be carried out (priority) and the resources needed. ([12], pp.39-40)
FACTOR An element that may influence the prioritization of planned improvement initiatives for
optimum use of resources. ([12], p.39)
ATTAINABILITY A factor expressing the degree to which an improvement initiative is achievable. This
consideration can help the organization demonstrate early success and gain valuable
momentum to sustain the improvement initiative. ([12], p.40)
STRATEGIC PRIORITY A factor describing the essentialness of an improvement initiative to an organization’s
strategy. ([12], p.40)
BENEFIT A factor expressing the advantage of an improvement initiative. ([12], p.40)
COST A factor indicating the expenditures connected to an improvement initiative. ([12], p.40)
OPM3 A standard developed under the stewardship of the Project Management Institute. The
purpose of this standard is to provide a way for organizations to understand
organizational project management and to measure their maturity against a
comprehensive and broad-based set of organizational project management best
practices. ([12], p.xiii)
FOUNDATION Narrative text, presenting the OPM3 foundational concepts, with various appendices
and a glossary. ([12], p.xiv)
SELF-ASSESSMENT A tool in support of the self-assessment step outlined in the OPM3 standard. ([12], p.9)
TOOL
BEST PRACTICES One of the three directories necessary to assess an organization against OPM3 and
DIRECTORY evaluate the scope and sequence of possible improvements. In the best practices
directory, the names and brief descriptions are provided of nearly 600 best practices.
([12], pp.31-32)
CAPABILITIES One of the three directories necessary to assess an organization against OPM3 and
DIRECTORY evaluate the scope and sequence of possible improvements. In the capabilities
directory provides detailed data on all of the capabilities in OPM3, organized according
to the best practices with which they are associated. ([12], p.32)
IMPROVEMENT One of the three directories necessary to assess an organization against OPM3 and
PLANNING DIRECTORY evaluate the scope and sequence of possible improvements. This directory is provided
to show the dependencies between capabilities, which are essential to the assessment
and improvement steps of the OPM3 cycle. Once the organization has identified best
practices requiring assessment, this directory will indicate the capabilities leading to
each of these best practices, along with any additional capabilities on which they may
depend. ([12], p.32)
BEST PRACTICE A best practice is an optimal way currently recognized by industry to achieve a stated
goal or objective. For organizational project management, this includes the ability to
deliver projects successfully, consistently, and predictably to implement organization
strategies. ([12], p.171)
PM DOMAIN A domain refers to each of the three domains of PM: project management, program
management, and portfolio management. ([12], p.172)
PROCESS GROUP Project management processes can be organized into five process groups of one or
more processes each: initiating processes, planning processes, executing processes,
controlling processes and closing processes. Process groups are linked together by
the results they produce –the results or outcome of one becomes an input to another.
([13], pp.27-29)
96
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007
PROCESS One of the dimensions along which OPM3 defines organizational project management
IMPROVEMENT STAGE maturity in terms of best practices. These stages to process improvement employed
by OPM3 are: standardize, measure, control, and continuously improve. The
sequence implies a prerequisite relationship between the stages, in that the most
advanced stage (continuously improve) is dependent on a state of control, which is, in
turn, dependent on measurement, which is dependent on standardization. ([12], p.6 &
p.19)
ORGANIZATIONAL Organizational enablers is a sub-set of the OPM3 best practices that relate to the
ENABLER organizational structures and processes necessary to support efficient and effective
implementation and operation of the best practices for the project, program and
portfolio domains. ([43], M.2)
PROCESS BEST See “best practice” definition.
PRACTICE
CAPABILITY A capability is a specific competency that must exist in an organization in order for it to
execute project management processes and deliver project management services and
products. Capabilities are incremental steps leading to one or more best practices.
([12], p.171)
OUTCOME An outcome is the tangible or intangible result of applying a capability. The degree to
which an outcome is achieved is measured by a key performance indicator. ([12],
p.171)
KEY PERFORMANCE A criterion by which an organization can determine, quantitatively or qualitatively,
INDICATOR whether an outcome associated with a capability exists or the degree to which it exists.
([12], p.172)
DEPENDENCY A dependency is a relationship in which a desired state is contingent upon the
achievement of one or more prerequisites. In OPM3, one type of dependency is
represented by the series of capabilities that aggregate to a best practice. Another
type occurs when the existence of one best practice depends, in part, on the existence
of some other best practice. In this case, at least one of the capabilities within the first
best practice depends on the existence of one of the capabilities within the other best
practice. ([12], p.172)
MATURITY DATABASE A repository for maturity profiles of organizations that have conducted rigorous OPM3
assessments. This database is managed by the PMI. (confirmed by certified OPM3
assessor, C. Ten Zweege)
Assessment
Sub-activity Description
process activity
Prepare Familiarize with OPM3 Before an organization decides to conduct a self- or rigorous
assessment OPM3 assessment, it has to understand the contents of OPM3 as
thoroughly as possible and becoming familiar with organization
project management and with the operation of OPM3.
Perform self-assessment In preparation of a rigorous assessment, BEST PRACTICES that
are and are not currently demonstrated in the organizational unit
are identified. This can be done with OPM3’s SELF-
ASSESSMENT TOOL, but also by a tool developed by
organizations themselves or consulting companies. This activity
results in a SELF-ASSESSMENT REPORT. The analysis of the
SELF-ASSESSMENT REPORT will eventually lead to the
decision whether or not there is a need for an organization to
conduct a rigorous assessment with the assistance of certified
OPM3 assessors.
Perform Determine depth and The scope and criteria of the assessment is determined, which is
assessment breadth of rigorous one of the elements described in the ENGAGEMENT
assessment DESCRIPTION.
Acquire & prepare team The ROLES and responsibilities of the assessment TEAM are
specified and communicated to all TEAM members.
Develop assessment The ASSESSMENT PLAN is developed. It is made up of the
plan following parts: ORGANIZATION DESCRIPTION,
COMMUNICATION PLAN, ENGAGEMENT DESCRIPTION and
TEAM & ROLES.
Prepare for data The PROTOCOL for conducting the data collection is generated
collection from the OPM3 ProductSuite Assessment Tool and a DATA
COLLECTION PLAN is developed.
Conduct interviews Interviews are conducted with members of an organization. This
sub-activity results in PRELIMINARY ASSESSMENT FINDINGS
comprising KEY PERFORMANCE INDICATORS.
97
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007
Perform Study records & Records and documents are examined to collect data. This sub-
assessment documents activity results in PRELIMINARY ASSESSMENT FINDINGS
(cont.) comprising KEY PERFORMANCE INDICATORS.
Verify assessment The PRELIMINARY ASSESSMENT FINDINGS are verified.
findings
Enter findings in rigorous The verified PRELIMINARY ASSESSMENT FINDINGS are
assessment tool entered into the OPM3 ProductSuite Assessment Tool.
Generate & analyze The OPM3 ProductSuite Assessment Tool processes the
results PRELIMINARY ASSESSMENT FINDINGS entered and generates
ASSESSMENT RESULTS. After generation, the ASSESSMENT
RESULTS are analyzed first.
Prepare final report Using the ASSESSMENT RESULTS, a FINAL REPORT is
prepared. Based on the FINAL REPORT, an organization decides
whether the findings result in an IMPROVEMENT TRIGGER.
Plan for Select & prioritize Based on the FINAL REPORT, improvement INITIATIVES are
improvement improvement initiatives given PRIORITIES in the IMPROVEMENT PLAN.
Develop improvement An IMPROVEMENT PLAN is developed based on the analysis of
plan the improvement INITIATIVES that can be taken to realize
improvements in the organization; in which order and when they
should be taken, and how.
98
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007
CMMI
Note: The term ‘appraisal’ is defined by CMMI as “an examinations of one or more processes by a trained team of
professionals using an appraisal reference model as the basis for determining, at a minimum, strengths and
weaknesses. In CMMI models this term is employed instead of ‘assessment’, which is defined as “an appraisal done
by an organization internally for the purposes of process improvement”. In order to stay true to the definitions of SEI
regarding CMMI and SCAMPI, the term ‘appraisal’ will be used instead of ‘assessment’ here while both mean the
same throughout this report.
Assessment process
Description
concept
REQUIREMENT A piece of information required to plan an appraisal. Examples of requirements include
the objective of the appraisal, constraints, scope and output. ([37], p.I_10)
APPRAISAL PLAN A guide containing all technical and non-technical details of an appraisal. An appraisal
plan includes among others descriptions about the method to be applied during an
appraisal, the resources needed, costs and risks to be taken into account ([37],
pp.II_18-II_20).
TEAM A group of individuals qualified, experiences, trained, available and prepared to execute
an appraisal ([37], pp.II_32-II_34).
INITIAL OBJECTIVE A report constructed in preparation to the actual appraisal containing information
EVIDENCE REVIEW regarding the data (un)availability, additional information needed, operations and
processes within an organizational unit and an initial set of objective evidence (see
definition below) ([37], pp.II_48-II_49).
DATA COLLECTION A guide containing details about the procedures to collect data for during an appraisal. A
PLAN data collection plan comprises mainly information about the participants to be consulted,
documents to be reviewed, responsibilities for data collection activities and
presentations/ demonstrations to be provided to the participants ([37], pp.II_59-II_62).
OBJECTIVE EVIDENCE Documents or interview results used as indicators of the implementation or
institutionalization of model practices. Sources of objective evidence can include
instruments, presentations, documents, and interviews ([37], p.III_53).
INTERVIEW A meeting of appraisal team members with appraisal participants for the purpose of
gathering information relative to work processes in place. In SCAMPI, this includes face-
to-face interaction with those implementing or using the processes within the
organizational unit. Interviews are typically held with various groups or individuals, such
as project leaders, managers, and practitioners. A combination of formal and informal
interviews may be held and interview scripts or exploratory questions developed to elicit
the information needed ([37], p.III_52).
DOCUMENT A collection of data, regardless of the medium on which it is recorded, that generally has
permanence and can be read by humans or machines. Documents can be work
products reflecting the implementation of one or more model practices. These
documents typically include work products such as organizational policies, procedures,
and implementation-level work products. Documents may be available in hardcopy,
softcopy, or accessible via hyperlinks in a Web-based environment ([37], p.III_50).
ARTIFACT A tangible form of objective evidence indicative of work being performed that is a direct
or indirect result of implementing a CMMI model practice ([37], p.III_49).
AFFIRMATION An oral or written statement confirming or supporting implementation (or lack of
implementation) of a CMMI model specific practice or generic practice. Affirmations are
usually provided by the implementers of the practice and/or internal or external
customers, but may also include other stakeholders (e.g., managers, suppliers) ([37],
p.III_47).
APPRAISAL RECORD An orderly, documented collection of information that is pertinent to the appraisal and
adds to the understanding and verification of the appraisal findings and ratings
generated ([37], p.III_48). An appraisal record contains among others: the appraisal
requirements, appraisal plan, objective evidence, all appraisal ratings and final findings
([37], pp.II_129-pp.II_131).
FINAL FINDINGS The final findings report contain the validated strengths, weaknesses, and ratings (as
REPORT defined by the appraisal plan), reflecting the organizational maturity level for process
areas within the appraisal scope ([37], pp.II_116-II_121).
APPRAISAL RESULTS The results of an appraisal comprise the goal satisfaction ratings, satisfaction ratings of
process areas within the appraisal scope and the derived maturity level rating ([37],
pp.II_106-II_108).
RESULTS ANALYSIS Report describing the analysis steps taken to determine the APPRAISAL RESULTS
([37], pp.II_106-II_108).
IMPROVEMENT A section in the FINAL FINDINGS REPORT containing suggestions for improvements
SUGGESTION based on the APPRAISAL RESULTS found (CMMI appraisal sample report, 2007).
PRACTICE RATING A value assigned by an appraisal team, which indicates the extent to which a practice is
implemented throughout the organizational unit ([37], pp.II_96-II_100).
99
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007
GOAL RATING The value assigned by an appraisal team to a CMMI goal. The rating is determined by
enacting the defined rating process for the appraisal method being employed. The goal
satisfaction rating is based on the extent of practice implementation throughout the
organizational unit ([37], p.I_34 & p.III_48).
PROCESS AREA The value assigned by an appraisal team to a process area. The process area
RATING satisfaction rating is derived from the set of goal satisfaction judgments ([37], p.II_112 &
& p.III_48).
MATURITY LEVEL The value assigned by an appraisal team to the maturity level of an organizational unit.
RATING The rating is determined by enacting the defined rating process for the appraisal method
being employed. The rating of maturity levels is driven algorithmically by the goal
satisfaction ratings ([37], p.I_34 & p.III_48).
MATURITY PROFILE A subset of the contents of the appraisal record, as well other data used by SEI to
aggregate and analyze appraisal performance data for reporting to the community and
monitoring the quality of performed appraisals ([37], p.II_132 & III_49).
100
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007
Assessment
Sub-activity Description
process activity
Plan and Analyze requirements Understand the business needs of the organizational unit for which
prepare for the appraisal is being requested. Information is collected to
appraisal determine appraisal REQUIREMENTS.
Develop appraisal plan An appraisal method is tailored and the needed resources are
identified. The costs and schedule associated with the appraisal are
determined. Logistics are planned and managed and risks are
documented and managed. All this information is documented in the
APPRAISAL PLAN.
Select and prepare team The leader and members of the appraisal TEAM are selected and
prepared.
Obtain & inventory initial Obtain information that facilitates site-specific preparation. Obtain
objective evidence data on model practices used. Potential issue areas, gaps, or risks
are identified to aid in refining the APPRAISAL PLAN. This results
in an INITIAL OBJECTIVE EVIDENCE REVIEW.
Prepare for appraisal Specific data-collection STRATEGIES including sources of data,
conduct tools and technologies to be used, and contingencies to manage
risk of insufficient data are planned and documented in the form of a
DATA COLLECTION PLAN.
Conduct Prepare participants Inform appraisal participants of the purpose of the appraisal and
appraisal prepare them for participation.
Examine objective Activities in accordance with the DATA COLLECTION PLAN are
evidence performed. INTERVIEWS and DOCUMENTS are examined as
OBJECTIVE EVIDENCE to collect information about the practices
implemented in the organizational unit.
Document objective Lasting records of the OBJECTIVE EVIDENCE gathered are
evidence created by identifying and then consolidating notes, transforming
the data into records that document practice implementation, as
well as strengths and weaknesses.
Verify objective evidence The implementation of the organizational unit’s practices for each
instantiation is verified. Each implementation of each practice is
verified so it may be compared to appraisal reference model (i.e.
CMMI) practices, and the TEAM characterized the extent to which
the practices in the model at implemented by means of a
PRACTICE RATING.
Validate preliminary Preliminary findings are validated. Gaps in the implementation of
findings model practices are weaknesses and exemplary implementation of
model practices may be highlighted as strengths in the appraisal
OUTPUTS. Both strengths and weaknesses are validated with
members of the organizational unit.
Generate appraisal APPRAISAL RESULTS are generated based on the validation of
results preliminary appraisal findings. APPRAISAL RESULTS contain
PRACTICE RATINGS, GOAL RATINGS and the MATURITY
LEVEL RATING.
Report results Deliver appraisal results APPRAISAL RESULTS are provided in the form of a FINAL
FINDINGS REPORT to the organizational unit to guide following
actions.
Package and archive Important data and records from the appraisal are preserved in an
appraisal assets APPRAISAL RECORD. Part of this APPRAISAL RECORD is
provided to the CMMI Steward (i.e. SEI) in the form of a MATURITY
PROFILE. Sensitive materials are disposed in an appropriate
manner.
101
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007
PMMM
Assessment process
Description
concept
ASSESSMENT PLAN A plan containing information about the reasons for conducting the maturity assessment
(DRIVER), the scope of the assessment (SCOPE) and those who will participate in the
assessment (ASSESSMENT SAMPLE). [34]
DRIVER Purpose of the maturity assessment. [34]
SCOPE Information regarding what part(s) of the organization will be subjected to the maturity
assessment. [34]
ASSESSMENT SAMPLE Group of employees of the organization selected to fill out the questionnaire in the
ONLINE ASSESSMENT TOOL. [34]
INDIVIDUAL Assessment findings based on the INDIVIDUAL SCORES of a participant. [34]
ASSESSMENT
RESULTS
INDIVIDUAL SCORE Scores achieved by each participant after filling out the part of the questionnaire
belonging to each MATURITY LEVEL. This score indicates the extent to which a
participant has achieved a certain MATURITY LEVEL. [34]
INDIVIDUAL SCORE Brief report containing analysis results and improvement possibilities based on the
ANALYSIS INDIVIDUAL SCORES. [34]
BENCHMARK SCORE BENCHMARK SCORES allow comparisons between the INDIVIDUAL SCORE and the
scores of other participants of the same organization, organizations of the same industry
sector and organizations of the same size. [34]
ORGANIZATIONAL Aggregate score calculated (so far) of the organization within the scope of the
SCORE assessment. [34] Note that this ORGANIZATIONAL SCORE is complete when
presented in the ASSESSMENT RESULTS SUMMARY.
INDUSTRY SCORE Aggregate score of organizations in the BENCHMARKING DATABASE operating in the
same industry sector as the organization in scope. [34]
SIZE SCORE Aggregate score of organizations in the BENCHMARKING DATABASE that are of
similar size as the organization in scope. [34]
ASSESSMENT Assessment findings based on the collective scores of all participants in scope after the
RESULTS SUMMARY questionnaire period. [34]
ELABORATE Report in which assessment findings as well as improvement suggestions are
ASSESSMENT REPORT elaborated. The findings are more thoroughly explained in this report compared to the
ASSESSMENT RESULTS SUMMMARY. [34]
COMPANY Part of the ELABORATE ASSESSMENT REPORT provided to organizations by the IIL
BACKGROUND when requested. It contains information regarding the organization in scope of the
assessment [45].
RESULTS ANALYSIS Part of the ELABORATE ASSESSMENT REPORT provided to organizations by the IIL
when requested. It contains a detailed analysis of the assessment results [45].
SUGGESTED ACTION Part of the ELABORATE ASSESSMENT REPORT provided to organizations by the IIL
when requested. It contains descriptions of the possible actions that could be taken to
realize improvements within the organization [45].
BENCHMARK Part of the ELABORATE ASSESSMENT REPORT provided to organizations by the IIL
COMPARISON when requested. It contains comparisons between the scores achieved by the
organization in scope and organizations of the same size or operating in the same
industry [45].
102
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007
ONLINE ASSESSMENT The online format of the maturity assessment questionnaire. It is fast, automatic and
TOOL easy to use, plus an executive interface monitors results. (confirmed by Mr. C. Damiba)
BENCHMARKING Repository that allows comparisons between the maturity levels of organizations of the
DATABASE same industry or size. (confirmed by Mr. C. Damiba)
Assessment
process Sub-activity Description
activity
Initiation Identify assessment need The DRIVER for carrying out an assessment is identified.
103
Appendix G – Criteria results per model
OPM3
Criteria Aspect Value Reference Explanation
Maturity reference model criteria
Openness Free access No [12] 1.1 OPM3 purpose and scope (p.3) Using the purchasable OPM3 Foundation (OPM3-F) provided by OPM3,
Paid access Yes [46] Pricing & Signup any organization can conduct a quick scan on itself. A comprehensive
[47] What does OPM3 look like? assessment with an acknowledged maturity degree has to be conducted
Certified usage Yes
by certified assessors. In this latter case, materials of the OPM3
Proprietary access & No ProductSuite (OPM3-PS) are used.
usage
Industry & Size Size of organization Yes [12] 1.1 OPM3 purpose and scope (p.3) OPM3 does not place applicability constraints regarding organization size
Industry sector Yes or industry.
Scope Project management Yes [12] 1.4 Organizational maturity (pp. 5-6) In OPM3, organizational project management maturity is reflected by the
Program management Yes combinations of best practices achieved within the project, program and
Portfolio management Yes portfolio domains.
Maturity level Process area No [12] Ch3. Best practices (pp.13-20) Within OPM3, activities are described as practices and deliverables as
description Activity Yes [12] Ch4. The organizational project management outcomes. The model contains 600 best practices and all of them are
processes (pp.21-28) provided openly to users. During an assessment, the assessors determine
Role No [12] Ch5. The OPM3 directories (pp.31-34) an organization’s maturity by the implementations of practices. And the
Competency No [12] Appendix F: Best practices directory outcomes (deliverables) are examined to prove that a practice is really
Deliverable Yes [12] Appendix G: Capabilities directory (p.123) implemented.
[12] Appendix H: Improvement planning directory
Result No (p.125)
Dimension of [12] 1.4 Organizational maturity (pp.5-6) OPM3 does not employ specific dimensions along which organizations can
maturity achieve maturity.
Process areas Not employed [12] Appendix I: Program and portfolio In OPM3, best practices are not grouped into process areas. However, the
management process models (pp.127-129) model does use process areas defined in the PMBOK to determine
improvements activities.
Process area Described differently [12] Appendix H: Improvement planning directory OPM3 acknowledges dependencies between capabilities instead of
dependencies process areas. And it is the improvement planning directory that contains
the dependencies between capabilities that aggregate to different best
practices.
Assessment method criteria
Assessment Not described. [12] Executive summary (p.xvi) OPM3 does not explicitly state the necessity of commitment by higher
commitment management. However, it does indicate the importance of communicating
frequently with the assessment sponsor and senior management of the
organization.
Competence level Assessor Yes [43] Module 6. Planning: acquire an assessment Specific personal attributes as well as competences that an OPM3
team (pp.11-12) ProductSuite Assessor needs to have are described in the training manual.
Participant No [43] Module 8. Execution: preparing for the Explicit competences or traits of assessment participants are not provided,
assessment (pp.5-6) but principles for sampling interviewees are described.
Assessment Process phase Yes [12] 6.3 Steps of the OPM3 cycle (pp.36-46) The detailed description of the assessment method is only available during
method Activity Yes [43] Module1. Introduction: assessment process official OPM3 training programs. A less elaborate explanation of the
description Deliverable Yes overview (p.4) assessment cycle is described in the purchasable knowledge foundation
Role Yes book (i.e. [2]).
Dependency Yes
Data gathering Questionnaires Yes [43] Module8. Execution: perform the OPM3 The only questionnaire meant for participants to fill out is the one that
method ProductSuite assessment (pp.14-21) organizations can use to execute a quick scan prior to the rigorous
Interviews Yes assessment. This questionnaire is provided by the OPM3-F. Interviews
and document consultations are conducted during rigorous assessment by
certified assessors. During rigorous assessments, no questionnaires are
Group discussions No
provided to the members of the organization. The quick scan is considered
one of the data gathering methods during an OPM3 assessment, because
Document Yes the results of the questionnaire (quick scan) trigger the rigorous
consultations assessment.
Length of Project management 800+ Confirmed by Mr. Ten Zweege, certified OPM3 This number includes all questions of the domain of project management
questionnaire domain assessor. on the entire dimension from Standardize to Continuous Improve as well
as questions related to the organizational structures and processes
necessary to support efficient and effective implementation and operation
of the best practices for the project domain.
Supportive (Self-)assessment Yes [12] Executive summary (p.xiv) OPM3-F provides a self-assessment tool. OPM3-PS provides a
assessment tools toolset [43] Module1. Introduction (p.3) professional and extensive tool to facilitate the execution of an assessment
Training Yes [36] OPM3 standard knowledge course. and improvement planning activities. OPM3 provides training programs
that train assessors how to asses and consultants how to plan, develop
Certification Yes and produce improvement project plans. Courses are also available to
those wanting to learn more about OPM3.
Benchmarking Benchmarking is optional [43] Module2. Orientation to OPM3 ProductSuite OPM3 permits benchmarking of OPM3 self-assessment data. It allows
(p.5) users to gain insight into peer organizations’ maturity continuum scores
[47] OPM3 benchmarking (p.5) and best practices, achieved with average, mean and median reports.
OPM3 benchmarking data will be available to those organizations that
participate in the collection and sharing of the data.
CMMI
Additional remarks:
- This evaluation is completely based on SEI’s definition of and service offerings regarding the CMMI for Development. It should be kept in mind that there are other institutions like SEI who
are eligible to conduct (official) CMMI assessments.
- It should be kept in mind that CMMI Development is a maturity model constructed for software development. CMMI encompasses PM processes for the sake of software development. In
CMMI, PM is an aspect of software development and system engineering.
- CMMI employs two distinct approaches, also known as ‘representations’, for its assessments, namely the ‘continuous representation’ and the ‘staged representation’. The continuous
representation is a maturity model structure wherein capability levels, instead of maturity levels, provide a recommended order for approaching process improvement within each specified
process area. With this approach, no maturity ratings are provided afterwards. With the staged representation approach, on the other hand, a maturity level is established by attaining the
goals of a predefined set of process areas. In this representation, each maturity level builds a foundation for subsequent maturity levels. The ‘continuous’ approach is not included in the
scope of this research for the sake of time.
Competence level (lead) Assessor Yes [37] Executive summary: Time frame and CMMI describes which criteria is required to qualify the assessment team
personnel requirements (p.I13). members and leader, but does not elaborate on the contents. There are
[37] Part II Process definitions: 1.3 Select and however, minimum requirements for each role. It also elaborates the
Participant No prepare team (pp.II32-II39) process of selecting the assessment team leader and members.
[49] Ch.4 Requirements for CMMI appraisal
methods (pp.7-10).
Assessment Process phase Yes [37] Executive Summary: What is SCAMPI A? The standard assessment method for CMMI is officially documented by
method Activity Yes (pp.I9-I11) the SEI.
description Deliverable Yes [37] SCAMPI A method overview (pp.I15-I38)
Role Yes [37] Part II Process Definitions (pp.II1-II134).
Dependency Yes
Data gathering Questionnaires No [37] SCAMPI A method overview: Types of
method Interviews Yes objective evidence (p.I22)
Group discussions No [37] SCAMPI A method overview: Instruments
Documents consultation Yes and tools (pp. I28-I29)
Length of Project management 155-180 [28]Ch.4 Relationships among Process Areas: SEI’s CMMI for Development definition document shows that the
questionnaire domain Project Management. (pp.55-58) category ‘project management’ comprises 6 process areas. Each
process area is accompanied by a standard questionnaire, so the
amount of questions per process area was added up to get a total
amount. The amount of questions adds up to approximately 30 per
process area.
Supportive (Self-)assessment toolset No [28] Ch.5 Using CMMI Models: Using CMMI SEI provides information regarding the available CMMI training facilities.
assessment tools Appraisals (pp.68-69). The SEI does not provide self assessment tools for CMMI v1.2. There
Training Yes [28] Ch.5 Using CMMI Models: CMMI-Related are however other institutions that do provide them.
Training (pp.70-71).
Certification Yes [49] Ch.3 Requirements for CMMI appraisal
method class structure (pp.5-6).
Benchmarking Benchmarking is optional [49] Ch.3 Requirements for CMMI appraisal There are possibilities to compare an organization’s own maturity score
method class structure (pp.5-6). with average industry sector maturity score, but this is only because the
SEI registers (by default) CMM and CMMI levels of officially assessed
organizations. This information is not publicly available; only for those
who have taken the official assessment and are registered. Even so, a
registered organization cannot retrieve information regarding the identity
of other assessed organizations.
PMMM
Criteria Aspect Value Reference Explanation
Maturity reference model criteria
Openness Free access No [34] Official IIL PMMM webpage. PMMM consists of a purchasable book and an online assessment tool,
Paid access Yes supported by the International Institute for Learning (IIL), Inc. Access to the
Certified usage No online assessment tool has to be purchased before organizations can use
Proprietary access & No it to generate reports and recommendations.
usage
Industry & Size Size of organization No [34] Who should take this assessment? The online assessment tool is applicable in all industries. And it can be
used by a wide range of users. The applicability to particular sizes of
Industry sector Yes
organizations is not specified.
Scope Project management Yes [25] Ch1. An introduction to the PMMM (pp.41-44) Kerzner’s PMMM specifically focuses on the 5 levels of project
Program management No [25] Ch5. Level5: Continuous improvement (pp. management maturity. Program management and portfolio management
111-138) are described as areas of development when an organization has reached
Portfolio management No the level of continuous improvement regarding project management.
Maturity level Process area Yes [25] Ch. 4-9. An introduction to the Project Project management maturity is described in different terms on each
description Management Maturity Model (PMMM) & the 5 maturity level by the PMMM. At level 1, the model looks at the knowledge
Activity Yes
maturity levels. (pp.41-143) of people within an organization regarding basic project management
Role Yes terminology. At level 2, it describes the importance of common processes
Competency Yes throughout an organization. Level 3 is about combining all corporate
methodologies into a singular methodology. Level 4 acknowledges the
Deliverable No importance of conducting benchmarking. And finally, level 5 is about
Result No continuously improving the singular methodology and business processes
using the information obtained through benchmarking.
Dimensions of Varies per maturity level [25] Ch. 4-9. An introduction to the Project At each maturity level, the PMMM discusses: organizational
maturity Management Maturity Model (PMMM) & the 5 characteristics, roadblocks that prevent an organization from attaining the
maturity levels. (pp.41-143) next level, what must be done to reach to next level and potential risk.
However, the descriptions of these aspects are not used to determine the
maturity level of an organization. As mentioned earlier, PMMM describes
different dimensions of PM maturity at each maturity level.
Process areas Scope management [25] Ch. 4-9. An introduction to the Project On maturity level 1, where PM knowledge is assessed, the processes are
Time management Management Maturity Model (PMMM) & the 5 categorized into the process areas as defined in the PMBOK. This
Cost management maturity levels. (pp.41-143) categorization, however, is not used in the remaining 4 maturity levels as
Human resources the model assesses different aspects on each level.
Procurement management
Quality management
Risk management
Communications management
Process area Not described [25] Ch. 4-9. An introduction to the Project PMMM does not describe dependencies between process areas, but does
dependencies Management Maturity Model (PMMM) & the 5 describe dependencies and overlapping between aspects of PM maturity
maturity levels. (pp.41-143) elaborated at each particular maturity level.
Assessment process criteria
Assessment Described [25] Ch4. An introduction to the PMMM (p.41) The PMMM assists organizations in achieving strategic planning for project
commitment management and to make this happens, executive-level involvement is
necessary to make sure any development and implementation process is
driven from the top down.
Competence level Assessor No Confirmed by Mr. C. Damiba (see Appendix A-2). The PMMM does not place restrictions on the suitability of those
participating in an assessment. Also, the model does not require
assessors as the assessments are carried out by the online assessment
Participant No tool. However, should an organization desire a comprehensive
assessment report, it will be constructed by eligible consultants within the
IIL.
Assessment Process phase Yes [34] Demo version of PMMM online assessment There is no standard procedure for the assessment conduct. Participants
method tool. of an organization only need to complete the questionnaire provided by the
Activity Yes
description online assessment tool. The standard procedure to be followed by the
Deliverable Yes client organization is explained using a demo-version of the online
assessment tool on IIL’s website. As for dependency, organizations can
Role No only retrieve simple or comprehensive assessment reports after working
Dependency Yes with the online assessment tool.
Data gathering Questionnaires Yes [34] Here’s how it works.
method Interviews No
Group discussions No
Document No
consultations
Length of Project management 183 [34] Here’s how it works. PMMM employs only one questionnaire for the assessment.
questionnaire domain
Supportive (Self-)assessment Yes [25] Introduction (p.xviii) To conduct an assessment, PMMM only requires the usage of the online
assessment tools toolset [34] Here’s how it works. assessment tool accompanying the model.
Training No
Certification No
Benchmarking Benchmarking is optional at the end [34] Ready to give it a test run? In the online assessment tool, respondents can compare their individual
of the assessment of each maturity scores at each level with others who have also taken the assessment;
level. within the same organization as well as in other industries. Ultimately, the
higher management can retrieve aggregate scores and compare the total
organizational score with peers in the same industry, of the same size or
with those in other industries.