Sie sind auf Seite 1von 114

Utrecht University

A framework for the comparison of


Maturity Models for
Project-based Management

Tjie-Jau Man, BSc.


Student number - 0217972

Thesis number:
INF/SCR-07-07

Utrecht University supervisors:


Lidwien v/d Wijngaert
Sjaak Brinkkemper

Capgemini supervisors:
Chris ten Zweege
Erwin Dunnink

September 21, 2007

Capgemini
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007

Preface
Writing a thesis is unmistakably one of the hardest tasks that a graduating student has
to deal with. Despite the enthusiasm at the beginning of the venture, you inevitably
end up counting the days before the final deadline (that is, if you have decided upon
one). As comical as it might sound, the whole process of doing the thesis assignment
can be summarized with three consecutive thoughts (at least in my case).

First, you will think that other students are exaggerating since there is no way you
would take that much time to finish your thesis. After all, you are positive about the
subject you have chosen. After several months, you will begin to wonder if you are on
the right track because something just doesn’t feel right. And then, finally, when you
are nearing your final deadline, you would most probably wish you had taken the
assignment more seriously from the beginning.

Fortunately, support is provided to alleviate the recurring dips of motivation and self-
discipline during this long and cumbersome journey towards graduation. In my case, I
wish to thank my supervisors of the University of Utrecht, Lidwien van de Wijngaert
and Sjaak Brinkkemper, for their guidance.
I also want to thank the people at Capgemini Netherlands for providing me with a
resourceful environment to conduct my thesis assignment, especially my supervisors
Chris ten Zweege and Erwin Dunnink.
Likewise to all members of the PMI project group, please accept my thanks for your
cooperation and helpful feedback during and after the project meetings.
And last but certainly not the least; I want to thank my parents for supporting me from
the very beginning. Without them, I would never have come this far.

Zevenaar, September 2007

Tjie-Jau Man

i
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007

Abstract
To conduct project-based management (PM) as successful as possible, it is
fundamental for organizations to invest time and effort to construct the necessary
infrastructure, such as organizational structure, policies and competencies of people.
Over time, more advanced organizations may wonder where they exactly stand in the
whole process and what they should do to make further advancements.
Maturity models for PM are developed to assist organizations that have these
thoughts. By comparing their own practices against best practices described by these
models, organizations can find out how mature or professionalized they are in
performing project-based management and what they could do to realize desirable
improvements in it. However, with more than 20 maturity models available in the
field of PM, organizations have to consider carefully which one they can adopt. In
order to do this, organizations need to know what aspects of these models are
important to consider and how they should evaluate them.
In this thesis, research is done on relevant dimensions to the evaluation of maturity
models for PM. This set the stage for the selection of measures that are needed to
evaluate similarities and differences between maturity models for PM.
The research showed that maturity models for PM can be evaluated along three
dimensions: structure, applicability and usage. And three measures were selected to
operationalize these dimensions in the same respective order: Process-Data Diagrams,
evaluation criteria and user interviews.
These measures formed a framework that was applied to several maturity models for
PM to determine its quality. The framework and its constituting measures proved
useful in shedding light on the relevant similarities and differences between the
models. It was able to show the strengths and weaknesses of each evaluated maturity
model, which should be considered by organizations planning to adopt them.

ii
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007

Table of contents
1. INTRODUCTION ...............................................................................................................2

1.1. RESEARCH QUESTION ..................................................................................................4


1.2. SCIENTIFIC & SOCIETAL RELEVANCE .............................................................................5
1.3. RESEARCH APPROACH .................................................................................................6
2. MATURITY MODELS FOR PROJECT-BASED MANAGEMENT ....................................8

2.1. PROJECT-BASED MANAGEMENT ....................................................................................8


2.2. MATURITY MODELS ....................................................................................................10
2.3. MATURITY MODELS FOR PM ......................................................................................10
2.4. MATURITY MODEL SELECTION .....................................................................................12
3. THE EVALUATION FRAMEWORK ................................................................................19

3.1. STRUCTURE DIMENSION ............................................................................................19


3.1.1. Process-Data Diagram (PDD) Modeling ........................................................20
3.1.2. PDD comparison method ...............................................................................22
3.2. APPLICABILITY DIMENSION .........................................................................................25
3.2.1. Evaluation criteria ...........................................................................................25
3.2.2. Criteria comparison method ...........................................................................34
3.3. USAGE DIMENSION ....................................................................................................35
3.3.1. Interviews........................................................................................................35
4. ANALYSIS & RESULTS .................................................................................................37

4.1. STRUCTURE ANALYSIS ...............................................................................................37


4.1.1. Maturity reference model structure.................................................................37
4.1.2. Assessment method structure ........................................................................40
4.1.3. Structure comparison summary......................................................................44
4.2. APPLICABILITY ANALYSIS ............................................................................................46
4.2.1. Maturity reference model................................................................................46
4.2.2. Assessment method .......................................................................................55
4.3. USAGE ANALYSIS .......................................................................................................63
5. DISCUSSION & CONCLUSIONS ...................................................................................66

5.1. FRAMEWORK REQUIREMENTS .....................................................................................71


5.2. SUGGESTIONS FOR SITUATIONAL SELECTION ...............................................................73
5.3. RECOMMENDATIONS FOR FURTHER RESEARCH ...........................................................75
REFERENCES .........................................................................................................................76

APPENDIX ...............................................................................................................................80

APPENDIX A-1 – PMI PROJECT GROUP MEMBERS AND ROLES ..................................................80


APPENDIX A-2 – CONSULTED EXPERTS....................................................................................81
APPENDIX B – PROCESS-DATA DIAGRAM MODELING ................................................................82
APPENDIX D-1 – ASSESSOR & USER SAMPLE QUESTIONNAIRES ................................................87
APPENDIX D-2 – SUMMARY INTERVIEW FINDINGS .....................................................................89
APPENDIX E – PROCESS-DATA DIAGRAMS ...............................................................................92
APPENDIX F – CONCEPT AND ACTIVITY TABLES ........................................................................95
APPENDIX G – CRITERIA RESULTS PER MODEL .......................................................................104

1
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007

1. Introduction
In general, there are two reasons why it is beneficial for organizations to adopt a
maturity model for project-based management, which includes the management of
projects, programs and portfolios. Ever since organizations began to adopt the project-
based way of conducting business, they have strived to deliver projects successfully.
To do this, organizations require the necessary infrastructure, which includes
processes (methods and techniques), governance structures, competences of people
and tools [1]. Developing such an infrastructure may take several years, and because
of this, more advanced organizations may start to wonder after a while where they
exactly stand in the whole process and whether they are going the right way. This is
when the adoption of a maturity model proves useful. A maturity model is able to
assist organizations in verifying what they have achieved by describing activities and
best practices and categorizing these descriptions into progressive levels of maturity.
The second benefit for adopting a maturity model becomes apparent when an
organization has finished assessing its current practices and aims for advancements to
a desired level of maturity [2]. By comparing the results of a maturity assessment with
the descriptions in a maturity model, an organization gains insight into their strengths
and weaknesses and is able to prioritize its actions to make improvements.

In addition to the above arguments, the execution of a maturity assessment in itself


raises the awareness about what can be improved within an organization. In other
words, members of an organization will focus more on the inefficiencies of their ways
of working simply because they know they are being assessed.

Besides deciding whether a maturity model should be adopted, an equivalently


important process regards making the choice between the offered maturity models for
project-based management. Many maturity models have emerged since the mid 90s
[3] and one question that can be asked here is how organizations should evaluate them
in order to select an appropriate maturity model. As an attempt to answer this
question, a project group has been gathered by Project Management Institute
Nederland (PMI-NL) and given the assignment to publish a book, in which different
maturity models for project-based management are compared with each other.

2
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007

Closely aligned to this assignment, research was done on maturity models for project-
based management. A framework was developed to support the evaluation and
comparison between such models. This framework was applied to compare three
maturity models for project-based management with each other.

3
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007

1.1. Research question


The purpose of this thesis is to develop a framework to evaluate and compare maturity
models for project-based management. The central research question of this
exploratory thesis research is formulated as follows:

“What measures are needed to evaluate the similarities and differences


between maturity models for project-based management?”

The following sub research questions will set the stage for answering the main
research question.

1. What is a maturity model for project-based management?


Before analyzing a maturity model for project-based management, it is important to
understand the reasons behind its existence and what this concept means.

2. What constitutes a maturity model for project-based management?


Examining the components that constitute a maturity model sets the stage for
determining important aspects or dimensions along which they can be evaluated.

3. What are relevant dimensions to the evaluation of maturity models for project-
based management?
These dimensions provide guidance to the selection of the measures that facilitate the
comparison between the maturity models. They will ultimately form the evaluation
framework mentioned earlier.

4. What are the main similarities and differences between maturity models for
project-based management?
This sub question focuses on the main similarities and differences found after
evaluating several maturity models for project-based management with the developed
framework.

4
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007

1.2. Scientific & societal relevance


This thesis research explores the aspects of maturity models for project-based
management that are relevant to distinguishing them from one another. It is meant to
shed light on the strengths and weaknesses of these models and how these affect their
applicability in certain situations. However, it is not meant to indicate superiority or
inferiority among them. The main reason behind this thesis lies in the assumption that
organizations should take careful considerations of which maturity model to adopt.
Maturity models in general are measurement tools used to assess and/or improve an
organization’s processes. Depending on the match between a maturity model and an
organization’s situation, the organization may end up assessing different capabilities
than initially planned. This could affect the outcomes of the maturity assessment and
may cause an organization to overlook some important weaknesses in its current
processes.

Supporting the above assumption is the contingency theory; a theory that takes many
forms in the world of research. The earliest and extensively researched form of
contingency theory was introduced by Fiedler in the 60s, which explains that group
performance is a result of interaction of two factors: leadership style and situational
favorableness [4][5]. Since then, the contingency approach has been applied and
adopted many times [6][7][8][9]. A list of scientific articles that use the contingency
theory is shown in [10].

Similar to what other researchers have done, this thesis applies the contingency theory
on maturity models for project-based management. As mentioned before, the focus
here is on finding measures to elicit similarities and differences among these maturity
models. The results of this research could set the stage for further research on the
possible contingency or ‘fit’ between the models and organizational situations, and
relate this to performance variables. Also, the differences found with the measures
may prove useful in future research efforts on the categorization of maturity models
for project-based management. And if the framework proves useful to the
comparisons between them, it may provide a foundation or be used to evaluate and
compare other maturity models of the same discipline.

5
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007

1.3. Research approach


The thesis research starts off with the reviewing of literature and scientific papers to
gather information about maturity models for project-based management. The next
chapter of this report provides the necessary background information regarding this
concept (Chapter 2). Chapter 2 also provides a long-list of existing maturity models
for project-based management. From this long-list, several maturity models are
selected for the purpose of testing the framework to be developed. The description of
the selection process and the selected maturity models are described in the final
paragraphs of Chapter 2.

The examination of literature continues in combination with expert consults to find


the dimensions in which maturity models may differ from each other. The experts
here include all members of the project group and individuals who have had
experience with maturity models for project-based management before. Both the
selected dimensions and the evaluation framework that will comprise them are
explained in Chapter 3. After the description of the framework, it is applied to several
selected maturity models for project-based management to test the framework and the
dimensions. The elaboration of the analysis and results of this application can be
found in Chapter 4.

In order to determine the quality of the evaluation framework, we have adopted the
five requirements from the field of situational method engineering [11]. These
requirements are used to assess the quality of a method assembled from method
fragments to suit a situation specific for a project. As our framework describes a
method to evaluate and compare maturity models for project-based management,
these requirements can be used for its evaluation. The five requirements are defined as
follows for this thesis:
- Completeness: the framework describes all relevant dimensions for the
evaluation of maturity models for project-based management;
- Consistency: all activities and concepts are consistently defined and described
throughout the framework;

6
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007

- Applicability: the researchers are able to execute the method described by the
framework;
- Reliability: the method described by the framework is semantically correct and
meaningful;
- Efficiency: the method described by the framework can be performed at
minimal cost and effort.

This thesis concludes with the elaboration of discussion topics and conclusions based
on the analysis results and the description of the framework’s quality based on the
above requirements (Chapter 5).

7
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007

2. Maturity models for project-based management


In order to understand the concept of maturity models for project-based management,
it is necessary explain the two smaller concepts that constitutes it: ‘project-based
management’ and ‘maturity model’. These two concepts will be further elaborated in
the following subsections.

2.1. Project-based management


As mentioned in the introduction, the term ‘project-based management’ refers to all
three domains of this discipline, which are ‘project management’, ‘program
management’ and ‘portfolio management’ [12]. From here on, the abbreviation PM
will be used from here on whilst referring to all three domains. In cases when the
abbreviation is less appropriate, the term ‘project-based management’ will be used. To
understand the concept of PM, it is necessary to explain the three domains
constituting it. This is done in the following paragraphs.

A project is a temporary endeavor with a definite beginning and end. To complete a


project successfully, it needs to meet the requirements predefined by all stakeholders
and deliver a product or service different in some unique way from all similar
products or services [13].
The Project Management Body of Knowledge (PMBOK) of the Project Management
Institute (PMI) uses the following broad definition to define the management of
projects:

“…the application of knowledge, skills, tools and techniques to activities


within a project in order to meet or exceed stakeholders’ needs and
expectations [13].”

Although this definition is similar to that of process management [14], it is different


from process management in that project management is concerned with managing
(collections of) temporary undertakings rather than ongoing activities.

8
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007

Programs differ from projects in that they are carried out to achieve specific strategic
business objectives or goals. Or as formulated in [12], “a program’s focus is on
producing, in accordance with a vision of an ‘end state’ consistent with organizational
strategic objectives”. An example of an ‘end state’ is ‘the realization of 5% cost
reduction throughout the entire organization’. To achieve this, a program will consist
of a number of projects or functional activities including for example ‘the
implementation of a new logistics system’ and ‘the development of a new IT system’
[2]. In addition, while the aforementioned project is successful when the logistics
system is implemented conform specifications, the completion of the above program
depends on the realization of the 5% cost reduction, and not with a new IT or logistics
system.
According to Wikipedia [15], the management of programs is:

“…the process of managing multiple ongoing inter-dependent projects. (It)


focuses on selecting the best group of programs, defining them in terms of
their constituent projects and providing an infrastructure where projects can be
run successfully but leaving projects management to the project management
community.”

However, due to physical or financial constraints, organizations cannot undertake all


projects or programs on their to-do-list at the same time. That is when the creation of
a portfolio becomes relevant. A portfolio, in terms of PM, is a collection of projects
and/or programs grouped together to meet strategic objectives [2]. Portfolio
management is the centralized management of one or more portfolios. This domain is
all about the prioritization, facilitation and management of projects/programs based on
their alignment to the business strategy of an organization.

According to [16] there has been an increase in the amount of organizations


employing PM. The growing popularity is also reflected by the large amount of
publications written about this concept. While some of them focus on defining the
success of a project [17][18][19][20], others try to shed light on the various factors
that might influence the effectiveness of PM [21][22][23]. The emergence of these
literature points to the need to professionalize PM such that organizations can

9
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007

undertake successful project continuously. This brings us to the discussion of maturity


models, which are developed to facilitate the improvement of how PM is undertaken.

2.2. Maturity models


The literature has paid a considerable amount of attention to the concept of maturity
models [2][3][24][25][26][27]. This is because a maturity model allows an
organization to assess and compare its own practices against best practices or those
employed by competitors, with the intention to map out a structured path to
improvement [3]. Basically, a maturity model is a framework describing the ideal
progression toward desired improvement using several successive stages or levels.
Note that an organization in the context of maturity models for PM does not
necessarily refer to an entire company. A maturity model can also be applied to a
business unit, functional groups or departments.

One well-known maturity model is the Capability Maturity Model (CMM), introduced
by the Software Engineering Institute (SEI). This model was later replaced by its
successor, the Capability Maturity Model Integration (CMMI) [28]. The development
of Capability Maturity Models had inspired the emergence of other maturity models
in the same field of Software Development. Examples of these are the Test Process
Improvement (TPI) Model developed by Sogeti [29] and the Usability Maturity
Model [30].

2.3. Maturity Models for PM


The existence of CMMI has also led to the development of maturity models for PM.
Because of the role that PM plays in the software development process, many of the
concepts of maturity incorporated in capability maturity models, such as the CMMI,
were adopted by maturity models that emerged in the field of PM [31].
Building on what was explained about maturity models earlier, maturity models for
PM are used to measure the degree to which an organization is executing PM by
comparing its PM practices against practices in general or ‘best practices’. These
models describe how ‘mature’ or professionalized organizations are in conducting PM
and what they could do to improve their way of working.

10
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007

According to [31], there is no generally agreed definition of what a mature project-


based organization looks like. In spite of this, the current number of maturity models
for PM is estimated at 30 [3].

During this research, an attempt was made to construct a long-list containing existing
maturity models for PM. This list of maturity models is depicted in Table 1, along
with their names and owners.

Table 1 Long-list of existing maturity models for PM


Nr Acronym Name Owner
1 OPM3 Organizational Project Management Project Management Institute (PMI)
Maturity Model
2 P3M3 Portfolio, Programme, Project Office of Government Commerce (OGC)
Management Maturity Model
3 P2M Project & Program Management for Project Management Association of Japan
Enterprise Innovation (P2M) (PMAJ)
4 PMMM Project Management Maturity Model PM Solutions

5 PPMMM Project Portfolio Management Maturity PM Solutions


Model
6 PMMM Programme Management Maturity Programme Management Group
Model
7 PMMM Project Management Maturity Model KLR Consulting
8 (PM)2 The Berkeley Project Management Department of Civil Engineering
Process Maturity Model University of California at Berkeley
9 ProMMM Project Management Maturity Model Project Management Professional Solutions
Limited
10 MINCE2 Maturity Increments IN Controlled MINCE2 Foundation
Environments
11 PPMM Project and Portfolio Management PriceWaterhouseCoopers (PWC) Belgium
Maturity
12 CMMI Capability Maturity Model Integration Software Engineering Institute (SEI)
13 SPICE Software Process Software Quality Institute
Improvement and Capability Griffith University, Australia
dEtermination
14 FAA-iCMM Federal Aviation Administration - US Federal Aviation Administration
Integrated Capability Maturity Model
15 Trillium Trillium Bell Canada
16 EFQM EFQM Excellence Model European Foundation for Quality
Management (EFQM)
17 COBIT Control Objectives for Information and Information Systems Audit and Control
related Technology Association (ISACA)
18 INK INK Managementmodel Instituut Nederlandse Kwaliteit (INK)
19 ProjectProof VA Volwassenheidsmodel Van Aetsveld

20 PAM Project Activity Model Artemis

21 Project The Project Excellence Model Berenschot


Excellence
Model
22 PMMM Project Management Maturity Model International Institute for Learning (IIL)
H. Kerzner

11
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007

Maturity models differ from one another in the concepts they embody and the
suggestions they make as to how the path to maturity looks like [22]. Different
maturity model for PM may define maturity differently and measure different things
to determine maturity. Because of this, organizations should give careful
consideration to the selection of a maturity model.

2.4. Maturity model selection


In order to select maturity models to test the framework to be developed, each
member of the project group were asked to rate each maturity model of the long-list
based on several criteria. Information regarding the members of the project group is
included in the Appendix section of this thesis (see Appendix A-1).

The maturity models were rated using the following criteria:


- Method independency: the degree to which a maturity model is closely aligned
to a PM methodology?
- Public domain: the degree to which a maturity model and maturity assessment
can be applied by anyone besides its owners.
- Publication: the degree to which a maturity models is issued in publications.
- Industry independency: the degree to which the application of a maturity
model is limited to particular industry sectors.
- Transparency: the traceability of the calculation of the maturity scores.
- Toolset independency: the degree to which the usage a maturity model is
bound to a toolset.
- Years of existence: how many years a maturity model has existed.
- Ease of use: the degree to which a maturity model is easy to use in practice.

A candidate maturity model should be at least publicly available (or against moderate
payment) through publication in a book (electronically or in print). This is to ensure
that the needed information about a maturity model is accessible when it is evaluated
using the framework. The table below shows the average scores given to the maturity
models.

12
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007

Table 2 Scoring table maturity model for PM

Maturity model nr.



1 2 3 4 5 6 7 8 9 10 11
Selection Criteria
Method independency 6 4 - - 7
Public domain 6 7 - - 10
Publication 10 10 - - 10
Industry independency 10 10 - - 10
Transparency 6 3 - - 10
Toolset independency 5 3 - - 5
Years of existence 7 3 - - 0
Ease of use 6 6 - - 10

Table 2 (continued)

Maturity model nr.



12 13 14 15 16 17 18 19 20 21 22
Selection Criteria
Method independency 10 10
Public domain 10 10
Publication 10 10
Industry independency 3 10
Transparency 7 10
Toolset independency 8 10
Years of existence 10 3
Ease of use 3 10

The maturity models were rated with scores ranging from 0 to 10 where 0 indicates
“Hardly” and 10 indicates “Completely”. Grayed out maturity models are those
excluded from the long-list based on their accessibility. The maturity models
‘MINCE2’ (nr. 4) and ‘PM Solutions PMMM’ (nr. 8) have not been examined yet due
to the limitations of time.

The short-list that resulted from the selection process consisted of the following
maturity models:

- Organizational Project Management Maturity Model (OPM3) [12]


- Capability Maturity Model Integration (CMMI-DEV) [28]
- Kerzner Project Management Maturity Model (PMMM) [25]
- Project, Program, Portfolio Management Maturity Model (P3M3) [32]
- Maturity Increments IN Controlled Environments (MINCE2) [33]

13
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007

The first three models of the list were selected to conduct this thesis, which are the
OPM3, CMMI and PMMM. For this thesis, materials are used provided by the
following institutions for the respective maturity models: Project Management
Institute (PMI) for OPM3, International Institute for Learning (IIL) [34] for PMMM
and Software Engineering Institute (SEI) for CMMI. A brief background of each
model is provided in the following sections, including the reasons why they were or
were not selected to test the framework.

2.4.1. OPM3
OPM3 is an acronym for Organizational Project Management Maturity Model. It is a
standard developed under the stewardship of and introduced in December 2003 by the
Project Management Institute (PMI). The development of this standard was inspired
by the increasing interest in a maturity model that shows a step-by-step method of
improving and maintaining an organization’s ability to translate organizational
strategy into the successful and consistent delivery of projects. In other words, OPM3
is meant to enable organizations to bridge the gap between organizational strategy and
successful projects [35].

The purpose of OPM3 is not to prescribe what kind of improvements users should
make or how they should make them. Rather, by providing a broad-based set of
organizational project management (OPM) best practices, this standard allows an
organization to use it as a basis for study and self-examination, and consequently to
make its own informed decision regarding potential initiatives for changes [12].

The standard comprises three interrelated elements:


- Knowledge. In this element, the user can become proficient with OPM3, be
comfortable with the body of best practices knowledge it contains, with the
idea of OPM and OPM maturity, and with the concepts and methodology of
OPM3.
- Assessment. The organization is compared to OPM3 in this element to
determine its current location on a continuum of OPM maturity.

14
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007

- Improvement. Here, organizations can decide to move ahead with change


initiatives leading to increased maturity using the results of the assessment as a
basis for planning.

OPM3 is represented by two complementary parts: the Foundation and the


ProductSuite. The Foundation is developed by PMI and the ProductSuite is developed
by PMI in collaboration with Det Norske Veritas (DNV) [36], an international
consulting firm. The Foundation is a description of the OPM3 model itself, readily
available to all organizations interested in knowing about the model. The
ProductSuite, on the other hand, is a description of OPM3 as in how the model should
be applied and what steps are taken during a maturity assessment. Access to the latter
is limited to those who enroll for training sessions to become certified assessors, while
the Foundation can be purchased in the form of a book. Both sources are consulted
during the evaluation of OPM3. In cases where the distinction should be made
between the two, the abbreviations OPM3-PS and OPM3-F will indicate information
derived from the ProductSuite and Foundation respectively.

OPM3 was selected for the evaluation because of its popularity in the field of PM.
Information about this model was readily accessible since the chairman of the project
group is a certified OPM3 assessor.
This model is closely aligned to the PMBOK [12], which is a well-accepted standard
approach for project management. Besides this, additional PMBOK guides have
recently been developed to describe approaches to program and portfolio
management. These additions are also embedded into the OPM3.

2.4.2. CMMI
CMMI stands for Capability Maturity Model Integration. Its first version (1.1) was
introduced by the Software Engineering Institute (SEI) in 2002 as the successor of the
Capability Maturity Model (CMM), which was developed from 1987 until 1997. The
SEI (2007) defines CMMI as a process improvement approach that helps
organizations integrate separate functions, set process improvement goals and
priorities, provide guidance for quality processes, and provide a point of reference for
appraising current processes.

15
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007

The latest version of CMMI (2.1), released in 2006, comprises a framework that
allows the generation of multiple models, training courses and appraisal methods
supporting specific areas of interest. CMMI for development is one of those models
and provides guidance for managing, measuring, and monitoring software
development processes.

The development of CMM and CMMI was based on the premise, which states that
“the quality of a system is highly influenced by the quality of the process used to
acquire, develop and maintain it” [28]. These models comprise the essential elements
of effective processes for one or more disciplines (e.g. Software Development) and
describe an evolutionary improvement path from ad hoc, immature processes to
disciplined, mature processes with improved quality and effectiveness. And the
fundamental idea behind this is that even the finest people within an organization
cannot perform their best if the process is not understood or operating at its best [28].

The purpose of the CMMI for Development model is to assist organizations in


enhancing their software development processes for both products and services by
describing characteristics of best practices. The reason why CMMI was included in
this thesis is because of its rich history and worldwide acceptance. The model’s long
history sped up the search for information and experts who were willing to share their
knowledge about it. In particular, the CMMI for Development is selected because of
its availability. The model information was readily downloadable from the Word
Wide Web. For the sake of brevity, the acronym CMMI will be used to refer to
CMMI for Development for the rest of this thesis.

During the evaluation, documents of SEI’s CMMI and the Standard CMMI
Assessment Method for Process Improvement (SCAMPI) [37] are consulted. It should
be noted that SEI is not the only institution that provides CMMI maturity assessments.
While SEI’s assessment method is the only one being evaluated for CMMI, it is not
the only that exists for CMMI. Thus, the findings here do not necessarily account for
the procedures employed by other institutions that also provide CMMI assessments.

16
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007

2.4.3. PMMM
The Project Management Maturity Model (PMMM) was introduced by H. Kerzner in
1998. The first edition of his book describing this model was published in 2001. In
2005, he published the second edition.

PMMM is a practical PMBOK-aligned standard [25]. The model sets out various
levels or stages of development towards project management maturity; along with
assessment instruments to validate how far along the maturity curve the organization
has progressed. The original intent of the PMMM is to provide organizations with a
framework that allows organizations to create an organization-specific maturity
model. Each organization can have a different approach to maturity and that is why
organizations are allowed to adapt the questions and answers of the PMMM
questionnaire [25].

The physical part of the model consists of a book and an online assessment tool. Both
components serve to provide individual assessment participants and their
organizations with:
- a breakdown on how they are doing in different categories in each maturity
level;
- a comparison on overall results against those of other companies and
individuals who have taken the assessment; and
- a high-level prescriptive action plan to follow for individual and organizational
improvement.

This model was chosen because of its simplicity and availability. It is interesting to
examine the differences between PMMM and OPM3 since both of them are aligned to
the PMBOK.

2.4.4. P3M3
P3M3 stands for Portfolio, Program and Project Management Maturity Model. It was
formally published in February 2006 by the Office of Government Commerce (OGC)
after some refinements were made [27]. The model describes the portfolio, program

17
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007

and project-related activities within process areas that contribute to achieving


desirable project outcomes.

Although P3M3 was eligible to be included in this thesis, access to information about
this model was only granted to people of accredited institutions. Several attempts
were made to contact these accredited institutions, but most of them could not fulfill
the request for information due to license agreements. Some of the contacted persons
agreed to provide answers to questions only relevant to conducting the evaluation, but
because this took place when most of the research was already done it was no longer
possible to include P3M3 in the thesis.

2.4.5. MINCE2
The last candidate maturity model for PM was MINCE2, acronym for Maturity
INcrements IN Controlled Environments 2. The MINCE2 Foundation (established in
May 2007) developed this model in order to:
- determine the project maturity level an organization is in;
- report in a standardized way regarding the findings; and
- indicate what to do in order to increase the maturity [33].

MINCE2 was not included in the thesis because at the moment of the maturity model
selection, the owners of the model were still working on the publication of the model.
All details about MINCE2 were going to be published in August 2007. So although
one of the project group members had access to information about the model, it could
not be used for the thesis research due to publishing rights.

In this chapter, the concept of maturity models for PM was explained. The next
chapter elaborates on the development of the framework used to evaluate the selected
maturity models.

18
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007

3. The evaluation framework


The chairman of the PMI-NL Project Group organized monthly meetings to guide the
construction of the framework. Whenever prompt feedback was needed, all members
could contact each other by email or phone. The description of the project group
members and their roles is included in Appendix A-1.

After several brainstorm sessions and informal meetings, the project group members
agreed upon a staged approach in the evaluation framework. It was decided that the
framework should evaluate a maturity model’s structure, applicability and usage. The
following sections describe these three dimensions. After explaining what a
dimension holds and how it can be operationalized, a measure is described to elicit the
characteristics of a maturity model on that dimension. The description of each
dimension concludes with an elaboration of how the results of the models, found with
the selected measure, are compared with each other.

3.1. Structure Dimension


The first dimension along which the framework evaluates a maturity models for PM is
‘structure’. Perhaps it is important to mention here that a maturity model for PM is
made up of two parts: a maturity reference model and an assessment method. From an
assessor’s point of view, the maturity reference model is considered a measuring staff;
it elaborates on ‘what’ an assessor should assess in order to determine the maturity of
an organization. The description of the assessment method, on the other hand,
describes ‘how’ assessors should carry out the assessment to determine maturity. The
characteristics of an assessment method are just as important as the maturity reference
model because it affects the repeatability of an assessment and therefore, also the
reliability of the results. Besides the fact that these two parts have different
measurable characteristics, another reason why a distinction should be made is
because a maturity model for PM does not necessarily have only one assessment
method to apply its reference model. As mentioned earlier, the SEI is the owner of
CMMI, and has described SCAMPI as its standard assessment method.

19
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007

The structure of the maturity reference model comprises a collection of concepts and
relationships between these concepts. Each concept says something about the concept
of ‘maturity’ as defined by a maturity model. And it is the relationship between these
concepts that illustrates their importance and role in the definition. Shedding light on
the structure of the model concepts makes it easier for organizations to understand the
purpose and essence of a maturity model.

The assessment method of a maturity model can be broken down into multiple process
phases and activities. By knowing the structure of these process phases and activities,
organizations will know what to anticipate when engaging in an assessment. The
products resulting from the assessment activities are also important, especially their
relationships with the concepts underlying the maturity reference model. After all,
these products contain the data that assessors use to assess maturity. And they do that
by comparing this data with the measures defined by maturity reference models. So
the relationships between the products resulting from the assessment activities and the
concepts of the reference model must not be ignored.

To depict the structure of the maturity reference model and assessment method as
good as possible, a meta-modeling technique was selected as a measure for the
structure dimension.

3.1.1. Process-Data Diagram (PDD) Modeling


Objective comparisons of the maturity models’ structures require them to be on a
higher abstraction level where it is possible to describe them uniformly. A meta-
model is a model constructed on a higher abstraction level, used to describe the
features of an underlying model. The usage of meta-models would prove useful
because maturity models for PM are described in different languages using various
terminologies. They may use various descriptions even though they imply the same
underlying concepts. Since the purpose of the framework is not to determine
superiority or inferiority between maturity models, it is inappropriate to select one
model (and its terminology) as a starting point and compare the other models with it.
Another reason why the framework should use a meta-modeling technique as a
measure for structure is because it enables a simplified illustration of the maturity

20
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007

models. One quick glance at a meta-model allows an organization to understand the


hierarchy of concepts or activities of a maturity model for PM.

In [38], the authors were able to compare different object-oriented analysis and design
techniques with each other using a meta-modeling technique. According to the
authors, the construction of meta-models is a uniform and formal way to compare
methodologies with each other as objective as possible, provided that the same
constructs are used to model them.

In this thesis, the framework will employ a type of meta-models called Process-Data
Diagrams (PDD) described in [39]. A PDD is made up of two different meta-models,
namely a ‘meta-process model’ and a ‘meta-data model’. As mentioned earlier, a
maturity model for PM incorporates a maturity reference model and an assessment
method. The two types of meta-models are suitable for modeling each of the two
parts. More specifically, the meta-process model will be used to depict the process
phases and activities of an assessment method, while the meta-data model models the
concepts underlying the maturity reference model. After creation, these two meta-
models will be combined to create a PDD in which the relationships between the
activities of the assessment process and the concepts of the model are shown.

PDDs are able to answer the following three questions for each maturity model for
PM:
- What process phases and activities is the assessment method made up of?
(using the meta-process model)
- What products do the activities of the assessment method deliver? (meta-data
model)
- What concepts underlie a maturity model for PM, and how are they related to
each other and the products of the assessment method? (process-data diagram)

A condensed description of the PDD modeling technique and notations employed by


the framework is provided in the Appendix section (see Appendix B). A more
thorough explanation of this method can be found in [39].

21
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007

3.1.2. PDD comparison method


The comparison between the PDDs follows the same approach described in [28]. This
approach begins by defining all concepts and describing all activities depicted in the
PDD of each maturity model. Each maturity model for PM can use a different
terminology, so a thorough understanding of the definitions of the activities and
concepts is needed before comparisons can be made with other models. The
definitions are shown using ‘activity tables’ and ‘concept tables’. They are derived
from publications and literature about the maturity models. People who have had
experience with a particular maturity model are also consulted to gain more and to
verify the definitions.

After all activities and concepts are defined, they are used to create two additional
tables: an activity comparison table and a concept comparison table. The activity
comparison table consists of a consolidated reference list of the activities of all
maturity models that are selected to test the framework. This means that overlapping
activities of the maturity models are combined and non-overlapping ones are added to
the table. The activity comparison table also contains consolidated process phases of
the assessment methods. Similarly, the concept comparison table contains a
consolidated reference list of concepts of the selected maturity models. These tables
are used for the actual comparison of the activities and concepts of the selected
maturity models. This is done by filling in the fields in the ‘Maturity model’ columns
using the symbols explained below.

For the activity comparison table, filling in:


- ‘=’ means that an activity of the reference list is present and also equivalent to
the one in the PDD of a maturity model
- ‘>’ or ‘<’ means that an activity of the reference list is present and comprises
respectively less or more than the one in a PDD
- a ‘(2)’ after one of the previous symbols means that an activity is described in
the second process phase of a PDD.

22
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007

For the concept comparison table, filling in:


- ‘=’ means that a concept of the reference list is present and also equivalent to
the one in the PDD of a maturity model
- the name of a concept means that the concept in the reference list is present in
a PDD but under a different name.
- ‘REPORT’ means that a concept of the reference list is present but not
depicted in the PDD for the sake of space and structure. And if the concept of
the reference list was depicted, it would be described within the REPORT
concept.

Finally, an empty field means that an activity or concept in the reference list is not
present in the PDD of the respective maturity model for PM. Examples of comparison
tables resulting from the analysis are depicted below:

Table 3: Example Activity comparison table


Maturity Maturity
1. Process phase 1
model 1 model 2
1.1 Activity 1.1 = <
1.2 Activity 1.2 < (2) >
1.3 Activity 1.3 =
Maturity Maturity
2. Process phase 2
model 1 model 2
2.1 Activity 2.1 < < (1)
2.2 Activity 2.2 >
2.3 Activity 2.3 > =

Table 4: Example Concept comparison table


Maturity Maturity
1. Process phase 1
model 1 model 2
1.1 Concept 1.1 CONCEPT1 =
1.2 Concept 1.2 = CONCEPT1
1.3 Concept 1.3 = =
Maturity Maturity
2. Process phase 2
model 1 model 2
2.1 Concept 2.1 CONCEPT3
2.2 Concept 2.2 =
2.3 Concept 2.3 CONCEPT5

A comparison table allows a quick overview of what activities or concepts are present
in the PDD of a maturity model for PM and what the main differences are with other
models. It should be noted that only the concepts related to the assessment activities
are compared using a concept comparison table. The core concepts underlying a
maturity model are depicted as gray boxes in the respective PDDs and are compared
using narrative text instead. This was decided because a concept comparison table

23
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007

only compares the presence/absence of concepts in PDDs. However, whether one


maturity model embodies more or fewer concepts than another one is not relevant.
More important is to look how different maturity model are actually built-up. This is
easier to describe using narrative text than a comparison table.

24
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007

3.2. Applicability Dimension


The second stage by the framework focuses on the applicability dimension of maturity
models for PM. This dimension is selected because it sheds light on the properties of a
maturity model for PM that affects the context in which the model can be applied.
It is complementary to the previous dimension since it elicits information about a
maturity model that cannot be depicted by a PDD. Besides the structure of a maturity
model, it is important to know about properties that determine whether a maturity
model for PM is suitable for an organization to adopt. For instance, a maturity model
for PM may contain properties that limit its application in certain industries. A PDD is
not capable of showing this piece of information.

Eventually, the decision was made to use evaluation criteria to measure relevant
properties of maturity models for PM. Criteria are appropriate measures to use for this
dimension because of their flexibility. Each criterion can capture one property
independently from each other, and this is useful especially when the list of criteria is
not definitive. It allows the addition or removal of criteria without affecting other
criteria on the list.

3.2.1. Evaluation criteria


To decide which criteria were relevant to measure the applicability dimension, several
sessions with the project group members were held. During these sessions, the list of
criteria was changed and refined numerous times due to differing knowledge levels
about maturity models for PM. Because of the limited time, the project group settled
for a list of criteria shown in the tables below (see Table 5 and Table 6). The criteria
are categorized into two main groups: criteria regarding the maturity reference model
and those that focus on the assessment method.

Table 5: Maturity reference model criteria


Maturity reference Model (MM)
Nr.
criterion
MM1 Openness
MM2 Industry & size
MM3 Scope
MM4 Maturity level description
MM5 Dimensions of maturity
MM6 Process areas
MM7 Process area dependencies

25
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007

Table 6: Assessment method criteria


Nr. Assessment Method (AM) criterion
AM1 Assessment commitment
AM2 Competence level
AM3 Assessment method description
AM4 Data gathering method
AM5 Length of questionnaire
AM6 Supportive assessment tools
AM7 Benchmarking

Because the purpose of the evaluation framework is descriptive rather than normative,
no scores or ranks will result from the evaluation based on the criteria. For this reason,
the findings per maturity model will be presented in the format shown in Table 7. The
value ‘Yes’ indicates that a maturity model meets a criteria and ‘No’ indicates
otherwise. The ‘Reference’ column contains references to the information sources
used and the ‘Explanation’ column provides a brief explanation of the findings.

Table 7: Example evaluation results format per maturity model


Criteria Aspect Value Reference Explanation
Maturity model criteria
Criterion A No Reference #1 <description of findings about criterion A in
reference #1>
Criterion B Aspect ba Yes Reference #1 <description of findings about aspect ba in
Aspect bb No Reference #2 Reference #1 and #2>

Besides books and publications, assessors, experts and accredited associations are
consulted to gather and verify information. Experts and assessors are people who have
had experience with a maturity model for PM before during maturity assessments.
Accreditation associations are organizations qualified to train and dispatch consultants
to conduct maturity assessments for other organizations. These associations may also
provide training sessions to those who wish to be certified assessors or those who
want to know more about a particular maturity model for PM.

Each criterion is defined and explained below to clarify the property it measures, the
reasons why this property is relevant and how it is used to measure this property.

26
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007

Maturity reference Model (MM) criteria

MM1 - Openness
Openness is the degree to which a maturity model for PM is available for public and
whether its usage is limited to certain individuals or organizations.
This criterion employs four possible situations that have significant implications for
the openness of a maturity model. It measures whether a maturity model for PM
and/or assessment materials:
- can be accessed by without payment (free access)
- can be accessed against payment (paid access)
- is meant to be used by certified assessors (certified usage)
- can only be used by specific organizations (proprietary access & usage)
These situations are not independent of each other. A maturity model can, for
instance, be made openly available while its assessment can only be done by certified
assessors. The resulting schema will indicate whether the four situations hold for a
maturity model and assessment. In de ‘Explanation’ column, additional information
will be provided for the values found.

MM2 – Industry & size


Maturity models for PM can be specifically designed for a certain organizational
context (e.g. type, structure, industry sector). Therefore, it is important for a model
description to contain information about its applicable areas. This criterion measures
whether a maturity model specifies limitations in its application to organizations
operating in particular industry sectors or to organizations of specific sizes.

MM3 - Scope
As explained before, the discipline of PM includes the domains of project
management, program management and portfolio management. Because of the
differences between projects, programs and portfolios, the corresponding best
practices and processes will differ as well. The scope of a maturity model for PM is
the extent to which a maturity model embodies PM. Depending on the domains
employed, a maturity model may be structured differently and describe different
processes to improve in. One maturity model for PM may describe all three domains

27
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007

of PM while another one may focus solely on the management of projects. And the
model that only describes project management limits its usage to organizations that
wishes to improve its project management processes. In other words, the limits in
scope have effect on the applicability of a maturity model. This is why scope was
selected as a relevant property.

MM4 - Maturity level description


In order to determine maturity, an assessor needs to know what exactly he or she
should assess within an organization and how the findings relate to the maturity
descriptions provided in the maturity reference model. Maturity level description is
the degree to which a maturity reference model describes what an assessor should
look at within an organization to determine its maturity. In the same way,
organizations gain insight into what a maturity reference model focuses on when they
are being evaluated.
The consultation of literature about the selected maturity models and meetings with
different assessors resulted in the following list of elements that maturity level
descriptions can contain (see Table 8):

Table 8: Maturity description elements


Element name Explanation

Process area A process area is a group of related practices that, when implemented collectively,
contribute to making improvements in that area. At the highest abstraction level,
assessors determine the maturity level of organizations by assessing the
presence/absence of particular process areas. Examples of process areas are: risk
management, cost management and change management.
Activity The description of an activity provides assessors with guidance in evaluating whether an
organization carries out certain activities to achieve purposes defined in the maturity
reference model. Examples of activities are: developing a project plan and involving
stakeholders.
Role Roles describe functions of individuals who should be responsible for executing certain
activities. This element can be employed by maturity reference models to see whether
activities are executed by people with the appropriate authority.
Competency Besides roles of the people executing activities, a maturity model may describe the
minimum or required knowledge levels and capabilities of these people. The
competencies of individuals carrying out certain activity may affect the outcome of that
activity.
Deliverable The idea behind this element is that if an organization claims to carry out certain activities,
it should be able to deliver the products resulting from these activities. For instance, if a
member of an organization claims to develop a project plan, this member should be able
to have a project plan produced.
Result Maturity models are meant to help an organization improve. A maturity model may
describe activities or practices that should be in place within organizations, but it can also
describe outcomes or improvements that an organization may experience after achieving
a particular maturity level. In this case, an assessor can determine an organization’s
maturity by assessing whether the organization is able to achieve
improvements/outcomes described by a maturity reference model.

28
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007

The level of detail of a maturity reference model is related to the amount of elements
used to describe the maturity level conditions. The more elements are used, the more
accurate assessors can determine the maturity of organizations. As a result, maturity
can be rated repeatedly leaving little space for inconsistencies caused by the maturity
reference model itself (reliability).

MM5 - Dimensions of maturity


In [31], two families of maturity models for PM are described:
- Models in which same things are measured at all levels of maturity; where it is
simply the results that improve with maturity.
- Models stating that more mature organizations measure different things than
immature ones; where the increase in maturity is indicated by measures
showing improving results.

Maturity models for PM of the first group can describe different dimensions in which
organizations can mature or professionalize their practices, but these dimensions are
measured on all maturity levels. Take for instance a dimension such as ‘competences
of people’. A maturity model adopting this dimension describes the type of
competences that members of an organization should possess at each maturity level.
This model then rates an organization’s maturity on this dimension and perhaps if
applicable, on other dimensions as well. The next criterion shall shed more light on
models of the second family.
By shedding light on the maturity dimensions employed by a maturity model,
organization will gain a better understanding of them in order to select a model. The
initial intention was to use a pre-defined list of dimensions used in the literature.
However, the evaluation of the maturity models for PM with this list proved to be
very difficult due to the varying interpretations of the dimensions and underlying
concepts. For this reason, it was decided that the framework will not employ a pre-
defined list of maturity dimensions. Instead, the framework will simply enlist and
describe the dimensions used by a maturity model.

29
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007

MM6 - Process areas


In a maturity model for PM, a PM domain can be broken down into different but
related process areas. So basically, this criterion evaluates the extent to which a
maturity model describes a PM domain. Models belonging to the second family
mentioned earlier, such as CMMI, evaluate the process areas or processes that an
organization should have in place instead of dimensions. These models measure
different things at different maturity levels, which are process areas in CMMI’s case.
Analyzing what process areas are described by a maturity model helps organizations
understand the processes the model deems relevant to achieving maturity. As with the
previous criterion, there are standard process areas defined in the literature, such as in
the PMBOK guide [13]. However, since different maturity model define different
process areas, it is difficult to develop a general list of process areas for reference. So,
here too, the framework will enlist and describe the process areas categorized by each
maturity model for PM.

MM7 - Process area dependencies


The reason why maturity models of the second family measure different things at
different maturity levels is because these models acknowledge strict dependencies
between maturity levels [31].
This criterion examines whether maturity models for PM acknowledge such
dependency. Describing these dependencies explicitly makes it easier for
organizations to understand the path that the model describes towards maturity.
Besides the presence or absence, a brief description on how maturity models for PM
describe these dependencies is also be provided during the evaluation.

30
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007

Assessment Method (AM) criteria

AM1 - Assessment commitment


One of the most important enablers of an adoption of a maturity model is the
commitment from higher managers. Just like a project, if the higher management does
not provide full support to its execution, it will not likely produce the desired
outcome. In an empirical study, support from higher management was the second
most selected factor believed to be most critical to a project’s outcome [40]. A
maturity model description should contain suggestions or encouragements about
gaining commitments from the higher management.

AM2 - Competence level


Not every member of a to-be-assessed organization is qualified or capable to be
involved in a maturity assessment. A maturity model may specify requirements for
those carrying out the assessment (assessors) as well as those participating in it
(participants). This is what is meant by competence level. This criterion investigates
whether specifications for both roles are provided by a maturity model. By describing
requirements, a maturity model decreases the chance that the outcomes of an
assessment are affected by the choice of assessors or participants. This is the reason
why this property was included.

AM3 - Assessment method description


While the elaborateness of a maturity model description helps assessors to define
‘what’ they should measure to verify the maturity of an organization, the details of the
assessment method contains information on ‘how’ they should carry out the
assessment process.
After studying different maturity assessment methods, the project group decided upon
the following description elements that could constitute an assessment method.

Table 9 Assessment method description elements


Element name Examples
Process phase prepare assessment, conduct assessment
Activity determine scope of assessment, deliver results
Deliverable assessment plan, team, report
Role lead assessor, project manager, process owner
Dependency an assessment plan is a prerequisite for the actual assessment

31
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007

The number of elements used affects the level of detail of a method description. With
a tight protocol to conduct an assessment, the outcomes are less likely affected by
variances in the choices and actions of the assessors. So a thorough description of an
assessment method can ensure its repeatability and the reliability of the assessment
outcomes.

AM4 - Data gathering method


A maturity model can prescribe different methods to retrieve information from an
organization during a maturity assessment. These methods can involve:
- questionnaires (meant for participants)
- interviews
- group discussions
- document consultations
Each of these methods provides information from a different angle. If multiple
methods are used to verify the same piece of information, the reliability of this
information would increase. This property also shows the degree of involvement
required from the members of an organization. An assessment method that requires
assessors to conduct interviews to gather information needs close cooperation from
the members of an organization. On the other hand, if a sponsor of an assessment
wants to involve as few members as possible, it can opt for a maturity model with an
assessment method that only uses document consultations.

AM5 - Length of questionnaire


There are two types of questionnaires in the context of maturity assessments. The first
type involves questionnaires only available to assessors. Assessors use this type of
questionnaire as a guide to develop interview questions or as score forms to rate an
organization’s maturity level. The second type of questionnaires regards those that
participants of a maturity assessment have to fill out. In this research, no distinction is
made between these two types of questionnaires. This is because no matter who is
using the questionnaire, the number of questions still contributes to the detail level of
the information that is required by a maturity model. The more questions are asked,
the greater the reliability of the retrieved information will be. The type of
questionnaire used does not change this. Because the scope of each maturity model

32
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007

for PM may vary, this criterion only considers the part of the questionnaire focusing
on the ‘project management’ domain.
This criterion results in a number, indicating the amount of questions in a
questionnaire provided by each maturity model for PM.

AM6 - Supportive assessment tools


To ease the application of an assessment, various kind of support can be provided
together with a maturity model for PM. This criterion measures three types of
supportive tools:
- (self-)assessment toolsets (e.g. an online maturity assessment mechanism)
- training sessions to increase the understanding of a maturity reference model
and the assessment method.
- certification possibilities (to become a certified assessor)
The presence or absence of these supportive tools affects the usability of a maturity
model for PM. Besides maturity assessment with external assessors, an organization
can also opt for unofficial assessments, which are assessments done by organizations
themselves. If tools are provided to facilitate the understanding and application of the
maturity model without the intervention of certified assessors, the application of the
model would then not limit itself to organizations desiring official assessments.

AM7 - Benchmarking
Depending on the maturity model, the results of a maturity assessment may be used
for benchmarking purposes. This means that anonymous maturity results of assessed
organizations can be gathered by assessment institutions and used to make
comparisons of aggregated scores between for instance industry sectors or region. If
an institution allows it, assessed organization may compare their maturity scores with
the aggregated maturity scores of organizations operating in the same industry or
region.
Before that is possible, however, a standard method for the assessment needs to be in
place. As with the standardization of a questionnaire, the standardization of the
assessment method decreases the amount of variance in the results attributable to
various factors (e.g. choice of participants or assessors) and increases the attribution

33
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007

to the real variable in question: maturity. This criterion will consider two questions
when evaluating each assessment method description for this criterion:
- whether a standard method is described for a maturity assessment, and
- whether maturity profiles will be preserved for benchmarking purposes
available to an assessed organization.

3.2.2. Criteria comparison method


After evaluating the maturity models for PM using the evaluation criteria, the findings
per model are compared with each other in a schematic way. Elaborate comparisons
between the criteria results are provided in the form of tables like Table 10. Each table
signifies each evaluation criteria where the cells in the upper right contains the
similarities and the cells in the lower left contains the differences found between each
pair of maturity models for PM.

Table 10: Example schematic comparison per criteria


Criterion A
Model 1 Model 2 Model 3
Similarities
Model 1 similarity #1 similarity #1
similarity #2

difference #1 similarity #1
Differences

Model 2 difference #2 similarity #2


difference #3 similarity #3
difference #1 difference #1
Model 3 difference #2

After these tables, another table is depicted to summarize briefly the findings of each
model per criteria. By placing all values in a schema, it is easy to see the differences
between maturity models in a quick glance. As some criteria cannot be answered with
a simply yes/no answer, some of the fields of the schema will also contain narrative
text as values. An example of the resulting schema is shown in Table 11.

Table 11: Example schematic comparison of criteria results


Model 1 Model 2 Model 3

Criterion A No Yes No
Criterion B
- aspect ba Yes No Yes
- aspect bb No No Yes
Criterion C No Yes No
Criterion D Yes Yes No

34
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007

3.3. Usage Dimension


The last part of the evaluation framework examines the usage dimension of a maturity
model for PM. The reason why this dimension is chosen is because the true value of a
maturity model lies in the eye of the ones using it. Theory may describe a maturity
model as applicable or usable (as elicited by the evaluation criteria) but there is no
value in this if the people using it think otherwise.

There are two ways to measure on this dimension: via a survey or conducting
interviews. Because there was little time to gather enough respondents to make a
survey statistically relevant, the project group had decided to opt for interviews.

3.3.1. Interviews
This part of the framework was originally meant to elicit user experiences with a
maturity model for PM; therefore, these interviews had to involve people who have
had experience before with the selected maturity models.

In the end, only a small number of interviews were conducted: two for CMMI and one
for OPM3. The reason for this is because there were few people available who had
experience with the selected maturity models, especially OPM3 and PMMM. These
two models are relatively less well-known in Europe compared to CMMI. On top of
that, not many organizations in the Netherlands have adopted these two models.

Ideally, the usage dimension should be measured by interviews with two different
user groups per maturity model, namely: the assessors and members of user
organizations. The reason behind this is because simply taking one of the two
perspectives will create a biased view of these models. Unlike (certified) assessors,
members of user organizations do not have much experience with maturity models
and thus, their knowledge backgrounds will differ from those of assessors.
And even if members of a user organization attend training sessions to gain
knowledge about a maturity model (i.e. not with the intention to become certified
assessors), there will still be a difference in how they look at a maturity model simply
because they have different objectives of using one. Assessors use maturity models to
assess other user organizations; they interview members of an organization and verify

35
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007

the findings to determine the degree of maturity of that organization. Conversely, user
organizations are those who request assistance of certified assessors to assess their
maturity. They are the ones being interviewed and providing information about their
ways of working. Depending on the maturity model, either assessors or user
organizations will use the model to determine improvement trajectories. Ultimately, it
will be the members of the user organizations who initiate and realize the
improvement initiatives.

Because of these reasons, eliciting practical experiences from both the assessors and
members of user organizations will help generating a complete picture of the usage
dimension of a maturity model.

Two different questionnaires were developed for this dimension: one meant for a user
organization and the other for an assessor. These questionnaires are included in the
Appendix section in Dutch (see Appendix D-1).

Due to the small number of interviews, the Project Group decided that the retrieved
interview data will be used to support findings of the previous two dimensions.
However, information retrieved from both interviews and informal consultations are
described separately as the results of the usage dimension if it provides insight into
the models overlooked by the previous two dimensions.

36
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007

4. Analysis & Results

4.1. Structure analysis


The analysis of the results of the structure dimension begins with the maturity
reference model concepts depicted in the PDDs as gray boxes. This facilitates the
understanding of the comparisons that follow between the remaining activities and
concepts in the diagram. The PDDs of the three PM maturity models are included in
the Appendix section along with the concept tables and activity tables containing the
concept definitions and activity descriptions (see Appendix E & F).

4.1.1. Maturity reference model structure


As argued before, it is not important to compare the individual concepts underlying
the maturity models for PM. Much more interesting is the hierarchy that holds these
concepts together. The following paragraphs describe this structure for each maturity
reference model. Maturity reference model concepts are written between quotation
(‘’) marks in the text. Appendix F contains more detailed definitions of these
concepts.

OPM3
The OPM3 standard comprises several important components, namely the
‘foundation’, a ‘self-assessment tool’ and three sets of ‘directories’. And to register
maturity scores of client organizations for reasons of benchmarking, the PMI also has
a ‘maturity database’ to store maturity profiles. While the foundation and self-
assessment tool are made available to assist client organizations in understanding the
OPM3 standard, the three ‘directories’ form the core of the assessment method. These
are the ‘best practices directory’, ‘capabilities directory’ and ‘improvement planning
directory’. The best practices directory contains two types of ‘best practices’, namely
‘organizational enablers’ and ‘process best practices’. The former are supportive
practices that relate to the organizational structures and processes required to facilitate
efficient and effective realization of best practices for projects. The latter are basically
practices that are currently recognized and applied by industries to achieve a stated
goal or objective. These best practices are grouped by ‘process improvement stages’

37
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007

and ‘PM domains’. ‘Process improvement stages’ are the four stages of process
maturity: from process Standardization to Measurement to Control and ultimately to
Continuous Improvement. In other words, the OPM3 model describes best practices
that an organization can learn from to Standardize, Measure, Control or Continuously
Improve its PM processes. Organizations can also choose to assess their process
maturity within a specific PM domain such as project management, program
management or portfolio management.
OPM3 describes ‘capabilities’, stored in the ‘capabilities directory’, which are
specific competencies that must exist in an organization in order for it to execute a
best practice. To prove the existence of a capability, the existence of one or more
corresponding ‘outcomes’ is examined. Outcomes are the tangible or intangible result
of applying a capability. The degree to which an outcome exists is determined by
criteria also known as ‘key performance indicators (KPI)’.
Furthermore, OPM3 acknowledges ‘dependencies’ among its underlying concepts,
particularly best practices and capabilities. The first type of dependency lies between
the series of capabilities leading to a best practice. Also, each capability builds upon
preceding capabilities to achieve one single best practice. But a capability under one
best practice can also depend on the existence of a capability under a different best
practice, in this case a best practice is said to be dependent on another best practice.
This is the second type of dependency described within OPM3. These dependencies
are stored in the ‘improvement planning directory’.

CMMI
CMMI embodies five ‘maturity levels’, each a layer in the foundation for ongoing
process improvement. Because each maturity level forms a necessary foundation for
the next level, an organization cannot achieve, for instance, maturity level 3 if it
hasn’t achieved level 2 yet. CMMI describes ‘dependencies’ between pairs of
adjacent maturity levels.
In CMMI, each maturity level is represented by several ‘process areas’ and for an
organization to achieve a particular maturity level, the corresponding process areas
have to have achieved that maturity level.
The maturity model describes ‘generic goals’ and ‘specific goals’ that guides the
process of bringing a process area to a higher maturity level. Generic goals can be

38
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007

understood as objectives to bring a group of process areas to a certain level of


maturity and the model describe ‘generic practices’ that an organization can apply to
achieve these objectives. Specific goals are objectives unique to a specific process
area that have to be achieved before the process area can be considered as satisfied,
i.e. implemented. And to implement a process area, the model describes ‘specific
practices’ that can be applied.
The philosophy behind this is that organizations can only achieve a level of maturity
if it is capable of achieving certain goals. To achieve these goals, there are (groups of)
processes that need to be in place. And the presence or absence of these processes is
assessed by looking at an organization’s practices. Maturity scores of assessed
organizations that are wiling to register themselves by SEI are stored in SEI’s
‘maturity profile database’.

PMMM
Just like CMMI, the PMMM also embodies five ‘maturity levels’. In the book of
Kerzner [15] where this model is described, each maturity level is equipped with
explanations about ‘roadblocks’, ‘risks’ ‘advancement criteria’ and an ‘assessment
instrument’. The first three concepts represent things an organization need to know
before it can achieve that level and advance to the next level. Each level is
accompanied by an assessment instrument in the form of a questionnaire that
organizations can use to assess the degree to which it has achieved that level.
Organizations can do it manually using the book of Kerzner, but PMMM also has an
online version of the assessment instruments. And those who have used the online
assessment tool can also compare their assessment scores with scores of other
organizations that are stored in IIL’s ‘benchmarking database’.

39
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007

4.1.2. Assessment method structure


After decomposing the PDDs of all three assessment methods into activities and
concepts, members of the project group were consulted to develop the consolidated
reference lists. During the development, similar process phases, activities and
concepts were combined and included in the reference list under a more general name
and dissimilar ones were added under their original names. This resulted in the two
comparison tables below (see Table 12 and Table 13). Each of these tables is followed
by the descriptions of the similarities and differences found between the models.

Table 12: PDD activity comparison table


1. Prepare assessment OPM3 CMMI PMMM
1.1 Familiarize with maturity model =
1.2 Perform self-assessment =
1.3 Determine assessment requirements = (2) = >
1.4 Select participants < (2) < =
1.5 Develop assessment plan = (2) = >
1.6 Select & prepare assessment team = (2) =
1.7 Gather pre-assessment data < (2) =
1.8 Prepare for assessment conduct < (2) =
1.9 Prepare participants < (2) < (2)
2. Conduct assessment OPM3 CMMI PMMM
2.1 Conduct interviews = <
2.2 Study records & documents = <
2.3 Document gathered information =
2.4 Verify gathered information = =
2.5 Enter values in assessment tool = =
2.6 Generate individual assessment results =
2.7 Generate summarized assessment results < = <
2.8 Validate assessment results < =
2.9 Generate benchmark scores * * =
2.10 Create final report < (2) <
3. Finalize assessment OPM3 CMMI PMMM
3.1 Deliver final report < (2) = <
3.2 Document assessment < (2) =
4. Plan for improvement OPM3 CMMI PMMM
4.1 Select & prioritize improvement initiatives = <
4.2 Develop improvement plan = >
*: optional after assessment

40
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007

In the first process phase, there are several differences between the three maturity
models for PM. First of all, during an OPM3 assessment, organizations usually
familiarize themselves with the model and conduct a self-assessment themselves to
determine the necessity for an OPM3 rigorous assessment involving certified
assessors. Contrary to that, the assessment method of both CMMI and PMMM do not
include these two activities. This is because the SEI does not provide any tools for
self-assessments for CMMI. The remaining activities in the first phase are present in
the assessment methods of both OPM3 and CMMI. Both models require assessors to
consult documents and conduct interviews to gather information in the second phase,
and the activities 1.3 to 1.9 are carried out to make the necessary preparations.

PMMM on the other hand contain only activities regarding the requirements analysis,
the selection of participants and the development of an assessment plan. Furthermore,
there is no self-assessment included in the assessment method. This is because the
assessment method of PMMM only requires organizations to interact with an online
tool, which takes place during the second phase (2.5). The IIL provides an online as
well as offline version of the same questionnaire for the assessment. Organization
may use the offline version to do a self-assessment, but unlike OPM3’s self-
assessment tool, no explanation is provided for the results found.

Furthermore, while OPM3 and CMMI assessors have the responsibility to decide on
the adequate amount and the appropriate roles of assessment participants. IIL does not
have control over who ultimately interacts with the PMMM online assessment tool. In
this latter case, it is the responsibility of the client organization to select the right
amount of particular roles to include in the scoped sample.

After gathering the necessary data, it is documented during a CMMI assessment. This
data is then used to manually determine the maturity level of the organization. During
an OPM3 assessment, this data is entered into a tool by lead assessors to have the
assessment results automatically generated. In both cases, a final report is then
delivered to the assessed organization containing the aggregated results and
corresponding explanations. The PMMM assessment differs from this by allowing
assessments participants to retrieve their individual scores and compare them to

41
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007

temporary aggregate scores during an assessment. After all pre-selected participants


have filled out the online questionnaire the tool will generate a summary of the
assessment results with brief explanations. If desired, organization can request for a
report containing more elaborate explanations of the results (3.1). Among others, this
report contains more specific suggestions for improvements based on the results.
While benchmarking possibilities become available after OPM3 and CMMI
assessments, PMMMs online tool allows organizations to generate benchmarking
scores during and after the assessment.

Finally, while CMMIs assessment method ends after the assessors deliver the final
report to the assessed organization, the OPM3 assessment continues until an
improvement plan is developed containing prioritized improvement initiatives. This is
because OPM3 assessments are meant to help organizations realize improvements.
CMMI assessments do not include these activities because a CMMI assessment is not
necessarily conducted with the goal to realize improvements. Organizations may
undergo CMMI assessments just to rank themselves onto a standardized model
without the intention to improve. This is not possible for OPM3 because it does not
define strict maturity levels.

42
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007

Table 13: PDD concept comparison table


1. Prepare assessment OPM3 CMMI PMMM
1.1 SELF-ASSESSMENT REPORT =
1.2 ASSESSMENT TRIGGER = REQUIREMENT DRIVER
1.3 ASSESSMENT PLAN = APPRAISAL PLAN =
1.4 COMMUNICATION PLAN = APPRAISAL PLAN
1.5 TEAM & ROLES DESCRIPTION = TEAM
COMPANY
1.6 ORGANIZATION DESCRIPTION =
BACKGROUND
ENGAGEMENT
1.7 REQUIREMENT DESCRIPTION
=
DATA INITIAL OBJECTIVE
1.8 PRE-ASSESSMENT DATA COLLECTION PLAN EVIDENCE REVIEW
1.9 DATA COLLECTION PLAN = =
1.10 QUESTIONNAIRE PROTOCOL =
ASSESSMENT
1.11 ASSESSMENT PARTICIPANTS SAMPLE
2. Conduct assessment OPM3 CMMI PMMM
PRELIMINARY
OBJECTIVE
2.1 ASSESSMENT DATA ASSESSMENT
EVIDENCE
FINDINGS
2.2 ARTIFACT =
2.3 AFFIRMATION =
2.4 DOCUMENT =
2.5 INTERVIEW =
ASSESSMENT
ASSESSMENT
2.6 ASSESSMENT RESULTS RESULTS
APPRAISAL RESULTS RESULTS
SUMMARY
2.7 RESULTS ANALYSIS = = =
2.8 INDIVIDUAL ASSESSMENT RESULTS =
2.9 INDIVIDUAL SCORE =
2.10 INDIVIDUAL SCORE ANALYSIS =
OPM MATURITY MATURITY LEVEL ORGANIZA-
2.11 ORGANIZATIONAL MATURITY SCORE SCORE RATING TIONAL SCORE
2.12 OUTCOME SCORE =
2.13 CAPABILITY SCORE =
2.14 BEST PRACTICE SCORE = PRACTICE RATING
2.15 GOAL RATING =
2.16 PROCESS AREA RATING =
AGGREGATE
2.17 BENCHMARK SCORE SCORE
2.18 INDUSTRY SCORE =
2.19 SIZE SCORE =
3. Finalize assessment OPM3 CMMI PMMM
ELABORATE
FINAL FINDINGS
3.1 FINAL REPORT =
REPORT
ASSESSMENT
REPORT
3.2 ASSESSMENT RECORD APPRAISAL RECORD
ORGANIZA-
3.3 MATURITY PROFILE = =
TIONAL SCORE
3.4 BENCHMARK COMPARISON =
4. Plan for improvement OPM3 CMMI PMMM
4.1 IMPROVEMENT PLAN =
4.2 IMPROVEMENT TRIGGER =
IMPROVEMENT SUGGESTED
4.3 INITIATIVE =
SUGGESTION ACTION
4.4 SCHEDULE =
4.5 PRIORITY =
4.6 FACTOR =
4.7 ATTAINABILITY =
4.8 STRATEGIC PRIORITY =
4.9 BENEFIT =
4.10 COST =

43
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007

Since PMMMs assessment method only gathers information using an online


questionnaire, there is no need to develop assessment team descriptions (1.5) and data
collection plans (1.9) or collect pre-assessment data (1.8) during the prepare
assessment phase.

In the conduct assessment phase, CMMIs method describes more thoroughly the
categories and type of data gathered during the assessment (2.2-2.5) while OPM3
summarizes them into one concept (2.1). PMMM does not employ these concepts at
all because, unlike OPM3 and CMMI, its assessment does not involve assessors who
have to gather the right data from the right sources.

In the activity comparison table, it was evident that PMMM allows participants to
access their individual scores. This explains the corresponding concepts in the second
phase of the concept comparison table (2.8-2.10). The differences found in the
concepts 2.12-2.16 between OPM3 and CMMI can be explained by the differences in
the concepts used to describe maturity (see previous section).

Unlike PMMM, the OPM3 and CMMI benchmarking scores are not provided during
the maturity assessment, which explains why the benchmark related concepts (2.17-
2.19) only have corresponding symbols in the PMMM column.

And finally, the OPM3 model is the only one describing concepts that are related to
the development of an improvement plan in the fourth phase. The reasons for this are
already explained before.

4.1.3. Structure comparison summary


With fewer activities and concepts than the other two maturity models, the maturity
reference model and assessment method of PMMM appear to possess a relatively
simple structure. It does not involve assessors and only requires members of an
organization to fill out an online questionnaire to calculate the maturity scores. There
are several implications of this simplicity. First of all, organizations themselves are
fully responsible for selecting participants for the assessment, so an organization can

44
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007

assess whoever it wants to assess. Considering the fact that participants can look into
their individual scores, this means that the model allows for individual assessments
and development for all members of a project-based organization. The downside of
this, however, is that unless a large sample is involved, the aggregate scores will be
affected by the choice of participants. Furthermore, with an online website and
questionnaire as a medium during the assessment, no account is taken for the
environment of an organization or other factors when generating the results. So
organizations have to contemplate whether the results and improvement suggestions
are applicable to their specific situation.
During the assessments of OPM3 and CMMI, the assessors are responsible for
selecting, preparing and interviewing the appropriate members based on their
experience besides consulting documents to gather information. Both models
determine the maturity level of an organization by the practices that it has
implemented. The difference between the two models is evident when the gathered
data is processed to generate the assessment results. OPM3 assessors make use of a
tool to process the data gathered and automatically generate a report containing
assessment results, while CMMI assessors (following the SCAMPI method) have to
do it manually through meetings between lead assessors and assessment team
members. All things being equal, this might imply that a CMMI assessment takes
more time than an OPM3 assessment.
Finally, the structure of OPM3 shows that it is a maturity model revolving around
making improvements. What is less evident is that OPM3 does not have a
standardized while CMMI can also be used to determine an organization’s maturity
level in a standardized model. PMMM differs from OPM3 in that it defines distinctive
levels of maturity, but it does not require organizations to follow the path to maturity
from 1 to 5. PMMM allows organizations to assess its progress at all 5 maturity levels
in a relatively simple way.

45
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007

4.2. Applicability analysis


Next is the applicability dimension where each of the selected maturity models are
evaluated using fourteen criteria defined in Chapter 3. During and after every
evaluation, the findings were discussed with assessors to check their correctness and
completeness. See Appendix G for the detailed evaluation results per maturity model.
The summarized results of the evaluation are described in the next sub-sections,
starting with the maturity reference model criteria. For the sake of space, the
acronyms of the three models are used in the explanations instead of referring to their
maturity reference models or assessment methods.

4.2.1. Maturity reference model


The tables below describe more thoroughly the comparisons between each pair of
maturity model for PM for each criterion. The upper right cells specify the similarities
and the cells in the lower left the differences. Each table is accompanied by a brief
explanation about the most important similarities and differences, and the
corresponding implications. The input for these cells is derived from the detailed
evaluation tables in Appendix G.

After all seven maturity reference model criteria are discussed separately; another
table is depicted with the summarized results along with a brief explanation.

46
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007

Table 14: Comparison table – Openness


MM1 - Openness
OPM3 CMMI PMMM
Similarities
- Organizations have to pay - Model and assessment
to access assessment materials are only available
materials against payment
- In the case of official - Usage of the models is not
assessments, materials of limited to certain
OPM3-PS and CMMI are organizations
OPM3 meant for certified
assessors to use. OPM3-F
materials can be used
unrestrictedly to conduct
self-assessments
- Usage of the models is not
limited to their owners
- Client organizations have to - Organizations have to pay
pay for all OPM3 materials to access assessment
CMMI while CMMI model materials
information is downloadable - Usage of the models is not
limited to certain
organizations
Differences

- Client organizations have to - During maturity


pay for all PMMM materials assessments, PMMM does
while some CMMI materials not require intervention of
are downloadable certified assessors while
PMMM - During maturity CMMI does
assessments, PMMM does
not require intervention of
certified assessors while
OPM3 does

The most important similarity here is that the application of the three models is not
limited to the owners of the models. Materials of all three models are available to the
public, although the access can be bound by payment in the case of PMMM and
OPM3. The definition documents of CMMI, on the other hand are freely
downloadable from the Web. Unlike OPM3 and CMMI, PMMM does not require the
intervention of certified assessors to conduct the assessment. Explanations about this
and about the possible implications are already described before in section 4.1.2.

47
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007

Table 15: Comparison table – Industry & Size


MM2 - Scope
OPM3 CMMI PMMM
Similarities
- No industry related - No industry related
OPM3 restrictions are posed the restrictions are posed on
usage the usage
- CMMI does not explicitly - No industry related
mention size related restrictions are posed on
CMMI restrictions while OPM3 the usage
Differences

does - No size related restrictions


are mentioned
- PMMM does not explicitly
mention size related
PMMM
restrictions while OPM3
does

Regarding scope of the maturity models, none of them places industry related
restriction on the application of the model (see Table 15). As for the size of the client
organizations, OPM3 is the only one that explicitly states that the model can be
applied to organizations of all sizes. An implication of this is that organizations have
to consider by themselves whether a maturity model is still applicable to the size of
the scope they have in mind.

Table 16: Comparison table – Scope


MM3 – PM Domain
OPM3 CMMI PMMM

Similarities
- The models describe best - The models describe best
OPM3 practices related to project practices related to project
management management
- PM practices within CMMI - The models describe best
are described to achieve practices related to project
goals regarding software management
development. Conversely, - The models do not cover
OPM3 describes practices practices related to program
CMMI to achieve goals regarding and portfolio management
PM
Differences

- OPM3 describes program


and portfolio management
practices while CMMI does
not
- PMMM only focuses on the - PM practices in CMMI are
maturity of PM in an described to achieve goals
PMMM organization while OPM3 of software development,
also focuses on program practices in PMMM are
and portfolio management described to achieve goals
related to PM

All three models appear at first glance to cover the project management domain, but it
should be noted that CMMI describe project management practices differently than
the other two models. This is because, as mentioned before, CMMI is developed for
software development purposes and not PM. This model focuses on the management

48
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007

of projects in order to achieve software developmental goals. PMMM focuses on the


project management domain only while OPM3 focuses on all three PM domains.
An important implication of this is that organizations should consider applying CMMI
when the assessment involves improving project management in the software
development context. For improving program and portfolio management processes,
OPM3 should be selected instead of PMMM.

Table 17: Comparison table – Maturity level description


MM4 – Maturity level description
OPM3 CMMI PMMM

Similarities
- Process areas and activities - Process areas and activities
are described are described
- The models do not describe - The models do not describe
competences that members the results that
of an organization should organizations should be
harbor, or results that could able to achieve at a certain
OPM3 be expected when a maturity level
maturity is achieved
- Both models verify the
existence of process areas
and activities by the
products resulting from
them
- OPM3 only describe - Process areas and activities
activities (i.e. practices) that are described by both
should be in place in an models
organization. CMMI, on the - Roles are described as well
CMMI other hand, also describe - The models do not describe
process areas, and roles the results that client
that should be responsible organizations should be
for executing certain able to achieve at a certain
activities maturity level
- OPM3 does not describe - CMMI does not describe
Differences

what process areas should competencies that members


be in place or roles that of an organization should
should be responsible for maintain, but PMMM does
executing certain activities, - CMMI assesses the
while PMMM mentions roles deliverables that result from
in the assessment activities, but PMMM does
questionnaire not require the assessment
PMMM
- PMMM describes of deliverables
competences of people
while OPM3 does not
- OPM3 assesses the
deliverables that result from
activities, but PMMM does
not require the assessment
of deliverables

The fourth criterion is the degree to which a PM maturity model describes the
conditions to achieve a maturity level (see Table 17). None of the three models appear
to describe the project results that should be achieved when a maturity level has been
reached. This is not too surprising when taking into account that a lot of factors can
influence the outcome of a project. A project can still fail even if PM is successfully

49
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007

carried out within an organization [17] and vice versa. So even if a certain maturity
level is achieved, it does not guarantee desired outcomes of projects (or PM). An
implication of this is that organizations should judge by themselves whether outcomes
have improved after the adoption of a maturity model for PM (models do not measure
the value delivered to business stakeholders).

An important difference that is not immediately obvious in the table is the way CMMI
and PMMM describe and employ the notion of process areas. CMMI employs process
areas to indicate what processes within an organization should receive attention at a
particular maturity level in order to achieve certain (software development) goals.
How this is different from PMMMs approach becomes evident when examining the
dimensions of maturity employed by the model (see next criterion). It will also
become clear why PMMM is the only model that describes the competencies of
people within an organization.

OPM3 and CMMI both require an organization to show evidence (i.e. deliverables) of
the existence of practices while with PMMM the scores given are completely based
on the answers provided by participants. One important implication of this is the lack
of control over the correctness of the answers given during a PMMM assessment.
This could be compensated, however, if a large assessment sample is selected.

50
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007

Table 18: Comparison table – Dimensions of maturity


MM5 – Maturity dimension
OPM3 CMMI PMMM
Similarities

OPM3

- OPM3 describes process


maturity in terms of best
CMMI practices and CMMI looks
at the capability levels of
Differences

processes
- PMMM assesses different - PMMM assesses different
dimensions on each dimensions of PM maturity
consecutive level. OPM3 on each consecutive level.
PMMM does not describe maturity CMMI does not describe
dimensions along which an maturity dimensions along
organization can achieve which an organization can
maturity achieve maturity

CMMI describes maturity by the capability levels of processes. OPM3 focuses on the
improvement of process in terms of best practices. And PMMM assesses several
dimensions but each at a different maturity level. At level 1, the model looks at the
knowledge of people within an organization regarding basic project management
terminology. At level 2, it measures the degree to which processes are made common
throughout an organization. Level 3 is about combining all corporate methodologies
into a singular methodology to conduct project managements. At level 4, PMMM
measures whether an organization is aware of the importance of benchmarking and
the degree to which benchmarking is carried out. And finally, level 5 measures the
extent to which an organization is improving the singular methodology and business
processes using the information obtained through benchmarking. By examining these
5 maturity levels, it becomes apparent that, unlike OPM3 and CMMI, PMMM is a
model developed to help organizations understand the basic requirements for doing
project management.

51
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007

Table 19: Comparison table – Process areas


MM6 – Process areas
OPM3 CMMI PMMM
Similarities

OPM3

- CMMI defines process


areas from a software
development perspective.
CMMI
OPM3 does not describe
maturity using process
areas, but with practices
- OPM3 includes process - CMMI takes a software
areas of project, program development perspective
and portfolio management while PMMM takes a
Differences

and PMMM focuses on perspective of PM


project management alone - CMMI defines process
- OPM3 uses the process areas for a different
areas defined in PMBOK to purpose than PMMM.
categorize its best practices Process areas in CMMI are
PMMM
and PMMM uses them to groups of processes that
categorize knowledge areas can be improved along a
on the first maturity level maturity dimension.
Conversely, PMMM only
uses the PMBOK process
areas to indicate different
knowledge areas on level
one

PMMM use project management process areas as defined in the PMBOK to


categorize knowledge areas at its first maturity level. The model does not make use of
process areas besides that. Because OPM3 is aligned to the PMBOK, the process
areas are embedded into the practices. But OPM3 does not use process areas the way
CMMI or PMMM does. The process areas employed by CMMI are different those
defined in the PMBOK, mainly because CMMI categorizes them from a software
development perspective. Organizations should examine these process areas carefully
before determining which model to use for an assessment.

52
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007

Table 20: Comparison table - Process areas dependencies


MM7 – Process area dependencies
OPM3 CMMI PMMM
Similarities
- Both models acknowledge
OPM3
dependencies
- CMMI describes - Both models acknowledge
relationship among process dependencies between
areas while OPM3 maturity levels
CMMI
describes dependencies at
a lower level (i.e. among
Differences

best practices)
- OPM3 describes - PMMM only elaborates
dependencies between best dependencies between
practices and capabilities maturity levels, not process
PMMM while PMMM describes areas. CMMI describe
dependencies between dependencies between
maturity levels process areas and maturity
levels

The Process areas dependencies criterion was developed to examine whether the PM
maturity models employed process areas to categorize all PM processes on each
maturity level and whether the realization of one process area depended on the
realization of another one. Although PMMM describes dependencies between
maturity levels like CMMI, these dependencies do not involve process areas. CMMI
categorize processes into process areas on all maturity levels and describe
relationships between them to indicate their dependencies.
Because OPM3 describes dependencies at a lower level between components such as
best practices or capabilities, organizations are provided with more guidance when
selecting improvement trajectories. But there is also a downside to this since more
descriptions may lead to restrictions of an organization’s choice in improvement
actions.

The above results are summarized in the following table (see Table 21). It is
accompanied by a short explanation of the main similarities and differences between
the maturity models for PM.

53
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007

Table 21: Summary results of maturity reference model criteria


OPM3 CMMI PMMM
Openness
Free access No Yes No
Paid access Yes Yes Yes
Certified usage Yes Yes No
Proprietary access & usage No No No
Industry & Size
Size of organizations Yes No No
Industry sector Yes Yes Yes
Scope
Project Management Yes Yes Yes
Program Management Yes No No
Portfolio Management Yes No No
Maturity level description
Process area No Yes Yes
Activity Yes Yes Yes
Role No Yes Yes
Competency No No Yes
Deliverable Yes Yes No
Result No No No
Dimensions of maturity Not employed Processes Varies per maturity
level.
Process areas Not employed PM-related process Scope management,
areas: project planning, time management,
project monitoring and cost management,
control, supplier human resources
agreement management,
management, procurement
integrated project management, quality
management, risk management, risk
management, management,
quantitative project communications
management. managements
Process areas dependencies
Described differently Yes No
(Described?)

Information about all three models is relatively easy to access, with or without
payment. And none of them limit their usage to their owners or organizations
operating in specific industries. OPM3 is the only maturity model describing practices
of program and portfolio management. CMMI describes project management
practices in a software development context and PMMM describes project
management practices applicable in organizations that want to learn the basics about
conducting project management.
OPM3 only defines and determines maturity in terms of activities (best practices) and
deliverables, while CMMI and PMMM also look at roles of people and process areas.
PMMM is the only model that looks at competencies of the people in an organization

54
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007

and does not concern itself with the deliverables. Furthermore, none of the maturity
models describe the expected results of having achieved a particular maturity level.
Regarding the dimensions of maturity, only PMMM assesses different dimensions of
an organization at each maturity level. CMMI only looks at the processes dimension
in organizations and OPM3 does not explicitly define stages of maturity or
dimensions in which maturity can be achieved.
Unlike CMMI, OPM3 does not define process areas or dependencies between them.
Instead, it describes dependencies between best practices and the capabilities that
constitute them.

4.2.2. Assessment method


As before, the next seven tables will describe the differences and similarities found
per pair of maturity models for each criterion regarding their assessment methods.
After that, another table is used to summarize the comparison results.

Table 22: Comparison table - Assessment commitment


AP5 – Assessment commitment
OPM3 CMMI PMMM
Similarities

OPM3

- OPM3 does not explicitly - The necessity of


state the necessity for commitment of higher
CMMI higher management management initiative is
Differences

commitment explicitly stated by both


models
- OPM3 does not explicitly
state the necessity for
PMMM
higher management
commitment

To achieve desired outcomes in all initiatives within organizations, support from


higher management is one of the most important determinants. Describing the
importance of higher management commitment helps making members of an
organization aware of this. They will be more likely to see the adoption of a maturity
model and process improvement as something important. OPM3 does not describe
this, while PMMM and CMMI do (Table 22). This means that organizations adopting
OPM3 should still keep the importance of higher management commitment in mind
although it is not suggested explicitly.

55
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007

Table 23: Comparison table - Competence level


AP1 – Competence level
OPM3 CMMI PMMM
Similarities
- Minimum requirements for - Both models do not
selecting a (lead) assessor describe requirements for
are described by both participants
OPM3 models
- Both models do not
describe requirements for
participants
- While CMMI provides - Requirements nor
minimum requirements, restrictions are described
OPM3 lists specific for the selection of
CMMI competences and personal assessment participants
Differences

attributes that assessors - Both models do not


need to posses describe requirements for
participants
- Unlike OPM3, PMMM does - CMMI describes
not describe requirements requirements for selecting
PMMM for assessors or participants assessors and PMMM does
not provide descriptions or
requirements at all

Looking at Table 23, OPM3 and CMMI both provide criteria for selecting (lead)
assessors and PMMM does not provide guidelines at all. These guidelines contribute
to the reliability of an assessment since they help in ruling out factors that might affect
the outcome of the assessment, which in this case would be the competences of the
assessors or participants. By not providing guidelines, an organization may risk
selecting participants that do not represent the scope of the assessment.

56
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007

Table 24: Comparison table - Assessment method description


AP3 – Assessment method description
OPM3 CMMI PMMM
Similarities
- The assessment method for - The process phases of the
CMMI as well as OPM3 are assessment method,
described in process deliverables and
OPM3
phases, activities, dependencies are described
deliverables, roles and by both models
dependencies
- The assessment method of - The process phases of the
CMMI is described in an assessment method,
online document. The deliverables and
assessment method of dependencies are described
CMMI OPM3 is described in the by both models
assessor training manuals - The information about the
assessment method is
Differences

online available for the


public
- The assessment method of - CMMIs assessment method
PMMM is described for, is meant for assessors to
while OPM3s assessment use whereas PMMMs
method as described by assessment description is
PMMM OPM3-PS is meant for written for participants
assessors - Also because of this, the
assessment description of
PMMM is less detailed than
that of CMMI

Both OPM3 and CMMI describe detailed standard procedures that assessors should
follow when conducting assessments. PMMM also provides these descriptions, but of
a lower level of detail since they are meant to be used by organizations that want to
undergo an assessment.
Having a detailed description of the assessment method adds up to the repeatability of
an assessment, which contributes to the reliability of the outcomes. This is relevant
when maturity scores are compared with each other. This way, differences in maturity
scores would be less attributable to factors other than the differences between
assessed organizations.

57
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007

Table 25: Comparison table – Data gathering method


AP4 – Assessment method
OPM3 CMMI PMMM
Similarities
- The data gathering methods - The assessment methods
for both models include for both models include the
conducting interviews and usage of questionnaires that
consulting documents can be filled out by
OPM3 - Both models do not organizations themselves
prescribe group discussions - Both models do not
as one of the gathering prescribe group discussions
methods as one of the gathering
methods
- CMMI assessment - Both models do not
participants do not have to prescribe group discussions
fill out any questionnaires. as one of the gathering
CMMI OPM3 participants can fill methods
out a questionnaire as part
of the assessment
preparation
- PMMMs questionnaire is - Unlike with PMMM, CMMI
used by organizations to assessment participants do
Differences

conduct a full assessment not have to fill out any


while OPM3s questionnaire questionnaires
is used by organizations to - CMMIs assessment
conduct a self-assessment involves gathering and
- OPM3s assessment verifying assessment data
PMMM involves gathering and using interviews and
verifying assessment data documents whereas
using interviews and PMMMs assessment
documents whereas involves gathering data
PMMMs assessment solely by having participants
involves gathering data fill out a questionnaire
solely by having participants
fill out a questionnaire

Although group discussion can strengthen assessment findings and rule out personal
biases, none of the three assessment methods prescribe this data gathering method to
retrieve assessment data. Both CMMI and OPM3 prescribe interviews and the
consultation of documents to retrieve and verify data (Table 25). The questionnaire
provided by OPM3 is only meant as a self-assessment tool. The data resulting of this
self-assessment can then be used as one of the input (i.e. trigger) to conduct the more
rigorous assessments with external assessors. PMMM provides one questionnaire for
the assessment and does not describe other methods to retrieve additional information
or for verification.

58
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007

Table 26: Comparison table - Length of questionnaire


AP6 – Length of questionnaire
OPM3 CMMI PMMM
Similarities

OPM3

- OPM3s questionnaire
CMMI comprise more questions
than CMMIs questionnaire
Differences

- OPM3s questionnaire - All questions in PMMMs


comprise more questions questionnaire focus on
than PMMMs questionnaire project management across
all maturity levels. On the
PMMM
other hand, only a portion of
CMMI’s questions are
focused on project
management

The questionnaire of OPM3 contains more questions than CMMI and PMMI, because
each maturity level is made up of multiple components and subcomponents (best
practices and capabilities). Each question in the questionnaire signifies either a best
practice or capability and since OPM3 embodies about 600 best practices, it is not
surprising that only the project management domain would result in more than 800
questions. Questionnaires used by CMMI assessors contain fewer details, but it also
means that assessors have to rely on their own judgment to create the appropriate lists
of questions to ask participants during interviews. The same holds for OPM3
assessors. The difference in the amount of questions is also explained by the scope of
the questionnaires used for the evaluation. The 800 questions of OPM3 cover the
entire project management domain across four process maturity levels. The project
management related questions in CMMI however does not represent one particular
maturity level. The questions of CMMI are grouped according to the 23 process areas
employed by this model and the number of process areas relating to project
management is only a subset of the total amount. The 183 questions of PMMM is the
standard questionnaire for project management over all five maturity levels.
The more questions are provided by a maturity model, the more aspects are
considered, provided that all questions are relevant to the assessment. However, too
many questions may lengthen an assessment process considerably, since it takes time
for assessors to convert hundreds of probe questions into a relevant list of questions
for interviews.

59
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007

Table 27: Comparison table - Supportive assessment tools


AP2 – Supportive assessment tools
OPM3 CMMI PMMM
Similarities
- Certifications and trainings - Toolsets to conduct
OPM3 are provided for both (self)assessments are
models provided
- Tools for self-assessments
and rigorous assessments
(to generate results) are
CMMI
Differences

provided for OPM3 whereas


no such tools are provided
for CMMI.
- No trainings or certification - Unlike with PMMM, CMMI
possibilities regarding provides no tools for self-
PMMM
PMMM are provided while assessments or for the
they are provided for OPM3 generation of results

Since both CMMI and OPM3 employ assessors to conduct the assessment, it is not
surprising that trainings and certifications are provided for these two models.
Organization using PMMM have to consult the IIL or the available literature [25]
instead of trainings in order to understand PMMM.
No electronic tools are provided for CMMI while assessors of OPM3 and participants
of PMMM can use electronic tools to calculate the maturity score of an organization.
In addition, OPM3 (Foundation) provide means to conduct self-assessments.
Organizations are able to conduct a quick scan on themselves before deciding whether
a full-blown assessment using external assessors is necessary.

60
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007

Table 28: Comparison table - Benchmarking


AP7 – Benchmarking
OPM3 CMMI PMMM
Similarities
- Both models allow - Both models provide
registered organizations to aggregate (anonymous)
request for anonymous benchmarking scores at
OPM3
maturity scores of other request of registered
assessed organizations for organizations
purposes of benchmarking
- Both models provide
aggregate (anonymous)
CMMI benchmarking scores at
request of registered
organizations
- With PMMM, organizations - CMMI-registered
automatically receive organizations have to
benchmarking scores along request for benchmarking
with the results while with scores. With PMMM,
OPM3, the scores can be organizations automatically
requested after receive benchmarking
Differences

assessments scores along with the


- OPM3 only provides results
maturity scores aggregated - CMMI only provides
per industry whereas maturity scores aggregated
PMMM PMMM also provides scores per industry whereas
categorized according to PMMM also provides scores
the size of an organization categorized according to
- Unlike OPM3, PMMM also the size of an organization
allows individual - Unlike CMMI, PMMM also
participants to compare allows individual
their scores with aggregated participants to compare
scores per industry and their scores with aggregated
organization size during the scores per industry and
assessment organization size during the
assessment

During and after a PMMM assessment, organizations can access benchmarking scores
by default. Organizations conducting an OPM3 or CMMI assessment can only request
for them after the assessment has ended. In this latter case, organizations receive
anonymous maturity scores aggregated per industry, while PMMM also provides
explanations of the benchmark findings in the elaborated results report. Benchmarking
is important because it allows organizations to compare their maturity scores with
other similar organizations. There is, however, a downside to this since multiple
factors can influence a maturity score. Not much information is gained when only
looking at industry-related differences. Organizations have to consider other factors
such as size or region.
Below is another table to summarize the above criteria results regarding the
assessment methods.

61
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007

Table 29: Summary results assessment method criteria


OPM3 CMMI PMMM
Assessment commitment No Yes Yes
Competence level
Assessor Yes Yes No
Participant No No No
Assessment method description
Process phase Yes Yes Yes
Activity Yes Yes Yes
Deliverable Yes Yes Yes
Role Yes Yes No
Dependency Yes Yes Yes
Data gathering method
Questionnaires Yes No Yes
Interviews Yes Yes No
Group discussions No No No
Document consultations Yes Yes No

Length of questionnaire
(of the project management domain) 800+ 155-180 183
Supportive assessment tools
(Self-)assessment toolset Yes No Yes
Training Yes Yes No
Certification Yes Yes No
Benchmarking Yes Yes Yes

The assessment methods of all three models do not pose requirements for members of
organizations participating in the assessments. However, OPM3 and CMMI do
provide guidelines for the assessors, since they are the ones responsible for selecting
appropriate participants. As PMMM only employs an online questionnaire to gather
information, there are no requirements described for assessors. The organizations
themselves are responsible for selecting participants during PMMM assessments.
Because of this, PMMM is the only one that does not include roles in its assessment
method description.
Except for the self-assessment questionnaire provided by OPM3, both OPM3 and
CMMI use interviews and document consultations to gather data during an
assessment. Also, training sessions and certification possibilities are provided by both
OPM3 and CMMI to help organizations understand the models better.
And finally, maturity results are gathered by all three assessment methods for
benchmarking purposes.

62
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007

4.3. Usage analysis


Three experts of the list in Appendix A-2 were interviewed using the questionnaires
developed to assess the usage dimension. They were an assessor of CMMI, an
assessor of OPM3 and a member of an organization that had experience with CMMI
before. The remaining experts on the list were consulted by email and informal face-
to-face meetings.

The first half of the questionnaires was developed to gain information about the
interviewee and his/her experiences with the maturity model. The second part of the
questionnaires, where interviewees are asked to score the maturity models, was meant
to find out about how the selected maturity models for PM are rated by users eyes in
terms of usability and satisfaction. But these scores are only representative if multiple
assessors or members of assessed organizations are interviewed for the same maturity
model. Since this was not the case, the retrieved scores held no meaning and were
excluded from analysis and this thesis.
The relevant findings of the interviews are summarized in Appendix D-2, excluding
the information about the interviewees (and their organizations) and the scores given
to maturity models. Although the retrieved data of this dimension were used to
support the analysis of the previous findings, some additional insight was gained as
well. Especially the informal meetings with experts elicited important information to
what was already found, which is summarized in the next paragraphs.

About CMMI and CMMI-DEV


Since CMMI for Development was evaluated during this thesis, it is not surprising
that all findings regarding this model are related and limited to the field of Software
Development. However, the idea behind CMMI (i.e. CMM) is that you can use this
model to assess all types of processes. This is possible because CMMI assesses
maturity by looking at the capability levels of processes that altogether contribute to
achieve goals within an organization. CMMI does not concern itself with matters such
as knowledge or abilities, unlike for example how OPM3 is focused on PM.
CMMI allows the usage of other tools (e.g. methodologies or best practices) in order
to assess types of processes other than those related to Software Development. For

63
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007

instance, if an ICT organization wants to assess its governance processes, it can


consider using CMMI in combination with a tool such as ITIL. ITIL stands for
Information Technology Infrastructure Library and is a collection of best practices to
assist ICT organizations in structuring their governance processes [41]. So in the end,
CMMI for Development is an offspring of CMMI, standardized to specifically assess
processes regarding Software Development.

A vital difference between OPM3 and CMMI


It was mentioned in section 4.1 that an OPM3 assessment is mainly carried out with
the purpose of realizing process improvements within the assessed organization.
CMMI on the other hand also allows organizations to determine their current maturity
and rank themselves onto a standard model without the intention to improve. In the
first part of the framework, this was explained by the structural differences between
the maturity models. The interviews and expert consultations led to a more elaborate
explanation for this, which relates to the method employed by the models to
determine the maturity level of an organization.

According to CMMI, an organization can only achieve level 2 if all process areas of
level 1 have fulfilled the necessary requirements. So even though the organization has
fulfilled all requirements to achieve maturity level 2, it will still be rated as level 1 if
one or more requirements for level 1 have not been met. The model applies the same
logic as working with building blocks: one cannot place a block on top of a lower one
that is incomplete. Because of this strict distinction between maturity levels,
organizations can use it to rank themselves onto a standard continuum of maturity.

OPM3 works with a completely different logic. Since it works with best practices,
scores are provided to an organization with the presence of each best practice found.
When determining the organizational maturity at the end of the assessment, no
distinction is made between scores found in the three PM domains (project, program
and portfolio management) or in the four improvement stages (standardize, measure,
control and continuously improve). In other words, all scores are added together. This
is analogous to filling a bucket with water, where the resulting water level indicates
the organizational maturity. An implication of this is that the organizational maturity

64
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007

score does not provide definite information about what an organization is doing well
or poor. OPM3 does not explicitly define maturity levels; it provides best practices
that organization can learn from or adopt to improve their processes.

65
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007

5. Discussion & conclusions


The purpose of this thesis was to develop a framework to evaluate and compare
different maturity models for PM. It resulted in an evaluation framework that looks at
maturity models for PM from three different perspectives. In order to answer the main
research question of this thesis, four sub-questions had to be answered. These
questions including their answers are provided below.

1. What is a maturity model for project-based management?


According to the literature consulted, a maturity model for PM is a framework
comprising several successive maturity stages/levels that allows an organization to
assess its current ability to conduct PM and determine its steps towards desired
improvements.

2. What constitutes a maturity model for project-based management?


The literature study revealed that a maturity model for PM comprises two
components: a maturity reference model and an assessment method. A maturity
reference model is a description of the requirements that an organization should meet
in order to achieve a desired maturity level embodied by the maturity model. With a
maturity reference model, users of a maturity model will know ‘what’ should be
assessed to determine the maturity of an organization. There are two user groups that
make use of a maturity model: assessors and participants. Assessors are people
equipped with the knowledge to assess an organization to determine its maturity level
and participants are members of organizations being assessed.
And to provide these users with the knowledge about ‘how’ the assessment should be
conducted, a maturity model also describes an assessment method. This description
explains what should be done to assess whether a requirement is fulfilled and whether
an organization has achieved a maturity level.

There are two reasons why these two components should be distinguished when
evaluating and comparing maturity models for PM. First, the two components contain
different characteristics that should be measured differently. At the most basic level,
the assessment method describes a process while the reference model description has

66
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007

a more static nature. Second, a maturity model for PM can have more than one
assessment method to apply its reference model. If a distinction is made between the
two components when developing measures for the framework, the components can
be evaluated separately. This may save the effort of having to evaluate an entire
maturity model when it is only the assessment method that is different.

3. What are relevant dimensions to the evaluation of maturity models for project-
based management?
Three dimensions were selected to include in the evaluation framework: structure,
applicability and usage. These three dimensions were used to evaluate three existing
maturity models for PM.

The structure dimension was used to reveal the concepts that underlie each maturity
model and the activities that comprise the corresponding assessment method. During
the evaluation, it showed that each maturity reference model employed a different
structure and different concepts. These concepts altogether reflect the approach that a
maturity model takes to define and determine maturity. And by evaluating the
structure of the assessment method, it became clear what kind of procedure is needed
to use a maturity model. Focusing on the structure alone simplified the process of
understanding how a maturity model is built up in a certain way and effects it has on
its application. In addition, this dimension was able to show where the differences lay
within the maturity models and what implications these differences cause.

Complementary to the structure dimension, the applicability dimension evaluated


specific properties of a maturity model that may affect its capability of being applied
in practice. The broad definition of this dimension allowed a flexible approach to
select relevant properties. The properties that could be evaluated along this dimension
included restrictions posed to their usage, subjects they do or do not describe,
requirements they prescribe and what tools they provide to their users. These are all
useful pieces of information for organizations that are planning to select a maturity
model.
The third and last dimension was meant to evaluate and compare the experiences of
the users of the maturity models, but due to the lack of contacts, user experiences

67
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007

could not be retrieved for all three models. This dimension is essential because it
focuses on how users are experiencing the usage of the models in practice instead of
what is described about the models in theory (see second dimension). Also, the
retrieved information was useful for the verification and explanation of the findings of
the previous two dimensions.

4. What are the main similarities and differences between maturity models for
project-based management?
One of the main similarities between maturity models for PM lies in the structure of
their assessment methods. Maturity assessments described by these maturity models
always start with a preparation phase, a conduct assessment phase and end with a
finalization or improvement phase where respectively a report is delivered or an
improvement plan is developed. Most activities in the preparation and finalization
phases are similar. The differences between the models are mainly present in the
second phase where activities regarding the data retrieval and results generation are
described. A model such as PMMM gathers information about an organization using
only one questionnaire, while OPM3 and CMMI require the intervention of external
assessors to conduct interviews. Moreover, the results are generated automatically by
a tool during an OPM3 or PMMM assessment while they are generated manually
during a CMMI assessment.
Another similarity is that maturity models for PM are all developed for the same goal:
to help organizations that are adopting the project-based way of working. And it is in
the way of achieving this goal that these maturity models can differ from each other.
For example, a maturity model may be developed to enable an organization to verify
its current position in a standard continuum of maturity (e.g. CMMI) while another
maturity model may help organizations by suggesting improvement initiatives (e.g.
OPM3). On top of that, a maturity model may exist to do both (e.g. PMMM).
Maturity models for PM also differ from each other in the way they define maturity
itself. This has implications for a maturity model’s structure (maturity reference
model) because the definition of maturity determines what aspects of an organization
a model finds important when assessing its maturity. For instance, OPM3 defines
maturity in terms of best practices, and CMMI defines maturity in terms of processes
and whether these processes are effective in helping an organization attain

68
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007

predetermined business goals. Consequently, assessors will determine the existence of


best practices during an OPM3 assessment or verify the effectiveness of processes
during a CMMI assessment to assess the maturity of an organization.

With the above conclusions, the main research question can be answered next.

What measures are needed to evaluate the similarities and differences between
maturity models for project-based management?
To measure the maturity models along the structure, applicability and usage, the
techniques Process-Data Diagram (PDD) modeling, evaluation criteria and interviews
were respectively used.

The construction of PDDs was useful in several ways; it enabled the evaluation of the
assessment method and reference model separately. In spite of that, it was still able to
show the connection between the two components. This measure made it possible to
depict the structure of a maturity model in a simple way. The comparison tables that
were developed accordingly made it easy to reveal the position and nature of the
similarities and differences between the maturity models. For instance, it showed that
most of the differences between the assessment methods of OPM3, CMMI and
PMMM lay in their descriptions of how the gathered assessment data should be
processed to generate the results. And the most basal similarity is that all three
assessment methods have a ‘preparation phase’, ‘conduct assessment phase’ and
‘finalize assessment phase’.

The evaluation criteria were configured to evaluate specific properties affecting the
applicability of the maturity models. Of the 14 properties, openness appears to be the
most important property of a maturity model, because it determines whether
information can be retrieved to evaluate the model on the remaining properties of the
list. Properties such as industry & size and scope were useful in that they may
describe limitations to the context in which a maturity model can be applied. Criteria
such as maturity level description, assessment method description, data gathering
method and supportive assessment tools were able to reveal aspects related to the ease
of use of the maturity models. Criteria such as maturity dimension and process areas

69
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007

are only useful if the evaluated maturity models employ maturity dimensions and
process areas, and if there is enough time to examine the different definitions used by
each maturity model to describe them. Nevertheless, these criteria shed light on the
aspects for which organizations can achieve maturity according to a maturity model,
so they should be included in the framework when possible. Finally, criteria such as
the length of the questionnaire and assessment commitment contributed little to
distinguishing one maturity model from another. These criteria can be excluded from
further usage.

As for the third measure, few interviews were conducted due to the lack of contacts
and the unavailability of people of both user groups for each of the three models.
However, the retrieved data still provided additional insight into the maturity models
that was not found with the previous two measures (section 4.3). Thus, the arguments
for the relevancy of this dimension and measure remain valid. The real value of a
maturity model lies in the eyes of its users, not in the literature or publications. And
interviews are more useful than a survey because it enables the collection of more in-
depth information such as user experiences.

In the end, the purpose was to find out whether the above measures were eligible to
describe the three dimensions and show similarities and differences between maturity
models for PM. After applying the framework to the three maturity models, modeling
PDDs and using evaluation criteria were proven useful measures for the
corresponding dimensions. And regardless of the small number of formal interviews,
the useful data gathered by speaking with assessors and a user proves the adequacy of
the third measure.

70
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007

5.1. Framework requirements


In the beginning of this thesis, five requirements were defined to assess quality of the
evaluation framework as a whole. These requirements are repeated below including
the assessment of the framework.

Completeness
The framework is comprehensive for the analysis and comparison of maturity models
for PM; however it is not yet complete because of the few user experiences gathered
for the third dimension. The framework is considered complete when more interviews
are conducted with members of both user groups for each maturity model being
evaluated. This may happen when this framework is used by the PMI-NL Project
Group to compare maturity models for PM.

Consistency
The framework is consistent since all activities and concepts are consistently defined
and described throughout the framework.

Applicability
This thesis has described the initial application of the framework. The applicability of
the framework will be further determined by the PMI-NL Project Group when using
the framework to compare other maturity models for PM.

Reliability
The reliability of the framework is determined by the way the framework is
constructed and by the sources consulted for information. The framework is not
reliable if the findings of the three measures contradict each other. The correctness of
the PDDs, criteria results and interview data have been verified by experts and
references are described for every piece of retrieved information. And above all, the
findings of the three measures complement each other well, so the framework is
reliable.

71
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007

Efficiency
Evaluating maturity models for PM using the framework is efficient because it
focuses on the most important aspects of these models to evaluate and compare. The
only costs are made when acquiring information about the models. Those wanting to
compare maturity models do not have to search for and purchase literature aimlessly
since the framework already describes what information is needed. License costs can
be avoided since the framework does not enforce the evaluation of models that are not
open to public. Modeling PDDs can be done using Microsoft Visio, which is not
expensive to purchase.
Using the framework might require some effort and time since evaluating on the
applicability dimension requires thorough desktop research with the gathered
information about the models.

Ultimately, the application of the framework does not limit itself to maturity models
for the PM discipline. The only exceptions are several criteria for the applicability
dimension (e.g. scope, process areas). The remaining criteria may be used to evaluate
maturity models of other disciplines. PDDs can be used to model maturity reference
models and assessment methods regardless of the discipline. The same holds for
interviews with user groups for the usage dimension.

72
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007

5.2. Suggestions for situational selection


This thesis proved that there are significant differences between maturity models for
PM. Although the scope of this research does not include the examination of the
possible ‘fit’ between maturity models for PM and certain types of organizations,
there are some findings worth mentioning. In particular, these findings regard the
implications that certain properties of each maturity model have for its usage. The
implications are described per maturity model below as characteristics of organization
that could opt for it. Each implication is accompanied by a property of the same
model that explains the implication.

Organizations may opt for OPM3 when they:


…want to find out if a rigorous assessment is necessary.
OPM3 provides facilities to conduct self-assessments prior to a rigorous assessment.

…want to improve their project, program or portfolio processes.


This is because OPM3 describes best practices of project, program as well as portfolio
management.

…assistance in developing a plan for improvements initiatives.


OPM3s assessment does not necessarily end after delivering the report containing the
results of the assessment. If needed, OPM3 assessors also provide assistance for the
development of an improvement plan afterwards.

…do not need to rank themselves onto a standard maturity continuum.


OPM3 does not describe strict maturity levels, so organization cannot state that they
have achieved for example maturity level 1 or 2 after an assessment has ended.

73
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007

Organizations may opt for CMMI (for Development) when they:


…want to assess their project management processes to improve their software
development processes.
CMMI for Development is specifically developed to assess software development
processes. The project management practices in this model are described for purposes
of improving software development processes.

…are certain that a rigorous assessment is what they seek.


SEI does not provide self-assessment tools for CMMI.

…want to rank themselves on a standard maturity continuum.


Because CMMI describes a standard path to maturity and defines strict maturity
levels, organization can announce themselves as having achieved a certain CMMI
maturity level after an assessment.

Organizations may opt for PMMM when they:


…seek a quick and simple assessment without intervention of any external assessors.
PMMM only uses an online questionnaire to gather information about an
organization. No intervention of assessors is needed.

…want to adopt the project-based way of working.


PMMM describes an approach to implement project management in five stages (i.e.
maturity levels).

…can decide for themselves whether the results are applicable to their own specific
situation and have a high tolerance for the reliability of the results.
Because PMMM uses only an online questionnaire to gather information, there is no
guarantee that the assessment results are aligned to the situation of the organization.
The reliability of the results is also questionable since the organizations themselves
are responsible for selecting the respondents to fill out the questionnaire. This is the
trade-off for the simplicity and the savings in costs provided by this model.

74
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007

5.3. Recommendations for further research


During the research, an attempt was made to compile a list of relevant criteria. More
research should be done on validating the current list and examine the usefulness of
other properties that may help making distinctions between maturity models. The
validity of the current list can for example be determined by asking experts to judge
the importance of the properties elicited by each criterion [42].

Due to the unavailability of the needed resources, models such as the P3M3 and
MINCE2 model could not be included in the evaluation. While the OPM3 is one of
the most prominent maturity models in America based on the PMBOK, the P3M3 is a
well-known model in Europe based on PRINCE2. If this model was included in the
research, it would have been interesting to see the differences between these two
models. And because MINCE2 is one of the newest and easy to access maturity
models developed, it would have been interesting to evaluate as well. It is now the
task of the PMI-NL Project Group and future researchers to examine these two
maturity models.

Furthermore, additional research could also evaluate different assessment methods of


the same maturity model to find out which method is the most suitable. Like maturity
models, differences in assessment methods may affect its applicability in an
organization.

And above all, further research should focus on the possible ‘fit’ between different
maturity models for PM and different types of organizations, and the effects this ‘fit’
might have on the performance of an organization.

75
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007

References
[1] Ten Zweege, H.C., De Koning, M.C. & Bons, F. (2006). PPI Project Performance
Improvement. Succesvolle projecten zijn geen toeval. The Hague, The Netherlands: Sdu
Uitgevers.

[2] Kwak, Y.H. & Ibbs, C.W. (2002). Project management process maturity (PM)2 Model
[electronic version]. Journal of Management Engineering, 18(3), 150-155.

[3] Pennypacker, J.S. & Grant, K.P. (2003). Project management maturity: an industry benchmark
[electronic version]. Project Management Journal, 34(1), 4-11.

[4] Fiedler, F.E. (1972). The effects of leadership training and experience: a contingency model
interpretation [electronic version]. Administrative Science Quarterly, 17(4), 453-470.

[5] Wikipedia.org (2007). Fiedler contingency model. Retrieved August 27, 2007 from
Wikipedia.org:
http://en.wikipedia.org/wiki/Fiedler_contingency_model

[6] Lin, W.T. & Shao, B.B.M. (2000). The relationship between user participation and system
success: a simultaneous contingency approach [electronic version]. Information & Management,
37, 283-295.

[7] Beach, L.R. & Mitchell, T.R. (1978). A contingency model for the selection of decision
strategies [electronic version]. The Academy of Management Review, 3(3), 439-449.

[8] Shenhar, A.J. (2001). One size does not fit all projects: exploring classical contingency domains
[electronic version]. Management Science, 47(3), 394-414.

[9] Ginzberg, M.J. (1980). An organizational contingencies view of accounting and information
systems implementation [electronic version]. Accounting, Organization and Society, 5(4), 369-
382.

[10] Association for Information Systems. (2007). Theories used in IS research: contingency theory.
Retrieved August 16, 2007, from the World Wide Web:
http://www.istheory.yorku.ca/contingencytheory.htm

[11] Brinkkemper, S., Saeki, M. & Harmsen, F. (1999). Meta-modelling based assembly techniques
for situational method engineering [electronic version]. Information Systems, 24(3), 209-228.

[12] Project Management Institute [PMI]. (2003). Organizational Project Management Maturity,
knowledge foundation. Newton Square, Pennsylvania USA: PMI.

[13] Project Management Institute [PMI]. (2004). A guide to the project management body of
knowledge (3rd Ed.). Square, Pennsylvania USA: PMI.

[14] Wikipedia, 2007. “Process management”. Retrieved May 20, 2007, from Wikipedia.org:
http://en.wikipedia.org/wiki/Process_management

[15] Wikipedia, 2007. “Program management”. Retrieved August 15, 2007, from Wikipedia.org:
http://en.wikipedia.org/wiki/Project_management

[16] Grant, K.P. & Pennypacker, J.S. (2006). Project management maturity: an assessment of project
management capabilities among and between selected industries [electronic version]. IEEE
Transactions on Engineering Management, 53(1), 59-68.

76
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007

[17] De Wit, A. (1988). Measurement of project success [electronic version]. Project Management
Journal, 6(3), 164-170.

[18] Dvir, D., Raz, T. & Shenhar, A.J. (2003). An empirical analysis of the relationships between
project planning and project success [electronic version]. International Journal of Project
Management, 21, 89-95.

[19] Storm, P.M. & Janssen, R.E. (2004). High performance projects – a speculative model for
measuring and predicting project success. Conference paper submitted to IRNOP VI Project
Research Conference. Retrieved November 11, 2006 from www.ou.nl:
http://www.ou.nl/Docs/Faculteiten/MW/MW%20Working%20Papers/gr%2004-
04%20Storm%20en%20Janssen.pdf

[20] Khang, D.B. & Moe, T.L. (AIT - SOM working paper, August 2006). Success criteria and
factors for international development projects: a lifecycle-based framework [electronic version].

[21] Hyväri, I. (2006). Project management effectiveness in project-oriented business organizations


[electronic version]. International Journal of Project Management, 24, 216-225.

[22] Johnson, J. Boucher, K.D., Connors, K. & Robinson, J. (2001). Project management: the criteria
for success [electronic version]. Software Magazine, Feb 2001
http://findarticles.com/p/articles/mi_m0SMG/is_1_21/ai_71562148

[23] Atkinson, R. (1999). Project management: cost, time and quality, two best guesses and a
phenomenon, its time to accept other success criteria [electronic version]. International Journal
of Project Management, 17(6), 337-342.

[24] Clarke, A. (1999). A practical use of key success factors to improve the effectiveness of project
management [electronic version]. International Journal of Project Management, 17(3), 139-
145.

[25] Kerzner, H. (2005). Using the project management maturity model: strategic planning for
project management (2nd Ed.). New Jersey, USA: John Wiley & Sons.

[26] Crawford, J.K. (2002). Project management maturity model: providing a proven path to project
management excellence. Basel, Switzerland: Marcel Dekker, Inc.

[27] Murray, A. & Ward, M. (2006). Capability maturity models – using P3M3 to improve
performance. Outperform, UK. Retrieved 28 November 2006 from outperform.co.uk:
http://www.outperform.co.uk/Portals/0/P3M3%20Performance%20Improvement%201v2-
APP.pdf

[28] Software Engineering Institute [SEI] (August, 2006). CMMI® for development, version 1.2.
Improving processes for better products. Retrieved March 3, 2007 from sei.cmu.edu (searched
with “CMMI for development”):
http://www.sei.cmu.edu/publications/

[29] Koomen, T. & Pol, M. (1998). Improvement of the test process using TPI. Retrieved May 21,
2007 from sogeti.nl:
http://www.sogeti.nl/images/summary%20TPI%20UK%20v1%2E2_tcm6-32137.pdf

[30] Earthy, J. (1999). Usability maturity model: processes. Public document. Retrieved May 21,
2007 from the World Wide Web:
http://www.idemployee.id.tue.nl/g.w.m.rauterberg/lecturenotes/Usability-Maturity-
Model%5B2%5D.PDF

77
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007

[31] Cooke-Davies, T.J. (2004). Measurement of organizational maturity: what are the relevant
questions about maturity and metrics for a project-based organization to ask, and what do these
imply for project management research? In D.P. Slevin, D.I. Cleland & J.K. Pinto (Eds.),
Innovations: project management research 2004 (pp. 211-228). Newton Square, Pennsylvania
USA: PMI.

[32] Office of Government Commerce [OGC]. (2006). Portfolio, programme & project management
maturity model (P3M3). Crown copyright. Retrieved 28 November 2006 from the World Wide
Web:
http://www.ogc.gov.uk/documents/p3m3.pdf

[33] MINCE2 Foundation (2007). Maturity INcrements IN Controlled Environments (MINCE2).


Retrieved May 20, 2007 from the World Wide Web:
http://www.mince2.org/

[34] International Institute for Learning [IIL]. (2007, June 14). Kerzner project management maturity
model online assessment. Retrieved June 14, 2007, from the Word Wide Web:
http://www.iil.com/pm/kpmmm/

[35] Schlichter, J. (2000). Organizational project management maturity model program plan.
Retrieved February 26, 2007, from the World Wide Web:
https://committees.standards.org.au/COMMITTEES/IT-030/Z0005/IT-030-Z0005.PDF

[36] DNV (2007). OPM3 standard knowledge course. Retrieved June 6, 2007 from dnv.nl:
http://www.dnv.nl/dnvtraining/categorie/project_management/OPM3_standard_knowledge_cou
rse.asp

[37] Software Engineering Institute [SEI] (August, 2006). Standard CMMI® Assessment Method for
Process Improvement (SCAMPIsm) A, Version 1.2: Method definition document. Retrieved
March 3, 2007 from sei.cmu.edu (searched with “SCAMPI”):
http://www.sei.cmu.edu/publications/

[38] Hong, S., Van Den Goor, G. & Brinkkemper, S. (1993). A formal approach to the comparison of
object-oriented analysis and design methodologies [electronic version]. Proceeding of the
Twenty-Sixth Hawaii International Conference, 4, 689-698.

[39] Weerd, I. van de & Brinkkemper, S. (2007). Meta-modeling for situational analysis and design
methods [electronic version]. To appear in the Handbook of Research on Modern Systems
Analysis and Design Technologies and Applications. Hershey, PA, USA: Idea Group
Publishing.

[40] White, D. & Fortune, J. (2002). Current practice in project management – an empirical study
[electronic version]. International Journal of Project Management, 20, 1-11.

[41] Wikipedia.org (2007). “ITIL”. Retrieved September 10, 2007 from Wikipedia.org:
http://nl.wikipedia.org/wiki/Information_Technology_Infrastructure_Library

[42] Dalkey, N. & Helmer, O. (1963). An experimental application of the Delphi method to the use
of experts [electronic version]. Management Science, 9(3), 458-467.

[43] Project Management Institute [PMI] (2006). OPM3 ProductSuite Assessor Training. PMI Inc.

[44] Project Management Institute [PMI] (2007). OPM3 Consultant Training Program. PMI Inc.

[45] International Institute for Learning [IIL] (2007). Kerzner project management maturity
assessment: level 1-5 assessment report for ABC company [sample report]. Provided June 13,
2007 by C. Damiba, IIL NY.

78
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007

[46] Project Management Institute [PMI] (2007). Organizational project management maturity model
(OPM3). Retrieved April 15, 2007, from the World Wide Web:
http://opm3online.pmi.org/Default.aspx

[47] Project Management Institute [PMI] (2004). An executive’s guide to OPM3. Retrieved June 5,
2007 from pmi.org:
http://opm3online.pmi.org/demo/downloads/PP_OPM3ExecGuide.pdf

[48] Software Engineering Institute [SEI] (2007). SEI’s official CMMI website: publications.
Retrieved April 14, 2007 from the Word Wide Web:
http://www.sei.cmu.edu/publications/

[49] Software Engineering Institute [SEI] (August, 2006). Appraisal requirements for CMMI®,
Version 1.2. Retrieved March 3, 2007 from sei.cmu.edu (searched with “appraisal
requirements”):
http://www.sei.cmu.edu/publications/

79
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007

APPENDIX

Appendix A-1 – PMI Project Group members and roles

Name (role within project group)


Organization of employment (role)

John Verstrepen (Sponsor)


PMI – NL (General Manager)

Chris ten Zweege (Chairman)


Capgemini Netherlands (Cluster Manager PPI Consulting)

Winnie Weintré (Member)


PRI Management (Project and Interim Manager)

Remco Meisner (Member)


Andarr (Senior Project, Program and Interim Manager)

Carl Splinter (Member)


Stork (n/a)

Jeroen l’Ecluse (Member)


ABN Amro (Vice President Project Management Office)

Jan van Galen (Member)


Van Aetsveld (Change and Project Manager)

Tjie-Jau Man (Member)


University Utrecht / Capgemini (Student on work placement)

80
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007

Appendix A-2 – Consulted experts

Maturity model name


Expert name
(role in relation to maturity model, organization of employment)

OPM3
Chris ten Zweege
(OPM3 assessor, Capgemini)
Lex van der Helm
(OPM3 assessor, Capgemini)
Raimond Wets
(OPM3 assessor, Capgemini)

CMMI
Ben Kooistra
(CMMI assessor/IT architect, Capgemini)
Gerhard Lutjenhuis
(CMMI lead assessor/PM auditor, Capgemini)
Ahmet Ersahin
(senior consultant, Capgemini)
Bert den Ouden
(manager change management Schade & Inkomen, ING)

PMMM
Christian Damiba
(director Methodology and Assessment Solutions, IIL)

81
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007

Appendix B – Process-Data Diagram Modeling


(derived from [39])

The meta-process model


The meta-process model is an adapted version of the UML activity diagram, were the
activities and transitions between them are depicted graphically. Activities are shown
as rectangles with rounded corners and can contain a collection of (sub)-activities
where applicable. The transitions are depicted as arrows and they indicate the progress
from one activity to another.

Activities
Activities can be subdivided into standard activities and composite activities. The
latter can, in its turn, be subdivided into ‘open’ and ‘closed’ activities. An ‘open’
activity is a composite activity of which the (sub)-activities are elaborated and a
‘closed’ activity is one of which the (sub)-activities are not shown. To increase the
readability and understandability of the PDDs, the framework will only use the
notation of standard activities and open activities to when modeling the PM maturity
models. Also, although they are officially called activities, the framework will use the
term ‘processes’ for all open activities and ‘activities’ to indicate (standard) ‘sub-
activities’ from here on (see Figure 1 below).

Figure 1: Activity types

Transitions
There are officially four types of transitions: unordered, sequential, simultaneous and
conditional. ‘Unordered’ means that activities of a process can be carried out in an
arbitrarily chosen order. This is illustrated by the absence of transition arrows between
the activities. On the other hand, sequential activities are obliged to follow a

82
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007

predefined order. The arrows that connect these activities indicate the order in which
they should be carried out (see Figure 2).

Figure 2: Transition types

The concurrent property of the simultaneous activities are depicted using two thick
black lines, respectively called the ‘fork’ and ‘join’. The ‘fork’ initiates the
simultaneous flow where separate activities are carried out at the same time and the
‘join’ closes this flow by uniting all separate transitions into one single transition. It is
important to note that the following activity can only be initiated once all
simultaneous activities between the ‘fork’ and ‘join’ have been completed.
Lastly, the conditional flow is shown as a toppled square that splits up an incoming
transition into two or more conditional transitions. A conditional transition is
accompanied by a condition written between brackets (see Figure 2). This condition
has to be formulated in a way such that it can be answered with a ‘true’ or ‘false’.

The meta-data model


The meta-data model is comparable to the UML class diagram. An obvious difference
is that a meta-data model uses ‘concepts’ instead of ‘classes’. Just like activities,
concepts can be subdivided into ‘standard’ and ‘composite’ concepts, and with the
latter dividable in ‘open’ and ‘closed’ concepts. And just like before, the framework
will only use the notation of standard and open concepts when constructing a meta-
data model.

Concepts are depicted as rectangles and written in capitals using singular nouns
within the meta-model as well as outside the model. In order to distinguish the

83
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007

deliverables of activities in the meta-process model from the concepts underlying a


PM maturity model, the latter will be depicted using gray boxes (see Figure 3).

STANDARD CLOSED MATURITY MODEL


OPEN CONCEPT
CONCEPT CONCEPT CONCEPT

Figure 3: Concept types

Concepts are connected by relationships. The evaluation framework employs three


types of relationships: generalization, association and aggregation.

Relationships
Generalization is a type of relationship between a general concept and a more specific
concept. This type of relationship is illustrated with a directed line with an open
arrowhead pointing at the general concept. A generalization is also called a ‘is-a’
relationship. The generalization relationship shown in Figure 4 below can be read as
“a survey/interview is a data source”.
An association is a structural relationship that describes how concepts are connected
to each other. It is depicted as a simple undirected line between two or more concepts.
The connection represented by an association is expressed as an active verb together
with by a black triangle that indicates the direction in which the connection should be
read. Thus, the association depicted in Figure 4 can be read as “a maturity score
determines a maturity”.
Lastly, an aggregation connotes a relationship between a concept ‘as a whole’ and
other concepts as ‘a part of it’. This relationship is also known as a ‘has-a’
relationship. It is shown as a directed line with an open diamond pointing at the
concept ‘as a whole’, which is also an ‘open’ concept. In the example below, the
aggregation relationship can be read as “a plan has an action and (has a) schedule”.

Figure 4: Relationship types

84
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007

Multiplicity
Besides a name and direction, there is another property of an association worth
mentioning: multiplicity.

Table 30: Forms of multiplicity


Form Meaning
1 Exactly one
0..1 Zero or one
0..* Zero or more
1..* One or more
5 Exactly five

This property indicates how many instances of a certain concept have a connection
with an instance of another concept. Multiplicity is expressed numerically at each end
of an association. The different forms of multiplicity are shown in the table below. In
Figure 5, the association example of Figure 1 is repeated, but this time with
expressions of multiplicity. According to this example, one or more instances of
maturity scores can determine precisely one instance of maturity, provided that one is
assuming the assessment of one single organization.

Figure 5: Example association with multiplicity

The process-data diagram


The combination of a meta-process and meta-data model creates a Process-Data
Diagram (PDD) in which the relationships are shown between the processes and
activities of an assessment process, and the deliverables resulting from each activity.
These deliverables are then connected to the concepts underlying a PM maturity
model, which are also depicted in the PDD.
The connection between an activity and a deliverable is expressed using a directed,
dashed line with its arrowhead pointing to the deliverable. An example of a PDD is
shown in Figure 6.

85
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007

Assess maturity

Assess process

MATURITY
Determine maturity score
SCORE

determines►

Determine maturity MATURITY

Figure 6: Example Process-Data Diagram

86
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007

Appendix D-1 – Assessor & User sample questionnaires

Questionnaire user
Over de gebruiker
1. Hoe lang hebt u al ervaring met het volwassenheidsmodel?

Over het volwassenheidsmodel & evaluatiemethode


2. Sinds wanneer is uw organisatie begonnen met het toepassen van het model?
3. Wat was de aanleiding hiervoor?
4. Wat waren de verwachtingen/doelstellingen voor het toepassen van het model?
5. Kunt u iets vertellen over hoe de evaluatie was verlopen?
a. Betrokken niveaus van organisaties
b. Aantal betrokkenen
c. Aandachtspunten tijdens evaluatie
6. Wat vindt het management van uw organisatie van het model? Staan ze er
sterk achter? Ondersteunen ze het initiatief actief?
7. Gebruik uw organisatie de evaluatie om structurele verbeteringen mogelijk te
maken?
8. Wat heeft uw organisatie na de evaluatie met de resultaten gedaan? Wat kon er
gedaan worden met de verkregen resultaten?
9. Heeft uw organisatie sinds haar eerste evaluatie nog herhaaldelijke evaluaties
uitgevoerd om de voortgang van verbeteringen te achterhalen?
10. Als laatste om het verhaal een beetje samenvattend te maken zou ik u graag
het model op een aantal dimensies willen laten scoren (rapportcijfer 1-10).
d. Subjectieve bruikbaarheid van het model/assessment
i. Begrijpbaarheid & leerbaarheid
ii. Tijd en moeite om een evaluatie voor te bereiden
iii. Toepasbaarheid (tijd en moeite om een evaluatie uit te voeren)
e. Subjectieve bruikbaarheid/betrouwbaarheid van de resultaten
i. Weergave van de werkelijkheid
ii. Begrijpbaarheid
iii. Toepasbaarheid
f. Subjectieve tevredenheid
i. Tevredenheid over het model

87
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007

Questionnaire assessor
Over de assessor
1. Hoe lang hebt u al ervaring met het volwassenheidsmodel?
2. Sinds wanneer bent u een gecertificeerde assessor voor het model?
3. Hoe vaak bent u betrokken geweest bij een evaluatie sinds de certificering?
4. In welke industrieën hebt u de evaluatie toegepast?
5. Bij welke type en grootte van organisaties hebt u de evaluatie toegepast?
6. Hebt u herhaaldelijke evaluaties uitgevoerd bij dezelfde organisaties?

Over het volwassenheidsmodel & evaluatiemethode


7. Is er sprake van een standaard procedure bij een evaluatie?
8. Kunt u vertellen over uw ervaringen met het model?
a. Wat zijn de sterke punten van het model?
b. Wat zijn de zwakke punten van het model?
c. Wat zijn de beperkingen van het model ten opzichte van andere
volwassenheidsmodellen?
9. Als laatste om het verhaal een beetje samenvattend te maken zou ik u graag
het model op een aantal dimensies willen laten scoren (rapportcijfer 1-10).
a. Subjectieve bruikbaarheid van het model/assessment
i. Begrijpbaarheid & leerbaarheid
ii. Tijd en moeite om een evaluatie voor te bereiden
iii. Toepasbaarheid (tijd en moeite om een evaluatie uit te voeren)
b. Subjectieve bruikbaarheid/betrouwbaarheid van de resultaten
i. Weergave van de werkelijkheid
ii. Begrijpbaarheid
iii. Toepasbaarheid
c. Subjectieve tevredenheid
i. Tevredenheid over het model

88
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007

Appendix D-2 – Summary interview findings

CMMI
Wat was de aanleiding voor het toepassen van het model en wat waren de
verwachtingen/doelstellingen?
- de beheersbaarheid van processen verbeteren

Kunt u iets vertellen over hoe de evaluatie was verlopen?


Commitment van alle niveaus van de te evalueren organisatie was belangrijk en
iedereen moest gemotiveerd worden. Een succesfactor hierbij was het kunnen
aanwijzen van pijnpunten met behulp van het model en vervolgens binnen een korte
termijn kunnen aantonen dat het proces beter verliep nadat men het anders uitvoerde.
Een ander voordeel was de platte structuur van de organisatie en de grootte van de
organisatie. Bij een groep van <100 is het makkelijker om te achterhalen of mensen
volgens dezelfde richtlijnen werken dan een organisatie van >500 man verdeeld over
verschillende lokaties binnen of buiten eenzelfde gebouw.

Wat vindt het management van het model? Staan ze er sterk achter? Ondersteunen ze
het initiatief actief?
Het management is overtuigd van de voordelen en ondersteunt het initiatief door
tijdens de kick-off aan iedereen van de organisatie uitgelegd wat de bedoeling is en
waar het goed voor is. Management commitment meet je door te kijken wie er komt
opdagen bij de kick-off, evaluatie en begeleiding. Consistentie is ook erg belangrijk:
als je eist dat een proces aanwezig moet zijn en het ontbreekt dan moet je dat niet
accepteren. En als je straks resultaten terugkrijgt van de evaluatie, moet je er ook iets
mee doen.

Wat kon er gedaan worden met de verkregen resultaten?


Er staan verbeterpunten in het rapport en die kan je prioriteren naar de belang van de
aanbevelingen en input van de aanbeveling. We keken naar of we er snel mee kunnen
beginnen en of we er snel van kunnen profiteren. Zo’n rapport laat duidelijk zien
welke processen nog verbeterd moeten worden en welke al voldoen aan de eisen van
het CMMI model.

Is er sprake van een standaard procedure bij een evaluatie?


Ja. De SEI heeft een standaard methode voor de evaluatie gedefinieerd. Maar dat is
niet de enige methode om CMMI toe te passen. Er zijn andere geaccrediteerde
instanties die ook CMMI evaluaties kunnen uitvoeren en als deze methoden voldoen
aan de eisen die SEI stelt aan de CMMI evaluatie dan worden deze methoden ook
erkent door de SEI.
De SEI stelt geen vragenlijsten voor zelf-assessments beschikbaar. In de meeste
gevallen worden deze vragenlijsten opgesteld door adviesbureaus en consultancy
bedrijven. Deze vragenlijsten kunnen dan door organisaties gebruikt worden om een
zelf-assessment uit te voeren. Voordeel hiervan is dat het bijdraagt aan de awareness
binnen een organisatie. Hiermee kan je kijken of de medewerkers een goede instelling
heeft en veel kennis hebben over de processen die zij uitvoeren. Een valkuil is dat
interne assessors geen goede evaluatie kunnen uitvoeren over hun eigen organisatie

89
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007

omdat ze vaak niet in staat zijn om op een afstand naar hun eigen gebruiken en
processen te kijken.

Wat zijn de sterke/zwakke punten van het model?


Volledigheid. Het model geeft handvaten om een organisatie op een niveau te
brengen. Het bestrijkt alle activiteiten die te maken hebben met software ontwikelling
en project management in dezelfde context. Het model leert je om op een
onafhankelijke en afstandelijke manier naar verbeterpunten in processen te kijken.

Een zwakke punt van CMMI is dat het model wat te theoretisch is bij het beschrijven
van de activiteiten. Het wordt soms erg generiek beschreven waardoor men weleens
afvraagt wat het model er precies mee bedoelt. De practices zijn soms onvoldoende
concreet beschreven. CMMI zegt wel wat je moet doen, maar hoe je het moet doen
laat het model open. Een nadeel hiervan is dat een organisatie veel tijd en moeite
ervoor moet doen om de ‘hoe’ vast te stellen en er naartoe te handelen. Aan de andere
kant zijn er ook nadelen zodra je de medewerkers van een organisatie een model gaat
opleggen. Dit zorgt voor weerstand. Het feit dat CMMI de ‘hoe’ openlaat maakt het
model moeilijker toe te passen in een grote organisatie dan in een kleine. In kleine
organisaties kunnen alle medewerkers een bijdrage leveren aan het definiëren van
processen conform CMMI. Dit is haalbaar. Maar bij een groep van een paar honderd
man is het moeilijker. In dat geval zou een model waarin de ‘wat’, ‘wie’, ‘hoe’ en
‘waarom’ staan beschreven wel handig zijn.

De toegevoegde waarde van een assessment en procesverbetering is dat men op de


probleempunten gewezen worden. Men wordt bewust gemaakt van het feit dat er
wordt gemeten en dat motiveert mensen om extra aandacht te geven aan knelpunten.
Een voordeel van een assessment is dat je alle aandacht richt op de belangrijkste
schakels binnen een process zonder dat je er heel veel voor hoeft te doen.

OPM3
Is er sprake van een standaard procedure bij een evaluatie?
Gedurende de OPM3 training voor assessors wordt er een standaard methode
beschreven die toegepast moet worden tijdens een assessment.

Wat zijn de sterke/zwakke punten van het model?


Het model bestrijkt niet alleen project management practices, maar ook practices van
programma’s en portfolio’s. Het model stelt een tool beschikbaar om resultaten te
genereren, dit maakt het makkelijk om scores te genereren en te achterhalen. Alle
essentiële punten komen naar voren in het rapport. Een ander sterk punt is dat OPM3
niet alleen kijkt naar de activiteiten zelf, maar ook naar de organisatie. De OPM3
onderscheid een lijst met best practices die met name beoordelen of een organisatie
genoeg draagvlak creëert om verbeteringen en veranderingen voor het projectmatig
werken in te voeren.

Een zwak punt van OPM3 is dat het sterk is gebaseerd op PMBOK waardoor een
organisatie, die volgens een ander project management methode zoals PRINCE2
werkt, meer moeite heeft om de aanbevelingen in het rapport te vertalen zodat zij er
gebruik van kunnen maken.

90
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007

Overige opmerkingen
OPM3 gaat verder dan alleen het opleveren van een rapport na een assessment.
Medewerkers van een organisatie die opgeleid willen worden tot OPM3 assessors
hebben ook de keuze om een opleiding te volgen om OPM3 consultant te worden. De
meeste OPM3 assessors zijn ook opgeleid als OPM3 consultants waardoor ze na het
opleveren van de assessment resultaten een organisatie ondersteuning kunnen bieden
bij het opstellen van een plan van aanpak voor verbeteringstrajecten.

91
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007

Appendix E – Process-Data Diagrams

OPM3

OPM3
1
1. Prepare assessment
FOUNDATION
ASSESSMENT 1
Familiarize with OPM3 leads to►
1 TRIGGER
1
SELF-ASSESSMENT
Perform self-assessment SELF-ASSESSMENT REPORT ◄generates TOOL
1 1..* 1

[else]]
1 ORGANIZATION
[Pursue rigorous DESCRIPTION
assessment]
1 COMMUNICATION PM DOMAIN
1

categorizes►
2. Perform Assessment PLAN

BEST PRACTICES
1 ENGAGEMENT PROCESS
Determine depth and breadth DIRECTORY 1
DESCRIPTION IMPROVEMENT 1
of rigorous assessment STAGE 1

1 TEAM & ROLE


Acquire & prepare team DESCRIPTION

1
ORGANIZATIONAL 1..* 1..*
ASSESSMENT ENABLER
Develop assessment plan PLAN 1
BEST PRACTICE
PROCESS BEST
DATA PRACTICE 1
COLLECTION
Prepare for data collection PLAN
1 has▼
PROTOCOL

Conduct PRELIMINARY 2..*


interviews ASSESSMENT CAPABILITIES
OUTCOME ◄has CAPABILITY
DIRECTORY
Study records & FINDINGS 1..* 1 1..* 1 1
1 1
documents 1..*

1..* has▼
KEY 1..* has▼
PERFORMANCE decides►
Verify assessment findings INDICATOR 1 1
OUTCOME 1..*
SCORE
decides►
1 1
Enter findings in rigorous RESULTS 1..* CAPABILITY 1..*
assessment tool ANALYSIS SCORE
decides►
1..* 1
1 1..*
BEST PRACTICE
1 SCORE
Generate & analyze data ASSESSMENT 1..*
RESULTS
- OPM maturity score
1 1
Prepare final report
FINAL REPORT
1 1 MATURITY MATURITY
1
1 PROFILE 1..* DATABASE 1
leads to▼ 1

1..*

[else] IMPROVEMENT
TRIGGER DEPENDENCY
[Pursue
improvements] - (other) BEST PRACTICE/
1..* CAPABILITY 1..*
◄influences
(value: present/absent)
3. Plan for improvement
1 1
1 1..*
PRIORITY ◄influences FACTOR IMPROVEMENT
Select & prioritize PLANNING DIRECTORY
1 1
improvement initiatives
has▼
1
ATTAINABILITY
Develop improvement plan INITIATIVE SCHEDULE
1..* 1 STRATEGIC
PRIORITY
1

IMPROVEMENT PLAN BENEFIT


[Pursue further [Pursue reassessment]
improvement]
COST
[else]

92
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007

CMMI

1. Plan and prepare for


appraisal

Analyze Requirements REQUIREMENT


1..*

Develop Appraisal Plan APPRAISAL PLAN


1

Select and prepare Team TEAM


1

Obtain & inventory INITIAL OBJECTIVE


initial Objective Evidence EVIDENCE REVIEW

Prepare for appraisal conduct DATA COLLECTION


1 PLAN

2. Conduct appraisal

PRACTICE ARTIFACT
Prepare participants ◄categorizes IMPLEMENTATION
1 INDICATOR AFFIRMATION
1..* 1..*

1..*
Examine Objective Evidence OBJECTIVE
EVIDENCE
◄determines
has▲

INTERVIEW
Document Objective Evidence
DOCUMENT

1
1..*
PRACTICE SPECIFIC PRACTICE
Verify Objective Evidence RATING
◄has PRACTICE
1 1 GENERIC PRACTICE
1..*
1..* 1..*

RESULTS determines► 1..*


Validate preliminary findings ANALYSIS CMMI
◄has GOAL
1 1 1 1..4 1
1
determines▼ GOAL RATING
1..* 1
1..*
APPRAISAL GENERIC SPECIFIC
Generate appraisal Results determines▼ GOAL GOAL
RESULTS 5
1
MATURITY
1 1 LEVEL
1..*
1
PROCESS 2
◄has PROCESS AREA
1..* AREA RATING 1 1 1..*
3. Report results has▼
1..*
determines▼
1 1
1
FINAL 1 DEPENDENCY
Deliver MATURITY
FINDINGS LEVEL
appraisal Results REPORT
- (other) MATURITY
1 RATING LEVEL
1

MATURITY PROFILE
IMPROVEMENT DATABASE 1
1..* SUGGESTION
Package and archive 1
appraisal Assets
1
1..*
APPRAISAL MATURITY
RECORD PROFILE
1 1 1

93
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007

PMMM

ASSESSMENT PMMM
PLAN
1
1
1. Initiation

Identify assessment need DRIVER


1 MATURITY
LEVEL 5

1
Determine
SCOPE
assessment scope 1
ROADBLOCK
1..*

Sample assessment ASSESSMENT


RISK
participants SAMPLE 1 1..*

ADVANCEMENT
CRITERIA 1..*

2. Assessment
ASSESSMENT
◄calculates
1 INSTRUMENT 1
1..*
5
Fill in online INDIVIDUAL
questionnaire SCORE
provides▲
1..*

INDIVIDUAL INDIVIDUAL ONLINE


Generate individual 1 ASSESSMENT TOOL 1
ASSESSMENT SCORE
assessment scores 1..*
RESULTS 1 ANALYSIS
1..*

[else] BENCHMARKING
DATABASE 1

1
[Need for
benchmark
results] ORGANIZATIONAL
SCORE 1..*

Generate benchmark
scores BENCHMARK INDUSTRY
1..* SCORE SCORE 1..*

SIZE SCORE
1..*
1

Generate ASSESSMENT
assessment results RESULTS
summary SUMMARY
1

[else]

[Request for elaborate


assessment report] ASSESSMENT PLAN
1

3. Assessment COMPANY
report development 1 BACKGROUND
& delivery
1 RESULTS ANALYSIS
1
Generate & deliver ELABORATE
elaborate assessment ASSESSMENT
report REPORT 1
SUGGESTED ACTION
1..*

BENCHMARK
1..* COMPARISON

94
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007

Appendix F – Concept and activity tables

OPM3
Note: Two of the sources for the definition of the PDDs activities and concepts comprise official OPM3 training
materials. These sources are subdivided into modules (chapters), each with their own separate page numbers. When
referring to particular pages, the capital M will be used to indicate the module number, followed by the actual page
numbers.

Assessment process Description


concept
ASSESSMENT The needs of the organization and the reason for the assessment. ([43], M6.p.5)
TRIGGER
SELF-ASSESSMENT A review of which best practices in OPM3 are and are not currently demonstrated by
REPORT the organization, and identifying the organization’s general position on a continuum of
organizational project management maturity. ([12], p.9)
ASSESSMENT PLAN A scheme of action containing information about the assessment to be conducted. The
main objective of this document is to make the assessors aware of what they are
required to assess and what time period is available in which to perform the
assessment. ([43], M6.p.13)
ORGANIZATION Information about the organization being assessed, which comprises: name,
DESCRIPTION description, product/service, type and size of projects, organizational unit involved in
the assessment, point of contact and results of previous improvement projects in
place. ([43], M6.pp.13-16)
COMMUNICATION A scheme of action containing information about, among others, when to provide
PLAN feedback to the organization on an ongoing bases and when to gather the assessment
team together to review the progress. ([43], M6.pp.13-16)
ENGAGEMENT Description of the assessment to be conducted, which comprises information about:
DESCRIPTION the purpose and objectives, scope and criteria, data collection process, data analysis
process, data validation process, and the schedule of activities. ([43], M6.pp.13-16)
TEAM&ROLES Description of the members of the assessment team and what roles and
DESCRIPTION responsibilities are fulfilled by whom. ([43], M6.pp.13-16)
DATA COLLECTION A data collection plan contains information about how to collection the data needed for
PLAN the assessment. This plan contains an interview strategy, an interview sampling
strategy and interview lists. ([43], M8.pp.5-11)
PROTOCOL An assessor question set generated from the OPM3 ProductSuite Assessment Tool.
This protocol can be the foundation for a question set for the scoped assessment.
([43], M8.p.12)
PRELIMINARY These are the data collected after conducting interviews and reviewing records and
ASSESSMENT documents. This data is primarily made up of scores given to key performance
FINDINGS indicators, which are used to measure the degree of existence of the outcomes
defined in OPM3. ([43], M.8.pp.17-19)
RESULTS ANALYSIS Part of the assessment report generated by the OPM3 ProductSuite Assessment Tool
containing descriptions of the analysis steps that resulted in the achievement scores of
the OUTCOMES, CAPABILITIES and BEST PRACTICES within scope. ([30],
M9.pp.7-19)
ASSESSMENT Assessment report generated by the OPM3 ProductSuite Assessment Tool and
RESULTS evaluated by the assessors. The results show achievements of outcomes, capabilities
and best practices, and the degree of organizational project management maturity.
([43], M9.pp.7-19)
OPM MATURITY The degree to which an organization practices systematic management of projects,
SCORE programs and portfolios in alignment with the achievement of strategic goals. This
(attribute) degree is given to an organization based on the findings resulting from an
assessment. ([12], p.xiii & p.173)
OUTCOME SCORE A value given to an outcome indicating the degree to which it is achievement. ([43],
M.8.pp.20-21)
CAPABILITY SCORE A value given to a capability indicating the degree of its implementation within an
organization. ([43], M.8.pp.20-21)
BEST PRACTICE A value given to a best practice demonstrating the degree of its existence within an
SCORE organization. ([43], M.8.pp.20-21)
FINAL REPORT A comprehensive presentation of the assessment results. This report reflect results of
the detailed analysis undertaken by the assessment team, and provides information to
the organization on: the degree of organizational project management maturity, best
practices, outcomes, capabilities, the associated scores of these latter three
components, and the ProductSuite scores per project management domain and
process improvement stage. ([43], M.9.p.22)

95
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007

MATURITY PROFILE A profile containing information regarding the ‘current’ maturity of an organization. A
basic maturity profile includes:
- achieved BEST PRACTICES
- achieved CAPABILITIES
- tool generated SCORES
([44], M.3.p.8)
IMPROVEMENT The needs of the organization and the reason for choosing to pursue organizational
TRIGGER improvements leading to increased maturity. ([12], p.9)
IMPROVEMENT PLAN An improvement plan provides all of the key information relevant to the purpose and
contents of the improvement project. This plan typically includes descriptions of: the
organization, the purpose and objectives of the improvement, the process used to
achieve the improvement plan, the implementation strategy, list of best practices to be
improved, schedule, risks and constraints and (as a reference) the final report of the
assessment. ([43], M.4.p.17)
PRIORITY A right to precedence given to an improvement initiative ([12], pp.39-40)
INITIATIVE A leading action to implement a capability that leads to the realization of improvements
and increased organizational project management maturity. ([12], pp.39-40)
SCHEDULE A scheme stating the time allocated to improvement initiatives, the other in which they
should be carried out (priority) and the resources needed. ([12], pp.39-40)
FACTOR An element that may influence the prioritization of planned improvement initiatives for
optimum use of resources. ([12], p.39)
ATTAINABILITY A factor expressing the degree to which an improvement initiative is achievable. This
consideration can help the organization demonstrate early success and gain valuable
momentum to sustain the improvement initiative. ([12], p.40)
STRATEGIC PRIORITY A factor describing the essentialness of an improvement initiative to an organization’s
strategy. ([12], p.40)
BENEFIT A factor expressing the advantage of an improvement initiative. ([12], p.40)
COST A factor indicating the expenditures connected to an improvement initiative. ([12], p.40)

Maturity model concept Description

OPM3 A standard developed under the stewardship of the Project Management Institute. The
purpose of this standard is to provide a way for organizations to understand
organizational project management and to measure their maturity against a
comprehensive and broad-based set of organizational project management best
practices. ([12], p.xiii)
FOUNDATION Narrative text, presenting the OPM3 foundational concepts, with various appendices
and a glossary. ([12], p.xiv)
SELF-ASSESSMENT A tool in support of the self-assessment step outlined in the OPM3 standard. ([12], p.9)
TOOL
BEST PRACTICES One of the three directories necessary to assess an organization against OPM3 and
DIRECTORY evaluate the scope and sequence of possible improvements. In the best practices
directory, the names and brief descriptions are provided of nearly 600 best practices.
([12], pp.31-32)
CAPABILITIES One of the three directories necessary to assess an organization against OPM3 and
DIRECTORY evaluate the scope and sequence of possible improvements. In the capabilities
directory provides detailed data on all of the capabilities in OPM3, organized according
to the best practices with which they are associated. ([12], p.32)
IMPROVEMENT One of the three directories necessary to assess an organization against OPM3 and
PLANNING DIRECTORY evaluate the scope and sequence of possible improvements. This directory is provided
to show the dependencies between capabilities, which are essential to the assessment
and improvement steps of the OPM3 cycle. Once the organization has identified best
practices requiring assessment, this directory will indicate the capabilities leading to
each of these best practices, along with any additional capabilities on which they may
depend. ([12], p.32)
BEST PRACTICE A best practice is an optimal way currently recognized by industry to achieve a stated
goal or objective. For organizational project management, this includes the ability to
deliver projects successfully, consistently, and predictably to implement organization
strategies. ([12], p.171)
PM DOMAIN A domain refers to each of the three domains of PM: project management, program
management, and portfolio management. ([12], p.172)
PROCESS GROUP Project management processes can be organized into five process groups of one or
more processes each: initiating processes, planning processes, executing processes,
controlling processes and closing processes. Process groups are linked together by
the results they produce –the results or outcome of one becomes an input to another.
([13], pp.27-29)

96
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007

PROCESS One of the dimensions along which OPM3 defines organizational project management
IMPROVEMENT STAGE maturity in terms of best practices. These stages to process improvement employed
by OPM3 are: standardize, measure, control, and continuously improve. The
sequence implies a prerequisite relationship between the stages, in that the most
advanced stage (continuously improve) is dependent on a state of control, which is, in
turn, dependent on measurement, which is dependent on standardization. ([12], p.6 &
p.19)
ORGANIZATIONAL Organizational enablers is a sub-set of the OPM3 best practices that relate to the
ENABLER organizational structures and processes necessary to support efficient and effective
implementation and operation of the best practices for the project, program and
portfolio domains. ([43], M.2)
PROCESS BEST See “best practice” definition.
PRACTICE
CAPABILITY A capability is a specific competency that must exist in an organization in order for it to
execute project management processes and deliver project management services and
products. Capabilities are incremental steps leading to one or more best practices.
([12], p.171)
OUTCOME An outcome is the tangible or intangible result of applying a capability. The degree to
which an outcome is achieved is measured by a key performance indicator. ([12],
p.171)
KEY PERFORMANCE A criterion by which an organization can determine, quantitatively or qualitatively,
INDICATOR whether an outcome associated with a capability exists or the degree to which it exists.
([12], p.172)
DEPENDENCY A dependency is a relationship in which a desired state is contingent upon the
achievement of one or more prerequisites. In OPM3, one type of dependency is
represented by the series of capabilities that aggregate to a best practice. Another
type occurs when the existence of one best practice depends, in part, on the existence
of some other best practice. In this case, at least one of the capabilities within the first
best practice depends on the existence of one of the capabilities within the other best
practice. ([12], p.172)
MATURITY DATABASE A repository for maturity profiles of organizations that have conducted rigorous OPM3
assessments. This database is managed by the PMI. (confirmed by certified OPM3
assessor, C. Ten Zweege)

Assessment
Sub-activity Description
process activity
Prepare Familiarize with OPM3 Before an organization decides to conduct a self- or rigorous
assessment OPM3 assessment, it has to understand the contents of OPM3 as
thoroughly as possible and becoming familiar with organization
project management and with the operation of OPM3.
Perform self-assessment In preparation of a rigorous assessment, BEST PRACTICES that
are and are not currently demonstrated in the organizational unit
are identified. This can be done with OPM3’s SELF-
ASSESSMENT TOOL, but also by a tool developed by
organizations themselves or consulting companies. This activity
results in a SELF-ASSESSMENT REPORT. The analysis of the
SELF-ASSESSMENT REPORT will eventually lead to the
decision whether or not there is a need for an organization to
conduct a rigorous assessment with the assistance of certified
OPM3 assessors.
Perform Determine depth and The scope and criteria of the assessment is determined, which is
assessment breadth of rigorous one of the elements described in the ENGAGEMENT
assessment DESCRIPTION.
Acquire & prepare team The ROLES and responsibilities of the assessment TEAM are
specified and communicated to all TEAM members.
Develop assessment The ASSESSMENT PLAN is developed. It is made up of the
plan following parts: ORGANIZATION DESCRIPTION,
COMMUNICATION PLAN, ENGAGEMENT DESCRIPTION and
TEAM & ROLES.
Prepare for data The PROTOCOL for conducting the data collection is generated
collection from the OPM3 ProductSuite Assessment Tool and a DATA
COLLECTION PLAN is developed.
Conduct interviews Interviews are conducted with members of an organization. This
sub-activity results in PRELIMINARY ASSESSMENT FINDINGS
comprising KEY PERFORMANCE INDICATORS.

97
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007

Perform Study records & Records and documents are examined to collect data. This sub-
assessment documents activity results in PRELIMINARY ASSESSMENT FINDINGS
(cont.) comprising KEY PERFORMANCE INDICATORS.
Verify assessment The PRELIMINARY ASSESSMENT FINDINGS are verified.
findings
Enter findings in rigorous The verified PRELIMINARY ASSESSMENT FINDINGS are
assessment tool entered into the OPM3 ProductSuite Assessment Tool.
Generate & analyze The OPM3 ProductSuite Assessment Tool processes the
results PRELIMINARY ASSESSMENT FINDINGS entered and generates
ASSESSMENT RESULTS. After generation, the ASSESSMENT
RESULTS are analyzed first.
Prepare final report Using the ASSESSMENT RESULTS, a FINAL REPORT is
prepared. Based on the FINAL REPORT, an organization decides
whether the findings result in an IMPROVEMENT TRIGGER.
Plan for Select & prioritize Based on the FINAL REPORT, improvement INITIATIVES are
improvement improvement initiatives given PRIORITIES in the IMPROVEMENT PLAN.
Develop improvement An IMPROVEMENT PLAN is developed based on the analysis of
plan the improvement INITIATIVES that can be taken to realize
improvements in the organization; in which order and when they
should be taken, and how.

98
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007

CMMI
Note: The term ‘appraisal’ is defined by CMMI as “an examinations of one or more processes by a trained team of
professionals using an appraisal reference model as the basis for determining, at a minimum, strengths and
weaknesses. In CMMI models this term is employed instead of ‘assessment’, which is defined as “an appraisal done
by an organization internally for the purposes of process improvement”. In order to stay true to the definitions of SEI
regarding CMMI and SCAMPI, the term ‘appraisal’ will be used instead of ‘assessment’ here while both mean the
same throughout this report.

Assessment process
Description
concept
REQUIREMENT A piece of information required to plan an appraisal. Examples of requirements include
the objective of the appraisal, constraints, scope and output. ([37], p.I_10)
APPRAISAL PLAN A guide containing all technical and non-technical details of an appraisal. An appraisal
plan includes among others descriptions about the method to be applied during an
appraisal, the resources needed, costs and risks to be taken into account ([37],
pp.II_18-II_20).
TEAM A group of individuals qualified, experiences, trained, available and prepared to execute
an appraisal ([37], pp.II_32-II_34).
INITIAL OBJECTIVE A report constructed in preparation to the actual appraisal containing information
EVIDENCE REVIEW regarding the data (un)availability, additional information needed, operations and
processes within an organizational unit and an initial set of objective evidence (see
definition below) ([37], pp.II_48-II_49).
DATA COLLECTION A guide containing details about the procedures to collect data for during an appraisal. A
PLAN data collection plan comprises mainly information about the participants to be consulted,
documents to be reviewed, responsibilities for data collection activities and
presentations/ demonstrations to be provided to the participants ([37], pp.II_59-II_62).
OBJECTIVE EVIDENCE Documents or interview results used as indicators of the implementation or
institutionalization of model practices. Sources of objective evidence can include
instruments, presentations, documents, and interviews ([37], p.III_53).
INTERVIEW A meeting of appraisal team members with appraisal participants for the purpose of
gathering information relative to work processes in place. In SCAMPI, this includes face-
to-face interaction with those implementing or using the processes within the
organizational unit. Interviews are typically held with various groups or individuals, such
as project leaders, managers, and practitioners. A combination of formal and informal
interviews may be held and interview scripts or exploratory questions developed to elicit
the information needed ([37], p.III_52).
DOCUMENT A collection of data, regardless of the medium on which it is recorded, that generally has
permanence and can be read by humans or machines. Documents can be work
products reflecting the implementation of one or more model practices. These
documents typically include work products such as organizational policies, procedures,
and implementation-level work products. Documents may be available in hardcopy,
softcopy, or accessible via hyperlinks in a Web-based environment ([37], p.III_50).
ARTIFACT A tangible form of objective evidence indicative of work being performed that is a direct
or indirect result of implementing a CMMI model practice ([37], p.III_49).
AFFIRMATION An oral or written statement confirming or supporting implementation (or lack of
implementation) of a CMMI model specific practice or generic practice. Affirmations are
usually provided by the implementers of the practice and/or internal or external
customers, but may also include other stakeholders (e.g., managers, suppliers) ([37],
p.III_47).
APPRAISAL RECORD An orderly, documented collection of information that is pertinent to the appraisal and
adds to the understanding and verification of the appraisal findings and ratings
generated ([37], p.III_48). An appraisal record contains among others: the appraisal
requirements, appraisal plan, objective evidence, all appraisal ratings and final findings
([37], pp.II_129-pp.II_131).
FINAL FINDINGS The final findings report contain the validated strengths, weaknesses, and ratings (as
REPORT defined by the appraisal plan), reflecting the organizational maturity level for process
areas within the appraisal scope ([37], pp.II_116-II_121).
APPRAISAL RESULTS The results of an appraisal comprise the goal satisfaction ratings, satisfaction ratings of
process areas within the appraisal scope and the derived maturity level rating ([37],
pp.II_106-II_108).
RESULTS ANALYSIS Report describing the analysis steps taken to determine the APPRAISAL RESULTS
([37], pp.II_106-II_108).
IMPROVEMENT A section in the FINAL FINDINGS REPORT containing suggestions for improvements
SUGGESTION based on the APPRAISAL RESULTS found (CMMI appraisal sample report, 2007).
PRACTICE RATING A value assigned by an appraisal team, which indicates the extent to which a practice is
implemented throughout the organizational unit ([37], pp.II_96-II_100).

99
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007

GOAL RATING The value assigned by an appraisal team to a CMMI goal. The rating is determined by
enacting the defined rating process for the appraisal method being employed. The goal
satisfaction rating is based on the extent of practice implementation throughout the
organizational unit ([37], p.I_34 & p.III_48).
PROCESS AREA The value assigned by an appraisal team to a process area. The process area
RATING satisfaction rating is derived from the set of goal satisfaction judgments ([37], p.II_112 &
& p.III_48).
MATURITY LEVEL The value assigned by an appraisal team to the maturity level of an organizational unit.
RATING The rating is determined by enacting the defined rating process for the appraisal method
being employed. The rating of maturity levels is driven algorithmically by the goal
satisfaction ratings ([37], p.I_34 & p.III_48).
MATURITY PROFILE A subset of the contents of the appraisal record, as well other data used by SEI to
aggregate and analyze appraisal performance data for reporting to the community and
monitoring the quality of performed appraisals ([37], p.II_132 & III_49).

Maturity model concept Description


CMMI CMMI stands for Capability Maturity Model Integration. It is a process improvement
maturity model for the development of products and services. It consists of best
practices that address developments and maintenance activities that cover the product
lifecycle from conception through delivery and maintenance ([28], p.i).
MATURITY LEVEL Degree of process improvement across a predefined set of process areas in which all
goals in the set are attained. ([28], p.543)
DEPENDENCY A relationship between pairs of MATURITY LEVELS in CMMI, where the lower
MATURITY LEVEL has to be achieved first before the higher one can be achieved. For
example, an organization cannot achieve CMMI level 3 if not all conditions of level 2 are
met. ([28], p.39)
PROCESS AREA A cluster of related practices in an area that, when implemented collectively, satisfy a
set of goals considered important for making improvement in that area. All CMMI
process areas are common to both continuous and staged representations ([28], p.548).
GOAL A required CMMI component that can be either a generic goal or a specific goal ([19],
p.541). A required component describes what an organization must achieve to satisfy a
process area. This achievement must be visibly implemented in an organization’s
processes ([28], p.16).
GENERIC GOAL A generic goal describes the characteristics that must be present to institutionalize the
processes that implement a process area ([28], p.541). An example of a generic goal is
“The process is institutionalized as a defined process.” ([28], pp.19-20). In this sense a
GENERIC GOAL can be understood as the same as achieving a maturity level.
SPECIFIC GOAL A required model component that describes the unique characteristics that must be
present to satisfy the process area ([28], p.555). For example, a specific goal from the
Configuration Management process area is “Integrity of baselines is established and
maintained.” ([28], p.19)
PRACTICE An expected CMMI component that can be either a generic practice or a specific
practice. An expected component describes what an organization may implement to
achieve a required component. Expected components guide those who implement
improvements or perform appraisals ([28], p.16). Practices within CMMI are examples of
how activities are carried out in practice by organizations in general. CMMI
acknowledges the fact that there are different ways to achieve GOALS; therefore it does
not enforce organizations to act blindly according to the PRACTICE descriptions. These
PRACTICES are guidelines that can be used to appraise organizations.
GENERIC PRACTICE An expected model component that is considered important in achieving the associated
generic goal. The generic practices associated with a generic goal describe the
activities that are expected to result in achievement of the generic goal ([28], p.541).
For example, a generic practice for the generic goal “The process is institutionalized as
a managed process” is “Provide adequate resources for performing the process,
developing the work products, and providing the services of the process.” ([28], p.21)
SPECIFIC PRACTICE An expected model component that is considered important in achieving the associated
specific goal. The specific practices describe the activities expected to result in
achievement of the specific goals of a process area ([28], p.555). For example, a
specific practice from the Project Monitoring and Control process area is “Monitor
commitments against those identified in the project plan.”
PRACTICE An objective attribute or characteristic used as a “footprint” to verify the conduct of an
IMPLEMENTATION activity or implementation of a CMMI model specific or generic practice. Types of
INDICATOR (PII) practice implementation indicators include artifacts and affirmations ([37], p.III_54).
MATURITY PROFILE An electronic storage environment containing aggregated appraisal results. The SEI
DATABASE provides approved information within the bounds of confidentiality to the community,
based on results from the appraisal data collected. The SEI establishes the format and
mechanisms for the presentation of this information. ([37], p.II_132).

100
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007

Assessment
Sub-activity Description
process activity
Plan and Analyze requirements Understand the business needs of the organizational unit for which
prepare for the appraisal is being requested. Information is collected to
appraisal determine appraisal REQUIREMENTS.
Develop appraisal plan An appraisal method is tailored and the needed resources are
identified. The costs and schedule associated with the appraisal are
determined. Logistics are planned and managed and risks are
documented and managed. All this information is documented in the
APPRAISAL PLAN.
Select and prepare team The leader and members of the appraisal TEAM are selected and
prepared.
Obtain & inventory initial Obtain information that facilitates site-specific preparation. Obtain
objective evidence data on model practices used. Potential issue areas, gaps, or risks
are identified to aid in refining the APPRAISAL PLAN. This results
in an INITIAL OBJECTIVE EVIDENCE REVIEW.
Prepare for appraisal Specific data-collection STRATEGIES including sources of data,
conduct tools and technologies to be used, and contingencies to manage
risk of insufficient data are planned and documented in the form of a
DATA COLLECTION PLAN.
Conduct Prepare participants Inform appraisal participants of the purpose of the appraisal and
appraisal prepare them for participation.
Examine objective Activities in accordance with the DATA COLLECTION PLAN are
evidence performed. INTERVIEWS and DOCUMENTS are examined as
OBJECTIVE EVIDENCE to collect information about the practices
implemented in the organizational unit.
Document objective Lasting records of the OBJECTIVE EVIDENCE gathered are
evidence created by identifying and then consolidating notes, transforming
the data into records that document practice implementation, as
well as strengths and weaknesses.
Verify objective evidence The implementation of the organizational unit’s practices for each
instantiation is verified. Each implementation of each practice is
verified so it may be compared to appraisal reference model (i.e.
CMMI) practices, and the TEAM characterized the extent to which
the practices in the model at implemented by means of a
PRACTICE RATING.
Validate preliminary Preliminary findings are validated. Gaps in the implementation of
findings model practices are weaknesses and exemplary implementation of
model practices may be highlighted as strengths in the appraisal
OUTPUTS. Both strengths and weaknesses are validated with
members of the organizational unit.
Generate appraisal APPRAISAL RESULTS are generated based on the validation of
results preliminary appraisal findings. APPRAISAL RESULTS contain
PRACTICE RATINGS, GOAL RATINGS and the MATURITY
LEVEL RATING.
Report results Deliver appraisal results APPRAISAL RESULTS are provided in the form of a FINAL
FINDINGS REPORT to the organizational unit to guide following
actions.
Package and archive Important data and records from the appraisal are preserved in an
appraisal assets APPRAISAL RECORD. Part of this APPRAISAL RECORD is
provided to the CMMI Steward (i.e. SEI) in the form of a MATURITY
PROFILE. Sensitive materials are disposed in an appropriate
manner.

101
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007

PMMM
Assessment process
Description
concept
ASSESSMENT PLAN A plan containing information about the reasons for conducting the maturity assessment
(DRIVER), the scope of the assessment (SCOPE) and those who will participate in the
assessment (ASSESSMENT SAMPLE). [34]
DRIVER Purpose of the maturity assessment. [34]
SCOPE Information regarding what part(s) of the organization will be subjected to the maturity
assessment. [34]
ASSESSMENT SAMPLE Group of employees of the organization selected to fill out the questionnaire in the
ONLINE ASSESSMENT TOOL. [34]
INDIVIDUAL Assessment findings based on the INDIVIDUAL SCORES of a participant. [34]
ASSESSMENT
RESULTS
INDIVIDUAL SCORE Scores achieved by each participant after filling out the part of the questionnaire
belonging to each MATURITY LEVEL. This score indicates the extent to which a
participant has achieved a certain MATURITY LEVEL. [34]
INDIVIDUAL SCORE Brief report containing analysis results and improvement possibilities based on the
ANALYSIS INDIVIDUAL SCORES. [34]
BENCHMARK SCORE BENCHMARK SCORES allow comparisons between the INDIVIDUAL SCORE and the
scores of other participants of the same organization, organizations of the same industry
sector and organizations of the same size. [34]
ORGANIZATIONAL Aggregate score calculated (so far) of the organization within the scope of the
SCORE assessment. [34] Note that this ORGANIZATIONAL SCORE is complete when
presented in the ASSESSMENT RESULTS SUMMARY.
INDUSTRY SCORE Aggregate score of organizations in the BENCHMARKING DATABASE operating in the
same industry sector as the organization in scope. [34]
SIZE SCORE Aggregate score of organizations in the BENCHMARKING DATABASE that are of
similar size as the organization in scope. [34]
ASSESSMENT Assessment findings based on the collective scores of all participants in scope after the
RESULTS SUMMARY questionnaire period. [34]
ELABORATE Report in which assessment findings as well as improvement suggestions are
ASSESSMENT REPORT elaborated. The findings are more thoroughly explained in this report compared to the
ASSESSMENT RESULTS SUMMMARY. [34]
COMPANY Part of the ELABORATE ASSESSMENT REPORT provided to organizations by the IIL
BACKGROUND when requested. It contains information regarding the organization in scope of the
assessment [45].
RESULTS ANALYSIS Part of the ELABORATE ASSESSMENT REPORT provided to organizations by the IIL
when requested. It contains a detailed analysis of the assessment results [45].
SUGGESTED ACTION Part of the ELABORATE ASSESSMENT REPORT provided to organizations by the IIL
when requested. It contains descriptions of the possible actions that could be taken to
realize improvements within the organization [45].
BENCHMARK Part of the ELABORATE ASSESSMENT REPORT provided to organizations by the IIL
COMPARISON when requested. It contains comparisons between the scores achieved by the
organization in scope and organizations of the same size or operating in the same
industry [45].

Maturity model concept Description


PMMM The Kerzner Project Management Maturity Model enables a diagnosis of the health of
project management in your organization. It assists organizations in identifying strategic
strengths and weaknesses and, if the ONLINE ASSESSMENT TOOL is used, creates a
prescriptive action plan for improving the health of your PM efforts. It allows you to
objectively assess your project management capabilities against key knowledge areas
of the PMBOK® Guide. [34]
MATURITY LEVEL Level of the PMMM model, representing a particular degree of maturity in project
management. ([25], p.42)
ROADBLOCK A barrier (belonging to a certain maturity level) preventing an organization from reaching
the next maturity level. ([25], pp.46-47)
RISK A risk can be assigned to each level of the PMMM (low, medium, high). The level of risk
is most frequently associated with the impact of having to change the corporate culture.
(confirmed by Mr. C. Damiba)
ADVANCEMENT Condition that has to be met in order to reach the next maturity level. ([25], pp.46-47)
CRITERIA
ASSESSMENT Tool to help an organization determine its degree of maturity at each maturity level
INSTRUMENT embodied by PMMM. Basically, it is a part of the complete assessment questionnaire of
PMMM. ([25], pp.46-47)

102
A framework for the comparison of maturity models for project-based management. T.J.Man | September 2007

ONLINE ASSESSMENT The online format of the maturity assessment questionnaire. It is fast, automatic and
TOOL easy to use, plus an executive interface monitors results. (confirmed by Mr. C. Damiba)
BENCHMARKING Repository that allows comparisons between the maturity levels of organizations of the
DATABASE same industry or size. (confirmed by Mr. C. Damiba)

Assessment
process Sub-activity Description
activity
Initiation Identify assessment need The DRIVER for carrying out an assessment is identified.

Determine assessment The SCOPE of the assessment is determined


scope
Sample assessment During this activity, the ASSESSMENT SAMPLE is selected from the
participants organization to which the assessment will be applied. Together with
the DRIVER and SCOPE, this piece of information will be
documented in an ASSESSMENT PLAN.
Assessment Fill in online The participants in the ASSESSMENT SAMPLE are invited to fill in a
questionnaire questionnaire provided by the ONLINE ASSESSMENT TOOL of the
PMMM.
Generate individual After filling out the assessment instrument assigned to a MATURITY
assessment scores LEVEL, each participant can retrieve his or her INDIVIDUAL SCORE
and INDIVIDUAL SCORE ANALYSIS.
Generate benchmark Besides the INDIVIDUAL SCORE and INDIVIDUAL SCORE
scores ANALYSIS, each participant can also retrieve BENCHMARK
SCORES to compare their INDIVIDUAL SCORES to scores of those
who have also taken the assessment.
Generate assessment After all participants of the ASSESSMENT SAMPLE has provided all
results summary information needed to the ONLINE ASSESSMENT TOOL, the
assessment sponsor will be granted access to an interface where the
ASSESSMENT RESULTS SUMMARY is presented.
Assessment Generate & deliver If there is a need for an ELABORATE ASSESSMENT REPORT, the
report elaborate assessment IIL can develop and provide this to the organization where the
development & report assessment took place.
delivery

103
Appendix G – Criteria results per model

OPM3
Criteria Aspect Value Reference Explanation
Maturity reference model criteria
Openness Free access No [12] 1.1 OPM3 purpose and scope (p.3) Using the purchasable OPM3 Foundation (OPM3-F) provided by OPM3,
Paid access Yes [46] Pricing & Signup any organization can conduct a quick scan on itself. A comprehensive
[47] What does OPM3 look like? assessment with an acknowledged maturity degree has to be conducted
Certified usage Yes
by certified assessors. In this latter case, materials of the OPM3
Proprietary access & No ProductSuite (OPM3-PS) are used.
usage
Industry & Size Size of organization Yes [12] 1.1 OPM3 purpose and scope (p.3) OPM3 does not place applicability constraints regarding organization size
Industry sector Yes or industry.
Scope Project management Yes [12] 1.4 Organizational maturity (pp. 5-6) In OPM3, organizational project management maturity is reflected by the
Program management Yes combinations of best practices achieved within the project, program and
Portfolio management Yes portfolio domains.
Maturity level Process area No [12] Ch3. Best practices (pp.13-20) Within OPM3, activities are described as practices and deliverables as
description Activity Yes [12] Ch4. The organizational project management outcomes. The model contains 600 best practices and all of them are
processes (pp.21-28) provided openly to users. During an assessment, the assessors determine
Role No [12] Ch5. The OPM3 directories (pp.31-34) an organization’s maturity by the implementations of practices. And the
Competency No [12] Appendix F: Best practices directory outcomes (deliverables) are examined to prove that a practice is really
Deliverable Yes [12] Appendix G: Capabilities directory (p.123) implemented.
[12] Appendix H: Improvement planning directory
Result No (p.125)
Dimension of [12] 1.4 Organizational maturity (pp.5-6) OPM3 does not employ specific dimensions along which organizations can
maturity achieve maturity.
Process areas Not employed [12] Appendix I: Program and portfolio In OPM3, best practices are not grouped into process areas. However, the
management process models (pp.127-129) model does use process areas defined in the PMBOK to determine
improvements activities.
Process area Described differently [12] Appendix H: Improvement planning directory OPM3 acknowledges dependencies between capabilities instead of
dependencies process areas. And it is the improvement planning directory that contains
the dependencies between capabilities that aggregate to different best
practices.
Assessment method criteria
Assessment Not described. [12] Executive summary (p.xvi) OPM3 does not explicitly state the necessity of commitment by higher
commitment management. However, it does indicate the importance of communicating
frequently with the assessment sponsor and senior management of the
organization.
Competence level Assessor Yes [43] Module 6. Planning: acquire an assessment Specific personal attributes as well as competences that an OPM3
team (pp.11-12) ProductSuite Assessor needs to have are described in the training manual.
Participant No [43] Module 8. Execution: preparing for the Explicit competences or traits of assessment participants are not provided,
assessment (pp.5-6) but principles for sampling interviewees are described.
Assessment Process phase Yes [12] 6.3 Steps of the OPM3 cycle (pp.36-46) The detailed description of the assessment method is only available during
method Activity Yes [43] Module1. Introduction: assessment process official OPM3 training programs. A less elaborate explanation of the
description Deliverable Yes overview (p.4) assessment cycle is described in the purchasable knowledge foundation
Role Yes book (i.e. [2]).
Dependency Yes
Data gathering Questionnaires Yes [43] Module8. Execution: perform the OPM3 The only questionnaire meant for participants to fill out is the one that
method ProductSuite assessment (pp.14-21) organizations can use to execute a quick scan prior to the rigorous
Interviews Yes assessment. This questionnaire is provided by the OPM3-F. Interviews
and document consultations are conducted during rigorous assessment by
certified assessors. During rigorous assessments, no questionnaires are
Group discussions No
provided to the members of the organization. The quick scan is considered
one of the data gathering methods during an OPM3 assessment, because
Document Yes the results of the questionnaire (quick scan) trigger the rigorous
consultations assessment.
Length of Project management 800+ Confirmed by Mr. Ten Zweege, certified OPM3 This number includes all questions of the domain of project management
questionnaire domain assessor. on the entire dimension from Standardize to Continuous Improve as well
as questions related to the organizational structures and processes
necessary to support efficient and effective implementation and operation
of the best practices for the project domain.
Supportive (Self-)assessment Yes [12] Executive summary (p.xiv) OPM3-F provides a self-assessment tool. OPM3-PS provides a
assessment tools toolset [43] Module1. Introduction (p.3) professional and extensive tool to facilitate the execution of an assessment
Training Yes [36] OPM3 standard knowledge course. and improvement planning activities. OPM3 provides training programs
that train assessors how to asses and consultants how to plan, develop
Certification Yes and produce improvement project plans. Courses are also available to
those wanting to learn more about OPM3.
Benchmarking Benchmarking is optional [43] Module2. Orientation to OPM3 ProductSuite OPM3 permits benchmarking of OPM3 self-assessment data. It allows
(p.5) users to gain insight into peer organizations’ maturity continuum scores
[47] OPM3 benchmarking (p.5) and best practices, achieved with average, mean and median reports.
OPM3 benchmarking data will be available to those organizations that
participate in the collection and sharing of the data.
CMMI
Additional remarks:
- This evaluation is completely based on SEI’s definition of and service offerings regarding the CMMI for Development. It should be kept in mind that there are other institutions like SEI who
are eligible to conduct (official) CMMI assessments.
- It should be kept in mind that CMMI Development is a maturity model constructed for software development. CMMI encompasses PM processes for the sake of software development. In
CMMI, PM is an aspect of software development and system engineering.
- CMMI employs two distinct approaches, also known as ‘representations’, for its assessments, namely the ‘continuous representation’ and the ‘staged representation’. The continuous
representation is a maturity model structure wherein capability levels, instead of maturity levels, provide a recommended order for approaching process improvement within each specified
process area. With this approach, no maturity ratings are provided afterwards. With the staged representation approach, on the other hand, a maturity level is established by attaining the
goals of a predefined set of process areas. In this representation, each maturity level builds a foundation for subsequent maturity levels. The ‘continuous’ approach is not included in the
scope of this research for the sake of time.

Criteria Aspect Value Reference Explanation


Maturity reference model criteria
Openness Free access Yes [49] Books (SEI CMMI Official webs) The description of the SEI’s CMMI for Development model and standard
Journal publications (Computerworld, IEEE assessment method is downloadable from SEI’s official website.
Paid access Yes software, Journal of Systems and Software, The questionnaires used during assessments can only be accessed via
Information and Management) the official SEI website by authorized persons (e.g. certified assessors).
These questionnaires are used by assessors to develop interview
Certified usage Yes
questions. An official maturity assessment can only be conducted by
certified CMMI assessors. Although some materials are readily available
Proprietary access & No online, organizations still have to acquire a license first before they can
usage use SEI’s standard questionnaire and conduct CMMI assessments.
Industry & Size Size of organization No [28] Ch.1 Introduction: The scope of CMMI for Among others: aerospace, banking, computer hardware, software,
Development (p.8) defense, automobile manufacturing, and telecommunications. The SEI
Industry sector Yes does not pose any industry sector related restrictions on the application of
CMMI. Size related restrictions are not mentioned.
Scope Project management Yes [28] Ch.1 Introduction: The scope of CMMI for Although CMMI does cover aspects of PM, it should be kept in mind that
Program management No Development (p.8) the model is designed for software development purposes and not PM. PM
is only considered an aspect that, if managed well, contributes to achieving
Portfolio management No software development goals.
Maturity level Process area Yes [28] Ch.3 Tying it all together: Understanding To assess the maturity of an organization, assessors study the process
description Activity Yes maturity levels. (pp.35-46) area, activities, roles and deliverables. When assessing process areas,
Role Yes [28] Ch.2 Process area components (pp.16-28) assessors determine what kind of processes are in place within an
Competency No [28] Ch.5 Part Two: Generic Goals and Generic organization, not how these processes are organized. The purpose of
Deliverable Yes Practices and the Process Areas (pp.73-513) assessors is to examine whether certain processes are in place and if so,
Result No if they contribute to achieving the goals described in CMMI.
Dimensions of Processes CMMI assesses the maturity of processes, so that is the only dimension
maturity employed.
Process areas Causal analysis and resolution [28] Ch.3 Tying it all together: Process Areas. There are a total of 23 process areas embodied by CMMI for
Configuration management (pp.41-44) Development. These process areas are categorized in 4 related groups
Decision analysis and resolution [28] Ch.4 Relationships among Process Areas: that are relevant to the software development process: ‘process
Integrated project management Project Management. (pp.55-58) management’, ‘project management’, ‘engineering’ and ‘support’.
Measurement and analysis [28] Ch.5 Part Two: Generic Goals and Generic
Organizational innovation and Practices and the Process Areas (pp.73-513) Within CMMI for Development, processes like ‘human resource
deployment management’ or ‘stakeholder management’ apply to all process areas,
Organizational process definition which is why these are not described as separate process areas.
Organizational process focus
Organizational process performance
Organizational training
Product integration
Project monitoring and control
Project planning
Process and product quality
assurance
Quantitative project management
Requirements development
Requirements management
Risk management
Supplier agreement management
Technical solution
Validation
Verification
Process area Described [28] Ch.4 Relationships among Process Areas CMMI for Development describes dependencies between the different
dependencies (pp.51-64) maturity levels, and because of that, there are also dependencies
[28] Ch.5 Part Two: Generic Goals and Generic described between process areas.
Practices and the Process Areas (pp.73-513)
Assessment method criteria
Assessment Described. [28] Ch.5 Using CMMI Models: Adopting CMMI Building strong organizational support through senior management
commitment (pp.65-66) sponsorship a crucial step toward process improvement.

Competence level (lead) Assessor Yes [37] Executive summary: Time frame and CMMI describes which criteria is required to qualify the assessment team
personnel requirements (p.I13). members and leader, but does not elaborate on the contents. There are
[37] Part II Process definitions: 1.3 Select and however, minimum requirements for each role. It also elaborates the
Participant No prepare team (pp.II32-II39) process of selecting the assessment team leader and members.
[49] Ch.4 Requirements for CMMI appraisal
methods (pp.7-10).
Assessment Process phase Yes [37] Executive Summary: What is SCAMPI A? The standard assessment method for CMMI is officially documented by
method Activity Yes (pp.I9-I11) the SEI.
description Deliverable Yes [37] SCAMPI A method overview (pp.I15-I38)
Role Yes [37] Part II Process Definitions (pp.II1-II134).
Dependency Yes
Data gathering Questionnaires No [37] SCAMPI A method overview: Types of
method Interviews Yes objective evidence (p.I22)
Group discussions No [37] SCAMPI A method overview: Instruments
Documents consultation Yes and tools (pp. I28-I29)
Length of Project management 155-180 [28]Ch.4 Relationships among Process Areas: SEI’s CMMI for Development definition document shows that the
questionnaire domain Project Management. (pp.55-58) category ‘project management’ comprises 6 process areas. Each
process area is accompanied by a standard questionnaire, so the
amount of questions per process area was added up to get a total
amount. The amount of questions adds up to approximately 30 per
process area.
Supportive (Self-)assessment toolset No [28] Ch.5 Using CMMI Models: Using CMMI SEI provides information regarding the available CMMI training facilities.
assessment tools Appraisals (pp.68-69). The SEI does not provide self assessment tools for CMMI v1.2. There
Training Yes [28] Ch.5 Using CMMI Models: CMMI-Related are however other institutions that do provide them.
Training (pp.70-71).
Certification Yes [49] Ch.3 Requirements for CMMI appraisal
method class structure (pp.5-6).
Benchmarking Benchmarking is optional [49] Ch.3 Requirements for CMMI appraisal There are possibilities to compare an organization’s own maturity score
method class structure (pp.5-6). with average industry sector maturity score, but this is only because the
SEI registers (by default) CMM and CMMI levels of officially assessed
organizations. This information is not publicly available; only for those
who have taken the official assessment and are registered. Even so, a
registered organization cannot retrieve information regarding the identity
of other assessed organizations.
PMMM
Criteria Aspect Value Reference Explanation
Maturity reference model criteria
Openness Free access No [34] Official IIL PMMM webpage. PMMM consists of a purchasable book and an online assessment tool,
Paid access Yes supported by the International Institute for Learning (IIL), Inc. Access to the
Certified usage No online assessment tool has to be purchased before organizations can use
Proprietary access & No it to generate reports and recommendations.
usage
Industry & Size Size of organization No [34] Who should take this assessment? The online assessment tool is applicable in all industries. And it can be
used by a wide range of users. The applicability to particular sizes of
Industry sector Yes
organizations is not specified.
Scope Project management Yes [25] Ch1. An introduction to the PMMM (pp.41-44) Kerzner’s PMMM specifically focuses on the 5 levels of project
Program management No [25] Ch5. Level5: Continuous improvement (pp. management maturity. Program management and portfolio management
111-138) are described as areas of development when an organization has reached
Portfolio management No the level of continuous improvement regarding project management.
Maturity level Process area Yes [25] Ch. 4-9. An introduction to the Project Project management maturity is described in different terms on each
description Management Maturity Model (PMMM) & the 5 maturity level by the PMMM. At level 1, the model looks at the knowledge
Activity Yes
maturity levels. (pp.41-143) of people within an organization regarding basic project management
Role Yes terminology. At level 2, it describes the importance of common processes
Competency Yes throughout an organization. Level 3 is about combining all corporate
methodologies into a singular methodology. Level 4 acknowledges the
Deliverable No importance of conducting benchmarking. And finally, level 5 is about
Result No continuously improving the singular methodology and business processes
using the information obtained through benchmarking.
Dimensions of Varies per maturity level [25] Ch. 4-9. An introduction to the Project At each maturity level, the PMMM discusses: organizational
maturity Management Maturity Model (PMMM) & the 5 characteristics, roadblocks that prevent an organization from attaining the
maturity levels. (pp.41-143) next level, what must be done to reach to next level and potential risk.
However, the descriptions of these aspects are not used to determine the
maturity level of an organization. As mentioned earlier, PMMM describes
different dimensions of PM maturity at each maturity level.
Process areas Scope management [25] Ch. 4-9. An introduction to the Project On maturity level 1, where PM knowledge is assessed, the processes are
Time management Management Maturity Model (PMMM) & the 5 categorized into the process areas as defined in the PMBOK. This
Cost management maturity levels. (pp.41-143) categorization, however, is not used in the remaining 4 maturity levels as
Human resources the model assesses different aspects on each level.
Procurement management
Quality management
Risk management
Communications management
Process area Not described [25] Ch. 4-9. An introduction to the Project PMMM does not describe dependencies between process areas, but does
dependencies Management Maturity Model (PMMM) & the 5 describe dependencies and overlapping between aspects of PM maturity
maturity levels. (pp.41-143) elaborated at each particular maturity level.
Assessment process criteria
Assessment Described [25] Ch4. An introduction to the PMMM (p.41) The PMMM assists organizations in achieving strategic planning for project
commitment management and to make this happens, executive-level involvement is
necessary to make sure any development and implementation process is
driven from the top down.
Competence level Assessor No Confirmed by Mr. C. Damiba (see Appendix A-2). The PMMM does not place restrictions on the suitability of those
participating in an assessment. Also, the model does not require
assessors as the assessments are carried out by the online assessment
Participant No tool. However, should an organization desire a comprehensive
assessment report, it will be constructed by eligible consultants within the
IIL.
Assessment Process phase Yes [34] Demo version of PMMM online assessment There is no standard procedure for the assessment conduct. Participants
method tool. of an organization only need to complete the questionnaire provided by the
Activity Yes
description online assessment tool. The standard procedure to be followed by the
Deliverable Yes client organization is explained using a demo-version of the online
assessment tool on IIL’s website. As for dependency, organizations can
Role No only retrieve simple or comprehensive assessment reports after working
Dependency Yes with the online assessment tool.
Data gathering Questionnaires Yes [34] Here’s how it works.
method Interviews No
Group discussions No
Document No
consultations
Length of Project management 183 [34] Here’s how it works. PMMM employs only one questionnaire for the assessment.
questionnaire domain
Supportive (Self-)assessment Yes [25] Introduction (p.xviii) To conduct an assessment, PMMM only requires the usage of the online
assessment tools toolset [34] Here’s how it works. assessment tool accompanying the model.
Training No
Certification No
Benchmarking Benchmarking is optional at the end [34] Ready to give it a test run? In the online assessment tool, respondents can compare their individual
of the assessment of each maturity scores at each level with others who have also taken the assessment;
level. within the same organization as well as in other industries. Ultimately, the
higher management can retrieve aggregate scores and compare the total
organizational score with peers in the same industry, of the same size or
with those in other industries.

Das könnte Ihnen auch gefallen