Sie sind auf Seite 1von 321

2007 BPM

and
Workflow
Handbook

Future Strategies Inc.
This book is published in digital format. The content of this book is fully copyrighted and may
not be distributed or data extracted therefrom without written permission by the publisher.
You are licensed to print one copy for your own use.

2007 BPM and
Workflow Handbook
Methods, Concepts, Case Studies and Standards
in Business Process Management and Workflow






Published in association with the
Workflow Management Coalition







Edited by
Layna Fischer

Future Strategies Inc., Book Division
Lighthouse Point, Florida
2007 BPM and Workflow Handbook
Copyright 2007 by Future Strategies Inc.
ISBN-13: 978-0-9777527-1-3
ISBN-10: 0-9777527-1-2

09 08 07 7 8 9
All brand names and product names mentioned in this book are trademarks or service marks of
their respective companies. Any omission or misuse should not be regarded as intent to infringe
on the property of others. The Publisher recognizes and respects all marks used by companies,
manufacturers and developers as a means to distinguish their products. The WfMC logo and
Workflow Management Coalition are service marks of the Workflow Management Coalition,
www.wfmc.org.
Neither the editor, Workflow Management Coalition, nor Future Strategies Inc., accept any
responsibility or liability for loss or damage occasioned to any person or property through using
the material, instructions, methods, or ideas contained herein, or acting or refraining from
acting as a result of such use. The authors and the publisher expressly disclaim all implied
warrantees, including merchantability or fitness for any particular purpose. There will be no
duty on the authors or Publisher to correct any errors or defects in the software.

Published by Future Strategies Inc., Book Division
2436 North Federal Highway #374
Lighthouse Point FL 33064 USA
954.782.3376 fax 954.782.6365
www.futstrat.com; books@futstrat.com
Cover design by Pearl & Associates

All rights reserved. Manufactured in the United States of America. No part of this
work covered by the copyright hereon may be reproduced or used in any form or by
any meansgraphic, electronic, or mechanical, including photocopying, recording,
taping, or information storage and retrieval systemswithout written permission of
the publisher, except in the case of brief quotations embodied in critical articles and
reviews.

Publishers Cataloging-in-Publication Data
Library of Congress Catalog Card No. 2007901191
2007 BPM and Workflow Handbook:
/Layna Fischer (editor)
p. cm.
Includes bibliographical references, appendices and index.


ISBN 978-0-9777527-1-3
1. Business Process Management. 2. Workflow Management.
3. Technological Innovation. 4. Information Technology. 5. Total Quality
Management. 6. Organizational Change 7. Management Information Systems. 8.
Office Practice Automation. 9. Business Process Technology. 10. Electronic
Commerce. 11. Process Analysis

Fischer, Layna
TABLE OF CONTENTS
FOREWORD 7
Jon Pyke, Chair WfMC, United Kingdom
INTRODUCTION: WORKFLOW AND BPM IN 2007: BUSINESS PROCESS STANDARDS SEE A NEW
GLOBAL IMPERATIVE 9
Nathaniel Palmer, Executive Director, Workflow Management Coalition,
United States
SECTION 1-THE BUSINESS VALUE OF WORKFLOW AND BPM
THE BUSINESS VALUE OF WORKFLOW AND BPM 17
Keith D Swenson, Fujitsu Computer Systems, United States
KNOWLEDGE INTENSIVE BPM 27
Jon Pyke, The Process Factory Ltd., United Kingdom
BPM AND SERVICE-ORIENTED ARCHITECTURE TEAMED TOGETHER: A PATHWAY TO SUCCESS FOR
AN AGILE GOVERNMENT 33
Linus Chow and Charles Medley, BEA Systems; Clay Richardson, Project
Performance Corp., USA
ANALYZING AND IMPROVING CORE TELECOM BUSINESS PROCESSES: CASE STUDY 55
Lee, Kyeong Eon; KTF Co., Ltd., Seoul, Korea
WORKFLOW AND PERFORMANCE MANAGEMENT 67
Arnaud Bezancon, Advantys, France
BPM CENTER OF EXCELLENCE MANIFESTO 73
Dr. Setrag Khoshafian, Pegasystems Inc., USA
EVOLUTION: AN INNOVATION PROCESS 85
Gabriel Franciosi and Federico Silva, Pectra Inc., USA
WHY ENGAGEMENT WILL REDEFINE THE NEXT EVOLUTION IN WORKFLOW AND BPM 97
Steve Rotter, Adobe Systems Incorporated, USA
APPLYING MDA

CONCEPTS TO BUSINESS PROCESS MANAGEMENT 103


Alexander Petzmann, Michael Puncochar, BOC Group, Austria; Christian
Kuplich, BOC Group, Germany; David Orensanz, BOC Group, Spain
FROM FUNCTIONAL SILOS TO A PROCESS VISION 117
Salvatore Latronico and Francesco Battista, openwork, Italy
SPOTLIGHT ON BPM IN HEALTHCARE
THE CHESTER COUNTY HOSPITAL: CASE STUDY 133
Ray Hess, The Chester County Hospital, USA
BUSINESS PROCESS MANAGEMENT IN PHARMACEUTICAL R&D 147
Dr. Kai A. Simon ALTANA Pharma AGa Nycomed company, Germany
WORKFLOW OPPORTUNITIES AND CHALLENGES IN HEALTHCARE 157
Jonathan Emanuele and Laura Koetter, Siemens Medical Solutions USA,
Inc., USA
TABLE OF CONTENTS
AUTHENTICATED DOCUMENT/ DATA EXCHANGE IN HEALTHCARE 167
Dr. Mohammed Shaikh, Image X Inc., USA
SECTION 2STANDARDS AND TECHNOLOGY
QUALITY METRICS FOR BUSINESS PROCESS MODELS 179
Irene Vanderfeesten
,
Hajo A. Reijers, Wil van der Aalst
,
Technische
Universiteit Eindhoven, The Netherlands; Jorge Cardoso, University of
Madeira, Portugal; Jan Mendling, Vienna University of Economics and
Business, Austria.
ENTERPRISE ARCHITECTURE AS A META-PROCESS 191
Heinz Lienhard, ivyTeam-SORECOGroup, Switzerland
OVERCOMING NEGATIVE TENDENCIES IN AUTOMATED BUSINESS PROCESSES 203
Juan J. Moreno, Lithium Software / Universidad Catlica, Uruguay; Luis
Joyanes, Universidad Pontificia de Salamanca, Spain
DEFINING EASY BUSINESS RULES FOR ACCOMPLISHING THE BASEL II RISK HANDLING
IN BANKS 211
Dr. Juan J. Trilles, AuraPortal BPMS, Spain
MSCWV: CYCLIC WORKFLOW VERIFICATION ALGORITHM FOR WORKFLOW GRAPHS 223
Sinnakkrishnan Perumal and Ambuj Mahanti, Indian Institute of
Management Calcutta, India
BUSINESS PROCESS ARCHITECTURE AND THE WORKFLOW REFERENCE MODEL 243
Chris Lawrence, Old Mutual, South Africa
SECTION 3DIRECTORIES AND APPENDICES
Membership Structure 283
Officers and Fellows 287
Membership Directory 291
Author Biographies 301
Additional Resources 313
Index 315

7

Foreword
Jon Pyke, WfMC Chair, United Kingdom
Thank you for continuing to support the work of the Workflow Management
Coalition. It never ceases to amaze me just how much can change and the
progress that can be made in a 12-month period.
2006 was a year of momentous change for Workflow Management Coalition.
Layna Fischer, WfMC General Manager and Executive Director for many
years, announced her semi-retirement in order to re-focus on the publishing
aspect of her business. Part of that process meant that, after many years of
service and support, Layna would step down from her role as General Man-
ager. On behalf on the entire Coalition, I would like to thank Layna for all
her efforts and hard work and we wish her well. Layna is not disappearing
completely though, she will continue to support us in many ways including
the editing and publishing of the annual Handbook and other publications,
as well as continue to manage the annual Global Excellence Awards.
Laynas decision did mean that we had to find another, equally enthusiastic,
individual to manage our affairs. I am very pleased that another lifelong
supporter of the Coalition, Nathaniel Palmer, stepped up to the plate to take
over. Many of us in the WfMC worked with Nathaniel during his time with
the Delphi Group where he was lead analyst in this sector. Id like to take
this opportunity to welcome Nathaniel to the fold. He has many exciting
ideas and we look forward to working with him. Be sure to read his introduc-
tory chapter on Workflow and BPM in 2007.
I think it is also appropriate to mention the untimely death this past year of
Dave Shorter. Dave was the driving force behind the Coalition as the found-
ing Chair. His determination and sheer hard work brought the group into
existence and made the Coalition what it is. He will be sorely missed.
But what of the Coalition in 2006?
We have made significant progress during the past year, which was filled
with great events, lots of changes and exciting new developments. XPDL has
been a great success and has really bought the message home on the impor-
tance of this particular standard. During the year we ran technical and
business seminars throughout the world including Japan, USA and the UK.
WfMC members spent many hours over 2006 helping to drive awareness,
understanding and adoption of XPDL. As a result, it has been cited as the
most deployed BPM standard by a number of industry analysts, and contin-
ues to receive a growing amount of media attention. More seminars and we-
binars are planned for 2007, so keep an eye on www.wfmc.org for further
announcements.
The relevance of the WfMC in 2007 and beyond:
We have seen a shift in thinking throughout the industry during 2006,
which has resulted in much less confusion as to what Business Process
Management (BPM) really is and the job its there to do. Many of us involved
in the field of Workflow Automation and BPM have argued long and hard
about where these two technologies overlap, where they are different, which
FOREWORD
8
mathematical models to use, which standards are applicable to which part of
the technology stack and all that associated puff.
Well, these arguments and discussions are over; the demarcation lines have
been drawn; the road ahead is clear.
The fact that Business Process Management has its roots in Workflow tech-
nology is well knownmany of todays leading products are, in fact, evolu-
tions of the original forms-processing packages. So there is no longer a need
to debate what is now a moot point.
The notion that BPM technology is only about SOA and web services has
shown itself to be flawed thinking. People are still a significant part of the
overall way businesses execute their processes. This means that much of the
work conducted by the coalition over the years is more applicable now than
its ever been. If you get the chance, look again at Dave Hollingsworths semi-
nal paper The Workflow Reference Model 10 years On
1
.
The paper puts forward the notion of a BPM Reference Model and con-
cludes that, In looking at the various components that make up a BPM ref-
erence system, much of the previous work of the original workflow reference
model lives on... the original architecture is now expressed in XML and as
interfaces to web services. One significant change presented in the paper is
in the area of process fragments and the choreography of interactions be-
tween such fragments. Hollingsworth goes on, Although the reference model
did introduce the idea of distributed processes (and defined several types of
interaction model) it never really tackled the problem of defining a notation
for expressing their interactionthe province of the emerging choreography
standards.
The correct approach is to recognize what standards are needed where in the
architecture, and for what purpose. Then they can be populated through the
various industry and de jure standards bodies. Product vendors will adopt
them if they add valueand this stems from having a thought-through un-
derlying architecture that clearly identifies the value and purpose of each
standard.
Perhaps this is the core legacy of the Reference Model. At the very least it
has provided a common framework for people to think about Workflow and
BPM architecture and many years of fascinating discussions!
Long may it continue


Jon Pyke, Chair WfMC
and Founder The Process Factory Ltd

1
Workflow Handbook 2004 ed. Fischer, L. Published by Future Strategies Inc.,
also available as a download from www.wfmc.org
9

Workflow and BPM in 2007:
Business Process Standards See a
New Global Imperative
Nathaniel Palmer, Workflow Management
Coalition, United States
INTRODUCTION
Until fairly recently, most of the discussion surrounding business process stan-
dards had been largely limited to a technical context. Specifically, focus on stan-
dard has been seen largely as the province of software developers and engineers.
However, with the maturation of interoperability standards around process defini-
tion (notably BPMN
1
and XPDL
2
), as well as a growing understanding of the role
of open standards in enterprise risk management, the role of BPM standards has
evolved from technical nuance to a business imperative, and is a topic likely to
drive many BPM and workflow discussions in 2007.
THE CASE FOR OPENNESS: OPEN STANDARDS AS A RISK MANAGEMENT STRATEGY
For decades open standards and open source have been natural compadres, yet
the two disciplines are neither equal nor interchangeable. A very visible example
of this is found in the issues surrounding OASIS Open Document Format for Office
Applications or commonly just ODF. After smoldering for some time, ODF ex-
ploded into quite a wildfire last year when the Commonwealth of Massachusetts
switched policy to favor it over proprietary office suite document formats.
The Commonwealths Secretary of Administration and Finance Eric Kriss, who at
the time oversaw the office of CIO, made history as the first policy-maker to pub-
licly favor open formats. His heard-around-the-world shot was fired at the Janu-
ary 15, 2005 meeting of the Massachusetts Software Council, stating that it is an
overriding imperative of the American democratic system to prevent public docu-
ments from being locked up in a proprietary format, subject to a proprietary li-
cense.
Detractors criticized the direction introduced by Kriss as anti-business and arbi-
trarily favoring open source; however, Kriss and current Massachusetts Governor
Mitt Romney are unlikely suspects to be anti-business, having co-founded a suc-
cessful capital investment firm and, together, have built and led several other
successful firms (none of which I am aware of that have ever derived significant
revenue from federal or state governments.) Rather, these two maverick capitalists
realized the inherent risk to democracy and capitalism presented by having ac-
cess to the intellectual property stored in closed format, owned and controlled by
a single private gatekeeper.

1
BPMN: Business Process Modeling Notation was developed by Business Process
Management Initiative (BPMI), and is now being maintained by the Object Management
Group (OMG)
2
XPDL: XML Process Definition Language is a format standardized by the Workflow
Management Coalition (WfMC) to interchange Business Process definitions between different
workflow products like modeling tools and workflow engines.
WORKFLOW AND BPM IN 2007
10
ODF is an open format but is not open source. Nor does it need to be. An open
format (to quote Wikipedia) is a published specification for storing digital data,
usually maintained by a non-proprietary standards organization, and free of legal
restrictions on use. This also means that it must able to be implemented by both
proprietary and open source software, using the standard or appropriate licenses
used by each. To be clear, offering a clunky, unidirectional translator (the strategy
chosen by some software vendors) does not count as an open format.
Ultimately the reason open formats exist is to guarantee long-term access to data
and information without being beholden to specific to legal rights or technical
specifications. Yet similar to open source licensing, a secondary benefit of open
standards and open formats is the creation of a technology meritocracy. It offers a
platform for competing products to deliver innovative ways for addressing the
markets collective and individual requirements and desires. Indeed, there is a
rich market of products and online services that support ODF, far more than oth-
erwise proprietary formats. Both issues have motivated governments and other
large software consumers to increasingly favor open formats.
The Open Document debate illuminates why we value transparency in data for-
mats. Even if every software vendor supporting ODF simultaneously ceased to
exist, the availability of an open format ensures the ability to build a new system
capable of reading the document. Similarly, it provides a platform for new innova-
tions to be marketed around a standard implementation, rather than requiring
each vendor to carry this burden alone.
As we have seen with current and past movements in the role of technology in
business, including open source software (OSS), e-mail, the personal computer
itself, and countless other examples, the innovative use of standard specifications
offers far greater marketability than trying to reinvent the wheel on your own.
What has now been sufficiently demonstrated through the same examples is that
no vertically-integrated, proprietary specification is defensible in the face of a
competitive network built on open standards.
THE VALUE OF PROCESS TRANSPARENCY: LEVERAGING OPEN PROCESS DEFINITIONS
Now apply the same notion of openness and portability to the application itself.
What if, rather than simply addressing documents, in addition, all of the proprie-
tary configurations defining how software applications work (i.e., the business
logic) were similarly based on an open format. Imagine that all of the millions of
dollars and thousands of man-hours spent configuring business applications
were not locked within intractable custom code, but were in a standard and port-
able format. This is, in simple terms, the goal of a number of business process
definition standards, notably XPDL (XML Process Definition Language).
XPDL allows for processes to defined using any number of compliant modeling
environments (both closed and open source options), and interpreted by XPDL-
savvy applications. To be clear, it is not an executable language (such as BPEL
3
,
which is discussed in greater detail later in this chapter), but rather a process
description language which contains not only the business logic which define
business processes, but the metadata concerning how to read and recreate proc-
essesdown to vector coordinates defining the graphic process model (expressed
as a contemporary standard called BPMN (BPM Notation).

3
BPEL: Business Process Execution Language is a business process modeling language
developed by OASIS.
WORKFLOW AND BPM IN 2007
11
These capabilities distinguish XPDL from BPEL, though the two are often placed
in close proximity. Yet, as observed by Bruce Silver, one of the industrys most
tenured analyzers of business process software, From day one BPM has sought
to make process design directly accessible to business analysts, but todays BPEL
process models are, for the most part, undecipherable to non-programmers.
Specifically that the content and context of a BPEL model (the equivalent of the
words and prose inside of an ODF document) can only be read by a BPEL engine,
not a business analyst.
Just as ODF provides a capsule for the preservation of intellectual property cre-
ated in documents and spreadsheets, XPDL and BPMN provide a platform for
managing a library of business processes as reusable and accessible assets.
These standards, already available today, also execute and update the same proc-
esses in the same format, rather than having static models which are translated
into code, they offer a single standard format for actionable and transparent
business process definitions.
THE BPM STANDARDS VALUE CHAIN
It has been suggested in both serious and sarcastic contexts that the great thing
about standards is there are so many to choose from! Used in jest, it pokes fun
at the fact that competing interests rarely agree on a single standard, despite the
inherent notion of standards and standardization implying a singularity of sorts.
In the case of business process management and workflow standards, however,
there is an upside to that non sequitur.
In addition to the potentially helpful proposed and emerging standards (notably
BPDM
4
and BPRI
5
), the spectrum of specifications in use today present more syn-
ergy than redundancy. These include some of the standards previously discussed,
BPMN, XPDL, and BPEL, as well as others such as eBP/BPSS
6
and the industrys
best kept secretWf-XML
7
.
In many ways, this group begins with BPMN (Business Process Modeling Nota-
tion), because the logical starting point for modeling is most frequently drawing a
BPMN diagram or graphical process model. Then what do you do? The next steps
spotlight the rationale for multiple (yet complementary) standards. BPMN pro-
vides a graphic model (i.e., it is a Modeling Notation standard) which can be saved
as XPDL during development, and either executed by XPDL-compliant solutions
or ultimately pieces can also be translated to BPEL (i.e., the parts of the process
focuses on data exchange versus human interaction.)
This presents what has been observed by some as a value chain connecting
BPMN, XDPL and BPEL. BPMN depicts the end-to-end flow of steps and activities
within a business process, modeling the sequence of activities within a process as
well as the data or messages that flow between different process participants
within a related set of activities. For this reason, BPMN is designed not simply to

4
BPDM, the Business Process Definition Metamodel, is a proposal being developed by the
Object Management Group (OMG).
5
BPRI: Business Process Runtime Interfaces being developed by the Object Management
Group (OMG).
6 ebXML BPSS: ebXML Business Process Specification Schema, defines a business process
foundation that promotes the automation and predictable exchange of business collaboration
definitions using XML. Developed by OASIS.
7
Wf-XML Workflow eXtensible Markup Language.Version 2.0 was produced by the Workflow
Management Coalition (WfMC), and extends the ASAP model to include BPM and workflow in-
terchange capabilities.
WORKFLOW AND BPM IN 2007
12
model applications, but the processes in which applications would be used. For
this reason, the output of BPMN needs to be expressed in something other than
programming language. This was initially expected to be BPML, a since-
abandoned standard developed by BPMI.org.
WHY XPDL IS HERE TO STAY
What BPMN would have provided vis-a-vis BPML is the direct translation of a
graphical model intelligible by people (i.e., business analysts, process owners) into
a machine-readable formatto enable the interchange of process definitions be-
tween different tools and from different vendors. In the absence of BPML, where
do you go with a BPMN model? The answer(s) today are most compelling in the
process of translating BPMN into XDPL and/or BPEL.
One of the common misconceptions regarding these standards is that BPEL and
XPDL are direct competitors or otherwise mutually exclusive. This is simply not
the case. BPEL and XPDL are entirely different yet complementary standards,
designed for different purposes. BPEL is an execution language designed to pro-
vide a definition of web services orchestrationthe underlying sequence of inter-
actions, the flow of data from point to point, defined start/stop and entrance/exit
points. XPDL, on the other hand, is designed to capture both programmatic in-
formation and the people work expressed within a process diagramto enable
one tool to model the diagram, and another to read the diagram, and so on.
BPMN can be used to model an executable process by constraining diagram ob-
jects and their properties to that which can be mapped directly to BPEL elements.
This enables, albeit with the limitations, the use of a BPMN diagram to provide a
business-oriented graphical process model that can also generate executable code
through BPEL. This present a real advantage because BPEL has not have an as-
sociated graphical notation nor required concepts to support the visual diagram
of a process model.
Similarly to BPEL the original version of XPDL, by design, lacked specific graphi-
cal representationi.e., it was built to be agnostic to modeling methods and nota-
tions. With the release of XPDL 2.0 and subsequent versions, however, it was ex-
panded to include the specific mechanisms allowing round-trip development from
BPMN to XPDL and back to BPMN. Rather than an executable programming lan-
guage, XPDL is a process design format, which literally represents the "drawing"
of the process definition. It has XY coordinates and node size, as well as a concept
of lines, and points along the line that give it a particular path.
The XPDL file can provide this design interchange because it offers a one-to-one
representation of the original BPMN process diagram. It can be written, and re-
read to recover the original diagram. BPEL, on the other hand is a non-trivial
mapping which is widely recognized as being one-directional (i.e., not round-trip).
As previously stated, it is possible to take a BPMN diagram and produce BPEL,
but it is difficult or impossible to recover the original BPMN diagram from the
BPEL. But that is okayBPEL was not designed for process design interchange,
whereas XPDL was designed precisely for this purpose.
WHAT ABOUT SUBPROCESSES?
Anyone spending time attempting to model the processes of an organization
probably quickly realizes that operations cannot be easily fit on a single white-
board. Rather, there are lots of activities and circumstances that happen some-
where elsein different systems, different companies, different geographies. Ap-
plied for a loan recently? If so, you no doubt participated in a subprocessas the
WORKFLOW AND BPM IN 2007
13
loan application process likely separated the credit check function from the rest of
the process. Today credit check is a very standard process where analysis of a
predefined set of information returns a consistent result.
Credit checks are a great example of BPEL-appropriate subprocess: it is predict-
able, well-structured, dataflows are well-defined and dont change, and it is
tightly-coupled to the process. It is designed to run cheaply and most of time
quickly; the rest of the process cannot and should not proceed without the result-
ing credit score (what lender wants to waste time on a dead-beat?). There are
many other examples, however, where a subprocess is less time-constrained and
longer-running, operating in parallel with the parent or super-process. For ex-
ample, a more complicated loan application today may involve the coordination of
two separate BPM environments, where both need to talk but not necessarily be
integrated in a tightly-coupled manner. Unlike the credit check example, this is
not a job for BPEL. To leverage BPEL in this situation requires a very fine-grain
understanding of the syntax of the other system.
Luckily, there is another standard designed specifically for this. Extending from
OASIS, the standard ASAP (Asynchronous Service Access Protocol) using SOAP
(Simple Object Access Protocol) and Wf-XML allows two BPM systems using stan-
dard messaging, analogous to sending an email to the system requesting it start a
process, then coordinate activities or check in from time-to-time until the de-
sired result is reached. Although it would be possible to achieve the same result
using a combination of a number of web services interfaces, leveraging the single
standard of Wf-XML allows multiple-compliant BPM systems to interoperate
without requiring the traditional custom programming or fine-grain integration.
Effectively it allows each to examine the process models and activity definitions of
others, enabling BPM engines to collaborate in a way more closely following hu-
man interaction than application integration.
Who cares about the ability to interchange process models anyway? Process in-
terchange offers a key leverage point for firms investing into process models, and
is for those who want these investments to be actionable without being locked
into a single vendor. In all of our research, as well as that we have seen done by
others, we have found it is common for individuals engaged in the early stage of
process discovery and validation to use general purpose graphics desktop tools to
develop process models, then to hand these results to process architects for fur-
ther validation and the development of actual process definitions. Through vari-
ous third-party extensions and templates, the most common desktop design tools
can be made to support BPMN which means it also supports XPDL. In fact, XPDL
has been used specifically for the interchange of models between XPDL and simu-
lation engines. XPDL and BPMN provide a platform for managing a library of
business processes as reusable and accessible business assets.
The importance of process design interchange continues to increase as the BPM
market matures. The lack of interoperability and design exchange necessitates a
vertically-integrated model where a single vendor must supply all of the tools in-
volved in BPM. This may have been acceptable in the early stage of the market
where early adopters were placing bets on individual vendors, but for the market
to grow and mature into the next stage there needs to be ecosystem, not an oli-
garchy.
BPM STANDARDS DRIVE GLOBAL INNOVATION
This ecosystem is visible and growing today, and despite relatively aggressive
mergers and acquisitions within the BPM sector, the leverage of standards (nota-
WORKFLOW AND BPM IN 2007
14
bly BPMN, XPDL and BPEL) has provided a platform for a host of individual spe-
cialized and niche players, as well as the opportunity for larger BPMS vendors to
offer a standards-based, round-trip framework.
These standards have also flattened the BPM playing field in a global context,
with a number of firms within emerging markets now not only offering compliant
software, but directly contributing to the working groups defining and developing
the specifications. Where dominant U.S. and Western European firms have not
always demonstrated a willingness to play nice on standards development,
emerging-market firms are showing great innovation when it comes to both the
development and application of standards.
An important attribute for enabling innovation within this ecosystem is the exten-
sibility mechanism of XPDL. Specialized tools may present unique requirements
using extended attributes, and while other tools will not understand these exten-
sions, they will carry the extensions along the round-trip. For example, a tool spe-
cialized to clean up the layout might manipulate the graphical aspects of the
model, and return a cleaned-up model, including all the extensions, back to the
original source without losing any information. A number of open source tools
have demonstrated the ability to read XPDL files generated from commercial ven-
dors, allowing process definitions to be modified and returned without any loss of
vendor specific extensions.
Several BPM engines are able to run XDPL natively, which allows run-time modi-
fication and process migration to be readily supported. Where these processes
focus on broader-scope collaboration among people, they can remain within
XPDL/BPMN. Where pieces are decomposed into system-to-system interactions,
these can be translated to BPEL for transmission to an EAI-oriented BPM engine.
These are three very different and very compatible roles. But that is the nature of
the value chain; BPEL and XPDL are entirely different things for entirely different
purposes
No longer limited to an audience of software developers and engineers, process
definition and interoperability standards such as BPMN and XPDL have become
mainstream issues for BPM and workflow consideration and competitive differen-
tiation. This attraction will, no doubt, only increase as more individuals in line
management and business process ownership understand the strategic role of
open standards in matters such as enterprise risk management and regulatory
compliance, as well as the opportunity to leverage investments made in business
process documentation and business logic design, rather than locking these away
inside closed systems and proprietary specifications.

Section 1

The Business
Value of
Workflow
and
BPM

17

The Business Value of
Workflow and BPM
Keith D. Swenson, Fujitsu Computer Systems,
United States
ABSTRACT
Human-Oriented Business Process Management, also called Workflow, is a criti-
cal component that allows applications to meet the agility demands of business.
Service-Oriented Architecture (SOA) is an important design goal to meet the agility
demands of Information Technology (IT).
IT and business users are different audiences, with very different demands, and
failure to recognize this can lead to missed opportunities and unsatisfactory solu-
tions. This paper will show how workflow can be brought together with SOA tech-
nology to form a powerful combination to meet both demands. IT can design ser-
vices that are safe for non-technical people to compose into high level applica-
tions, giving them the unprecedented ability to respond to external events. Exam-
ples include a corporation that changed business process in 2 hours in order to
be in a new line of business the next day.
INTRODUCTION
In 2006, Forrester ran a poll of 146 IT Executives and asked the following ques-
tion: Considering your existing enterprise applications, how important are the
following business problems? The results of the poll are reflected in this chart:
The question does not specifically mention business process, nor was the audi-
ence selected for interest in business processes. Eighty one percent of the respon-
THE BUSINESS VALUE OF WORKFLOW AND BPM
18
dents felt that inadequate support for cross-functional processes was an impor-
tant business problem. Note that several other highly rated responses point to the
need for business process support.
Please contact Forrester Research directly for the full details of the study and the
results. This excerpt is used here to highlight that IT professionals understand
that there is a divide between the business side, with its business requirements,
and the support that is being provided. Those IT professionals are looking for
ways to close this gap.
DEFINITION OF TERMS
Workflow is an excellent way to meet this need. Workflow allows for a better
alignment of IT with business because it allows the enterprise applications to be
expressed in a way that makes sense to business users. We will also see that it
helps businesses be more agile by allowing business people control of the busi-
ness aspects of applications, while IT people retain control of the applications
more technical aspects. Before discussing the details of how this comes about, we
should start first with a definition of a few basic terms to make sure that we are
talking about the same things.
The term Business Process is used a lot today, but often loosely to mean several
different things. The origin of the term is generally attributed to Michael Hammer
and his seminal work in the area of Business Process Reengineering. When Mi-
chael Hammer talks about a Business Process he uses the term to distinguish a
Business Process from a Manufacturing Process or a Chemical Process. The
distinguishing characteristic of a Business Process is that it involves people doing
office work. The point of his work was to get people to stop thinking about office
work as being organized along functional lines, and to start thinking about the
chain of different functions that must be strung together to accomplish a busi-
ness goal. He was very successful in getting people to think along these lines, and
today no serious business analyst would approach an attempt to improve the way
an office works without starting by drawing out the process. Oddly, some people
use the term business process for things that dont involve people or office work. I
prefer the WfMC definition which has been stable for ten years now:
Business ProcessA set of one or more linked procedures or activities
which collectively realize a business objective or policy goal, normally within
the context of an organizational structure defining functional roles and rela-
tionships.
WorkflowThe automation of a business process, in whole or part, during
which documents, information or tasks are passed from one participant to
another for action, according to a set of procedural rules.
Process DefinitionThe representation of a business process in a form
which supports automated manipulation, such as modeling, or enactment by
a workflow management system. The process definition consists of a net-
work of activities and their relationships, criteria to indicate the start and ter-
mination of the process, and information about the individual activities, such
as participants, associated IT applications and data, etc.
The WfMC Glossary does not include a definition for Business Process Manage-
ment but recent discussions within the Coalition have centered on the following
proposal which highlights management aspect of the term:
THE BUSINESS VALUE OF WORKFLOW AND BPM
19
Business Process ManagementThe practice of developing, running,
performance measuring, and simulating Business Processes to effect the con-
tinued improvement of those processes. Business Process Management is
concerned with the lifecycle of the Process Definition.
As office work has been traditionally supported through the use of paper docu-
ments and folders passed from function to function, many of the early workflow
products focused on routing documents through a group of people. More recent
systems are quite a bit more sophisticated, offering not only documents, but
structured information handling, complex event processing, programmatic ma-
nipulation of information, and the ability to exchange information with web ser-
vices and other external information sources. These newer capabilities allow the
workflow systems to integrate into the modern IT infrastructure. At the same
time, the workflow systems have not forgotten the human aspect, which give
workflow a unique capability to bridge the gap between the business world and
the IT world.
TWO DIFFERENT AUDIENCES
We talk about the gap between business and IT, but what do we mean? Busi-
nesses run on their information systems, but there are two distinct audiences.
The first audience we call business users. These are the people in the organization
who are doing the work that directly accomplishes the goals of the organization.
In most ways, these people are users of the information systems. The business
side also includes management, who is interested in how well their organization is
running, and might be interested in optimizing the way that people work. The
CEO, CFO, and Line of Business manager are roles that are well known. We now
talk more about the Business Analyst role. People in this role specialize in the or-
ganization of tasks into processes. The Business Analyst is not normally techni-
cal, but instead someone who understands the business and the goals of the
business, as well as how to accomplish those goals with a team of people.
The second audience we call Information Technology (IT) professionals. This side of
the business is responsible for providing the information systems. Sometimes this
means developing custom applications for the enterprise, and in other cases it
includes only installation and management of package applications.
The reason for considering these as two distinct groups is because they often look
at the same problem with different goals and desires. The business side is con-
cerned with business goals which are both manual and automatable, while the IT
side is concerned with only those goals that can be translated into tested, reliable,
and secure systems. While the IT side is organized around system structure and
values 7x24 operation and scalability issues, the business side is organized
around social structures with the complexities of working hours, vacation sched-
ules, skills training, and changing positions.
Both business and IT users need agility, the ability to respond to change. But the
rate and scope of change are different between the two groups. Not counting
emergencies such as production server outages, there are usually weeks or
months needed to plan the addition of a server or a new application to the sys-
tem. The business user, on the other hand, needs to be responsive to competitors,
the market, and personnel changes, on a week or even daily basis. If a competitor
comes out with a challenging new product, you need to respond immediately. The
typical average turnover across all US businesses is 20 percent. This means that
if you are running a 1000-person organization, you will have on the average one
THE BUSINESS VALUE OF WORKFLOW AND BPM
20
person leaving and joining every day. Your personnel change internally is much
greater than that, because you have people learning new skills, moving into new
positions, as well as taking and returning from vacations. Running a business is
a matter of accommodating change on a daily basis.
Process support for humans is also very different than process support for IT sys-
tems, and this difference can be categorized as a difference in the handling of
time. Making a process which routes information through a set of servers is a
matter of identifying the servers and transforming the data as required by each,
along with any conditional logic to determine what happens next. The servers are
generally available 7 x 24, so when you have a job for a server to do, it can typi-
cally be given the task immediately. There are details for scalability and robust-
ness being glossed over here, like retries for those rare cases that the server is
down, or queues for the cases where a server is given multiple tasks at the same
instant, but it is fair to generalize by saying that servers spend most of their time
sitting and waiting to be given a task, and when given a task they take those
tasks in the order given. They generally complete the task almost instantly.
People, on the other hand, work in a very different manner. A human process sys-
tem (workflow) will offer tasks to people. Those people are not generally sitting
around with nothing else to do, and do not take up the task immediately upon
being assigned. Generally a person has a worklist with a variety of things in pro-
gress which can be sorted and completed in a more efficient manner. Tasks as-
signed during off hours will wait until opening hours to be considered. The as-
signment of a task to a system is very concrete; if a system is set up to handle a
task by the installation of software, it is immediately able to handle all such tasks.
Assigning tasks to people on the other hand is a much more complicated thing.
People will have varying levels of particular skills, and are often specialized in cer-
tain ways. Two salespeople may have equivalent skill to close deals, but one of
them may be more suitable for a particular job because of having more experience
with, say, defense contractors. It may not be possible to express the criteria, so
such systems need the ability to manually reassign tasks. Human process sys-
tems generally offer an ability to send reminders or escalate the task when it has
not been completed within a certain time. Whereas, there is no point in sending a
reminder to a server that has for any reason failed to complete an activity.
When IT professionals talk about process support (even business process sup-
port) they often are referring to this system to system process support which
forms an important part of meeting their needs to create robust, scalable system.
But when business users discuss process support, they usually refer to human
process support, or workflow, which includes these human features, but at the
same time can provide connectivity to the backend information systems. It is this
unique ability for workflow systems to bridge between the human and system
realm that makes them key in providing business value to the organization.
PURPOSE OF WORKFLOW
I have pointed out how workflow offers unique features that allow for the coordi-
nation of human work during the running of a process, but there is another key
aspect of workflow which is critical to bridging the business IT gap. The business
processes themselves must be able to be designed and modified by business peo-
ple. Here are some comments that reflect this:
The ultimate goal of workflow is to place in the hands of business profes-
sionals the ability to modify their processes, with no involvement from the
IT organization.Michael Melenovsky, Gartner BPM Summit, 2006
THE BUSINESS VALUE OF WORKFLOW AND BPM
21
... process changes are made by business professionals who need only limited
knowledge of IT systems. In a growing number of cases, changes such as
work item routing, business rule overrides, and parametric changes to ap-
proval levels, are made in real time to executing process.Janelle Hill, Gart-
ner BPM Summit, 2006
These ideas are very uncomfortable for most IT professionals. That is because
they know that with traditional programming practices, if you let an untrained
person modify the code it is far more likely to break the application than to im-
prove it. I think most people would agree that, for a non-programmer, opening up
Java, C++, or Visual Basic code would be dangerous. To complete an application,
a programmer must apply many rules and practices in a correct manner to result
in a reliable and safe application.
What these industry experts are saying is not that we want business people play-
ing with the guts of the application developed along the lines of traditional pro-
gramming, but rather that applications must be structured in a specific way that
isolates the business process from the programming logic. The more technical
aspects of the application need to be wrapped up into reusable chunks. Those
chunks need to be robust and not sensitive to erroneous input. They need to be
more like plugging a power adapter into your cell phone, and less like soldering a
printed circuit board.
Business side retains control of:
Assignment of responsibility because this depends strongly on who is in
the organization.
Groups, Roles, and Skills because these change on a monthly, weekly, or
even daily basis.
Deadlines, Alerts, Reminders, and Escalations because they depend on
the culture of the organizational unit
Order of tasks and addition of new manual tasks because this is critical
for agility to be able to respond to market and legislative changes.
User Interface because this is effected by the level of training or experience
of a particular organization
IT retains control of:
Computational logic and data representations because there is little or no
dependency upon the culture of the organizations
Scalability and performance because this requires significant specialized
expertise in the working of information systems.
Interoperability because this requires extensive knowledge of the operat-
ing infrastructure
Master data management because this is constrained by highly special-
ized requirements
The business processes need to be abstracted out of the application, and repre-
sented as a structure separate from the more technical aspects. The business
process is simply used to sequence the chunks into an integrated whole in a way
that is safe for a non-programmer to edit.
By 2009, 20 percent of business processes in the Global 2000 will be sup-
ported on BPMS[*]. These processes will be predominantly those that in-
volve a lot of human work, that differentiate the company from its competi-
tors and that are poorly supported by existing IT systems (0.7 probability). --
Janelle Hill, Gartner BPM Summit 2006
THE BUSINESS VALUE OF WORKFLOW AND BPM
22
Gartner defines a BPMS as a suite that handles both human and system proc-
esses, which is equivalent to the definition of workflow given above. This trend is
clear.
AN EXAMPLE: HUMAN-BPM APPLICATION
To illuminate how the application might be structured to allow for the different
responsibilities to be split across different groups of people, consider in detail an
example application for processing bank loans. It is common for the application
development to start with drawing a high level human-oriented process diagram.
A business analyst might start by deciding what important business activities
need to take place to accomplish the goal. Imagine that people come to the bank
and fill out an application which is subsequently scanned and converted to text
data, and that this is the event that starts the process. In this case the business
analyst determines that two people need to be involved. First, a person needs to
review the input data for completeness and as a check of the character recogni-
tion. Once that has been done, a bank manager needs to make a decision on
whether to grant the loan or not.
This example is simplified so it can be discussed in this article, but it is important
to note that the business analyst is dealing only with jobs that must be performed
by humans within the organization. There is an implicit assumption that there
will be a bunch of data processing associated with the process, but that is not a
concern at this level. For example, a bank will clearly want to perform a back-
ground check on the applicant, but that is not a human activity. Since that can
be completely automated, there is no reason to have a person in your office who
performs background checks. Instead, it is assumed that somewhere between the
first and second human activity, a call will be made to retrieve information about
the background of the applicant, and the bank manager has the results of that
available in order to make the decision of whether to loan the money or not. At
this point, the business analyst is concerned only with the activities that will be
done by office workers.
Review
enter
info
Review
enter
info

The diagram above is a conglomerate of notations. The circles, rounded rectan-
gles, and arrows between them depict the process using a standard called Busi-
ness Process Modeling Notation (BPMN). The rounded rectangles are the standard
way that you represent an activity, while the circles represent the start and end
events. The trapezoid shapes are not part of the BPMN standard, but instead are
used here simply to represent that there will be a user interface (UI) of some sort
associated with the activity. The business analyst may lay out some sort of form
which specifies particular information values that must be made available to the
user. This specification might be abstract in only specifying the quantities that
need to be present, or it might be a concrete layout of precisely where such values
THE BUSINESS VALUE OF WORKFLOW AND BPM
23
should appear on a screen. The people then use the UI in order to perform their
respective tasks.
This level of the process might be designed on a graphical design tool intended
specifically for business process design. It might be drawn up using a generic
graphic tool, or it might simply be documented in a non-graphical way. Some
workflow systems will allow a drawing at this level to be executed directly without
any further technical work. Others offer powerful design capabilities, but the im-
plementation of the automated system is left as a task for a traditional develop-
ment team. In some cases if suitable web services with public interfaces exist, it is
possible that a business user might be able to incorporate calls to back end sys-
tem without programming. But in most cases, integration to the back end infor-
mation systems must be done by a development team.
The human process design is provided to the programmer who will add integra-
tion to the back end system. In this example, immediately after the first activity of
reviewing the information for correctness (which must be done by a human) the
system then should automatically call a service that can perform a background
check of the applicant. The bank may have rules that it will not accept certain
categories of applicants, and there is no reason to force the bank manager to
check this manually. Business rules can be employed to classify applicants. In
this example, an Enterprise Service Bus (ESB) is used to integrate the call to the
background check service, and the call to the conformance rules into a single web
service which is easy to connect to the workflow. This is not meant to imply that
an ESB must be used; most workflow systems will allow for multiple calls to dif-
ferent services. This is offered here only as an example of how IT professionals
might wish to structure the back end systems to give them flexibility.
After the bank manager reviews the application and approves the loan, an addi-
tional call is used to integrate with the account management application and to
cause the new account to be created.
Enterprise Application A
Account Management
B
a
c
k
g
r
o
u
n
d
R
u
l
e
s
list
Accts
new
Acct
update
Acct
delete
Acct
call 1 Review
ESB / BPEL
enter
info
Enterprise Application A
Account Management
B
a
c
k
g
r
o
u
n
d
R
u
l
e
s
list
Accts
new
Acct
update
Acct
delete
Acct
call 1 Review
ESB / BPEL
enter
info

THE BUSINESS VALUE OF WORKFLOW AND BPM
24
The boxes in the lower half of the diagram represent automated services of vari-
ous forms. The smaller square boxes represent web service interfaces to these
capabilities. The intent is not to imply that it is necessary to use web services, but
that is currently a popular approach to allow for flexibility.
Why didnt the diagram change when adding the new integration? An IT profes-
sional might want to draw a more detailed diagram, one that includes activity
boxes for the background check, and the rules. This would be helpful to IT, but
an important principle of support for human processes is to not clutter the dia-
gram with details that are not relevant to the human users. The human process
diagram is used for things like training people within the organization. It is impor-
tant in a training scenario to show the steps that people have to do. The detail of
where calls are made to back end systems is not important when helping people
to understand what it is that they need to do, and how it relates to what other
people do. The business users are best served by a diagram that shows what the
business users are doing. This may seem obvious, but you will find a large num-
ber of IT people who find this concept surprising.
AGILITY IN THE FACE OF CHANGE
In the previous section we saw how an application might be constructed, but that
is not the end of the story. Applications must evolve and change over time. The
point of structuring the application in this way is to enable rapid change of some
aspects of the application without breaking it.
Consider what the bank will have to do to respond to this scenario: One day, it is
reported that a small bank in one part of the country is successfully sued and has
to pay a huge fine for having given a loan to a terrorist. This is a purely hypotheti-
cal example, but the point is that legal precedence is set by court cases which can
happen relatively suddenly without warning. If this was to happen, the prece-
dence would be set, and it might be possible then for many other banks to be
sued if they do the same thing. The bank has a huge risk, and can not afford to
wait for a new terrorist identification solution to be developed by IT in order to
check if the applicant is a terrorist. The bank must begin, the very next day, to
behave under the new rule of not giving a loan to a terrorist.
The first thing to happen is that a manual check must be added to the process. A
team will be identified, and every bank loan must be reviewed by that team, to
assure that the current loan is not going to a terrorist. The bank will also set in
motion a project to automate this, but that will take weeks or months. The bank
can not afford to stop giving out loans for that time. The manual review will be
expensive, but less expensive than being sued if they make a mistake.
The manual step can be immediately incorporated into the human process as a
new step between the review and the approval. The huge advantage in being able
to put this step directly into the process is that, at the end of the day, you are as-
sured that every bank loan has been checked. Workflow systems keep a record of
every activity that is completed, and it is easy to prove that every loan has been
appropriately checked. The bank is able to prove compliance to the new rule (law)
the very next day on every loan made, which greatly reduces risk of the bank.
THE BUSINESS VALUE OF WORKFLOW AND BPM
25
Enterprise Application A
Account Management
B
a
c
k
g
r
o
u
n
d
R
u
l
e
s
list
Accts
new
Acct
update
Acct
delete
Acct
call 1 Review
ESB / BPEL
enter
info
legal
check
Enterprise Application A
Account Management
B
a
c
k
g
r
o
u
n
d
R
u
l
e
s
list
Accts
new
Acct
update
Acct
delete
Acct
call 1 Review
ESB / BPEL
enter
info
legal
check
legal
check

The manual step is temporary. A couple of months, or possibly weeks, later there
will be an automated service that will be able to reliably categorize an applicant as
a terrorist or not. This can be added as another automated call between the first
and final steps of the process. When this is in place, and when the bank is confi-
dent that it works correctly, the manual step can be removed from the process,
and the bank can return to having two human steps in the loan process.
CONCLUSION AND SUMMARY
Agility is about responsiveness to the market. Applications that are designed us-
ing traditional programming principles cannot be modified quickly due to the
technical expertise that is required. But if an application is structured from the
beginning to separate the human process from the technical manipulation of the
data, then it is possible for business users to be able to modify the process part of
the application in a safe way.
When done right, successful BPM initiatives (herein referring to projects in-
volving both business process analysis and the implementation of business
process management software) change the entire notion of applications, by
allowing core systems to respond to process context, rather than driving
processes around the limits of technology.Nathanial Palmer, Laura
Mooney, 2006
The fundamental benefit is business level agility, where applications are no longer
monolithic blocks constructed out of third-generation languages. Instead, the
user interface is separated from the back end logic. In this case, user interface
means not only the visual display to the user, but the time-oriented aspects of of-
fering a task to a user, and reminding that user if the task is not completed in
THE BUSINESS VALUE OF WORKFLOW AND BPM
26
time. The solution is built from applications slices sequenced by workflow process.
The workflow determines the right person for the right task at the right time.
The key difference is that the business analyst is in control of the human side of
the application. The business analyst can rearrange slices, and add in manual
steps quickly, without having to do any programming. This yields a form of agility
that is rapidly becoming a competitive differentiator in the industry.
This is the business value of workflow and human-oriented BPM.
27

Knowledge Intensive BPM
Jon Pyke, The Process Factory Ltd., UK
Many of us involved in the field of Workflow Automation and Business Process
Management (BPM) have argued long and hard about where these two technolo-
gies overlap, where they are different, which mathematical models to use, which
standards are applicable to which part of the technology stack and all that asso-
ciated puff.
Well, these arguments and discussions are over; the demarcation lines have been
drawnthe road ahead is clear.
The fact that Business Process Management has its roots in Workflow technology
is well knownmany of todays leading products are, in fact, evolutions of the
original forms processing packages. So there is no longer a need to debate what is
now a moot point.
But what has happened is that BPM has also changed. Rather than being an ex-
tension of workflow concepts BPM is now seen as systems-to-systems technology
exclusively used in the deployment of SOA solutions. Im over simplifying things I
know, but it does seem that BPM is becoming an IT Technology solution as op-
posed to the business process solution it was meant to be. Somewhere along the
way, one of the key elements in a business processa persondropped off the
agenda. The fact that the majority of business processes (some 85 percent accord-
ing to the analyst company Forrester) involve carbon-based resources
1
was over-
lookedthink BPEL for a momentdoesnt the development of that particular
standard tell you something about the general direction of BPM? But be warned,
many vendors will tell you that their BPM products support human interaction,
but what they are talking about will be simple work item handling and form fill-
ingthis is a long way from the collaboration and interaction management we will
talk about below.
The problem stems from the fact that most workflow products were flawed and as
a result, the problem in the gene pool has rippled through to the new BPM spe-
cies. So what was wrong with workflow? Its quite simple when you think about it;
most workflow products assumed that work moved from one resource to another.
One user entered the loan details, another approved it. But business doesnt work
like that.
This flawed thinking is probably the main reason why workflow was never quite
the success most pundits thought it would be; the solutions were just not flexible
enough, since the majority of processes are unsuited to this way of working.
Paradoxically, it is the exact reason why BPM is so suited to the world of SOA and
systems to systems processes. A rigid approach to systems processes is essential,
where people are concerned; the name of the game is flexibility.
WHY DO WE NEED THE FLEXIBILITY?
As mentioned earlier, we have to deal with the unexpected. This is not just about
using a set of tools to deal with every anticipated business outcome or rule; we
are talking about the management of true interaction that takes place between
individuals and groups which cannot be predicted or encapsulated beforehand.

1
Human beings, people
KNOWLEDGE INTENSIVE BPM
28
This is because Business Processes exist at two levelsthe predictable (the sys-
tems) and the unpredictable (the people).
The predictable aspects of the process are easily and well-catered for by BPMS
solutionswhich is why the term Business Process Management is a misnomer
since the perceived technology addresses only the integration aspectswith the
close coupling with Service-Oriented Architecture (SOA) (SOA needs BPM, the
converse is not true) there is an argument for renaming BPM to Services Process
Management (SPM).
The advent of BPEL4People isnt going to fix the problem either, all that will hap-
pen is the shortcomings of workflow will be replicated and it will be as difficult
and expensive to implements as it ever wasand anyone who has tried to put
together a business case for buying SOA/BPM will know the entire proposition
will be a non-starter.
Understanding that the business processes exist at two levels (the Silicon and the
Carbon) takes us a long way towards understanding how we solve this problem.
The key point is to recognize that the unpredictable actions of the carbon compo-
nents are not ad-hoc processes, nor are they exception handling (ask anyone with
a Six Sigma background about exceptions and youll understand very quickly
what I mean). This is all about the unstructured interactions between peoplein
particular knowledge workers. These unstructured and unpredictable interac-
tions can, and do, take place all the timeand its only going to get worse! The
advent of Web 2.0, social computing, SaaS etc., are already having, and will con-
tinue to have, a profound effect on the way we manage and do business.
Process-based technology that understands the needs of people and supports the
inherent spontaneity of the human mind is the next logical step, and we might
be tempted to name this potential paradigm shift Knowledge-Intensive Business
Processes.
This is somewhat different from workflow or BPM as we know it because the focus
is on the work, not the process. Of course the underlying objective of the process
is still of vital importance indeed, it provides the underlying bedrock of getting
tasks completedbut these processes are much more complex, ad-hoc, enduring
and important to the business. They are contracted processes as opposed to co-
ordinated or controlled processes as provided by workflow and BPM solutions.
Lets take a simple analogy so that the concept is more easily understood.
Supposing you were playing golf; using the BPM approach would be like hitting a
hole in one every time you tee off. Impressive18 shots, and a round finished in
25 minutes.
But as we all know, the reality is somewhat different (well, my golf is different)
theres a lot that happens between teeing off and finishing a hole. Normally about
four steps (or shots)but you have to deal with the unexpected; sand traps, water
hazards, lost balls, free drops, collaboration with fellow players, unexpected con-
sultation with the refereeand so it goes on. Then there are 17 more holes to
dothe result: an intricate and complex process with 18 targets but about 72
operations.
KIBPM falls into two main types, which will probably merge over time, and the
vendor that recognizes that potential will steal a march on the others.
The first, and in use today, is Case Management (AKA Case Handling).
Case is a very different proposition and represents a very different opportunity
space for BPM vendors. As mentioned above, todays BPM solutions are still very
KNOWLEDGE INTENSIVE BPM
29
production centricwhat Analysts used to call Production Class Workflow. We
know that most organizations dont work this way, at least not at the human
level.
The key differentiating
2
factor of a case handling environment is the ability to run
multiple procedures against a given case of workthe primacy is with the case
rather than the process that is used to support a work item, which means that
the key concept of this approach is the body of work, the Case, not the typically
modeled process of moving of work from one resource to another (where resource
is a system, a work tray or step in a process).
Case Handling might be thought of as any sort of business problem where, with-
out technology support, a manila folder would have been used to store related
documents and information about things of interest and tasks then associated
with the contentsan intelligent to do list if you will. Case Handling systems
leverage the capability to associate virtually any number of objects within the con-
text of a case. Processes tend to unfold rather than rely on a priori design time
decisions (but within the context of an overall framework). Clearly, the activities
are related and cases follow typical patterns but the process becomes the recipe
for handling cases of a given type.
3

The following diagram illustrates what a typical solution might look like:

Case management is effectively a sub-process within a much larger context and
inherently more flexible.
The next logical step from case management is to focus still further on the human
aspects and introduce the concepts of Human Interaction Management (HIM).
HIM has been developed to provide a mechanism that translates top-level strate-
gies into executable collaborations and to provide an approach to negotiating
public processes.
Worldwide, more and more routine work is gradually being automated and/or
commoditized. So the skilled human work left over is more important than ever
both to individuals, who are competing for a smaller and smaller number of inter-
esting jobs, and to organizations, for whom skilled human work is becoming the
only competitive differentiator left.

2
Meirs, Derek, Enix Consulting one of his many white papers
3
Van Der Aaalst, Wil Beyond Workflow Management: Product Driven Case Handling
Case Case Case
KNOWLEDGE INTENSIVE BPM
30
This type of work depends fundamentally on collaboration. Few highly skilled in-
dividuals work in isolation. Yet the new software tools for Internet-based commu-
nication are actually making collaboration less and less efficient, by flooding each
of us with more and more email, chat, text messages, video conferences, telecon-
ferences, documents. Furthermore organizations have lost the ability to manage
their workforce. So we all need to collaborate better. This means adopting a sim-
ple, general approach to collaborationone that meets both individual and organ-
izational needs.
Meeting this need is a framework for human process collaboration, based on the
theory of Human Interaction Management. HIM solves a problem of direct busi-
ness relevance by constructing work processes based on those carried out in their
organization. In this way, the processes are far easier to build and deploy; integra-
tion demands are reduced; there is no need to get it right on first deployment.
A typical human collaborative work process is agileit changes as it goes along,
since much of the work is about jointly deciding what to do next. Hence, a proc-
ess template intended for use as a starting point alone is enough to deliver signifi-
cant personal productivity improvements to users, and significant efficiency bene-
fits to organizational management.
HIM is designed to support human work processes, which depend on interaction
and are dynamically shaped by the participants. This is achieved by five main
features of the technology:
1. Connection visibility
2. Structured messaging
3. Support for knowledge work
4. Supportive rather than prescriptive activity management
5. Processes change processes.
It effectively turns strategy into action and provides a mechanism for ensuring
users have control over the strategic objectives and that executives remain in con-
trol of the overall deploymentwith the ramifications that will have over corporate
compliance. Furthermore it provides strong management control by enabling par-
ticipation in the process execution including on going redefinition of the process
itself, thereby ensuring maximum agility and responsiveness.
KNOWLEDGE INTENSIVE BPM
31
The above diagram encapsulates HIM in action.
Weve learnt that BPM has evolved from simple routing engines and forms pack-
ages that were designed to manage the flow of work, and handle controlled proc-
esses, through to Process Management tools that act upon explicit interactions
and effectively coordinate known processes to the Knowledge intensive BPM sce-
nario that handles tacit interactions, case management and contracted processes.
The true process-enabled enterprise will use the technology (BPM) to address all
aspects of the value supply chain; the very DNA of the organization. This includes
the People: customers, employees and stake holders; the applications both inter-
nal and external and processes up and down the full supply chain.
KNOWLEDGE INTENSIVE BPM
32

CONCLUSION
In conclusion the move towards sophisticated human interaction will have a ma-
jor impact on the BPM market in the very near future. BPM vendors with the
right insight into the market will seize this opportunity and steal a significant
march on their competitorsand it is fair to say that some of those vendors will
be unable to respond to this growing need.
I doubt there are many BPM products on the market today which will be able to
meet this seismic shift in requirementscertainly those that rely on BPEL and
SOA wont and any that have been in the market for longer than five years will
need radical surgery to meet the coming challenge.

KIBPM
BPM
WfM
Source EDS BPM a systemic Perspective

33

BPM and Service-Oriented
Architecture Teamed Together:
A Pathway to Success for an
Agile Government
Linus Chow and Charles Medley, BEA Systems;
Clay Richardson, Project Performance Corp., USA
EXECUTIVE SUMMARY
Business Process Management (BPM) and Service-Oriented Architecture (SOA)
have evolved almost independently over the last six years to become key technolo-
gies leveraged by government agencies in their struggle to become more agile and
innovative. Inherently team players, the converging maturity of these disciplines
and technologies promise to produce an ROI multiplier
1
effect for organizations
worldwide. This paper showcases the real challenges faced and benefits derived by
government agencies seeking to use BPM and SOA to meet constituent demands
for increased agility and responsiveness. An example-driven analysis of these high
impact solutions and what they mean to government entities such as the US De-
partment of Defense, US Intelligence Agency, and Government of Bermuda are pro-
vided in this chapter.
THREE KEY CHALLENGES WILL FACE GOVERNMENT OVER THE NEXT TEN YEARS
Through each budget cycle, government agencies are constantly challenged to ac-
complish more far-reaching goals with less working capital. This trend is expected
to continue as government tightens its belt to accommodate social and military de-
mands. Government executives, CIOs, and technology managers have become ac-
customed to the mantra: Do More with Less. While technology has been a major
driver in addressing the need to Do More with Less in the last three to four years,
the challenges have increased and are pushing technology to the limits of its capa-
bilities.
The first of the great challenges facing government is the retirement tsunami that
will take place over the next five to ten years. As reported in the Washington Post,
60 percent of the governments 1.6 million white-collar employees will be eligible for
retirement in the next 10 years
2
. With such a large portion of the baby-boomer gen-
eration retiring from government, the impact undoubtedly will be widespread. This
issue will not only impact the federal government; it is also anticipated that large
numbers of retiring baby-boomers will step away from state and local government
jobs in the coming years. Over 30 percent of Californias workforce will retire in the
next three years, points out Anthony L. Souza, a former CIO with the State of Cali-
fornia
3
. This represents a looming crisis that the State is currently planning for,
states Souza.
The immediate challenge arising from this retirement tsunami will be filling the
high number of job vacancies that will be created. However, the greater challenge
lies in addressing the great loss of institutional knowledge that will literally walk out
of the door when boomers retire. In many cases, this knowledge that has been built
up over decades of service to the government will be difficult, if not impossible, to
BPM AND SERVICE-ORIENTED ARCHITECTURE TEAMED TOGETHER
34
capture or reconstitute. To address the strains created by projected mass-
retirements, government will lean more heavily on information technology to drive
efficiency, productivity, and self-service using a smaller, leaner workforce. Technol-
ogy will need to capture the institutional knowledge of retiring government workers
in order to ensure continued enforcement of informal and undocumented policies
and procedures.
The second major challenge facing government is the need to support longer tech-
nology refresh rates. Many agencies are now faced with technology refresh rates of
three to four years. Looking out into the future, these refresh rates are likely to re-
turn to those seen during the 1970s and 1980s.
Given the trend toward lower information technology budgets for government
agencies, systems will need to remain in commission longer than what we saw
in the late 1990s and early 2000s, points out John Kim, who supports infra-
structure planning activities for the United States Patent and Trademark Of-
fice
4
. Looking to the future, Kim envisions technology refresh cycles of five to 10
years, where systems will need to remain in commission and evolve to meet
the needs of internal and external constituents. This means systems will need
to be able to incorporate rapid changes that result from new legislation and rule
changes, says Kim.
The third great challenge facing government is the continued trend toward out-
sourced information technology services. In most agencies today, the bulk of infor-
mation technology services are outsourced to technology consulting firms. This of-
ten means that agencies are orchestrating development projects across various in-
tegrators that usually compete with one another in the federal marketplace. How-
ever, agencies require these firms to work closely together to deliver mission-critical
systems that cut across various departments.
In the future it will become even more important that federal contractors and con-
sulting firms play nice together to support the agencys mission. Technology and
interoperability standards will need to facilitate collaboration and communication
across various systems, platforms, programming languages, and methodologies.
Without standards and the appropriate tools, it becomes impossible for government
to connect the various development activities of integrators to the larger picture of
the agencys mission.
BPM AND SOA ARE CONVERGING TO ANSWER THESE GREAT CHALLENGES
Over the last six years, Business Process Management (BPM) and Service-Oriented
Architecture (SOA) have evolved as key components in the technology landscape. In
addition, these two technologies represent the greatest promise for answering the
key challenges that will be faced by government in the coming years.
While these two technologies represent the greatest promise, they also represent the
most over-hyped and maligned technologies today. Most government CIOs and in-
formation technology managers view SOA and BPM more as industry buzzwords
than technologies that have substance and can deliver measurable results. Much of
this misunderstanding comes from the constant barrage of mixed messages ex-
plaining BPM and SOA and their promised benefits.
At Wikipedia.com, a search on Service-Oriented Architecture turns up over 10
pages of content that define this technology. Even within the Wikipedia.com defini-
tion, there is some debate as to exactly what SOA is and isnt. In addition, a quick
poll of most information technology managers also turns up that SOA is one of the
most disdained terms in the technology lexicon. However, to understand SOA it is
BPM AND SERVICE-ORIENTED ARCHITECTURE TEAMED TOGETHER
35
helpful to take a trip back to the mid-1990s when Object Oriented Programming
was the hottest phrase in the world of technology. Object Oriented Programming
(OOP) was introduced as a paradigm that allowed application code to closely mirror
the objects and functions provided by a particular business activity. For example,
an accounting program might contain a general ledger object along with functions
for credit, debit, and reconcile.
In theory, OOP made it easier for programmers to design and develop systems that
looked like the business. The promise of OOP allowed technologists to remain in the
drivers seat from requirements through development to deployment of government
systems. Of course, government business units also thought OOP was great, as it
promised to give them a little more visibility into the black box of technology.
Throughout the modern history of information technology, government business
units have demanded greater control over what their systems look like and their
ability to shape these systems. Of course, government business units want their
systems to look and operate exactly as their businesses look and operate. SOA and
BPM represent the continuation of this trend toward technology mirroring the
business.
In fact, from a business perspective, the idea of service orientation is not a new
concept. In government, the concept of service design has been around for dec-
ades. Service design in this context deals with the physical services that are pro-
vided to internal and external constituents and how these services are designed.
For example, at some point, someone within the Internal Revenue Service had to
define what services the organization would provide and how those services would
be carried out. In very much the same way, Service-Oriented Architecture is con-
cerned with designing and constructing the technical services that are provided to
internal and external constituents. In many cases, these technical services are be-
ing designed to mirror and support the physical services that are provided by the
governmental agency.
In the U.K., government is focused on designing better services, which often
leads to better use of technology, reports Nick Jones, Assistant Director of the
Transformation and Delivery Group within the United Kingdom Government
5
.
Our goal is to first design a better service before we try to apply technology; ul-
timately we believe this allows us to create a technical architecture that sup-
ports the actual services delivered by government.
The basic idea behind SOA is to provide a set of services that can be consumed by
anyone at any time. In essence, SOA represents the latest in good systems design
that promotes the idea of delivering a black box that can be used in perpetuity by
other systems and applications. In theory, any changes that need to be made inside
of the black box do not impact any of the systems that are currently using it. This
approach supports the trend toward longer technology refresh rates within govern-
ment. The black box can remain in commission as long as it is needed without
necessarily having to update the underlying technology. Technology updates and
refreshes can be decoupled from relying on a particular hardware, operating sys-
tem, or database platform.
Service-Oriented Architectures promise of delivering these black boxes also sup-
ports governments continued trend toward outsourced technology services. As in-
tegrators develop new applications, using SOA standards and frameworks, govern-
ment can connect these systems to other applications being developed by other in-
tegrators throughout the agency.
BPM AND SERVICE-ORIENTED ARCHITECTURE TEAMED TOGETHER
36
Ultimately, government CIOs and information technology managers are signing up
to play less of a leading role and sharing more control with business units in shap-
ing the design and functionality of systems. Business Process Management has
emerged as both a technology and a methodology that allows information technol-
ogy managers to better collaborate with business units on system initiatives. Using
BPM, technology managers can work directly with business owners to model and
automate their key processes; also allowing them to quickly incorporate improve-
ments and enhancements as they are required.
Case Study #1
7
: Transforming a Defense Agency into an Agile Enterprise
Executive Overview:
SRA International is utilizing Business Process Management (BPM) and Service-Oriented
Architecture (SOA) methodologies, tools, and techniques to help transform a large Depart-
ment of Defense agency into an agile enterprise. As shown in the diagram, the current
agency environment is a highly distributed organization that is moving toward a tightly cou-
pled enterprise that will replace localized, redundant processes and stovepiped applications
and databases with integrated enterprise processes, applications, and databases. The trans-
formation process includes mission and business re-alignment; consolidation of locations, or-
ganizations, and human resources; and IT Transformation to better support the mission and
business transformation.
Current Agency Environment
Loosely coupled enterprise
Localized, redundant processes
Stovepiped application, databases
Re-alignment
Consolidation
IT Transformation
Transformation Mechanisms
IT Operations
IT Infrastructure
Information Assurance
End User Services
IT Transformation Areas
Transformation Objectives
Quality
Reliability
Security
Cost Reduction
Greater Efficiency
Customer Satisfaction
AS IS Components
TO BE Components
Sequencing Plans
Enterprise Architecture
Current Agency Environment
Loosely coupled enterprise
Localized, redundant processes
Stovepiped application, databases
Current Agency Environment
Loosely coupled enterprise
Localized, redundant processes
Stovepiped application, databases
Re-alignment
Consolidation
IT Transformation
Transformation Mechanisms
Re-alignment
Consolidation
IT Transformation
Transformation Mechanisms
IT Operations
IT Infrastructure
Information Assurance
End User Services
IT Transformation Areas
IT Operations
IT Infrastructure
Information Assurance
End User Services
IT Transformation Areas
Transformation Objectives
Quality
Reliability
Security
Cost Reduction
Greater Efficiency
Customer Satisfaction
Transformation Objectives
Quality
Reliability
Security
Cost Reduction
Greater Efficiency
Customer Satisfaction
Quality
Reliability
Security
Cost Reduction
Greater Efficiency
Customer Satisfaction
AS IS Components
TO BE Components
Sequencing Plans
Enterprise Architecture
AS IS Components
TO BE Components
Sequencing Plans
Enterprise Architecture
Business Process Management Program
Define
AS IS
Processes
Define
TO BE
Processes
Prioritize
Process
Improvements
BPM/SOA Pilot Agency Contracting
Conduct
Stakeholder
Interviews
Develop
Target
Process Model
Develop
Requirements
Specification
Incremental Process Automation,
Process, Data, and System Integration
BEA
AQPI
BEA
AQUI
Open Text
edocs DM
eBusiness
Interfaces
Business Process Management Program
Define
AS IS
Processes
Define
TO BE
Processes
Prioritize
Process
Improvements
Business Process Management Program
Define
AS IS
Processes
Define
TO BE
Processes
Prioritize
Process
Improvements
BPM/SOA Pilot Agency Contracting
Conduct
Stakeholder
Interviews
Develop
Target
Process Model
Develop
Requirements
Specification
Incremental Process Automation,
Process, Data, and System Integration
BEA
AQPI
BEA
AQUI
Open Text
edocs DM
eBusiness
Interfaces
BPM/SOA Pilot Agency Contracting
Conduct
Stakeholder
Interviews
Develop
Target
Process Model
Develop
Requirements
Specification
Conduct
Stakeholder
Interviews
Develop
Target
Process Model
Develop
Requirements
Specification
Incremental Process Automation,
Process, Data, and System Integration
BEA
AQPI
BEA
AQUI
Open Text
edocs DM
eBusiness
Interfaces
Incremental Process Automation,
Process, Data, and System Integration
BEA
AQPI
BEA
AQUI
Open Text
edocs DM
eBusiness
Interfaces
BEA
AQPI
BEA
AQUI
Open Text
edocs DM
eBusiness
Interfaces
Contracting eBusiness Integration Services
Federal, DoD, and Agency eBusiness Systems (34)
Executive
Managers
Functional
Managers
CO/
COTRs
Contracting
Specialists
Other
Users
Process, Data, and System Integration
Enterprise Portal
Contracting eBusiness Integration Services
Federal, DoD, and Agency eBusiness Systems (34)
Executive
Managers
Functional
Managers
CO/
COTRs
Contracting
Specialists
Other
Users
Process, Data, and System Integration
Enterprise Portal
BPM/SOA
Governance,
Infrastructure
Business
Process
Automation
Iteration
Current Agency Environment
Loosely coupled enterprise
Localized, redundant processes
Stovepiped application, databases
Re-alignment
Consolidation
IT Transformation
Transformation Mechanisms
IT Operations
IT Infrastructure
Information Assurance
End User Services
IT Transformation Areas
Transformation Objectives
Quality
Reliability
Security
Cost Reduction
Greater Efficiency
Customer Satisfaction
AS IS Components
TO BE Components
Sequencing Plans
Enterprise Architecture
Current Agency Environment
Loosely coupled enterprise
Localized, redundant processes
Stovepiped application, databases
Current Agency Environment
Loosely coupled enterprise
Localized, redundant processes
Stovepiped application, databases
Re-alignment
Consolidation
IT Transformation
Transformation Mechanisms
Re-alignment
Consolidation
IT Transformation
Transformation Mechanisms
IT Operations
IT Infrastructure
Information Assurance
End User Services
IT Transformation Areas
IT Operations
IT Infrastructure
Information Assurance
End User Services
IT Transformation Areas
Transformation Objectives
Quality
Reliability
Security
Cost Reduction
Greater Efficiency
Customer Satisfaction
Transformation Objectives
Quality
Reliability
Security
Cost Reduction
Greater Efficiency
Customer Satisfaction
Quality
Reliability
Security
Cost Reduction
Greater Efficiency
Customer Satisfaction
AS IS Components
TO BE Components
Sequencing Plans
Enterprise Architecture
AS IS Components
TO BE Components
Sequencing Plans
Enterprise Architecture
Business Process Management Program
Define
AS IS
Processes
Define
TO BE
Processes
Prioritize
Process
Improvements
BPM/SOA Pilot Agency Contracting
Conduct
Stakeholder
Interviews
Develop
Target
Process Model
Develop
Requirements
Specification
Incremental Process Automation,
Process, Data, and System Integration
BEA
AQPI
BEA
AQUI
Open Text
edocs DM
eBusiness
Interfaces
Business Process Management Program
Define
AS IS
Processes
Define
TO BE
Processes
Prioritize
Process
Improvements
Business Process Management Program
Define
AS IS
Processes
Define
TO BE
Processes
Prioritize
Process
Improvements
BPM/SOA Pilot Agency Contracting
Conduct
Stakeholder
Interviews
Develop
Target
Process Model
Develop
Requirements
Specification
Incremental Process Automation,
Process, Data, and System Integration
BEA
AQPI
BEA
AQUI
Open Text
edocs DM
eBusiness
Interfaces
BPM/SOA Pilot Agency Contracting
Conduct
Stakeholder
Interviews
Develop
Target
Process Model
Develop
Requirements
Specification
Conduct
Stakeholder
Interviews
Develop
Target
Process Model
Develop
Requirements
Specification
Incremental Process Automation,
Process, Data, and System Integration
BEA
AQPI
BEA
AQUI
Open Text
edocs DM
eBusiness
Interfaces
Incremental Process Automation,
Process, Data, and System Integration
BEA
AQPI
BEA
AQUI
Open Text
edocs DM
eBusiness
Interfaces
BEA
AQPI
BEA
AQUI
Open Text
edocs DM
eBusiness
Interfaces
Contracting eBusiness Integration Services
Federal, DoD, and Agency eBusiness Systems (34)
Executive
Managers
Functional
Managers
CO/
COTRs
Contracting
Specialists
Other
Users
Process, Data, and System Integration
Enterprise Portal
Contracting eBusiness Integration Services
Federal, DoD, and Agency eBusiness Systems (34)
Executive
Managers
Functional
Managers
CO/
COTRs
Contracting
Specialists
Other
Users
Process, Data, and System Integration
Enterprise Portal
BPM/SOA
Governance,
Infrastructure
Business
Process
Automation
Iteration
Current Agency Environment
Loosely coupled enterprise
Localized, redundant processes
Stovepiped application, databases
Re-alignment
Consolidation
IT Transformation
Transformation Mechanisms
IT Operations
IT Infrastructure
Information Assurance
End User Services
IT Transformation Areas
Transformation Objectives
Quality
Reliability
Security
Cost Reduction
Greater Efficiency
Customer Satisfaction
AS IS Components
TO BE Components
Sequencing Plans
Enterprise Architecture
Current Agency Environment
Loosely coupled enterprise
Localized, redundant processes
Stovepiped application, databases
Current Agency Environment
Loosely coupled enterprise
Localized, redundant processes
Stovepiped application, databases
Re-alignment
Consolidation
IT Transformation
Transformation Mechanisms
Re-alignment
Consolidation
IT Transformation
Transformation Mechanisms
IT Operations
IT Infrastructure
Information Assurance
End User Services
IT Transformation Areas
IT Operations
IT Infrastructure
Information Assurance
End User Services
IT Transformation Areas
Transformation Objectives
Quality
Reliability
Security
Cost Reduction
Greater Efficiency
Customer Satisfaction
Transformation Objectives
Quality
Reliability
Security
Cost Reduction
Greater Efficiency
Customer Satisfaction
Quality
Reliability
Security
Cost Reduction
Greater Efficiency
Customer Satisfaction
AS IS Components
TO BE Components
Sequencing Plans
Enterprise Architecture
AS IS Components
TO BE Components
Sequencing Plans
Enterprise Architecture
Business Process Management Program
Define
AS IS
Processes
Define
TO BE
Processes
Prioritize
Process
Improvements
BPM/SOA Pilot Agency Contracting
Conduct
Stakeholder
Interviews
Develop
Target
Process Model
Develop
Requirements
Specification
Incremental Process Automation,
Process, Data, and System Integration
BEA
AQPI
BEA
AQUI
Open Text
edocs DM
eBusiness
Interfaces
Business Process Management Program
Define
AS IS
Processes
Define
TO BE
Processes
Prioritize
Process
Improvements
Business Process Management Program
Define
AS IS
Processes
Define
TO BE
Processes
Prioritize
Process
Improvements
BPM/SOA Pilot Agency Contracting
Conduct
Stakeholder
Interviews
Develop
Target
Process Model
Develop
Requirements
Specification
Incremental Process Automation,
Process, Data, and System Integration
BEA
AQPI
BEA
AQUI
Open Text
edocs DM
eBusiness
Interfaces
BPM/SOA Pilot Agency Contracting
Conduct
Stakeholder
Interviews
Develop
Target
Process Model
Develop
Requirements
Specification
Conduct
Stakeholder
Interviews
Develop
Target
Process Model
Develop
Requirements
Specification
Incremental Process Automation,
Process, Data, and System Integration
BEA
AQPI
BEA
AQUI
Open Text
edocs DM
eBusiness
Interfaces
Incremental Process Automation,
Process, Data, and System Integration
BEA
AQPI
BEA
AQUI
Open Text
edocs DM
eBusiness
Interfaces
BEA
AQPI
BEA
AQUI
Open Text
edocs DM
eBusiness
Interfaces
Contracting eBusiness Integration Services
Federal, DoD, and Agency eBusiness Systems (34)
Executive
Managers
Functional
Managers
CO/
COTRs
Contracting
Specialists
Other
Users
Process, Data, and System Integration
Enterprise Portal
Contracting eBusiness Integration Services
Federal, DoD, and Agency eBusiness Systems (34)
Executive
Managers
Functional
Managers
CO/
COTRs
Contracting
Specialists
Other
Users
Process, Data, and System Integration
Enterprise Portal
Current Agency Environment
Loosely coupled enterprise
Localized, redundant processes
Stovepiped application, databases
Re-alignment
Consolidation
IT Transformation
Transformation Mechanisms
IT Operations
IT Infrastructure
Information Assurance
End User Services
IT Transformation Areas
Transformation Objectives
Quality
Reliability
Security
Cost Reduction
Greater Efficiency
Customer Satisfaction
AS IS Components
TO BE Components
Sequencing Plans
Enterprise Architecture
Current Agency Environment
Loosely coupled enterprise
Localized, redundant processes
Stovepiped application, databases
Current Agency Environment
Loosely coupled enterprise
Localized, redundant processes
Stovepiped application, databases
Re-alignment
Consolidation
IT Transformation
Transformation Mechanisms
Re-alignment
Consolidation
IT Transformation
Transformation Mechanisms
IT Operations
IT Infrastructure
Information Assurance
End User Services
IT Transformation Areas
IT Operations
IT Infrastructure
Information Assurance
End User Services
IT Transformation Areas
Transformation Objectives
Quality
Reliability
Security
Cost Reduction
Greater Efficiency
Customer Satisfaction
Transformation Objectives
Quality
Reliability
Security
Cost Reduction
Greater Efficiency
Customer Satisfaction
Quality
Reliability
Security
Cost Reduction
Greater Efficiency
Customer Satisfaction
AS IS Components
TO BE Components
Sequencing Plans
Enterprise Architecture
AS IS Components
TO BE Components
Sequencing Plans
Enterprise Architecture
Business Process Management Program
Define
AS IS
Processes
Define
TO BE
Processes
Prioritize
Process
Improvements
BPM/SOA Pilot Agency Contracting
Conduct
Stakeholder
Interviews
Develop
Target
Process Model
Develop
Requirements
Specification
Incremental Process Automation,
Process, Data, and System Integration
BEA
AQPI
BEA
AQUI
Open Text
edocs DM
eBusiness
Interfaces
Business Process Management Program
Define
AS IS
Processes
Define
TO BE
Processes
Prioritize
Process
Improvements
Business Process Management Program
Define
AS IS
Processes
Define
TO BE
Processes
Prioritize
Process
Improvements
BPM/SOA Pilot Agency Contracting
Conduct
Stakeholder
Interviews
Develop
Target
Process Model
Develop
Requirements
Specification
Incremental Process Automation,
Process, Data, and System Integration
BEA
AQPI
BEA
AQUI
Open Text
edocs DM
eBusiness
Interfaces
BPM/SOA Pilot Agency Contracting
Conduct
Stakeholder
Interviews
Develop
Target
Process Model
Develop
Requirements
Specification
Conduct
Stakeholder
Interviews
Develop
Target
Process Model
Develop
Requirements
Specification
Incremental Process Automation,
Process, Data, and System Integration
BEA
AQPI
BEA
AQUI
Open Text
edocs DM
eBusiness
Interfaces
Incremental Process Automation,
Process, Data, and System Integration
BEA
AQPI
BEA
AQUI
Open Text
edocs DM
eBusiness
Interfaces
BEA
AQPI
BEA
AQUI
Open Text
edocs DM
eBusiness
Interfaces
Contracting eBusiness Integration Services
Federal, DoD, and Agency eBusiness Systems (34)
Executive
Managers
Functional
Managers
CO/
COTRs
Contracting
Specialists
Other
Users
Process, Data, and System Integration
Enterprise Portal
Contracting eBusiness Integration Services
Federal, DoD, and Agency eBusiness Systems (34)
Executive
Managers
Functional
Managers
CO/
COTRs
Contracting
Specialists
Other
Users
Process, Data, and System Integration
Enterprise Portal
BPM/SOA
Governance,
Infrastructure
Business
Process
Automation
Iteration
BPM/SOA
Governance,
Infrastructure
Business
Process
Automation
Iteration

BPM AND SERVICE-ORIENTED ARCHITECTURE TEAMED TOGETHER
37
The IT transformation includes multi-year sequencing of improvements to IT Operations, IT
Infrastructure, Information Assurance, and End User Services and achievement of the objec-
tives shown in the diagram. The Troux Metix Enterprise Architecture tool is being used to
support all phases of the agency transformation, including identification of AS IS components,
definition of TO BE components, and the detailed multi-year sequencing required to achieve
the target transformation environment. The enterprise architecture embodies the prioritized
business process improvements identified through the agency Business Process Manage-
ment Program (BMMP). The initial agency process selected from the enterprise architecture
and BMMP for the BPM/SOA Pilot is agency contracting, which is a multi-billion dollar a year
operation that involves interfacing with 34 different Federal, DoD, and Agency e-business
systems.
The agency Enterprise Portal uses the BEA AquaLogic User Interaction (ALUI) COTS soft-
ware and it is the central interface for all agency applications. The BEA AquaLogic Business
Process Management (ALBPM) BPMS COTS software was selected because of its deep in-
tegration with ALUI and its ability to support the transformation of agency processes through
process automation and integration of internal and external data, processes, and systems.
The initial step in the Contracting BPMS was to conduct interviews with over 30 stakeholder
organizations. A major part of the effort was to conduct a series of workshops to define an en-
terprise level contracting process for the agency. This target process is the key enabler for
process automation with the ALBPM modeling and design tools. A Requirements Specifica-
tion was developed as well as an Interface Definition Document for the 34 Federal, DoD, and
Agency e-business systems that will be integrated into the environment. The BPM modeling
tool allows the agency to define all levels of the agency contracting process and identify proc-
ess increments that will be implemented in phases.
The target system concept is shown at the bottom of the diagram. The system will support all
contracting phases (pre-award, award, and post-award) and will support process automation
for all contracting actions, including development of contract requirements, new contract pro-
curement, task order/delivery order modifications, and contract modifications. The users iden-
tified in the diagram will utilize familiar Enterprise Portal interfaces and integration with exter-
nal and internal e-business systems will be transparent to the system users. The results of
the will provide lessons learned that will allow the agency to develop BPM and SOA Guid-
ance, reusable processes and services, and the BPM and SOA infrastructure needed to ex-
pand the BPM/SOA effort to other agency processes identified in the enterprise architecture.
Typically, enterprise BPM software provides modeling and design tools for docu-
menting and automating processes, including business rules, task assignment,
escalations, and notifications. In many cases, BPM applications can be used to cap-
ture and automate the institutionalized knowledge that has been maintained in
employees heads and across myriad systems. Before retiring boomers begin to
walk out of the door, many agencies are beginning to realize the value of capturing
and automating their institutional knowledge using Business Process Management
tools and methodologies.
At its core, Business Process Management is all about orchestrating services;
whether those services are system-to-system, system-to-human, or human-to-
human is irrelevant. Thus, BPM has evolved to be quite complementary to Service-
Oriented Architecture, as BPM acts as a consumer of the services developed and
published by the enterprise. In addition, BPM is often a provider of services, as
other applications can take advantage of the processes deployed via Business Proc-
ess Management applications. Over the next few years, BPM and SOA will converge
to address the key challenges faced by government. Both of these practices and
technologies are designed to allow information technology managers the ability to
better support the needs of the agencys business units.
BPM AND SERVICE-ORIENTED ARCHITECTURE TEAMED TOGETHER
38
BPM AND SOA: HISTORICAL PERSPECTIVE
It is important to note that BPM and SOA matured through the natural evolution of
business demands and technology advancement. Understanding the historical per-
spective both demystifies and reduces the risk in undertaking projects and pro-
grams focused on BPM and SOA. The general technology trend has been adding
levels of abstraction to enable technology to run more like the business. Likewise,
business concepts and methodologies have been created (as extensions of tried and
true methods) to deal with new technologies.
A Historical Perspective
1980s
1990s
TQM (Total Quality Management)
BPR (Business Process Reengineering) by Hammer and Champy
2000s
ERP (Enterprise Resource Planning)
CRM (Customer Relationship Management)
Six Sigma was invented (1986)
BPM: The Third Wave
by Smith and Finger (2002)
SOA (Services Oriented Architecture)
BPM BPM BPM
Case IEF
EAI (Enterprise Application Integration)
Groupware
Workflow

Figure 1.1: General Historical Timeline showing when key business concepts and
technologies became widely recognized.
From a historical perspective (figure 1.1), it can be seen how BPM and SOA evolved,
and how technology solutions maturity has accelerated over time. In the early days
there were wide gaps between business practices (e.g. TQM and BPR) and the ac-
tual technology running the business (e.g. mainframes and custom code). This
changed in the 1990s with technologies focused on standardizing processes based
on the best practices of the times (e.g. ERP, CRM, HRMS, etc.) and of course inte-
gration of these stovepipe solutions (e.g. EAI). While the technologies were built
based on (at the time) industry best practices, this still meant that technology ran
the business. At first being standardized meant being more efficient, but eventu-
ally organizations saw that this also meant losing uniqueness impacting longer
term efficiency. Workflow evolved from this gap to add flexibility to the standard
solutions.
With many organizations now having a multitude of existing applications, data
sources, and legacy systems, you can see that the time is ripe for the next evolution
of technology and business methodology. And with lessons learned and building
blocs in place, BPM and SOA took a further step in making technology run like the
business and synchronized business methodology with technology tools.
BPM AND SERVICE-ORIENTED ARCHITECTURE TEAMED TOGETHER
39
BPM & SOA
Optimizes business processes
Solves the demand for insight
Driven directly by
business/agency goals
SOA greatly simplifies BPM
implementations and Multiplies
BPMs effectiveness
Organizes IT infrastructure
Solves the demand for encapsulation
Driven indirectly by business goals,
translated to a need for IT agility and
governance
Provides a layer of control and
governance for IT underneath BPM
BPM SOA

Figure 1.2: BPM & SOA
BPM and SOA are both produced by the natural progress of business and IT striv-
ing to work together more efficiently and effectively. It is easy to note that these two
technology solutions are complementary in nature (Figure 1.2).
SOA makes content (information and services) accessible when and where its
needed, while BPM applies it to real, dynamic missions. Defense, intelligence,
and homeland security professionals make better decisions, faster, and that
saves lives, says Dave Godman, director of Net-Centric Horizontal Integration
programs at BAE Systems.
6

BPM is a strategy for managing and improving the performance of the business
through continuous optimization of business processes in a closed-loop cycle of
modeling, execution and measurement. In essence BPM is a combination of both a
best practice methodology and an integrated technology solution. BPM was created
from the business-driven evolution and merging of different technology trends. It is
easy to see that BPM solutions have evolved technology to run as the business.
Many features, in whole or part, were combined to satisfy the BPM lifecycle. And
this lifecycle is driven directly by organizational goals. This merging of technologies
into a seamless Integrated Design Environment (IDE), provides the level of abstrac-
tion needed for both technology and business specialist to talk the same lan-
guage. This is no insignificant feat, as this builds trust as well as agility throughout
the organization.
Combined Technologies Merged to fit the Business Driven BPM Lifecycle
Process Modeling Tools
Change Management Tools
Documentation Tools
Simulation Tools
Development Tools (code)
Workflow engine
EAI integration capabilities
Forms Tools
Business Activity Monitoring (BAM)
Business Intelligence
Portal
Automate
Optimize
Innovate
Analyze
Optimize
Simplicity
As IsTo Be
Document/Simulate
Best Practices
Accuracy
Manual Activities
Integrate
Control/Compliance
Alignment
Agility
Synchronize
Opportunity/Change Mgmt
Transparency
Monitor/Audit
Real-Time Metrics
Efficiency

Figure 1.3: Technology tools that were part of the evolution of BPM
BPM AND SERVICE-ORIENTED ARCHITECTURE TEAMED TOGETHER
40
Even today many of the technologies that are combined in a full BPM Suite are still
also available as stand-alone products. Additionally, not all technology components
are fully absorbed by BPM. In fact there is a wide variance among BPM vendors of
which components are even included or how integrated they are with their BPM
Suite. The most common difference, that even industry analysts recognize, is be-
tween human-centric and integration-centric BPM.
Using BPM, we are turning .pdf plan documents and Standard Operating
Procedures into executable applications that drive incident response processes.
While BPM does this, it collects operational metrics which can be used to
rapidly adjust business processes in response to changing environments,says
Mike Walters Project Manager BAE Systems
6

Business Process Improvement
Opportunity Life Cycle
Analyze
Automate
Optimize
Innovate
R
e
t
u
r
n

o
n

I
n
v
e
s
t
m
e
n
t
Time
With the focus on organizational agility,
BPM Suites provide a lifecycle of im-
provement to enable government agility
and the ability to establish long chains
of process improvement. These chains
of process improvement not only pro-
vide return on investment (ROI), but
also the ability to innovate and in effect
future proof processes. This also
makes uniqueness an advantage as
processes are driven by the organiza-
tion vs. IT and yet are properly man-
aged and governed.
Figure 1.4: How the BPM Lifecycle leads to Business Process Improvement

ROI for the government is not always measured in dollars and cents. While cost
savings and efficiencies are key value propositions for both commercial and gov-
ernment organizations, certain governmental processes measure ROI with addi-
tional factors. Responsiveness, compliance, reduction of reaction times, and zero
process failures are some examples of factors that can measurably impact an
agencys ability to meet its organizational goals and objectives.
BPM AND SERVICE-ORIENTED ARCHITECTURE TEAMED TOGETHER
41
BPM Delivers
Consistently knowing
the current status and
outcome of your
processes
Deliver more better,
faster and cheaper than
your current alternative
Efficiency Control Agility
M
e
t
r
i
c
s
R
e
s
u
l
t
s
Ability to adapt quickly
to changing world
conditions
Utilization, capacity
Throughput, speed
Quality, yield, exceptions
Financial
Organizational
SLA failure rate
Rate of non-compliance
Speed to create &
change processes
Responsiveness
Reduce operational costs
Improve productivity
Impact resource
utilization
Better quality / service
Manage outcomes
Compliance
Improve visibility
Impact change
Change with
market/policy demands
Leverage Net Centric
strategies
Improve sharing across
disparate organizations

Figure 1.5: BPM delivers measurable value
The ability to deliver efficiency, control, and agility makes BPM a key tool for gov-
ernment to become more agile and efficient, but in a controlled manner. This en-
ables them to respond to government-specific challenges in ensuring public safety
and security.
Case Study #2: Incident Crisis Response & Management System
13

Executive Summary:
The most important issues facing incident crisis response teams during an event re-
main agility, efficiency, process effectiveness, and information in context. The Na-
tional Response Plan (NRP) is considered to be the core operational plan for na-
tional, and in many cases local, incident management. Additionally, the NRP estab-
lishes national-level coordinating structures, processes, and protocols that will be in-
corporated into existing interagency incident planning and process mechanisms for
particular contingency scenarios.
The NRP and National Incident Management System (NIMS) are models and guide-
lines to improve the Nations incident management capabilities, processes, agility,
and overall efficiency during a crisis situation of any type. These guidelines outline
processes for pre-incident activities, mid-incident activities, and post-incident activi-
ties. Furthermore, many organizations are using these processes and guidelines for
managing their local events and incidents regardless of cause, size, or complexity.
This has increased the importance of leveraging a service-oriented architecture
(SOA) and a business process management system (BPMS) within technical solu-
tions to help manage incident planning, response, and support.
Mantechs Incident Crisis Response Operational Support System (ICROSS) is the
conceptualization of an agile, automated, easy-to-use system that enables all struc-
tures, processes, and protocols outlined within the NRP and NIMS to be facilitated
through the use of SOA and BPM enabled technologies.
The functionality of this system will include many functions all to enhance crisis man-
agement and response through information sharing. The high level services that sys-
tems like ICROSS provide must include:
BPM AND SERVICE-ORIENTED ARCHITECTURE TEAMED TOGETHER
42
A secure, robust portal framework to enable the ease of use and maxi-
mize interoperability through a solid service oriented foundation;
An online Incident/Suspicious Activity management systems built around
the NRP/NIMS processes and the Incident Command System (ICS)
Forms outlined in the NIMS model;
Enhanced situational awareness such as; live streaming video, mass-
notifications, and secure real-time chat from multiple sites, operational
cameras, and mobile devices to connect the operations centers with the
responder;
A framework for multiple data feeds from many sources;
An interoperability center for managing messages and data transforma-
tions using the current government standards for information exchange
to include the National Incident Exchange Model (NIEM), Global Justice
XML Data Model (GJXDM), Common Alerting Protocol (CAP), the
Emergency Data Exchange Language (EDXL), and more;
A document management system with workflow and a content reposi-
tory;
A Geographic Information System Interoperability framework;
An Enterprise class search engine.
The design of incident crisis response systems must address these requirements
and must be based on an enterprise level, service-oriented approach that provides a
flexible and agile technology stack, based on best of breed commercial off-the-shelf
(COTS) products, that can be modified to meet the particular needs of any region,
agency, organization, and/or user community.
Importance of Business Process Management in an Incident Crisis and Re-
sponse System
The premise behind Mantechs ICROSS solutions is the operational need for critical
situational data to be shared across a State or region in the event of and in response
to a major crisis impacting the entire State or specific regions within the State. Access
to this data and the ability to share it across the State will enable government leader-
ship, emergency managers, and first responders to assess the availability of critical
resources and coordinate responses as necessary. A natural disaster, a terrorist at-
tack, or a pandemic outbreak of a disease are examples of situations that would re-
quire the rapid and coordinated response of resources across the State and the
need to rapidly share information from many systems or services with multiple State,
local and national organizations as the situation so requires.
Technology is an enabler of solutions to promote these data exchanges and ensures
relevant and timely information is available to decision makers and emergency ser-
vices personnel when and where needed. From this perspective, technology is
needed to support the well known axiom of getting the right information to the right
person in the right place and at the right time. Meeting this need is a significant chal-
lenge within a single organization or location and this challenge is vastly more difficult
when dealing with multiple organizations across the State and Federal governments
and their private sector partners.
During a crisis situation many organizations are working many scenarios at extreme
pace without time to think through a process model or activity. Additionally, the make
up of responders often consists of trained personnel from different organizations
and many civilian volunteers. The information flow between people during an incident
situation will always consist of structured and collaborative data sets. Structured data
is simply a data set that has an enforced composition, often a database structure or
an XML Schema. Examples of a structured data set include all of the incident forms
included as part of the NIMS Incident Command System. Collaborative data has no
BPM AND SERVICE-ORIENTED ARCHITECTURE TEAMED TOGETHER
43
enforced composition and is more conversational. Collaborative data sets are evolv-
ing to include web message boards, weblogs, instant messaging conversations,
VoIP enabled meetings, and traditional emails and documents. Both pieces of infor-
mation are extremely important to managing the overall rapid response requirements
that a crisis poses.
Many of the systems that are used for both structured and collaborative data today
are being enabled to work within a service-oriented architecture. More and more ven-
dor product lines are enhancing their overall use of a web service or message based
APIs to enable easier reuse, integration, and communication from a system level
perspective with each other. This loose coupling of strategic system level services of-
fers an Information Technology (IT) team the capability to use and reuse these provi-
sioned services in ways not often thought of at design time. Additionally, the service
oriented infrastructure and vendor offerings are becoming very mature and powerful.
There are SOA technologies to enable service discovery and governance, reliable
message routing and transformation, security from the infrastructure edge to service
application layer, and service level agreements at a granular level. Many of the cur-
rent SOA benefits can significantly impact an IT teams strategy, overall effective-
ness, and agility; however, the overall benefits of a SOA must impact an overall mis-
sion, incident, or organization. The challenge that is ahead is leveraging the overall
investment in SOA to correlate human workflow and service orchestration with over-
all agility and benefits realized during a crisis situation.
Using a business process management system as an integral core service enables
the ICROSS line of solutions and its customers to find ways to work faster, adapt
quickly, and reduce errors before, during, and after a crisis situation. Each function
within ICROSS outlined in the executive summary incorporates a series of loosely
coupled services that are orchestrated by modeling the processes of an organiza-
tions incident management plan. These models are then saved and processed as
services within the BPMS creating a repository of repeatable and reusable assets for
incidents of similar nature. BPM addresses both human workflow as well as system
process management across organizational boundaries that often plague systems
that have embedded workflow and integration with heterogeneous systems.

The NRP and NIMS are excellent guidelines; however, the definition of new events
and incidents during a crisis situation changes depending on many variables un-
known until the crisis is imminent. An incident response system needs to be agile
enough to change the underlying process models to account for those unknown
variables before, during, and after the incident.
The agility provided by a BPMS layer within a SOA for an incident crisis response
support system can save human lives and a substantial amount of time and embar-
rassment for the organization responding to the crisis. While many portions of an op-
BPM AND SERVICE-ORIENTED ARCHITECTURE TEAMED TOGETHER
44
eration, organization, or system are task oriented they seemingly focus on accom-
plishing that task as quickly and efficiently as possible; however, they do not often fo-
cus on the overall impact of that task on other organizational process or workflow.
This creates a misplacement or loss of data. In many cases the data that may not get
captured is collaborative data which helps give important, often critical, context to the
structured data captured via forms. Getting accurate information into the hands of the
person who needs it most at a given time during a crisis could mean the difference
between saving a life and losing a life. The application of BPM will enable organiza-
tions new levels of visibility, predictability, and accountability that can be managed as
the incident mission is unfolding. Being able to manage the process lag across or-
ganizational components can have an incredibly positive impact on the overall mis-
sion critical operation taking place.
UNDERSTANDING SERVICE-ORIENTED ARCHITECTURE (SOA)
SOA is an architectural approach that enables the creation of loosely coupled inter-
operable business services that can be easily shared within and between enter-
prises. Most business analysts can see the inherent value in analyzing the proc-
esses that describe the operation of their organization for the purpose of cataloging
and further optimizing the processes so that the business can improve its overall
performance.
Department-wide
SOA up 300%,
Enterprise-wide
SOA up 200%
Organizations Are Moving
Forward Now With SOA
Q: What Stage Is Your Organization Currently In
With Respect to SOA?
Don't Know
Not Planning to
Deploy
Evaluation
Pilot Projects
Department-wide
SOA
Enterprise-wide
SOA
2005
2006
32%
21%
20%
13%
4%
8%
12%
7%
25%
28%
12%
16%

Figure 2.1
8
:
In the realm of enterprise systems, there is a similar decomposition of concepts
that can help organizations derive real value from existing infrastructure. The first
step is to change the way we perceive our existing IT assets: instead of thinking of
them as providing functionality, we need to consider that a collection of functional-
ity which provides value to our enterprise can be decomposed conceptually in to
services. That is, systems are usually composed of services, which provide func-
tionality that provides value to the enterprise.
A Service-Oriented Architecture (SOA) is, by its very nature, built upon those ser-
vices. The true value of an SOA is found in reuse and agility. While these attributes
BPM AND SERVICE-ORIENTED ARCHITECTURE TEAMED TOGETHER
45
have been promised by various methodologies and technologies, what differentiates
a Service-Oriented Architecture from other approaches is that it is uniquely geared
to encourage reuse for generations of applications which will last not just years, but
decades. Systems implemented today may live beyond the lifetimes of their original
implementers in the form of virtualized enterprise applications managed as black
boxes that are defined by their interfaces.
Were able to complete development projects fast, and do it without wasting
taxpayer money. We can develop functionality for $50 million that would have
cost us $120 million or more in the past. I attribute much of that to reusability.
We dont have to reinvent the wheel with every project. Not only does it save
time and money, it reduces risk. We dont have to gamble on multiyear devel-
opment projects that may or may not be useful upon completion, said Lt. Col
Steven Zenishek, U.S. Air Force on the SOA implementation for the Department
of Defense Distributed Common Ground System (DCGS), SOA allows us to
move away from traditional application development. We dont have to spend
years developing a monolithic system that could be obsolete before it goes live.
We can bring capabilities to troops rapidly. We can reuse work on subsequent
phases of the project. Frankly, we may deliver capabilities in 10-15 years that
arent even on the drawing board yet. SOA gives us that flexibility. It ensures
that investments we make today wont inhibit our progress in the future.
10

The Service-Oriented Infrastructure
Services are typically discovered or constructed. In the first case, a service may
be a defined by codifying the interfaces of existing applications and IT assets so that
they can be reused. One way to describe this is to say that those existing assets are
being service-enabled.

Figure 2.2: Infrastructure needs addressed by SOA.
When services are created from scratch this requires that we plan for the func-
tionality that makes the service easy to interact with (for humans), and easy to inte-
grate with (for other, external systems). To achieve this ease-of-reuse, it is typically
desirable to leverage an application platform that makes it easy to build reusable
component-based services such as J2EE or .NET. At the bare minimum, these
newly constructed applications should leverage standards-based protocols and in-
terfaces such as SOAP, XMLRPC, IIOP, DCOM, and RMI in order to maximize their
potential as reusable services in the context of the enterprise. In essence, to service-
BPM AND SERVICE-ORIENTED ARCHITECTURE TEAMED TOGETHER
46
enable a new application is to implement it with an eye toward making it accessible
to the world outside for the purpose of making it easier to reuse.
This is not to say that service-enablement requires developers to write code. On the
contrary, tools like an enterprise service bus (ESB) can serve as mediators between
existing IT assets and potential service consumers by providing protocol translation,
message transformation, and message enrichment, as well as the ability to instru-
ment, monitor and manage the endpoints of services. As a result, an ESB may
make even the most isolated resource accessible in a new context. For example,
consider a mainframe application that only understands file drops. This applica-
tion could be made accessible as a SOAP service via an ESB that provides a web
service faade that turns the incoming request in to a file for consumption by the
mainframe application.
Process Orchestration User Interaction
Security Services
Data and Information Services
Message Services
Business Process Management
Business Rules
Enterprise Connectivity
Business Activity Management
Portal
Multi-channel
Collaboration
Interaction Management
Federated Identity Management
Distributed Application Security Management
Business Intelligence
Composite Data Management
Unified Meta Data Repository Unified Data Modeling
Service Manager
Message Management Service Registry

A large-scale SOA deployment also benefits from the use of tools to assist in cata-
loging and tracking the evolution of services throughout the enterprise. Reuse re-
quires that services be advertised in some manner, and one of the most common
means for doing so is to catalog services in a service registry. Artifacts from the de-
sign & implementation of services and their consumers should be managed in a
service repository so that future efforts can leverage not only the services them-
selves, but knowledge gained by the people involved in creating those services.
The Service-Oriented Ecosystem
A Service-Oriented Architecture requires an enterprise to embrace a methodology
about how applications and other IT assets are built, deployed, managed, and lev-
eraged that is focused on service-enablement. While most vendors would like their
customers to believe there is a magic bullet for implementing an SOA, the reality
is that there is a need for governance in these systems.
Service-Oriented Architecture governance isnt an optionits an imperative.
The bigger the SOA, the more governance it needs, and the more complex the
governance roles and mechanisms must be. Governance arrangements take a
BPM AND SERVICE-ORIENTED ARCHITECTURE TEAMED TOGETHER
47
long time to design and install, and are difficult to enforce, but without them,
every SOA project out of pilot phase is at risk, says Paolo Malinverno, Gartner
Group, Service-Oriented Architecture Craves Governance, January 20, 2006
This means that implementing an SOA requires people and processes, not just
products. Case in point, consider an application that is rolled out in to production
as it becomes entrenched in a service-oriented enterprise, an ecosystem of service
consumers who leverage and reuse that applications services will sprout up around
it. Some of these consumers are services themselves (composite services). Every
time a new IT project or business process can leverage existing services, money is
saved and IT resources can be focused on providing new and unique features that
add value to the enterprise instead of constantly revisiting existing applications to
add new capability.
The problem that inevitably arises is that the consumers of a service are now de-
pendent upon that services interfaces remaining consistent. Any efforts to alter the
service, by reengineering or refactoring, must take in to account the ecosystem of
service consumers that rely upon the service. As a result, the argument for imple-
menting a form of SOA governance to manage the lifecycle of services within an or-
ganization is fairly obvious: altering a service carries with it possible consequences
that could be catastrophic. If services are expected to last decades instead of years,
proper management of those services is foundational to making sure that as IT
evolves, the foundation laid by SOA will continue to provide value to the enterprise.
Return on Investment

Figure 2.3: SOA Return on Investment
Over time, a business may build up an entire catalog of services. Business func-
tionality can then be fluidly accessed and reused across many different systems
and by many internal and external service consumers. In this way, SOA can elimi-
nate duplicate data, inconsistent application of policy, rekeying of information, and
human error at the tactical level, while enabling strategic change at the business
process and business capacity level.
COMBINING SOA & BPM FOR AN ROI MULTIPLIER
The relevance of SOA with regards to BPM is that, at its simplest level, a business
process defines the interactions between people and services provided by enterprise
systems SOA is a paradigm that allows BPM to flourish and multiply the value of
the investment in creating and reusing services enterprise-wide.
In 2007, BPM will become the driver for SOA implementations. The technology for
the convergence of BPM and SOA may not fully mature until 2010, but organiza-
tions should adopt "process architecture" now if they want to take a leadership role
in this trend.
7

BPM AND SERVICE-ORIENTED ARCHITECTURE TEAMED TOGETHER
48

This has immediate impact: a service-oriented architecture, when combined with a
BPM solution, guarantees the value of the overall architecture. As an organizations
business processes evolve and are further refined, the services they leverage will be
reused many times over without requiring that they be refactored or reengineered.
Services deliver value as they are reused within the context of developed applica-
tions, but the services built out as part of an SOA are reused repeatedly within
business processes to provide an even richer set of functionality. Ultimately, even
the business processes themselves can be treated as services, to be reused within
even more complex composite applications and business processes.
Business Process Improvement
Opportunity Life Cycle
Analyze
Automate
Optimize
Innovate
R
e
t
u
r
n

o
n

I
n
v
e
s
t
m
e
n
t
Time
SOA ROI
&
Point of Reusability
Multiplied by SOA
Combined ROI = BPM ROI x SOA ROI

Figure 3.1a: BPM & SOA the ROI Multiplier
1
BPM AND SERVICE-ORIENTED ARCHITECTURE TEAMED TOGETHER
49

Figure 3.1b: BPM accelerating the ROI of
SOA.
BPM and SOA enhance each other bi-
directionally. Once a process is auto-
mated by BPM it becomes a reusable
service. So, SOA enables both the faster
deployment of BPM and the enablement
of its reusability. Likewise BPM provides
orchestration, control, and governance of
the SOA infrastructure. This means that
BPM makes SOA real to the business.
Operational managers and staff not only
can have a better understanding of the
underlying services, it also allows com-
mand and control over how services are
used as separate entities and when com-
bined in logical processes.
BPM and SOA are not just technologies; they are also their over-layered methodolo-
gies. In this manner BPMs and SOAs impact on agility is more than the sum of
their parts. This is because both the methodologies and technologies of BPM and
SOA are so compatible.
BPM and SOA together
Enables business agility and IT agility
Implement
Model
Execute
Measure
Manage
Architect
Execute
Implement
Business
IT
Enables
Business
Agility
Enables
IT Agility
Process Lifecycle
Service Lifecycle
Discover Services Consume Services

Figure 3.2: BPM and SOA enabling Agility for both IT and Business
It is easy to see that the combination of BPM and SOA provide agility to both IT and
business stakeholders. Even more importantly, both provide further abstraction
from underlying technologies and a framework that allows closer collaboration.
Successful improvement in the three Cs, Communication, Command, and Control
not only reduces implementation risks, but also enhances organizational agility.
Case Study #3: US Intelligence Agency
12

Before BPM implementation:
A weekly status report process existed where employees would send an e-mail to their man-
agers at the end of the week. The first line of Managers would review the e-mails, and ap-
prove or reject them. Once all approved, they would be combined into a single document,
and forwarded on to the next level of management for a second review. A third level of man-
BPM AND SERVICE-ORIENTED ARCHITECTURE TEAMED TOGETHER
50
agement review also existed, and sometimes specific accomplishments would be selected
from the status reports to be posted in various publications (monthly newsletter, company
website, etc.) At any point in the review process, an item could be rejected, and could get
sent back all the way to the original submitter. Many administrative assistants were involved
in coordinating and organizing the content for review up the management chain.
After BPM implementation:
The organization invested in ALBPM, the workflow component of the ALUI suite of products.
ESS developed a workflow process to map out the various steps of the process. Web-based
forms were developed to capture the status reports, and others for Management to ACCEPT
or REJECT. ESS also used various 3
rd
party software products that exposed their services.
Using the ALBPM designer tool, ESS utilized the 3
rd
party services and Portal APIs to then
automate a lot of the system process that were previously done by people in the process.
Hence the Process has been streamlined: Every Friday, the process engine automatically
sends an e-mail out to each researcher. Each e-mail contains a link and reminder to fill out
their status reports. The link goes to the SSO enabled portal, automatically authenticates
them to their workflow inbox. From here they fill out their status report. When they submit the
form, the data is stored into a database. The status reports are automatically routed to the
workflow inbox of the researchers respective Focus Group Manager for review. The Focus
Manager categorizes each status report item for reporting and metrics, and can accept or re-
ject it back to the researcher. After all the status reports for the Focus Group have been ac-
cepted, an automatic step in the workflow executes a series of code snippets to create the
new folder in the Portals knowledge directory, and combines the status report data into a sin-
gle HTML document. The HTML document is then added to the folder and made searchable.
The workflow then routes each status report to the inbox of the Office Manager, who can fur-
ther review each item. The Office Manager can ACCEPT or REJECT, and make final deci-
sions on where selected items should be published. The item is then routed to the Legal team
for review, to Accept or Reject. Any time an item is rejected the user has the ability to add a
rejection rationale, which follows the item all the way back to the person who eventually re-
solves the rejection. After the Legal team has approved the item, it is archived, then routed to
the appropriate publication end point. This can be in the form of an email, a webservice call, a
database insert, or some other system API call to initiate the process of final publication, de-
pending on where the item is being sent. For example, the newsletter editors prefer an email.
Reporting and Metrics portlets have been developed that provide summary management in-
formation on the status reports; for example, Quantity per Focus Group for quarterly report-
ing. From the users perspective the entire process has been streamlined and by leveraging
the services exposed by the portal many of the tedious tasks have been automated.

This example demonstrates how a BPM tool can orchestrate both human activities and sys-
tem activities. For business processes that include both, ROI will be more obvious for the
BPM AND SERVICE-ORIENTED ARCHITECTURE TEAMED TOGETHER
51
business systems that expose services that can be leveraged by a BPM tool. The obvious
fact is that organizations that adopt the SOA approach and invest in BPM technologies
quickly see the direct relationship of the success of both.
KEYS TO ARCHITECTING SOA AND BPM SOLUTIONS
With Service-Oriented Architecture and Business Process Management being such
new concepts, it is easy to understand why government agencies feel a sense of
panic when it comes to implementing these two technologies. The key to success-
fully implementing BPM and SOA projects is to start small, think big, and move
fast.
In the past, stakeholders were willing to endure 12-18 month development cycles
before realizing any benefits from a new system or technology. However, today,
stakeholders demand immediate results from the technology initiatives that sup-
port the business. In particular government expects SOA and BPM to pay dividends
in much shorter cycles than previously expected. In some cases, government BPM
initiatives are looking to realize ROI within three to six months.
Our goal is to identify quick wins that automate processes across several de-
partments, reported David Atwood, Director of e-Government for Government
of Bermuda
15
. Atwood continues on, Using this approach, we expect to realize
immediate return on our investment in process automation.


Project selection is the single most impor-
tant aspect of successfully implementing
Business Process Management and Ser-
vice-Oriented Architecture within the gov-
ernment. If the first project selected to
showcase these technologies does not
match well with their capabilities, the
likelihood of carrying out follow-on pro-
jects is very slim. When evaluating which
projects represent the best fit, stake-
holders should create a prioritization ma-
trix that analyzes process complexity ver-
sus process impact.

Start Small. Projects that represent the least complexity and have lower impact
should be implemented first. These projects are usually the low hanging fruit that
can be implemented to validate the benefits of implementing BPM and SOA before
committing to enterprise-wide deployments. For example, human resources or con-
tracts management projects usually provide low complexity and lower impact; how-
ever, can serve as initial safe projects for validating the benefits of deploying BPM
and SOA. It is important that each organization takes the time to evaluate and as-
sess which projects represent the greatest opportunity for success in delivering
measurable results in short time frames.
Think Big: As agencies implement their first SOA and BPM projects, they should
always consider the big picture of how the resulting services and processes might
be used by the enterprise. In many cases, this big picture is maintained by the
agencys enterprise architecture initiative. In some cases, agencies are establishing
Business Process Management and Service-Oriented Architecture Centers of Excel-
lence to promote governance, best practices, and cross-departmental collaboration
on BPM and SOA activities.
BPM AND SERVICE-ORIENTED ARCHITECTURE TEAMED TOGETHER
52
Move Fast: In order to speed adoption and win over skeptics, agencies will need to
deploy their first Service-Oriented Architecture and Business Process Management
project within three to six months from project inception. This time frame should
serve as a guide during the project selection phase, mentioned above. With SOA
and BPM, Implementation speed is more important than implementation size,
points out Dale Tuttle, Practice Lead with Project Performance Corporation
16
. Tuttle
has supported several large federal customers with their initial forays into Service-
Oriented Architecture. Gaining momentum and support are key elements to secur-
ing buy-in for SOA, reports Tuttle.
SOA AND BPM INITIATIVES MUST BE CONNECTED WITH THE AGENCYS ENTERPRISE ARCHITECTURE
INITIATIVE
In the federal government, Enterprise Architecture (EA) drives all technology initia-
tives, investments, and governance. Service-Oriented Architecture and Business
Process Management must respect the role of Enterprise Architecture. It is impor-
tant to note that most Business Process Management and Service-Oriented Archi-
tecture projects are likely to fail or become derailed if they are not connected to the
agencys Enterprise Architecture initiative. Until recently, most agency technology
initiatives were allowed to operate and live within their own silos.
This autonomy was revoked when the federal government introduced the Federal
Enterprise Architecture (FEA) Program in 2002. Led by OMB, the FEA provides
guidance and oversight for each agencys progress with defining and maintaining
their enterprise architectures. OMBs assessments are used by the federal govern-
ment to justify agency funding and budget requests.
At the agency level, enterprise architecture acts to inform strategic decision-
making to help government executives and CIOs make better decisions regard-
ing technology investment, strategy, and long term planning, according to Pat-
rick Heinig, a former Senior Enterprise Architect with experience from EA pro-
grams at the Department of Energy, Commerce, and the Environmental Protec-
tion Agency. Heinig also served as a Federal CIO and points out that given the
guidelines provided by the Federal Enterprise Architecture Program, and the
emphasis being placed on the Federal Transition Frameworks cross agency
initiatives, technology investment decisions can no longer be made in a vac-
uum. In this new paradigm, agencies are being forced to align technology in-
vestments with their overall vision and strategy for accomplishing their mis-
sion.
17

Heinig believes that agencies must begin to incorporate their Business Process
Management and Service-Oriented Architecture plans into their current enterprise
architecture models in order to ensure proper levels of funding and investment.
While SOA and BPM projects may start out on a small scale, if the initial projects
are successful, these successes will need to be deployed across the enterprise. If
these initial steps toward SOA and BPM are not connected to the organizations en-
terprise architecture, then they surely face the possibility of extinction down the
line.
In fact, the federal government has established some basic guidelines for how busi-
ness processes should be documented within the federal enterprise architecture
framework. In addition, the FEA program is in the process of publishing the Practi-
cal Guide to Federal Service-Oriented Architecture (PFGSOA). Once published, this
framework will provide guidance to federal agencies on best practices for incorpo-
rating SOA into their enterprise architectures.
BPM AND SERVICE-ORIENTED ARCHITECTURE TEAMED TOGETHER
53
CONCLUSION
Over the next ten years government agencies will be challenged to accomplish more
with smaller budgets, less human capital, and fewer technical resources. In order
to meet these challenges, government executives and CIOs will lean heavily on
technology that encapsulates and executes the organizational knowledge that has
been developed over decades of operation. In addition, agencies will seek to produce
technical models that mirror the way that business is done. In short, agencies will
begin to leverage solutions that automate their business processes and deliver their
services; both internally and externally.
Service-Oriented Architecture and Business Process Management have evolved as
disciplines and technologies that are committed to supporting the business objec-
tives of federal, state, local, and non-governmental organizations. While it is ex-
pected that agencies will undoubtedly realize tangible returns on investment by im-
plementing either one of these technologies, an ROI multiplier effect can be ex-
pected if both technologies are implemented together.
As agencies decide to undertake BPM and SOA initiatives, it is critical that champi-
ons and stakeholders remember: Start Small, Think Big, Move Fast. This approach
will ensure that initial successes are realized in short time frames in order to gain
momentum and capture buy-in across the organization.
With the growing role of Enterprise Architecture, agencies must take time to align
SOA and BPM initiatives to their overarching EA activities. SOA and BPM projects
that do not seek the shade of the EA umbrella do so at their own peril. From within
the EA framework, these projects enjoy the proper funding and visibility that will be
required for success.
As is true of all technology, it is important to look beyond the hype and gain an un-
derstanding of how the technology can measurably impact the business. The con-
vergence of BPM and SOA promise many substantial benefits to both the business
and technology of government agencies. Looking beyond the hype, government
agencies can expect to become more agile and responsive to constituent demands
and the changing political landscape by merging their BPM and SOA initiatives.
END NOTES:
1. BPM & SOA the ROI Multiplier Linus Chow, Charles Medley, and Clay
Richardson, 26 January 2007.
2. Next Generation of Hires Must Be Cultivated Differently and Quickly, OPM
Chief Says,The Washington Post, 21 February 2006.
3. Anthony L. Sousa, Principal. Public Sector Partners.
4. John Kim, Deputy Practice Leader. Project Performance Corporation
(www.ppc.com).
5. Nick Jones, Assistant Director. U.K Government Transformation and Delivery
Group (currently, acting Director, Department of e-Government for Bermuda
Government).
6. BAE Systems (www.baesystems.com)
7. Jim Schneider, Senior Integration Manager at SRA International (www.sra.com).
8. InfoWorld Research Report: Service Oriented Architecture (SOA), conducted by
IDG Research Services, Feb 2006
9. BEA SOA Cost Benefits Survey, Sep 2006
10. Customer Case Study: U.S. Department of Defense, BEA Systems, October
2005
BPM AND SERVICE-ORIENTED ARCHITECTURE TEAMED TOGETHER
54
11. Predicts 2007: Align BPM and SOA Initiative Now to Increase Chances of Be-
coming a Leader in 2010, Jim Sinur and Janelle B. Hill Gartner, 10 November
2006
12. Brian Espinola, Senior Software Engineer and Technical Project Manager, Ex-
ceptional Software Strategies, Inc. (www.exceptionalsoftware.com)
13. Chris Holmes, Technology Director, Mantech (www.mantech.com)
14. BEA Systems, U.S. Department of Defense, SOA for Unified, Global Intelligence,
2005
15. David Atwood, Director Designate. Bermuda Department of e-Government.
16. Dale Tuttle, Practice Leader. Project Performance Corporation (www.ppc.com).
17. Patrick Heinig, Senior Technical Consultant. Project Performance Corporation
(www.ppc.com).
55
Analyzing and Improving
Core Telecom Business Processes:
A Case Study
Lee, Kyeong Eon; KTF Co., Ltd., Seoul, Korea;
Robert Cain, HandySoft, USA
INTRODUCTION
The telecom market is faced with a sharp increase in the number of service sub-
scribers to such a degree that demand exceeds supply. In the past, the industrys
information systems consisted mostly of calculating a customers telephone traffic
and sending a bill. But severe competition and the growing availability of products
in the market space has forced the providers to leverage state-of-the-art technolo-
gies to improve customer service and satisfaction. In order to remain competitive,
information systems had to address this sharp change to customer focus.
A leading telecoms carrier based in Korea, KT Freetel Co. Ltd., (KTF) concluded
that creating customer satisfaction-oriented processes that were integrated with
back-end company systems would enable them to more flexibly and spontane-
ously attract, serve, and keep customers who are sensitive to new technology and
service.
One of KTFs goals was standardization of the companys core business processes.
KTF knew that automation of its core processes could maximize business effi-
ciency. Another goal was to embrace a Business Process Management (BPM) and
workflow platform making it visible as an enterprise asset, and promoting stan-
dard processes throughout the organization. Promoting the BPM & Workflow Plat-
form within all business units provided a common framework for process im-
provement, and allowed KTF to focus more on process standardization. KTF also
established process consolidation as a goal, and used the BPM & Workflow Plat-
form to integrate business processes wherever possible, allowing human re-
sources to be utilized more efficiently.
THE CHALLENGES
KTF identified and categorized their list of challenges as a Business, Information
Systems, or Organization challenge:
Business: Due to a lack of standardized work methods, individual experi-
ence and knowledge was a key dependency in daily operations, which
lead to many errors. To improve the process, KTF needed a common lan-
guage that could be shared among team members. Namely, the demand
for continuous improvement in work processes and standardization was
increasing. In order to adapt to rapid changes in their business environ-
ment, KTF needed to create a very visible process and a strategic method
for change management.
Information Systems: Maintaining and increasing customer numbers
was one of KTFs prior goals. They built CRM and CTI technology-based
call centers using independent, commercially available technologies. But
KTF found that their call center system could not deal with their dynamic
ANALYZING & IMPROVING CORE TELECOM BUSINESS PROCESSES
56
and rapidly changing business environment. They desperately needed an
integrated, process-based application.
Organization: KTF had experienced significant growth in a very short pe-
riod of time. Therefore, they needed to set up and standardize processes
that previously were not even considered in the organizations original vi-
sion. In addition, KTF needed to build cooperation among teams to make
the processes more flexible and rational.
THE IDENTIFIED TASKS
The tasks that KTF identified in order to meet these challenges can be summa-
rized as follows:
Create processes that were a corporate asset by making them visible.
Integrate system and human resources based on the defined processes.
Secure work transparency and productivity using clearly defined busi-
ness rules and a work system that was integrated with existing and new
systems.
THE CORE PROCESS IMPROVEMENT GOALS
KTF needed to set up a systematic process and introduce tools to deal with cus-
tomer demands for various services and to remain competitive. KTF divided their
BPM & Workflow installation goals into four categories:
I. Core Process Standardization and Build-Up
Set up the processes to be shared among team members.
Minimize work errors through process standardization.
Facilitate transition of work.
II. Core Process Automation and Management
Create a faster work process by setting up a Work-Portal with current
work status and To Do List.
Maximize work efficiency by changing the system from a pull method to a
push method.
Prevent work delays in advance by setting up a real-time monitoring sys-
tem to check work progress.
Shorten work hours.
Reduce simple/redundant work.
III. Process-Centered Rearrangement of Resources
Secure agility and flexibility through a safe transition from the existing
systems into the new process-centered system including integration with
and connection to a transactional system.
Strengthen cooperation among team members through process-centered
work.
Set up an efficient management base to discern and align major IT and
human resources.
IV. Process Improvement through Work Management History
Pinpoint and solve process problems through work management history
that includes monitoring and statistical indices.
Secure business transparency through process monitoring.
Test work performance through development of a process management
index.
ANALYZING & IMPROVING CORE TELECOM BUSINESS PROCESSES
57
CORE PROCESS CLASSIFICATION, ANALYSIS & PRIORITIZATION
KTF first started to introduce BPM & Workflow in 1999 and drove to apply BPM &
Workflow to all of its processes. The IT team (e-Management Team) led the move
gradually to expand BPM & Workflow beyond a process-by-process productivity
tool to a company-wide system to increase work efficiency and convenience by
setting up a user-oriented portal.
The e-Management teams first step in deciding which processes to automate first
was to classify core processes according to function and business areas. As a re-
sult, they divided all processes in the company into mega processes and then fur-
ther into process chains that relate directly to the business areas:

Mega Process, Process Chain & Process Classification
Beyond the Mega Process and Process Chain categories, processes are further
broken down into 77 detailed processes. For example, in the process chain, Pur-
chasing is broken down further into Purchase Invoice, Purchase Request Form,
Warehousing, and Contracts.
The KTF e-Management Team, departmental management, and the BPM & Work-
flow Platform solution provider chose core processes based on these criteria and
multiple discussions. BPM & Workflow were applied to the core processes accord-
ing to the criteria shown in the table below and on the effectiveness and accessi-
bility of the processes involved.

Criteria Detail Contents
Process Process
Significance
Is it a core process that contributes to
cost, customer, or quality improvement?
ANALYZING & IMPROVING CORE TELECOM BUSINESS PROCESSES
58
Criteria Detail Contents
Related
Organization Set Up
Are there many process-related teams
and workforce that will use the process?
Standardization
Need
Is there a need to strictly define the
process and related controls so that
they follow standardized rules?

Need To Manage
Change
Is there a need to oversee the process for
continuing process improvement?
Automation
Request
Is there a need to automate work, dis-
tribution, or workflow based on work
guidelines?
Communication Is there a need to improve communica-
tion between teams and person in
charge to carry out the work?
Workflow
Handling
Work Precision
Upgrade
Does work have to be repeated due to
frequent errors?
Work Speed
Upgrade
Is there a need to speed up the work
process?
Process
Manage-
ment
Work Control
Tightening
Is there a need to monitor the work
process in real time?
Related Systems Are there various systems related to the
process?
Information
Systems
Soft Copy
Information
Management
Is there a need to manage soft copy data
(files, etc.) that accompany or are related
to the process?

When choosing processes for BPM & Workflow initiatives, the teams considered
the above criteria and then prepared a list of process details.
After understanding the business sector and process details, the priority of BPM
& Workflow projects was decided by analyzing process effectiveness and accessi-
bility. In addition, KTF knew that introducing BPM & Workflow would be very dif-
ferent from previous work methods and they expected users to be resistant to
change. Therefore, they wanted to also focus on processes that would have the
most positive impact not only on the company overall but on those people actu-
ally doing the work.
ANALYZING & IMPROVING CORE TELECOM BUSINESS PROCESSES
59

BPM & Workflow Application Target Process Prioritization
As a result of the teams analysis, the first sectors to which process improvements
were applied were part of the Purchasing, Value Assessment, and Six Sigma sec-
tors. They identified Purchasing and Invoicing as the main areas that could bene-
fit from BPM & Workflow as those users needed a transparent and speedy work
system.
The latest sectors to which the process improvements were applied were Over-
sight and Performance Assurance (OA) and Integrated Account Management.
The BPM & Workflow process application was expanded in nine sectors and plans
are underway to further expand to all company systems.
THE PROCESS PORTALTHE WAY PEOPLE WORK AFTER BPM & WORKFLOW
By using BPM & Workflow, IT resource integration centered on process and se-
cured system flexibility through the integration of back-end systems and presen-
tation of those system capabilities to users through the BPM & Workflow Plat-
form-powered FreeNet portal.
Better Management through Separation of Processes and Applications
KTF approached BPM & Workflow by splitting business processes and application
logic, thus reducing the burden of always having to account for process change
and variation within applications. The separation afforded KTF a more simplified
approach to application development, which increased development productivity
and reduced maintenance. This approach also yielded more simplistic application
logic that could be componentized for more optimal maintenance.
Ease of Access to Knowledge and Information
KTF maximized access to internal and external company knowledge by instituting
a portal to facilitate online collaboration. Web-based user interfaces could be cus-
tomized to meet individual user needs. Users were pleasantly surprised to find
that multiple applications required for completing work could be consolidated into
ANALYZING & IMPROVING CORE TELECOM BUSINESS PROCESSES
60
a single screen, providing a single point of access for information required across
several applications. The BPM & Workflow Platform also provided the capabilities
to attach files and associate comments for community sharing and collaboration.
Centering all Systems on BPM & Workflow
Prior to embracing the BPM & Workflow Platform, KTF utilized ERP (SAP), FreeNet
(Enterprise Knowledge Portal) Intranet, e-HR, CReaM (CRM), and other systems
as stove-piped technologies. Users had to connect to each system individually to
conduct work. All the distinct systems are now centered on BPM & Workflow, al-
lowing KTF to respond to diverse business environments, as well as rapidly
changing management strategies and goals.

Centering All Systems on BPM & Workflow Platform
The BPM & Workflow Platform serves as a technology footprint, allowing users to
work in an integrated work environment. Under this situation, if resources
change, if goals change, or if applications change, everything is managed through
processes, providing maximum flexibility.
Before KTF embarked on its BPM & Workflow initiative, much of a typical
workers productivity was influenced by factors related to finding work, logging in
to multiple systems, locating lost work, finding work inherently related to several
business units, and managing changes in existing work. And, with most work
being conducted offline, productivity was reduced even further through efforts to
share work with others for collaboration. Workers spent significant time searching
individual systems for work or information; and once a search was completed,
there was little information about how it related to other parts of the organization.
The BPM & Workflow Platform provided a clear and simple approach to improving
resource productivity. Although the platform enhanced many aspects of workflow,
it also provided a management framework for linking all systems associated with
Sales, Subscriber (Customer) Support, and Financial Management. By integrating
all of these systems, work is delivered automatically to the appropriate resource,
while providing visibility into resource collaboration. By exposing resource input
for all work deliverables, work is processed more efficiently among all groups
ANALYZING & IMPROVING CORE TELECOM BUSINESS PROCESSES
61
within the organization, while at the same time sharing feedback between all re-
sources. This type of collaboration helps teams work more effectively together.
KTF employees can access all their work through FreeNet, the KTF intranet. This
KTF BPM & Workflow portal enables users to see the number of work items that
must be completed, all future work, work that must start now, work that is cur-
rently being processed, etc. Each user can see the assigned worklist in the Work
Space Full Screen:

The BPM & Workflow Platform-Powered Work Portal
Employees can look up progress of their work assignments by viewing the status
of work, as well as search for specific work within the portal. Even if the work is
from several different systems, FreeNet provides users with a single point of ac-
cess to work categorized by priority of completion. Users no longer have to log in
and log out of each system. All login information is handled seamlessly by the
BPM & Workflow Platform.
A CLOSER LOOK AT A PROCESS IMPROVEMENT EXAMPLE
The VoC subscriber claim (customer care) process shown in the process flow dia-
gram below was developed from March of 2005 to March of 2006 as a process
expansion and improvement on the VoC process that was originally developed in
August of 2004 to January of 2005.
This process was engineered to optimize an off-line, manual process that could
not handle customer claims effectively. The CReaM CRM system can initiate the
ANALYZING & IMPROVING CORE TELECOM BUSINESS PROCESSES
62
process automatically, distributing work to each team as required, while provid-
ing access to each system seamlessly.
Telemarketers receive the VoC from CReaM, and the VoC automatically links to
the BPM & Workflow Platform, where it is routed to the Operational Head Office,
Handling Organization, and Middle/Completed activities which display all
work within the FreeNet portal. Once a process is completed, it can be retrieved
for informational purposes to assist with knowledge support.

The Automated VoC (Subscriber Care) Process

VoC Process Automation Benefits
A summary of the benefits of utilizing the BPM & Workflow Platform for the VoC
process is provided below as they apply to Management, the A/S Center, the IT
Team, and customer satisfaction.

Team Effect
Management
VoC process standardization eliminated the arbitrary
prioritization of employee worklists. Standardization
of the VoC process also increased the efficiency of
work processing, and minimized educational training
for new workers.
Provided visibility into the VoC process which is very
critical to achieving customer satisfaction goals.
Increased customer data collection which prevented
potential customers from leaving. For example, it is
possible to classify data as a complaining customer,
or VIP, and the system can come up with a strategic
marketing plan for each customer.
VoC-related teams such as A/S center and the IT
team now operate within a unified communication
platform.
A/S Center
Absolute guarantee that registration of feedback on
VoC is transferred to IT team.
Customer feedback can be updated using prior VoC
ANALYZING & IMPROVING CORE TELECOM BUSINESS PROCESSES
63
Team Effect
knowledge.
IT Team
Excessive use of VoC is prevented.
Visibility into VoC in customer transfer process.
Systematic analysis on each employee based on VoC
performance.
Customers
Fast feedback and response to customer complaints.
Improved customer service.
THE OVERALL RESULTS
As KTF introduced BPM & Workflow, the following key business innovations, both
quantitative and qualitative effects, were noted:
Quantitative Effects of BPM & Workflow Initiatives
Reduced work cycle times.
Prevention of work delays through automatic work notification.
Use of electronic data (invoices, purchase request forms, order forms, etc.)
resulting in minimized paperwork.
Electronic document transfer reduced work feedback time and prevented
redundant input of identical data.
Real-time, step-by-step management allowed better collaboration of work
between related departments.
Reduced claim response times increased workers productivity.
Process designs can be easily understood and changed quickly and easily
in the work environment.
Qualitative Effects of BPM & Workflow Initiatives
Work Process Standardization
Standardization of work processes on the BPM & Workflow Platform.
Standardization through user authority setup and business rule applica-
tion.
Users are able to easily perform tasks without relying on a manual.
All parties in a process from start to finish can take the work through a
standard process and they can clearly see the object and contents of the
work.
Exceptions can be automatically flagged and follow-up measures can be
automatically initiated according to pre-defined business rules.
Work Process Status Monitoring Improvements
Work process detail management and tracing of the work according to the
defined process strengthens work transparency.
Progress status monitoring, automatic work notification, and process visi-
bility improves management levels.
Process status such as due date, emergency, delay, etc., can be checked
continuously.
Company cooperation system is set up to respond to customer demands
in real time.
Simplified reporting function enables easy inspection of process efficiency.
Ability to introduce new services in a timely manner through fast-paced
work processes.
Work management, work negotiation, and managerial functions are im-
proved.
ANALYZING & IMPROVING CORE TELECOM BUSINESS PROCESSES
64
Ability to manage work load, bottle-neck incidence, and problem type by
each unit.
Statistical Data
Process and unit work-related data is automatically collected.
Work collaboration improved.
Communication with workers on meeting agenda and work instructions
are sent instantly through e-mail and notifications.
Upon completion of previous work, the work is transferred to the workers
work list in real time.
Measurable Benefits
The following measurable improvements were directly recorded as a result of the
BPM & Workflow Platforms being deployed at KTF:
Savings Outcomes
Cost Savings

Shortened work hours, ~10-15% cost reduction.
Easy to see the work progress, which reduced idle work
hours: 20 hours/month.
Reduced labor costs required to improve the process: 10
million won Korean/ month.
Increased
Revenues
Product development time span was reduced, improving
product settlement rate: Annual average product develop-
ment case: 0.15 0.18.
Work adoption period was reduced by over 10 %.
Knowledge proficiency period was reduced by over 10%.
Productivity
Improvements
Quantitative Productivity Improvement
Work process period: 15 days 10 days.
Data collection period: 5 days real-time.
Per unit work process average hours: 48 hours
2 hours.
Average claim settlement period: 15 days
11 days.
Cost to settle 1 claim : 3 million won 2.2 million won
Korean.
Customer claim on quality reduced: 100 cases/month
65 cases/month.
Qualitative Productivity Improvement
Existing off-line and paper-based settlement and product
development applications were automated.
Product development process is now transparent, making
forecast on product development schedule possible.
All requests for product development and related feedback
are unified into the BPM & Workflow Platform, thus facili-
tating communication and minimizing any confusion over
requests.
Product development process participation result leads to
career history and liability management.
All requests for product development processes and appli-
cations are now standardized.
ANALYZING & IMPROVING CORE TELECOM BUSINESS PROCESSES
65
BPM & WORKFLOW MOVING FORWARD
KTF will continue to grow its business by applying the BPM & WorkFlow solution.
KTF expects that this strategy will continue to drive KTFs lead in the industry
and strengthen the companys competitive edge in the following ways:
Continuously monitor operations, and incorporate user demands in order
to maximize applications through exhaustive analyses.
Further expand BPM & Workflow to the cut-off process between core sys-
tems.
Continuously monitor processes in order to improve existing processes.
Create and manage an expert group in each core area to activate culture
and cooperation around the BPM & WorkFlow solution.
Promote the BPM & WorkFlow solution through continued highlights of
benefits.
Manage and operate an expert BPM & Workflow group that can carry out
process optimization by studying and collecting process data, while man-
aging and refining those processes.
Establish strategic goals and CSFs (Critical Success Factors) based on
KPIs (Key Performance Indicators) linked with processes and continuously
analyze process performance.
Through the analysis of process performance, create a database on cus-
tomer-related information such as customer service, quality, cycle times,
and costs, and utilize the data to lay a foundation for making fair judg-
ments on each department and employees.
Define the knowledge required for each process, and find out from where
the knowledge comes. Then supply the knowledge just-in-time at point of
need. Accumulate the results of the process to create a useful knowledge
base for the organization.
To deal with prior internal financial and audit controls, link the BPM &
Workflow system with Internal Controls Assessment according to utility
accounting reformation laws.
Set up an Early Risk Warning system for operational risk management.
Secure process agility to set up a base for the Real-Time Enterprise.
CONCLUSION
With over 2,500 employees and annual revenues of $5.6 billion USD, KT Freetel
Co. Ltd., is Koreas second largest mobile communications company. In order for
a telecom company like KTF to gain a competitive advantage within todays rap-
idly changing world, it must have strong technical skills that can meet interna-
tional standards, and it must be able to manage its internal and external re-
sources efficiently and flexibly. However, introducing new technology can cause
companies to overlook their ultimate goal and key benefits of each information
system and individual resource. This can result in decreased competitiveness that
opens the company to the risk of lost opportunities.
KTF realized that in order to exceed customer expectations and make customer-
oriented service a priority, an investment in and commitment to BPM & Workflow
was necessary. Through BPM & Workflow, KTF aimed to automate and standard-
ize operational processes, reengineer inefficient workflows, improve work cycle
times, and provide a platform designed to foster continued process improvement.
KTF selected BizFlow

by HandySoft (www.handysoft.com) as their BPM and


workflow solution.
ANALYZING & IMPROVING CORE TELECOM BUSINESS PROCESSES
66
BizFlow gave KTF the ability to integrate KTFs existing FreeNet (Enterprise
Knowledge Portal) and ERP (SAP) systems. In so doing, KTF was able to maximize
flexibility in IT resource utilization by applying the same resources against stan-
dardized, and consolidated processes. A configurable UI with menus based on
operation type was a plus with the users. User familiarity and efficiency was im-
proved through delivery via a user-friendly portal.
Since implementing the BizFlow solution, internal operating efficiency has been
steadily rising, allowing KTF to better manage its resources, and cut operating
costs where gains could be realized. By establishing better management over its
core operating processes, KTF has been better able to quickly and efficiently cope
with operational changes in a flexible manner and sustain competitive advantage
in the crowded telecom market space.
67
Workflow and Performance
Management
Arnaud Bezancon, Advantys, France
OVERVIEW
Business process automation combined with Business Intelligence provides a
new management platform for "post-administrative" companies.
Key Performance Indicators (KPIs) are more than just additional benefits provided
by workflow management; they actually drive the workflow project itself: To obtain
KPIs, start implementing workflows!
An efficient workflow, associated with the appropriate KPIs, will transform enter-
prise agility: performance will be managed through real-time scorecards and
dashboards meaning decisions can be processed and implemented more quickly
using adaptive processes.
INTRODUCTION
What is "BPM"?
"Business Process Management" or "Business Performance Management"?
Confusion arises because the same abbreviations are used to refer to two different
technological and organizational areas.
Business Process Management models, structures, optimizes andwhere possi-
bleautomates the processes within a company. Business Performance Manage-
ment, on the other hand, allows managers to define and view the relevant indica-
tors in order to oversee the companys processes and business activity.
This chapter will focus on how the most tangible applications interact with each
other:
Workflow for Business Process Management
Performance Indicators for Business Performance Management.
WORKFLOWGENERATING YOUR PERFORMANCE INDICATORS
Collating the data
Checks while entering data into electronic forms mean high-quality structured
information can be collated easily.
Prior to implementing a workflow solution, this data is often lost in e-mails or
underutilized in specific applications. The data contained in the forms is analyzed
and then deployed using Business intelligence software, such as reporting pro-
grams or OLAP technology. The data can be used by performance indicators to
provide quantitative analysis.
WORKFLOW AND PERFORMANCE MANAGEMENT
68
Below are a few examples of workflow performance indicators:
Producing sales proposals for customers: potential sales volume.
Customer claims management: number of product defects.
Credit authorization: credit approval rates.
Product modification management: number of changes made
New employees: average induction time for new employees
Staff contract termination: numbers leaving due to performance-related
problems
The integration functions of workflow engines, including the use of web services,
means that the data from electronic forms can be exported to data warehouses, or
other databases used to generate performance indicators. This, in turn, means
the workflow can obtain more accurate, up-to-date information, thereby generat-
ing more relevant indicators.
Real-time analysis using the workflow stages
The majority of reports used in companies provide information relating to events
in the pastfinancial statements, sales reports, satisfaction levels, quality as-
sessment etc.,meaning steps can be taken to improve future operations based
on an analysis of what has already happened.
The combined use of form data as outlined above and a workflow solution pro-
vides actionable performance indicators as the operations are in progress.
Taking the examples referred to above:
Producing a sales proposal for a customer:
Producing a customer sales proposal involves a certain number of stages. Put
simply, each stage equates with a change in state of the proposal. Let us take the
following states: drafting, technical approval, legal assessment, modification and
final submission to client.
The user may, for example, decide that the proposals approved by sales and tech-
nical staff should be considered as potential sales, even if they have not as yet
been submitted to the clients. The performance indicator will thus be updated in
real time as soon as the proposal has been approved, and without the need to
wait for final submission (administrative stage).
Performance indicators are up-
dated
Dashboard reflects past
data
WORKFLOW IN PERFORMANCE MANAGEMENT
69
This performance indicator therefore allows managers to act on on-going opera-
tions before the process is complete.

Customer claims management:
An initial alert can be triggered if, for a given product, the number of requests in
to be qualified state exceeds a predefined threshold. The alert is issued by apply-
ing a compensation procedure without it being necessary to wait until the prob-
lem has been fully analyzed.
The performance indicator thus serves as a trigger for corrective action, which
may itself be a workflow.
The implementation of corrective action immediately as the initial signs of risk
appear is a good example of how performance indicators can be used to forewarn
of and resolve problems in real time.
Credit authorization:
The performance indicator giving the number of requests for a certain credit
amount pending additional customer information triggers improvement or correc-
tive action. Updating the check-list of documents to be provided by the customer
according to the credit amount is an example of improved procedure which a
bank may decide upon.
A backlog of pending requests affects the transformation rate and, more generally,
the banks competitiveness, potentially leading to a loss of custom due to lack of
reaction. With the help of performance indicators using the various workflow
stages, the organization works in real time to improve immediate results.
Workflow and operational dashboards
In addition to analyzing the information collated during a workflow, the process
can also be operationally monitored.
The traceability features incorporated into workflow solutions mean that certain
relevant data on how well a request is being processed can be consolidated, merg-
ing Business Performance Management with Business Activity Monitoring (BAM).
Put simply, it can be seen that operational performance indicators offer benefits
both in the definition and in the execution of processes.
Indicators can be displayed in operational dashboard format, incorporating the
process specifications. Statistics regarding past actions do not allow performance
to be managed, but simply provide an overview of any problems. Implementing
performance indicators for each process and making them available to supervi-
sors means that the entire organization can focus on achieving the desired strate-
gic objectives.
Performance indicators are up-
dated
Dashboard reflects present
data
WORKFLOW AND PERFORMANCE MANAGEMENT
70
Business Performance Management brings a positive dimension to Business
Process Management, leading to more constructive results analysis.
The issue is no longer How do you explain your departments delays in process-
ing the requests? but rather What actions do we need to trigger when your de-
partments requests are at risk of delayed processing?
Implementing operational dashboards linked to each process means the entire
organization can now access the KPIs.
By combining workflow with performance management, the supervisor of a par-
ticular process benefits from additional resources, thereby contributing more ac-
tively to the companys overall performance.
Supervisors need no longer be limited to carrying out tasks, but can be actively
implicated in performance optimization.
For maximum efficiency, a process dashboard has to comply with the parameters
set by performance management by displaying on a single screen fewer than 10
key performance indicators which, as far as possible, should be an integral part of
the companys strategic objectives. For example, in the instance of a support
process, the indicator assessing how quickly a request is processed and incorpo-
rated into the Customer satisfaction indicator category.
IMPLEMENTING WORKFLOWS TO OBTAIN PERFORMANCE INDICATORS
BPerformanceM & BPprocessM: two projects, one aim
Implementing workflows means considerable data, both qualitative and quantita-
tive, can be collated for integration into operational dashboards and, more gener-
ally, into KPIs.
Systematic workflow deployment to modernize processes is not as yet common
practice, however: e-mail based solutions, ERPs or specific applications are more
often used to partially cover workflow requirements.
These fail to adequately resolve the growing need for relevant, reliable and up-to-
date performance indicators. Using Business Intelligence solutions, such as data
mining for example, means all existing data in the IS can be fully explored and
exploited. The examples above of performance indicators highlight the quality of
the data generated by adopting workflow solutions to optimize processes.
It can be seen that Business Performance Management projects are at the heart
of the increasing need for workflows to obtain KPIs. Initiating the workflow project
in this way has certain advantages:
First, the project is initiated high up the hierarchical ladder, meaning de-
cisions are taken more rapidly and greater resources are allocated.
Second, the resulting performance indicators mean the process specifica-
tions can be improved, particularly in respect of the various stages within
the workflow and the process data, insofar as the data required for the
performance indicators is integrated into the workflow from the outset.
WORKFLOW IN PERFORMANCE MANAGEMENT
71

Example applicationproject management
Let us take a straightforward example of this type of approach.
As part of its Business Performance Management project, a company has to de-
fine a KPI regarding its ability to innovate. This involves collating data on new
product projects. The company begins by exporting the figures from its Project
Portfolio management solution, but is really looking to assess its ability to inno-
vate in respect of the numerous projects put forward (many of which will be re-
jected), rather than those in progress. In addition, it does not want all the projects
submitted, just those that have made it through the initial qualifying stages.
The project approval process currently relies on using a collaborative work area,
but this method is failing to provide the data required for the KPIs.
The project management solution does not therefore cover this pre-process
phase. A workflow solution is therefore deployed to allow the data required for the
KPIs to be collated and then exported to the next stage of the process.
By implementing an operational dashboard to pilot the various activities within
the process, the turnaround required to approve and/or modify the projects can
be reduced, improving the companys responsiveness with regard to new oppor-
tunities.
Implementing workflows and, more generally, a process-oriented management is
a key performance element for any company, and the job process rate with work-
flow is a KPI that can be used to assess internal operating efficiency.
Using performance indicators in the workflow conditions
We now need to exploit the performance indicators in the workflow itself.
Using the same customer claims management scenario, we can now define a con-
dition in respect of a set number of claims on a given product. Requests relating
to said product will be automatically redirected to a special team so as not to
overburden the usual staff, allowing them to focus on the claims on other prod-
ucts and avoiding an overall drop in service quality.
With an IT Change process, the workflow can check the value of a performance
indicator, such as the availability rate of the IT systems involved and add an addi-
tional approval stage should a major system downtime be detected.
Workflow Project leads the Business Performance Management project
Business Performance Management project leads the workflow project
WORKFLOW AND PERFORMANCE MANAGEMENT
72
Using performance indicators in workflows also means that self-adaptive proc-
esses can be used, though this does lead to greater levels of complexity. We not
only need to outline the possible routes of the workflow in a normal context, but
also adapt the workflow to each possible context. For example, we need to define
the customer claims management process in a major product anomaly context.
It is likely that modelling of this kind will eventually be assisted by artificial in-
telligence programs so as to reduce the time needed to set up the system.
CONCLUSION
The workflow project provides relevant and genuinely useful performance indica-
tors presented in operational dashboards, and giving process supervisors just
what they need to monitor the performance of their workflows.
The growing need for performance indicators is driving companies to implement
workflows in an effort to boost process performancea sort of virtuous circle in-
volving the two BPMs. Having control over both the workflow and the KPIs
equates with a certain level of corporate maturity, where productivity gains make
way for permanent improvements in performance.
Using performance indicators as decision variables in operational execution paves
the way for dynamic processes capable of predicting critical situations and
adapting in real time.
Normal situation Perfomance indicators change
Auto-adaptive process reflects
the change (a new action is added)
73

BPM Center of Excellence
Manifesto
Dr. Setrag Khoshafian, Pegasystems Inc., USA
CENTER OF EXCELLENCE
As BPM becomes more and more pervasive, it is imperative for both large as well
as mid-sized enterprises to establish a BPM Center of Excellence (COE) that fo-
cuses on the deployment of successful BPM projects. The COE has many func-
tions. The iterative COE methodology identifies the participants, artifacts, and
phases of BPM projects. The COE governance of BPM projects identifies the poli-
cies for roles, standards, decision making, and deliverables that target BPM appli-
cations. The COE also attempts to provide the guidelines and models for building
reusable corporate assets captured in process and policy models. The BPM COE
promotes best practices for continuous improvement lifecycles through BPM Ma-
turity Models (BPM MM). A BPM MM is a roadmap that helps evolve the COE
guidelines through BPM engineering, adoption, and governance. BPM is a para-
digm shift in building and deploying applications. It is a new way of developing
and managing enterprise solutions. The COE provides the promotion, training,
and certification of BPM development talent within the enterprise.
Perhaps most importantly, COEs support the holistic three-layered Service Ori-
ented Enterprise (SOE) architecture. The middle, and the most important, layer of
SOE architecture is the BPM Suite. The top layer is the Enterprise Performance
Management (EPM) layer. The COE specifications include the best practices in
connecting BPM functionality to EPMespecially Business Intelligence (BI)
allowing stakeholders to drill down from their high level performance goals to exe-
cuting policies and procedures. COE also specifies the details of the architecture
framework involving BPMS and the underlying SOA infrastructure, especially the
Enterprise Service Bus.
BPM COE META-MODEL
The focus of this paper is on a comprehensive manifesto for BPMS COEs: What
are the essential elements, practices, and priorities of a BPMS COE? Many large
and medium-sized businesses have achieved successful deployments of BPM pro-
jects. The BPM COE has the responsibility to promote the radiation of the BPM
throughout the enterprise and even between enterprises. As the enterprise be-
comes more BPM centric, the BPM COE determines the ownership of the busi-
ness process applications. In fact, it treats the BPM entities (such as process
flows, process information models, business rules associated with processes, in-
tegration, and other strategic models) as corporate assets. These assets, like all
other corporate assets, need ownership and governance policies.
To bring all of these requirements together, we have created a BPM COE meta-
model. The following diagram illustrates the overall summary meta-model for
BPMS COEs. There are four main categories of BPM COE modules: Roles, Gov-
ernance, Methodology, and Maturity Model. Each has subcategories. There are
also associations and dependencies between the modules. Yet this figure is not
comprehensive. There are many other associations and components of the meta-
model. Some of these will be explained in the subsequent sections. However, this
BPM CENTER OF EXCELLENCE MANIFESTO
74
meta-model does capture the essential entities of the BPMS COE and the overall
SOE context.

BPMS COE Meta-Model Summary
BPM COE IN SERVICE-ORIENTED ENTERPRISES
BPM Suites are becoming an essential component in Service Oriented Enter-
prises
1
. The following diagram illustrates an SOE. A BPM COE needs active par-
ticipation of the IT SOA infrastructure experts, as well as the enterprise perform-
ance management communities. Building primarily upon the success of the
Internet, as well as a much better understanding of how business policies and
processes could be automated, today we are witnessing the emergence of robust
service-oriented platforms.

BPM Suites Bringing IT and Business Together

1
For a detailed discussion of SOEs see: Service Oriented Enterprises by the author Dr. Setrag
Khoshafian, Auerbach Publications, ISBN 0849353602.
BPM CENTER OF EXCELLENCE MANIFESTO
75
The BPM COE Governance, Best Practices, and Continuous Im-
provement should be carried out in the context of the three funda-
mental layers of Service-Oriented Enterprises and not in isolation.
At the top layer of this SOE architecture, you have business intelligence and en-
terprise performance management tools. The targets of enterprise performance
management platforms, tools, and solutions are optimized based on key perform-
ance indicators. Organizations can use predictive leading indicators to adjust
quickly and introduce change. For instance, if there are indicators that certain
supply chains will be diminished or certain markets will shrink, the enterprise
can proactively introduce change to its processes or policies to quickly plan for
and adjust to the coming trends as reflected through the indicators. Lagging indi-
cators are obtained through after-the-fact measurement and analysis. Quality
measures or analysis of the performance of processes from historic data in cus-
tomer process management applications are examples of lagging indicators. In Six
Sigma
2
, for instance, you often measure defects per million opportunities.
Through this quality improvement methodology, the Six Sigma practitioners can
collect process performance data and analyze it rigorously. The measurement
phase is followed by an improvement phase that introduces changes to the proc-
esses, and then the cycle continues with optimizations and deployments of the
improvements. In either case, whether you have lagging or leading indicators of
performance, you end up with changechanges in process flows, changes in the
business rules driving the processes, and changes in the information, organiza-
tion, or integration aspects of the processes. This is where BPM Suites come into
the picture.
As illustrated, the core layer of a SOE architecture is the BPMS Suite. Although
the COE Manifesto is about BPMSs, it should also take into consideration the
SOE context. The following diagram illustrates all the essential components of a
BPM Suite. In BPMS Suites, you have integration, both within the enterprise as
well as with trading partners. The BPMS relies upon the underlying IT infrastruc-
ture, especially the Enterprise Service Bus, to support the required Quality of Ser-
vice (Security, Reliability, Performance) of the internal and external service invoca-
tion interactions. Similarly, the process monitoring and analysis are aligned with
the strategic methodology (e.g., Balanced Scorecard) and the Key Performance
Indicators or Critical to Quality measures identified in the enterprise performance
measurement layer.

2
See Business Process Management for Six Sigma Projects by the author Dr. Khoshafian in
Workflow Handbook 2006, edited by Layna Fischer
http://www.wfmc.org/information/handbook06.htm
BPM CENTER OF EXCELLENCE MANIFESTO
76

Components of a BPM Suite
A BPM COE Must Reflect and Support a BPM Suite: Human Work-
flow, Enterprise Integration, B2B Integration, Business Rules,
BAM /Analysis, and Solution Frameworks.
BPMS COE ROLES
Roles with required qualifications are essential in any successful BPM COE. Re-
flecting the SOE architecture discussed in the previous section, BPM COEs in-
volve many roles. Most of these are BPM-specific roles, but you have others corre-
sponding to the enterprise performance and the underlying IT infrastructure lay-
ers. These roles are not isolated. In fact, often the same person plays several roles.
For instance, a Six Sigma black belt responsible for improving a critical-to-quality
measure can also be the process architect automating and deploying the BPM
application. Similarly, a BPM analyst extending current models or providing new
models reflecting business requirements can also be a performance measurement
specialist, relating the modeled processes or business rules to KPI measures ob-
tained from BPM or other information warehouses.
There are four major categories of COE Roles: Business Analysts,
Enterprise Performance Specialists, BPMS Specialists, and IT SOA
Architects. The same person can have multiple skills.
BPM CENTER OF EXCELLENCE MANIFESTO
77

Roles in BPMS COE
BPM Analysts: The BPM Business Analyst understands the business and
translates the business requirements onto modeled processes, business
rules, and other artifacts for business process management solutions. The
BPM Business Analyst is the conduit translating business goals and re-
quirements onto models and combining roles and responsibilities from
traditional business analysis with BPM know how. This role analyzes the
business policies as well as use-cases and translates the requirements
onto modeled, and eventually executed, business procedures (processes)
and rules.
BPM Performance Measurement and Continuous Improvement Specialists:
These specialists deal with concrete, and in most cases quantitative,
analysis and improvement of business processes. There are several sub-
roles here, spanning the business stakeholder analyzing both historic and
real-time data of executing processes. We consider Six Sigma specialists,
such as black-belts or green belts, to be also within this area. The main
tools used here are BI platforms, reporting tools, statistical packages,
strategic modeling tools (e.g. Balanced Scorecard), and others.
BPMS Architects and Developers: The BPMS implementers (architects, de-
velopers, testers, etc.) work closely with both BPM Analysts and perform-
ance specialists. The closer the suggested models are to implementation of
automated processes, the better. Ideally, you would like to have the same
BPM suitehopefully using browser-based accessible toolsused by both
the BPM analysts and BPMS architects building the application and en-
suring its success. Modeling and execution should have the same meta-
model. The main tools that are used here are the BPM Suite modeling, de-
velopment, deployment, testing, and monitoring tools.
BPM CENTER OF EXCELLENCE MANIFESTO
78
IT Infrastructure Architects: The IT architects are responsible for the overall
IT infrastructure, including the application server specialist, system man-
agement experts, enterprise service bus specialists for QoS of service invo-
cations, service definition experts, DB administration, and Network Secu-
rity experts.
There are two additional roles that are important here. One, of course, is the Pro-
ject Manager: The overall management of tasks and schedules as well as project
cost, especially resource management, are handled by project managers. The
other role indicated in the figure is the stakeholder. This stakeholder is typically
the business unit owner who funds the BPM suite project. Stakeholder perform-
ance improvement requirements drive the target performance and ROI justifying
the BPM solution.
GOVERNANCE
One of the most difficult and challenging aspects of BPM COE is governance. In
our context, by governance we mean capturing, communicating, tracking, moni-
toring, controlling, and managing BPMS solutions. But governance also includes
the decisioning mechanisms and authority within an organization and the result-
ing accountability and performance evaluations. Governance is defined through
policies. You also have policies about making policies, whether at the strategic,
performance, BPM, or the underlying SOA layer. You have interdependent policies
at each of the three SOE layers. Governance gets even more complex since it also
has the added dimension of the organizational responsibility and accountability
structure. There are several categories of requirements and policies.
Governance must be holistic: Hierarchical corporate-to-business-unit
governance must be complemented with governance for IT SOA,
BPM practices, EPM practices, and enterprise policy change.
The following diagram provides a more detailed depiction of the Governance meta-
model in BPM COE.

Governance Policies Meta-Model
As illustrated, there are four major categories of governance policies. The corpo-
rate asset policies deal with the reusability governance policies. Process solutions
involving information, flow, business rules, integration, and user interfaces can be
reused within units and across functional units. Process solutions have owners
who have the authority to make changes. Furthermore, there are policies that
specify how to organize the solutions in, for example, inheritance or specializa-
BPM CENTER OF EXCELLENCE MANIFESTO
79
tion hierarchies. For instance, you can have a baseline for handling, say, credit
card disputes around the globe but can then have specializations by country: US
policies, Canadian policies, UK policies and so on. The governance of corporate
assets involves policies for ownership as well as the organization/specialization of
process solutions.
Staffing and training policies govern the required competencies, experience, and
certifications of the team implementing and maintaining the BPM solution. The
Project Management policies are perhaps the best understood, focusing on sched-
ule, resource, and cost governance.
As illustrated, there are three interrelated categories of Service Oriented Enter-
prise policies. The BPM policies include design policies, various categories of stan-
dards, and change policies. There are many types of elements that are used in a
BPM solution. The policies provide guidelines, conventions, and standards in the
design of flows, business rules, integration, information models, and user inter-
faces. These policies can be as detailed as specifying, for instance, the types of
flow patterns that can be used in a business process (e.g. specifying number of
task successor nodes). The standards can include listing the acceptable stencil or
shape types that could be used in designing processes. They could also include
naming conventions, standards and convention for business rules, and service
definition and discovery. The change propagation policies specify how changes are
proposed, approved, and propagated in the deployed solutions.
The figure also illustrates quality of service policies (QoS) that are important both
for BPM and the underlying infrastructure (e.g., the service bus). Here, QoS in-
cludes three categories of policies: security, reliability, and performance. This
could imply, for instance, approved architectural patterns involving distributed
BPMS nodes and underlying service buses that could be used to guarantee scal-
ability and reliability requirements. Similarly, security policies specify the tech-
nologies and standards that are used for authentication and authorization.
Let us take an example. Let us say a business owner would like to get a report on
the performance of the sales people by region. Let us also assume this is obtained
through a request to the performance management specialistswho can create
the report or, if need be, the data warehouse. Now this report needs to be made
available within a prescribed period of time to be effective. In other words, you
have a service level agreement (SLA) associated with the request. These are all
policies. There are others that govern the prioritization of potentially conflicting
requests. This example shows several parties involved in governance: the re-
questor, the specialist, and also the overall governance. Changes in business
rules, business processes, and new applications all include governance with mul-
tiple parties or organizational units. The enterprise policy governance controls the
prioritization and governance in delivering the requests.
The authority to set up the policies, as well as tracking the implementation of
these policies, is also delegated to specific units.
As much as possible, Governance must be Automated and Moni-
tored through a BPM Solution.
Governance deals with the ownership of various phases and deliverables in a
BPMS solution.
BPMS METHODOLOGY WITH CONTINUOUS IMPROVEMENT
A BPM COE needs a BPM Methodology. In BPM continuous improvement, you
have two fundamental principles. One dovetails the advances in software engi-
BPM CENTER OF EXCELLENCE MANIFESTO
80
neering agility, quickly introducing and deploying changes in the underlying
BPMS application. The other is the changes that are happening culturally as or-
ganizations become more service oriented. These two are related, but different.
When you hear or read about change or agility, it is important to be able to
distinguish between the two. Both are important. In the former, business process
management is becoming the next programming paradigm. We are now seeing
the emergence of model driven development, where process models, information
models, and business models can be readily executed. Contrast this with the
more traditional approach of generating code from models and continuing devel-
opment life cycles with source code modifications. With BPM suites, the process
models execute.
This methodology starts with the identification of a quick win opportunity. Here
are several guidelines that should be followed for the methodology:
Quick Win: As a rule of thumb, a new BPM project should not take more
than 90 days to go to production. 45- 60 days should be the average. The
goal of the Quick Win approach is to identify use cases which both pro-
vide business value and minimize implementation risks.
Iterative: The methodology should be lean and iterative (vs. waterfall). In
this iterative methodology, you are capturing requirements and going
quickly to execution. The following diagram illustrates a methodology
where you start with identification of quick win use cases and then iterate
on these. In this example, the iteration phases correspond to the phases
of the unified software development process:

BPMS Methodology Iterations
BPMS Platform-Specific: The ideal methodology should be platform- spe-
cific and not platform independent. Furthermore, the BPMS platform
should itself reflect the methodology. For instance, the platform can cap-
ture best practices as specified in the methodology and conduct a pre-
flight analysis to check if the best practices are reflected in the solution
that is about to be deployed. The platform could also raise exceptions and
warn about potential methodology compliance areas, again reflecting the
methodologys recommendations.
BPM CENTER OF EXCELLENCE MANIFESTO
81
Building Corporate Assets: Reusability of BPMS modules or applications is
one of the most essential objectives of BPM COE. The methodology will
specify the best practices in sharing, reusing, and building corporate as-
sets involving all the elements of a BPM application: information models
(properties/attributes), process flows, business rules, service integration,
and UI.
Holistic Methodology with Continuous Improvements: The methodology
should also specify how the continuous improvement at the BPMS layer
interacts with the other layers of the overall SOE architecture.
The following
3
illustrates the interdependencies and the larger continuous loop of
improvement involving the higher layer enterprise performance management, the
business performance management system layer that was discussed in this sec-
tion, and the lower level SOA/ESB infrastructure plumbing layers.

Continuous improvements with EPM, BPMS, and SOA/ESB
From a top-down, especially business-oriented, perspective, it starts with what
you are trying to achieve and your key performance indicators. One of the recur-
ring problems in most organizations is that the higher your level in the organiza-
tional hierarchy, the less connected you are to the operational processes and en-
terprise data that are needed to solve critical performance problems. Further-
more, usually it is difficult to control and change real-time executing processes.
Often, decisions are made with the wrong or insufficient information or after the
fact. Furthermore, the QoS, service composition, and service discover/publication
and service interactions are realized.
The Continuous Improvement of BPM solutions should be holistic:
KPI or CTQ improvement iterations are then realized in BPMS im-
plementation iterations which, in turn, utilize ESB iterations.

3
Also from Service Oriented Enterprises by Dr. Setrag Khoshafian
BPM CENTER OF EXCELLENCE MANIFESTO
82
As we have discussed elsewhere, occasionally the continuous improvement could
include rigorous methodologies such as Six Sigma. Six Sigma has become the
leading methodology by which companies manage and improve their business
processes. There are numerous ways that BPMS supports and complements Six
Sigma, but particularly in the context of the popular DMAIC methodology, BPM
and Six Sigma practitioners can be seen as part of the same continuous im-
provement loop. In fact, you can start with a strategic methodology, such as Bal-
anced Scorecard, identify your KPIs, and then map those to Critical to Quality
objectives that are improvements via Six Sigma analysis, while using the underly-
ing BPMS to automate the processes that are being improved. The CTQ measures
(the Ys) and the critical to process variables they depend on (the Xs) are
mapped onto executing BPMS processes. The combination yields what we have
called real-time Six Sigma continuous improvement cycles.

Continuous Improvement with Strategic Methodologies and Six Sigma
BPMS MATURITY MODELS
BPM introduces a new approach in capturing, modeling, and automating busi-
ness procedures and policies. Any enterprise that starts adopting BPM must have
a roadmapa strategy and approach to incrementally automate business proc-
esses and continuously improve the enterprises responsiveness to change re-
quests. So even if an enterprise does not call it maturity model or does not have
the rigor and complexity of the original CMM
4
or the host of BPM MMs that have
recently been proposed to address maturity for BPM projects, it can still achieve
its BPM objectives, provided it adopts an overall approach to become more mature
in BPM practices. In other words, the requirement for maturity is a roadmap that

4
http://www.sei.cmu.edu/cmm/
BPM CENTER OF EXCELLENCE MANIFESTO
83
makes BPM more and more pervasive within the enterprise. BPM MM levels pro-
vide a semi-formal framework that helps you realize the roadmap.
There are three dimensions for BPM MM. The first one is the dimension for build-
ing or engineering successful BPM solutions. BPM is a different and novel para-
digm in building applications. Building BPM applications requires a robust BPM
methodology, which was discussed in the previous section. It is a BPM methodol-
ogy with BPM roles, BPM artifacts, automation of policies and procedures, and
continuous improvement of BPM applications. With BPM engineering, all the lev-
els of the maturity model should be BPM-specific. For instance, at the Defined
Level (Level 3), you have best practice patterns in building BPM solutions that are
captured, documented, and mandated across BPM projects. In a BPM context,
the development procedure starts to incorporate reusability models so that sub-
sequent projects can leverage and reuse the process applications (including flows,
business rules, information models, integration, and UI) that have already been
built. Thus in this dimension, the focus is on BPM engineering practices and
processes.
Adoption is another dimension of maturity that deals with the permeation of BPM
within the enterprise, from the functional or departmental unit to eventually en-
terprise adoption of BPM. Successful deployments within a functional unit or de-
partment radiate to cross functional processes, and then eventually BPM be-
comes a strategic enterprise solution. The goal of this dimension is to eventually
be able to link your enterprise key performance indicators to end-to-end executing
processes and business rules.
In addition, you have the third dimension of Governance. As we saw earlier, gov-
ernance has several categories including staffing, project management, building
reusable assets, and perhaps most importantly, SOE polices. Furthermore, ma-
turity in the Governance dimension implies the enterprise increasingly relies on
BPM solutions to satisfy internal as well as external compliance policies. The Sar-
banes-Oxley Act of 2002 (SOX) is now levying an enormous amount of complexity
on enterprises, requiring them to document in detail their business transactions.
Governance is a requirement. However, without automating the policies and the
business rules encoded in SOX, this manual governance approach simply will
not scale. As more processes get automated, the easier it will be to ensure govern-
ance policies for compliance.
With BPM maturing along these three dimensions, ultimately you would like to
have the ability to easily drill down from high level KPIs or CTQs to underlying
executing business processes, introduce improvements, easily make changes, and
improve the performance of your indicators or critical to quality factors. BPM MM
should be lean, practical, iterative, and achievable, focusing on radiating quick
winsvs. radical changes or long phases in BPM adoption. Often BPM MM is
confused with reengineering with radical organizational changes. Maturity Mod-
els are also sometimes confused with archaic waterfall models with long phases
between each phase. An iterative approach means you do not have to perfectly
achieve all the descriptive goals in a phase before making progress in other levels.
Similar to other iterative approaches, you can have a main theme but make pro-
gress in other levels or areas as well. For example, the maturity model might fo-
cus on project management in, say, Level 2 and yet at the same time achieve ra-
diation within a department and realize within this level continuous improve-
ments and tangible/measurable ROI goals.
BPM CENTER OF EXCELLENCE MANIFESTO
84
The BPM Maturity Model Strategy must be lean and iterative. It
should be holistic, achieving maturity along three dimensions, in-
cluding, but not limited to a) building BPM applications (software
engineering for BPM), governance, and adoption of BPM in the ex-
tended enterprise; b) BPM with the SOA/ESB strategy; and c) en-
terprise performance management with continuous improvements.

BPM Maturity Model Levels
CONCLUSION
This paper presented a BPM COE Manifesto. A BPM COE specifies the roles of the
various parties and participants involved in the overall continuous life cycle of
BPM solutions. The focus of a BPM COE is holistic: It reflects all the modules,
components, methodologies, and maturity models of the three Service-Oriented
Enterprise layers: Enterprise Performance Management, Business Process Man-
agement Suite, and the underlying SOE IT infrastructure, especially the Enter-
prise Service Bus. The BPM COE specifies governance policies. The governance
spans SOE design and development policies, project management, staffing, and
reusability policies.
BPM COE includes a methodology that typically starts with identifying a quick
win opportunity and iteratively designing and automating its cases. BPM method-
ologies should not be independent of or agnostic to the underlying BPM platform
that is used for the solution. The BPM Suite itself should support the methodol-
ogy guidelines. Finally, BPM COE should adopt an iterative maturity model. This
BPMS MM should mature along a number of dimensions: governance, adoption of
BPM from functional units to the entire enterprise, and maturity in building BPM
solutions.
85

Evolution: An Innovation Process
Gabriel Franciosi and Federico Silva,
Pectra Inc., USA
INTRODUCTION
A quick look at todays Business Process Management Suites shows how far the
market has come. Some times even farther than industry imagination.
Nevertheless, it is a never-ending process. And it keeps going.
It started with Human Workflow, passing through BPM and BPMS, and leading to
new concepts like SOA (complementary to BPM) or even beyond, the future,
BPOABusiness Process Oriented Architecture.
We all know the key to success in the networked economy is the ability to per-
fectly integrate value chains. Those value chains are profitable and efficient when
accurately orchestrated and automated through designed business processes. Its
not only provides best practices, but also, and most importantinnovation, as a
systematic business process. Innovation; to solve problems, optimize solutions,
and create new value for society.
For everyoneIT and business professionalswe are now approaching the way of
doing things easier. In the past years, a big challenge has been set by the experts;
bridge the gap between Business and IT. All theses concepts and tools are run-
ning to reach this goal. But, like we said before, it is a never-ending process, and
it keeps going.
Would we achieve it?
The next lines tend to be an approach to what we call the evolution of innovation,
and how the business process management philosophy is collaborating with it.
We will explore how system development technology and business models have
traveled their paths until they finally came across this new model that optimizes
the discoveries of the two sectors, allowing the fulfillment of a common objective;
business readiness and flexibility.
THE TRAVELED PATH
Every time we figure the logical pattern underlying the functioning of businesses,
there are two clear components worth highlighting; the idea itself and the imple-
mentation of such idea.
The idea is an integral part of the business spirit, of the creator, the entrepreneur,
and it sets the direction for the business with a highly visionary focus. Its imple-
mentation is a part of the business management, and it is the key ingredient for
the success of any idea.
We have many times overheard such phrases like, Bring me a brilliant idea and
Ill give you some cents; bring me a successful way to implement it and Ill give you
millions. In fact, there are many excellent ideas that have never attain the stage
of implementation and many others turned out to be successful not really be-
cause of the concept itself, but because of the successful way they have been im-
plemented. Today, we reach the conclusion that it is not the idea or the product
itself that really matters; it is the way in which we make that idea or product sat-
isfy our customer.
EVOLUTION: AN INNOVATION PROCESS
86
We will be focusing our analysis on this point, on the way in which organizations
have attempted to implement the various ideas according to different world sce-
narios, and on the way technological innovation has contributed to this imple-
mentation.
Stage of Functional Organizations
The spectrum of our focus starts with tracing back to the first business models;
the ones centered on classical theory. Men like Taylor
1
have grounded their theory
on operative efficiency and have directed their aim at the production area, stress-
ing the entrepreneurial values of production, study and specialization standards
of tasks, centralization, functional supervisions, and promotions based on pro-
duction levels.
The prevalent paradigm in such organizations is the organizational chart; estab-
lished from the division of work and further grouping of specialized tasks accord-
ing to functional areas or departments. The classical concerns are the channels of
authority, information and control, the lines of responsibility, authority and inter-
personal relations are clearly drawn.
Coworkers are used to titles such as Vice-president, Management, Front Office or
Department, yet processes are not established or defined and nobody knows what
is done and how it is done throughout the whole company.
Managers are responsible for reporting to the Senior Management every issue
relevant to an area or department, but no responsibility is assigned when it comes
to the whole work, i.e., the process.
Now, which would be the focus of technological innovation at this stage?
Once faced with these business needs, it is obvious that innovation would focus
on computer systems that account for specific tasks within specific areas, aiming
at increasing production and reducing resources use. Thus, the main features of
innovation at this stage had to do with researching issues related to systems to
perform high-speed calculations, with high performance and low-resource re-
quirement computers, with systems for production line tasks automation, and so
on.
Later, business models evolved towards the Organization Theories. At this stage,
emphasis relies on the human resources, its relations and a functional structure
that is more clearly outlined and consolidated.
Here, the importance lies on the proper operation of the inner company; the focus
is placed upon the construction of a formal organization based on rational con-
ducts aimed at achieving some goal. It is characterized by excessive formalities
and paperwork; it is viewed as a closed and rigid system with high reluctance to
change and depersonalization of relations.
Work groups emerge as an option for increasing efficiency in the problem-solving
instance; this stage marks the beginning of the use of Departmental Workflows.
These first departmental workflows were limited to directing tasks and documents
between employees of one area. Their four distinctive elements were:
Routes; they define the road though which objects (documents, forms,
etc.) will flow. They indicate the probable paths the objects will travel.
Rules; they indicate who are to receive the information that flows over the
routes. This process in known as object routing.

1
Frederick Winslow Taylor in The Principles of Scientific Management.
EVOLUTION: AN INNOVATION PROCESS
87
Roles; they indicate the functions to be performed, regardless of the indi-
viduals that perform them. Each role is assigned users and they are the
ones that perform the tasks.
Processes; they comprise a series of steps (activities) based on existing
routes and rules.
As evidenced, during these stages and until the eighties, most companies directed
their efforts toward tasks that aimed at the correction and improvement of their
production processes, seeking higher efficiency and productivity levels with rigid
task specialization and functional structures.
This is why we witness how innovation was also encouraged to build computer
systems on isolated grounds, developed to solve problems of a specific area, com-
pletely ignoring its relationship to the rest of the areas.
If we have spent years working toward the achievement of a higher efficiency of
resources, the key issue would be which is the scarcest resource within our organi-
zation?
And the answer rises clearly; The Customer. It no longer points to technological
resources, nor to facilities, but the key issue now points toward our customers
who are, ultimately, the pith and marrow of the company.
Only now the management understands that it has been devoting lots of efforts to
a mistaken focus point of the company. For many years we have directed our en-
ergy towards measuring, controlling, certifying and correcting our production
processes. Consequently, processes within the company became the main cost
factor for organizations. Let us pay attention to the fact that a strong sales strat-
egy generates a higher impact of the customers perception of your company than
many of the manufacturing activities. How many potential sales are lost because
of a defectively designed sales process?
It is unchallengeable that the product we offer must be good, but it is also quite
clear that a good product is not all our customers need. Our main goal should be
the satisfaction of a customers need, by providing a good product and an associ-
ated service. We must not lose sight of the concept of quality that determines
that quality is on the perception the customer has over our product or service.
Furthermore, this stage is signaled by significant changes in technology, new ver-
sions of operating systems, developments on word processors and PCs, higher
speed servers, and so on.
So prominent is the dynamic development of these innovations that many com-
panies have been tempted to buying technology fashion-like i.e., that most of
them have not effectively analyzed the Return on Investment (ROI) of their IT in-
vestment, they just underwent the venture to achieve the state-of-the-art technol-
ogy goal. By the time they started analyzing this factor, they realized that even
though they have made strong investments on technology, systems and applica-
tions, they have not yet reached the expected benefits.
It is at this point that both business and technological fields came to realize that
they should move forward toward some new model. The biggest opportunity they
had to increase the company finance relied on the integration and optimization of
the companys processes so as to meet the customers needs.
Stage of Customer and Processes Economy
A company is said to exist and be operative when it contributes some added value
to the community (customers) it serves. With this background, it is evident and
EVOLUTION: AN INNOVATION PROCESS
88
natural that those companies that use more resources than the ones they gener-
ate are doomed to extinction.
The question is: How do we create value for our customers?
This question clarifies that the generation of value has a dynamic sense in which
every single moment is an opportunity to modify our business logic in order to
keep adding value for our customers. It is worth mentioning that this dynamic
context speeds up as time goes by, thus the business lifecycle of our products or
services is shortened daily.
It becomes evident that technological innovation has always had a leading role
functioning as a catalyst by accelerating these changes.
The emergence of new information technologies, Internet and the whole comput-
ing world has contributed to the globalization process and engulfed companies
with a new environment that is permanently changing the rules of the road. To
face this, many companies have adopted a mix of two basic strategies;
1. They have isolated their domestic markets behind a wall in order to be-
come an authoritarian local player.
2. They have opted to penetrate overseas markets by exports from their
headquarters, investing on the construction of extensive nets among sub-
sidiaries.
Yet, becoming a really global company entails many more aspects than a broad
portfolio of worldwide business units. In terms of efficiency and suitability to cre-
ate value for customers, the whole must be greater than the sum of its parts.
The key challenges now companies have to face are related to the establishment
of the necessary infrastructure with partners, from financial networks to supply
chains that will allow them to build an integrated value chain together, aiming at
an optimized customer satisfaction. Once this is achieved, a global organization is
sustainable.
The online connection of providing a big commercial, flexible and organized net-
work accounts for the introduction of newly born services. This is not only an op-
portunity to improve customer service and substantially reduce paper and files
movement, but also it sets precedent for the emergence of new businesses and for
the maximization of a new sales model though business process automation and
control between community partners.
Against this background, over the last few years, companies have discovered a
new way to get things done, a new way of managing and molding business ideas
into realities. This new way is what we now call Business Process Management.
If we analyze this theory from a business viewpoint, we can see it is characterized
by a change in the way we perceive a company. Comparing and contrasting prod-
uct-oriented and customer-oriented companies, we could highlight the following
discrepancies:


Product-oriented companies Customer-oriented companies
Employees are the main source of
problem.
The process is the main problem to
solve.
Companies have employees. Companies are made up of people.
I have to do my work everyday. I have to help get things done in the
company everyday.
EVOLUTION: AN INNOVATION PROCESS
89
I need to know how to perform my
tasks.
I need to know the position of my task
within the process and how can I im-
prove it.
We need to assess the performance of
employees.
We need to assess the performance of
the process.
If something is not working as ex-
pected, we change the employee.
If something is not working as ex-
pected, we modify the process.
There is always a better employee. There is always a better process.
Motivate people Eliminate barriers.
Thoroughly control employees and
their production levels.
Concern about the development of
people and encourage their skills to
improve processes.
I can trust no one; everything has to
be controlled as many times as neces-
sary.
We are all part of this.
Who made the mistake? Which process or task allowed the
mistake to occur?
Correct mistakes Reduce variation
Tasks are oriented towards the inner
operation of the company.
Processes are oriented towards cus-
tomers.
Recently, companies have learnt to document and map their processes, to develop
analysis techniques for the identification of bottle necks and unnecessary steps,
applying quality standards and working on the continuous improvement circuit
for processes as the basis for continuously improving the whole organization.
Evidently, this entrepreneurial change would clearly demand a new step forward
from technology, existing systems should be innovated.
This is how the BPM concept originatesas the technology that allows business
process automation. It is born as a promise of giving the organization the drive it
needs in order to attain a continuous customer satisfaction.
The most significant value of BPM tools contribution to the organization is the
reaction capacity it acquires when it is gifted with real-time visibility on an opera-
tive level. This visibility enables a more effective and faster reaction of the man-
agement when it comes across a problem or a business opportunity. It is based
on eliminating the logics from systems and upgrading the system to a higher level
in which they can be modified and adapted at a faster pace and with lower costs.
Yet... have these technological innovations been able to fulfill their promises? Have
they been able to provide dynamism and flexibility for businesses to organizations?
We may assert that they have achieved many goals; they have standardized busi-
ness operation, controlling and reducing its variations and, thus costs; they have
included people as a part of the process; they have shifted the organization orien-
tation towards customers, and so on. Yet, they have made no improvement as to
giving the organization the speed and versatility to adjust itself to constant busi-
ness changes; they have failed to get the desired results.
If we analyze the causes of such failure, we can spot many reasons, from resis-
tance to change or lack of technologies mature enough to be implemented.
Beyond these causes, one of the main reasons is that process-supporting applica-
tions are still built up on a monolithic basis, they are created to give support to
functional areas (as classical business models once required), but they are not
EVOLUTION: AN INNOVATION PROCESS
90
prepared for quick adjustment in line with business processes. This creates a gap
even when business processes are adjusted and implemented quickly. IT applica-
tions that are to provide support to them fall behind and delay the deployment of
new businesses.
We see the story is repeated every time there is a new business requirement, a
market opportunity or just a new adjustment to some change of rules or laws, the
steps are:
1. Analysis of the business requirement
2. Analysis, design and automation of necessary business processes
3. Development and/or integration of the applications that support the im-
plementation of these business processes
At this final phase, the systems department has to make a tough decision;
1. Try to reutilize functionalities already implemented in some other systems
in order to implement the new business process activities, or
2. Reimplement the requested functionality, developing the functionality
again in the new technology background.
Upon evaluation of these options, we can see the most proper one would be num-
ber one, yet it is rarely the case in our experience.
To reutilize already implemented functionalities implies a hard task, as these sys-
tems were not created to be integrated and are developed on a monolithic archi-
tecture with platforms and/or technologies that are incompatible among each
other. Even when the connection is technically feasible, we must face the risk of
tampering with a system that is operating smoothly.
This is why most of the times the selected option is number two. It is the easiest
and fastest option, although it demands more development time but it is not, in
the long run, the best choice. This typical choice made by the technological area
brings about a negative outcome:
The functionality is replicated by all applications
Difficulty for migrating inner systems; when there are multiple connec-
tions among systems
Multiple points of failure when there is no integration strategy for different
applications and when they are all interconnected, points of failure multi-
ply. They can very easily interrupt the operation of all systems
Generally, this kind of model is not very scalable
The final obstacle is a poor response to change. Applications are still con-
ceived of as independent islands
In short, the implementation of BPM technologies over this application develop-
ment scheme loses their inherent flexibility and versatility. Applications become a
restraint for quick implementation of business processes.
Here we become aware that at some point during the evolvement of a company,
when business processes growth turns evident and, along with them the growth
of the computer systems that support it, there appears the need for a quantum
leap forward towards the Business Architecture concept.
This means taking process concepts and sketching, like an architect, a design in
structures and layers to support those processes from strategic business levels to
physical implementation levels.
The concept of Business Architecture entails several implications, but they all aim
at the same goal; to determine an orderly way to provide every level within the
company with a defined and clear workframe, an outline in which every player of
EVOLUTION: AN INNOVATION PROCESS
91
the company is considered and in which every level participates, focusing on the
processes and, finally, advocating business strategies and objectives.
What does this imply for technological areas within the organization?
Actually, it means that they will no longer focus their concerns or aims on iso-
lated applications or systems. On the contrary, they will be oriented towards pro-
viding support to the companys business processes, which involves an IT attitude
and culture deep change. The services-oriented concept is born from this.
Stage of Services
When we begin analyzing the way in which a business process management solu-
tion is designed and implemented, we can spot the following phases:
We start by an analysis of the business and its background
We design business processes and the activities pertaining to it
We evaluate which company areas or systems could execute each activity,
i.e., we assess which organizational and IT components provide the ser-
vices we need to run our process.
These final stage services have been widely outsourced or subcontracted by com-
panies. The key issue is that they have managed to run their business processes
adding participants, increasing the value for customers and reducing times and
costs (logistics, documents safekeeping, call centers, etc.). So, many phases of
business processes are performed outside the organization that originates them.
When a company evaluates the provision or hiring of a service, it desires the fol-
lowing qualities:
That it can be hired by any company, regardless its characteristics
That it has no-location restrictions (for example, call center services may
be provided by a company in India for a company in America)
That it does not require specific implementation for each customer
That the incorporation or modification of one customer does not impact
on the others or on the service itself. As customer, in turn, I should have
the chance to change my service provider with no further impact on my
daily operation
IT architectures assimilate this idea from the business world and model it into the
technological world by means of the SOA (Services Oriented Architecture) concept.
Just like the areas of an organization (or an external organization) provide a ser-
vice to execute the tasks within a process, SOA defines that each application
should also provide services to execute the phases of a business process.
Services-Oriented Architecture is neither a technology nor something you can buy
off-the-shelf. Instead, it is a paradigm, a change, an architectural concept. It is a
new way of assembling an IT environment architecture that allows you to model
the company as a group of services available on the network that can be reutilized
by any business process with a minimal effort.
To achieve its goal, it organizes discrete functions from applications and trans-
forms them into interoperable services (based on standards) that can be quickly
combined and reused to satisfy the business needs.
From a more technical perspective, SOA promotes and facilitates certain charac-
teristics desirable for every system:
Its coupling level is loose; its interfaces are clearly demarcated, the cou-
pling level is reduced, so the implementation or modification of a service
does not affect the other services or the users.
EVOLUTION: AN INNOVATION PROCESS
92
It improves development levels and facilitates testing; as each business
function is developed as an independent service, the developer focuses at-
tention on a concrete point, raising the productivity level.
It defines a security level; the definition of systems security levels can be
established on the services layer.
It facilitates reutilization; the development of services that allows interop-
erability and which publishes their interfaces on the net thus encouraging
their own reutilization as they offer users complete business functional-
ities that are easy to implement in any process.
It improves scalability and high availability; services are transparent re-
garding localization, a failure of an existing one can be easily solved by re-
allocation with no negative impact on production systems.
It enhances maintainability and facilitates monitoring and error detection;
services have well-defined communication channels, it is easier to monitor
them, timely detecting errors and failures.
It allows interoperability; as based on standard protocols, we can create
services that interoperate regardless their platform or programming lan-
guage.
A Corporate Architecture Paradigm must facilitate integration. SOA is set to
change the way architecture is viewed, not only for isolated applications or sys-
tems, but for the company as a whole. Besides, it encourages the building of ser-
vices rather than applications. These services will be in charge of publishing a
well-defined functionality for the application or process that requires it.
Clearly, we could never reach these technological models without traveling the
innovation path on systems communication and integration technologies. Tracing
back, we can highlight the evolution pattern and assert that the melting of the
following technologies served as the foundations for SOA:
Classical solutions used standards for messages and communication im-
plementation; EDI and TCP, FTP, etc. The synchronous implementation of
these solutions is harsh, networks are efficient though little versatile when
it comes to changes.
Attempts like CORBA are far too advanced for current market maturity.
By its simplicity, SOAP (Simple Object Access Protocol) standard acquires
great acceptance. It is mounted on XML success and has a widespread
uptake among IT communities, so much so that messaging standards
shift to an XML base.
WS (Web Service) perfects the idea of SOAP by orienting the whole archi-
tecture towards the exposition of the services on the Web.
Even so, a company is not just based on the execution of isolated services. Con-
versely, services must be coordinated by the organizations management. This is
the instance when BPM and SOA generate a synergism to attain paramount re-
sults.
ONE OBJECTIVE IN TWO PERSPECTIVES
We can state that the utmost added value from both, BPM and SOA, is obtained
from their combined work, from their synergic operation, that enables them to
achieve the best of two worlds, Business and IT.
BPM furnishes very important benefits by means of analysis, modeling, simula-
tion, and monitoring of business processes. Yet, it is quite restricted when it is
EVOLUTION: AN INNOVATION PROCESS
93
implemented over a rigid IT platform, in which any improvement opportunity is
really costly in terms of time and resources.
SOA, on the other hand, has to justify its ROI by implementing services that hold
some degree of relevance for the business, and not just a mere compilation of de-
tached functions. Given a real SOA plan requires that services be created inde-
pendently, it is essential to have a mechanism that coordinates them and makes
them worthwhile for the organization.
This is the melting point for both perspectives; BPM and SOA.
Born within a business environment, BPM implements processes by
means of top-down analysis; it starts analyzing the situation, then defines
processes and their automation (orchestrated by SOA services).
With its technological roots, SOA performs a bottom-up analysis, starting
from existing systems up to the generation of business services suitable
for a BPM.
In this manner, both concepts direct their efforts toward a common point in
which the business analyst can define a process and assign services for the per-
formance of its activities.
This clearly bridges the gap between business and IT areas, and ensures a quick
adjustment to changes. A final BPMS (Business Process Management Suites) ap-
plication orchestrates the performance of a group of services, adds its specific
logic and provides for the final user interface.
SOA is the technical answer that enables BPM to fulfill its promise of achieving
flexible processes in the organization.
When we talk about SOA with different profiles within an organization, we learn
that the perception of this same concept varies according to each answerer.
So, a Business Manager considers SOA as a group of business services that his
organization exposes (sells) to its customers, partners or other areas within the
same organization.
For a Technology Manager, SOA represents an architectural style applied to tech-
nology that enables the definition of implementation standards, patterns, and sets
the guidelines to develop reusable applications, self-contained and interoperable.
For a Developer, it becomes a programming model that defines clear guidelines for
system programming.
The thing is which one is correct?
In fact each of the three perceptions is correct at their levels. What does this
mean? It means that this concept has managed to meet the needs and require-
ments of the two worlds (business and IT), and this is the key to its success as a
model.
It is definitely not an easy, simple, fast or even economic task. It is a long term
undertaking and the companies that have undertaken it are, nowadays, highly
successful in terms of flexibility and performance of their computer systems and
platforms.
Lets review a very simple example, yet really illustrative, of what can these two
technologies can produce when teamed. Lets take the implementation of a proc-
ess for uploading pictures into a children website.
We should begin by designing the business process. It can be briefly outlined like
this:
Request a new photo upload.
EVOLUTION: AN INNOVATION PROCESS
94
Authorize new photo upload.
Save new photo.
Confirm upload of new photo on the website.
The following step comprises the implementation of each process activity. We can
think of:
Request a new photo upload; design a web site that allows the request for
a new photo uploading and upload it.
Authorize new photo upload; if we are to perform this activity on an auto-
matic basis, we should develop an image recognition application to avoid,
for instance, pornographic content images. The development of this appli-
cation will surely demand a considerable amount of time and a great ef-
fort.
Save new photo; we could use a local database for images storage.
Confirm upload of new photo on the website; we could develop a mailing
application to confirm uploads on the website.
Now, let us leave aside the way in which we would traditionally implement this
process and try to implement each activity related to some own or external ser-
vice.
Request a new photo upload.
Authorize new photo upload to implement this kind of activity, there are
some providers that offer a very interesting service based on the perform-
ance of manual tasks through technological interfaces. For instance, the
best way to implement image recognition features is through human in-
tervention, with shorter recognition times and highly reduced error rate.
Then, these providers publish a Web Service to which they ask us to send
parameters regarding the images and actions to perform with them. On
the providers side, each time this Web Service is executed, there will be
an individual issuing an answer based on the information (image and ac-
tion) received.
Save new photo; we could use third party services for information storage.
These services publish a Web Service that receives parameters as to the
information to be saved and returns an ID. Then, whenever we want to re-
trieve information, we summon that Web Service sending it the ID we re-
ceived.
Confirm upload of new photo on the website; we can use third party ser-
vices for notifications.
The worth mentioning idea from this simple example is that not only has the way
of implementing business processes been modified, but also that companies have
ascertained a whole new market to explore; the provision of innovative services to
third parties, enhanced by a level of creativity we would have never envisioned
during the classical business theory stage.
NEW SOLUTIONS, NEW DILEMMAS
Now, the scope is not as simple as generating services along with BPM tools to
change the development style of IT solutions. It goes far beyond. as we unveil this
change, we discover that this way of working implies advantages and disadvan-
tages.
On the one hand, the most significant disadvantage is, perhaps, related to per-
formance, but current hardware technologies will easily help overcome this issue
by means of ultra powerful servers. On the other hand, the main advantages rely
EVOLUTION: AN INNOVATION PROCESS
95
on its ease of use and versatility that your solutions will offer, which will enable
you to focus your work on processes and their dynamics.
As we can witness, every evolution brings about advantages, but it also draws
some light on a bunch of new problems that have to be solved. We will briefly dis-
cuss some of them:
Multiple points of failure; when developing applications based on line ser-
vices that are provided by third parties, the running of the whole global
process over different systems, platforms and on the Internet, we face
many points of failure.
A way to mitigate this is the use of matrixes of the SLA (Service Level
Agreement) sort. They allow us to agree with the service provider upon the
specifications of the service itself.
WSLA (Web Service Level Agreement) is not currently a standard but it is
really helpful as it allows us to define the specifications of the provision of
a service.
Considering that each business logic has its particular traits, there will be
as many different WSLAs as different business needs we can imagine.
This is a non-comprehensive guide of the parameters that should be
taken into account:
Service availability (7*24)
Versions and service change
Cost of the service and payment methods
Expected provision
Scalability.
Exception, alert and failure handling
Estimated turnaround times (min, max, average, etc.)
Output bandwidth
Security and confidentiality; as most of the services will be published and
used through Internet/Intranet, we must use technologies that ensure
security regarding:
Authentication, which ensures that the message is sent by the
real sender.
Integrity, which ensures that the message has not been tampered
with.
Confidentiality, which ensures that no other people have been
able to access the message.
By a secure transport HTTPS implementation, WS (Web Service) may be
secured; this is known as TLS (Transport Layer Security). It can also be
secured using a standard that applies security concepts to communica-
tion, which is known as MLS (Message Layer Security).
Certainly, these new dilemmas are part of the focus on which future technological
innovations will develop, aiming at providing higher security and stability levels to
services oriented models.
THE FUTURE?
Everything seemed to announce that in this evolution, a new journey is begin-
ning, one in which business becomes the essence of technology. And, even more
important, one in which innovation is not conceived of as separate from new
businesses, from new processes.
EVOLUTION: AN INNOVATION PROCESS
96
Any new technological model set shall be governed by this premise as its basis, or
else it will face very little chances of success. Systems will need to be able to en-
compass the dynamic rhythm of businesses, and withdraw the position of an an-
chor that holds them away from evolution.
SOA and BPM emerge as a significant move in this arena, even when there is still
too long a way to travel. We must have the ability to understand that not every
real situation within a company may be modeled just by the interrelation between
BPM and SOA (or, at least, not with their current status of evolution). BPM and
SOA help model simplified situations from reality but there are still many in-
stances in which human intervention is, and will remain, necessary.
We may forecast that BPM tools will develop modules that work collaboratively
along with multiagent systems or distributed artificial intelligence systems. These
networked artificial intelligence systems will, by means of intelligent entities exist-
ing within certain background or environment, be able to communicate in order
to make decisions regarding real situations. Or else, we may envisage event sys-
tems that detect environmental changes and generate a reaction to them in an
intelligent way.
Be them multiagent systems or not, event systems or any variant of evolution we
may imagine, we must bear in mind that history proves that this kind of innova-
tion always permeates deeply into the society that gave birth to it, either radically
changing it or facilitating a subsequent change. To put it some other way, once a
technological innovation is created, in turn, it ends up modeling its creator. This is
the case of the printing press, electricity, television, Internet, cellular telephony,
even the latest acquisition; the iPod, which established the concept of personal-
ized entertainment. Each of these examples illustrate how innovations trans-
formed the way human beings think, it modified human thinking categories and
changed human contemplation style.
Likewise, future computer systems will introduce modifications to Homo Sapiens
working routine, on their way of learning and thinking.
Let us analyze a moment the power entailed in the connection between BPM tools
and the World Wide Web; The underlying importance is that it is an infrastruc-
ture suitable for collective interaction between many human beings as it enables
multiple, instant and orderly exchanges between individuals all over the world.
When endowed with global, organized and ready connectivity, human beings can
multiply their biggest strength; their capacity of working in groups sharing
thoughts and experiences.
97

Why Engagement Will Redefine
the Next Evolution in
Workflow and BPM
Steve Rotter, Adobe Systems Incorporated, USA
It is an exciting time in business process management (BPM), as traditional no-
tions of BPM expand to include more users and more interactive applications.
Many corporations have already invested heavily in BPM to improve how business
data is processed and moves across enterprise systems. The next stage
enhancing how people engage with information and participate in processesis
already underway, with companies actively looking to capture and manage essen-
tial information before, during, and after it touches enterprise systems.
The evolution of BPM from technologies that focus largely on automating backend
processes to solutions that address how people like to work, collaborate, and en-
gage with systems is having profound impacts. The goal today is to implement
solutions that not only address the importance of integrated enterprise applica-
tions but also the need to automate the front-end processes the drive how people
engage with information and with each other. This more interactive process man-
agement is critical to realizing the full potential of BPM, which has been difficult
to achieve. Consider the following:
Several years ago the financial services industry embraced e-banking to
reduce costs and to improve customer service but has found that only 10
percent of customers are willing to complete transactions online.
The manufacturing industry has dramatically reduced the cost of product
design and production over the last five yearsyet 50 percent of all col-
laboration is still done outside of enterprise systems. In addition, 60 per-
cent of process flow and security activities are managed manually.
The life sciences industry has actively pursued using electronic submis-
sions and electronic regulatory reviews to reduce costs and speed ap-
proval times, but currently only five percent of submission processes are
fully digital.
TRANSFORMING BPM
When it comes to BPM, technology advances are reshaping opportunities and ex-
pectations. The emergence of Rich Internet Applications (RIAs), Web 2.0 technolo-
gies, mobile devices, and interactive, intelligent document solutions is prompting
more engaging and more secure ways of integrating people, processes, and sys-
tems. The result is a new class of customer engagement applications that seam-
lessly and securely connect end users to enterprise applications, such as ERP,
CRM, SCM, and HR systems.
Before looking at how these technologies extend the opportunities for and benefits
of interactive process management, it is helpful to explore the advances in detail.
The Rise of Rich Internet Applications (RIAs)
Increasingly, consumers and business users expect web applications to do more
than deliver static pages of information to their computer screens. Instead, they
want applications that combine the richness, offline capabilities, and interactivity
WHY ENGAGEMENT WILL REDEFINE THE NEXT EVOLUTION
98
typically found in desktop applications with the simplicity and universal access
associated with web interfaces. RIAs deliver on this demandwith resulting ap-
plications that open new avenues for engaging with end users and capturing and
processing information that is at the core of business workflows.
By providing real-time, dynamic interfaces, RIAs transform how people interact
with services and information online, offline, or in occasionally connected envi-
ronments. The benefits include increased end user participation in processes, re-
duced abandonment rates, improved data capture, and overall process perform-
ance improvements. Ideally, RIAs operate within a cross-OS, cross-device applica-
tion runtime, ensuring they are available to the widest possible audience.
Examples of RIAs include interactive customer-facing applications for financial
services, citizen-focused applications for government agencies, and supply-chain
solutions for manufacturers and their partners. In any case, RIAs reach extended
teams and customers with highly graphical interfaces that connect to and present
data from multiple backend systems. In addition to more engaging interactions,
RIAs eliminate the tedious "click-wait-reload" processes long associated with web
services and the inconsistencies that can result from viewing the same content in
different browsers.
In many ways, a primary focus of RIAs is on integrating front-end business proc-
esses with backend systems and broadening opportunities for business automa-
tion and collaboration. For organizations looking to improve workflows, RIAs are
part of the solution. Also essential to improving operations are strategies to en-
hance how documents are generated, completed, and processed inside and out-
side an organization.
Intelligent, interactive document processes
Even the most efficient business processes can come to a halt when paper and
manual workflows are introduced. Processing employee Human Resource (HR)
requests, responding to customer orders, or sharing engineering designs across
project teams slow considerably when workflows move from digital to paper. Lost
documents, materials sitting in recipients' inboxes, or the need to manually key
data into backend systems are just a few of the things that increase operating
inefficiencies and can cause participants to disengage from processes.
The challenge, of course, is that documents drive many business processes today,
acting as the primary interface between people and between people and systems.
Because of this, process management strategies have to address the role of
documents and provide opportunities for automating creating, completing, shar-
ing, processing, and archiving business materials.
By reducing reliance on paper and transitioning to more engaging, digital proc-
esses, workflows can be built around interactive documents, helping organiza-
tions to reduce costs, boost employee productivity, and improve customer ser-
vices. Already, the advantages of dynamic digital documents are evident at or-
ganizations worldwide.
RULES OF ENGAGEMENT
With the increasing adoption of RIAs, dynamic digital documents, and high-speed
web services, the rules of engagement are changing for everyone in business to-
day. Workflows previously characterized by high costs and delays can now hap-
pen in a fraction of the time and at a fraction of the coststwo benefits lauded by
BPM enthusiasts. The experience of the Kane County Circuit Courts office in Illi-
nois highlights the opportunities and advantages of moving process management
WHY ENGAGEMENT WILL REDEFINE THE NEXT EVOLUTION
99
beyond backend systems to engage people more directly in collaborative work-
flows and front-end application processing.
As the fifth largest county in Illinois, Kane County court staff manages over
150,000 cases annually, ranging from simple traffic violations to serious felonies.
At the same time, the Circuit Court Clerk's office is responsible for handling citi-
zen requests for orders of protection, which typically involve domestic violence
cases. "We wanted to improve the quality of services that victims of domestic vio-
lence receive when they come to the county for help," explains Monica Lawrence,
records manager at the Circuit Court Clerks office. Previously, it could take up to
six hours to process an order of protection, and the office wanted to accelerate
that workflow.
To improve processes and enhance citizen services, the Circuit Court Clerks office
is leveraging a combination of web services and interactive document solutions to
transform how orders of protection are accessed, completed, submitted, reviewed,
and approved. The more integrated, automated processes translate into enhanced
citizen services that can be delivered faster, more conveniently, and more cost
effectively than ever.
IMPROVED SERVICES, REDUCED COSTS
Orders of protection can be initiated through advocates at shelters, legal aid at-
torneys, court personnel, or directly by individuals seeking protection. Regardless
of where an order originates, the forms have to be routed to the appropriate
judges, clerk staff, and sheriff employees, as well as to the victim and any legal
counsel. "We share a lot of information with agencies throughout the countyand
it has to be readily available when they need it, says Josh Orr, programmer in
the Circuit Court Clerks office.
Filling out an order of protection can increase stress on victims, especially when
the form has as many as 17 pages. For an order of protection to go into effect,
forms have to be reviewed by advocates, signed by the judge, filed and certified by
court clerks, and then transferred to the county sheriff.
To accelerate the process of initiating an order of protection, Kane County created
an online wizard that makes it easy for shelter staff to enter required information,
which is saved into the circuit court clerks secure database. With the help of in-
teractive, web-based documents, the submitted data is imported into platform-
and application-independent digital forms that are associated with business
workflows. For people initiating order of protection requests at the judicial center,
access to the intelligent forms is through a dedicated application running on the
countys systems.
The intuitive online forms enable users to enter required information electroni-
cally in as little as thirty minutes, in contrast to the hours that it could previously
take to complete forms on paper. Information that appears repeatedly on several
forms is automatically populated into all appropriate digital documents, eliminat-
ing the frustration and errors that can result from citizens or advocates having to
complete lengthy, repetitive forms.
INTELLIGENT, AUTOMATED ROUTING
Victim advocates and people seeking protection can electronically sign the forms
before submitting them to the clerks office. The petition, along with a suggested
order, is then sent to a judge, who may accept orders as presented, send them
back for more detail, or revise the orders. After judges review orders and apply
their electronic signatures, the approved forms are routed automatically to the
WHY ENGAGEMENT WILL REDEFINE THE NEXT EVOLUTION
100
Circuit Court Clerk for filing, where finalized order of protection documents are
electronically file stamped, certified, and emailed to the sheriff's office and legal
counsel.
"The automated process built around interactive, web-based forms is dramatically
faster than our previous manual workflows," says Matt Meyer, programmer at the
Circuit Court Clerks office. "Within approximately sixty seconds of having a judge
sign the document, an order of protection arrives at the sheriff's office for input
into the national "wanted persons" database. Overall, we've seen as much as a
five-fold improvement in the time it takes to complete, submit, and process orders
of protection."
A MODEL OF EFFICIENCY
Based on the success with automating its processes, the Circuit Court Clerks
office is extending the application to include a direct interface into the court's re-
cord-keeping system. New orders will be recorded automatically and saved along
with all accompanying documentation. The sheriff's office is also planning to
automate the process of capturing orders of protection after it implements a new
record-keeping system later this year.
By securely extending back-office processes to front-office applications, we can
better engage with our constituents and collaborate more effectively across agen-
cies, says Deborah Seyller, Clerk of the Circuit Court. Processes that took hours
can now be handled in a fraction of the time and at a fraction of the costs. Given
our agencys commitment to excellent service and efficiency, the more automated
processes support our success.
NEW OPPORTUNITIES FOR ENGAGING EMPLOYEES AND CUSTOMERS
As the experience at Kane County illustrates, advances in web and document
technologies open up opportunities for interactive process management by provid-
ing cost effective and reliable ways to connect systems and people inside and out-
side an organization. It is also important to note that unlike many traditional
large-scale BPM systems, Internet- and document-based solutions typically inte-
grate more easily with existing systems, and offer a faster time to deploy and
lower ongoing maintenance costs. These advantages have not gone unnoticed,
prompting corporations and government organizations worldwide to retool their
workflows.
This was recently the case at a leading U.S. financial services provider, as the
company looked to improve services and better engage with more than 10,000
personal investment advisors nationwide. Previously, to serve clients and initiate
financial transactions, advisors had to print the proper formsfrom a library of
more than 360 documentsand complete the materials by hand. To add to the
challenge, the company and its national network of financial advisors had to
manage nearly 400 additional forms to support processes such as disclosures for
regulatory purposes.
Due to the large number of paper forms and regulatory documents, the company
incurred substantial warehousing and storage management costs. In addition,
there were continuous issues with financial service advisors using outdated
formsa problem that not only hampered productivity but also increased the risk
of regulatory non-compliance.
When advisors submitted paper-based application packets to corporate for proc-
essing, staff scanned the forms into an image system and checked them for com-
pleteness and accuracy. A significant portion of those forms were received with
WHY ENGAGEMENT WILL REDEFINE THE NEXT EVOLUTION
101
missing or invalid information. For example, forms were missing pages or had
incorrect information. When errors were found, the documents had to be re-
turned to advisors, who often had to follow up with their clients to get more in-
formation. The resulting errors and delays were having a negative impact on advi-
sor and customer satisfaction.
Like Kane County, the company opted to use web services and interactive docu-
ment solutions to automate form access, completion, submission and processing.
To date, more than 300 forms have been automated. Financial advisors can now
go online and simply enter the services their clients want. They are instantly pre-
sented with all the forms required for transactions.
When existing customers meet with advisors to apply for new services, the appli-
cation forms are dynamically populated with advisor and client information from
backend systemssaving time, reducing errors, and helping ensure regulatory
compliance because current versions of forms are always used. Advisors can add
remaining details to the digital forms and electronically submit them to headquar-
ters for processing.
Since going live with the integrated web and document services solution, the
company has reduced by more than 50 percent the number of forms received
with errors. Equally important, it can now process a larger number of transaction
requests using fewer service representatives. Personal investment advisors also
report a higher level of satisfaction, with 77 percent of them stating that it is
much faster and easier to find, complete, and submit client requests. What is
clear is that by extending traditional notions of BPM from backend systems to
front-end processes, organizations are reducing costs, speeding service delivery,
and better engaging with all parties involved in processes.
KEYS TO SUCCESS
Several factors contribute to successful process management implementations at
organizations like Kane County and the financial services provider highlighted
above. In general, effective strategies for improving participant engagement in
processes often share common approaches with regard to planning and imple-
mentation. Two questions typically addressed are:
How can front-end workflows be securely and reliably integrated with
automated processes in backend systems?
How can the widest possible adoption of workflow and process manage-
ment strategies be achieved?
Integrating front-end and backend processes
Even amidst all the automation at companies today, many employees still find
themselves spending hours daily processing paper or keying data on paper forms
into computer systems. In these instances, people and manual processes act as
the interface to digital systems, resulting in increased errors, high administrative
outlays, and processing delays.
By leveraging web applications and automating document services to support
data capture and processing, organizations are replacing costly manual processes
with more streamlined, electronic workflows. Integral to success is building appli-
cations on open, standards-based solutions that support existing demands and
can scale to handle future business requirements.
Achieving broad user acceptance of process management
Front-end processes can involve an almost limitless number of variables. Employ-
ees, partners, and customers frequently have different ways of working, from how
WHY ENGAGEMENT WILL REDEFINE THE NEXT EVOLUTION
102
they supply information to the types of hardware and software they use. Because
many of these issues are outside of an organization's control, it is important to
adopt flexible solutions that can conform to users' varied requirements. At the
same time, the solutions still have to provide organizations with as much control
as possible.
This is, in part, why demand for RIAs and Web 2.0 technologies is on the rise. By
2010, Gartner predicts that at least 60 percent of new application development
will include RIA technology. And for good reasons. RIAs are generally platform-
and version-agnostic, enabling more people to use services and eliminating incon-
sistencies in data when it appears in different versions of web browsers.
Also, intelligence in RIAs resides on the client side, allowing users to repeatedly
and quickly manipulate content without reengaging with servers. Connections to
a server happen only as needed, such as when users save data, log on to the
Internet after working offline, or if backend data has changed and needs to be
updated on a user's machine.
As a result, RIAs offer more engaging, compelling experiences that are essential
for ensuring higher rates of user adoption and compliance with new strategies to
make process management more interactive. Equally important is that RIAs are
built around industry standards, providing greater assurance that they will inte-
grate seamlessly with middleware and enterprise systems, as well as support fu-
ture services.
CONCLUSION
The opportunities for leveraging web and document services technologies to im-
prove user participation in business processes are tremendous. Implementations
such as those found at Kane County and innovative financial services providers
are transforming long-held views of process management by enabling more people
to engage cost effectively and reliably with information and workflows inside and
outside an organization.
Fortunately, these approaches do not require a radical departure from existing
ways of working. Instead, it is possible to migrate accepted paper-based work-
flows to more interactive digital processes without adding unnecessary layers of
costs and complexity for users. In fact, processes should ultimately be simpler
and more engaging for everyone. The key is to look to proven technologies that
incorporate web services, interactive documents, and RIAs. The resultsreduced
costs, streamlined workflows, and enhanced servicesare available to anyone
with a stake in process management.
103

Applying MDA

Concepts to
Business Process Management
Alexander Petzmann, Michael Puncochar,
BOC Group, Austria
Christian Kuplich, BOC Group, Germany
David Orensanz, BOC Group, Spain
ABSTRACT
Business Process Management enables companies to gain from efficiency en-
hancements and to quickly and flexibly adopt for a changing world. Workflow
technology can definitely help to put business process management into action.
But how do you facilitate companies for sustainable Business Process Manage-
ment?
This paper first introduces the Process Management Life Cycle and its steps from
process strategy, documentation, optimization and implementation to daily busi-
ness process execution and process controlling.
For implementation aspects, the paper emphasizes on the duality of business-
focused and IT-focused process views which is a basic concept for applying MDA


approaches with respect for business needs and requirements. Then, practicable
concepts for model-driven implementation of workflow applications and model-
driven business monitoring are introduced.
Finally, different approaches of how to (re) introduce business process manage-
ment are discussed. In recent years a significant number of projects aimed at in-
troducing business process management required large amounts of initial in-
vestment before gaining benefit. To achieve sustained business process manage-
ment, significant benefits have to be quickly realized. To overcome the issue of
large initial investment, we discuss an approach to start with business monitoring
even before starting reengineering or implementation. This approach provides
process control and evaluation data to business people from the first minute.
Based on this, a continuous process management life cycle can be easily put into
action by starting with sharply focused process enhancements.
MDA

approaches can strongly support technical implementations but (re) intro-


ducing business process management is also a matter of organizational and cul-
tural change in companies. To overcome the pitfalls in such introduction projects,
the paper introduces a new approach starting with model driven business moni-
toring in order to get the process management life cycle into action.
INTRODUCTION
This chapter briefly introduces main concepts, this paper is based on:
Process Management Life Cycle (PMLC)
Model Driven Architecture

(MDA

)
Service Oriented Architecture (SOA)
In the following chapters a concept is evolved how MDA

/SOA concepts can be


enhanced with respect to monitoring issues. This concept can be applied for (re-)
APPLYING MDA CONCEPTS TO BUSINESS PROCESS MANAGEMENT
104
introducing sustainable Business Process Management in the form of a Process
Management Life Cycle.
business process
modelling languages
CIM
PIM
PSM
code
logging
instance
T
r
a
n
s
-
f
o
r
m
a
tio
n
,
XPDL
BPEL4WS,
Services
Modeling Method
for XPDL, BPEL, ..
extended
business process
modeling languages
Elementary
Audit Data
Business
transaction
Execution Process
Monitoring View
Business Process
Monitoring View
Workflow Graph
Execution Graph
Business Graph
M
D
A
t
o
p
d
o
w
n
a
p
p
r
o
a
c
h
T
r
a
n
s
-
fo
r
m
a
tio
n
,
business process
modelling languages
CIM
PIM
PSM
code
logging
instance
T
r
a
n
s
-
fo
r
m
a
tio
n
,
XPDL
BPEL4WS,
Services
Modeling Method
for XPDL, BPEL, ..
extended
business process
modeling languages
Elementary
Audit Data
Business
transaction
Execution Process
Monitoring View
Business Process
Monitoring View
Workflow Graph
Execution Graph
Business Graph
M
D
A
t
o
p
d
o
w
n
a
p
p
r
o
a
c
h
T
r
a
n
s
-
fo
r
m
a
tio
n
,
CIM
PIM
PSM
code
M
D
A
t
o
p
d
o
w
n
a
p
p
r
o
a
c
h
CIM
PIM
PSM
code
M
D
A
t
o
p
d
o
w
n
a
p
p
r
o
a
c
h
CIM
PIM
PSM
code
M
D
A
t
o
p
d
o
w
n
a
p
p
r
o
a
c
h
CIM
PIM
PSM
code
M
D
A
t
o
p
d
o
w
n
a
p
p
r
o
a
c
h
business process
modelling languages
CIM
PIM
PSM
code
XPDL
BPEL4WS,
Services
Modeling Method
for XPDL, BPEL, ..
extended
business process
modeling languages
M
D
A
t
o
p
d
o
w
n
a
p
p
r
o
a
c
h
Workflow Graph
Execution Graph
Business Graph
business process
modelling languages
CIM
PIM
PSM
code
XPDL
BPEL4WS,
Services
Modeling Method
for XPDL, BPEL, ..
extended
business process
modeling languages
M
D
A
t
o
p
d
o
w
n
a
p
p
r
o
a
c
h
Workflow Graph
Execution Graph
Business Graph
1
1
2
2
3
3
business process
modelling languages
CIM
PIM
PSM
code
logging
instance
T
r
a
n
s
-
f
o
r
m
a
tio
n
,
XPDL
BPEL4WS,
Services
Modeling Method
for XPDL, BPEL, ..
extended
business process
modeling languages
Elementary
Audit Data
Business
transaction
Execution Process
Monitoring View
Business Process
Monitoring View
Workflow Graph
Execution Graph
Business Graph
M
D
A
t
o
p
d
o
w
n
a
p
p
r
o
a
c
h
T
r
a
n
s
-
fo
r
m
a
tio
n
,
business process
modelling languages
CIM
PIM
PSM
code
logging
instance
T
r
a
n
s
-
fo
r
m
a
tio
n
,
XPDL
BPEL4WS,
Services
Modeling Method
for XPDL, BPEL, ..
extended
business process
modeling languages
Elementary
Audit Data
Business
transaction
Execution Process
Monitoring View
Business Process
Monitoring View
Workflow Graph
Execution Graph
Business Graph
M
D
A
t
o
p
d
o
w
n
a
p
p
r
o
a
c
h
T
r
a
n
s
-
fo
r
m
a
tio
n
,
CIM
PIM
PSM
code
M
D
A
t
o
p
d
o
w
n
a
p
p
r
o
a
c
h
CIM
PIM
PSM
code
M
D
A
t
o
p
d
o
w
n
a
p
p
r
o
a
c
h
CIM
PIM
PSM
code
M
D
A
t
o
p
d
o
w
n
a
p
p
r
o
a
c
h
CIM
PIM
PSM
code
M
D
A
t
o
p
d
o
w
n
a
p
p
r
o
a
c
h
business process
modelling languages
CIM
PIM
PSM
code
XPDL
BPEL4WS,
Services
Modeling Method
for XPDL, BPEL, ..
extended
business process
modeling languages
M
D
A
t
o
p
d
o
w
n
a
p
p
r
o
a
c
h
Workflow Graph
Execution Graph
Business Graph
business process
modelling languages
CIM
PIM
PSM
code
XPDL
BPEL4WS,
Services
Modeling Method
for XPDL, BPEL, ..
extended
business process
modeling languages
M
D
A
t
o
p
d
o
w
n
a
p
p
r
o
a
c
h
Workflow Graph
Execution Graph
Business Graph
1
1
2
2
3
3

Figure 1: Evolving steps towards the enhanced concept in this paper
Process Management Life Cycle (PMLC)
Business Process Management (BPM) is aimed at implementing continuous im-
provement in organizations. Therefore, an obviously good procedure to implement
BPM takes the form of a cycle. Based on the BPMS paradigm [Karagiannis 1996],
we define a generic Process Management Life Cycle (PMLC) as follows:

Figure 2: Process Management Life Cycle (PMLC)
Process Strategy: before starting to work with processes, the organizations
strategy should be applied for process management, like defining core
business processes, process goals and objectives.
Process Documentation: existing processes should be documented as is.
Process Documentation not only defines procedures but also roles and re-
sponsibilities. Last, but not least, it defines tools and resources (like IT)
which have to be used for daily business.
Process Optimization: improvements are prepared here, existing processes
are analyzed for possible enhancements. These enhancements are identi-
fied in order to better fulfill process goals and objectives derived from the
strategy. The results of this are should be processes.
APPLYING MDA CONCEPTS TO BUSINESS PROCESS MANAGEMENT
105
Process Implementation: To take should be processes into action is not
only a technical issue. Even if IT makes up a large proportion of the im-
plementation, organizational changes should be taken into consideration
as well.
Process Execution: day-to-day business is done by executing processes.
Today, most of the daily business is done using IT in some way. Therefore,
this is where audit data is logged for immediate or later monitoring and
controlling.
Process Controlling: process audit data can be used in various ways from
IT support or process owners up to top management. Making use of avail-
able data is therefore a matter of suitable aggregation and transformation
according to goals and objectives set in the process strategy.
There are short run or long run cycles with emphasis on different areas of im-
provement. Short run cycles typically focus on evolutionary improvements of ex-
isting processes in order to make critical parts of it faster or cheaper or producing
output in better quality. Long run cycles on the other hand can try to bring revo-
lutionary changes to overcome local optimizations in order to focus on global op-
timizations or changed strategies.
Implementing Business Process Management from scratch can take a lot of time
when starting the PMLC classically with Process Strategy. An actual survey of
Business Process Initiatives states: In the case of the survey, most respondents
reported expectations of realizing a positive ROI within 2-3 years, which is gener-
ally a reasonable expectation for well-executed projects ... (see [BPTrends 2007] p
28).
To bring revolutionary should be processes defined in Process Optimization into
action, new software is needed. Model driven process implementation can strongly
support this. To reflect the PMLC concept, the model driven approach should be
enhanced reflecting monitoring issues in order to support Process Controlling.
Monitoring data is essential not only for steering and controlling issues but also
for having a basis for analysis and optimization in the next run of the PMLC.
Model Driven Architecture


In 2001 the Object Management Group

(OMG

) introduced the Model Driven


Architecture

(MDA

) as an approach for the specification of software systems


based on a model transformation concept. Before MDA

, software engineering was


strongly linked to the system platform on every design level and interoperability
was based largely on CORBA

standards and services. One of the main goals of


the MDA

approach is to separate software design from architecture and realiza-


tion technologies facilitating that design and architecture can alter independently.
Besides MDA

uses only standardized techniques: The Unified Modeling Lan-


guage

(UML

), Meta Object Facility (MOF


TM
), XML Metadata Interchange (XMI

)
and the Common Warehouse Meta Model (CWM
TM
). This article does not focus on
the description of these techniquesfor more details see [OMG 2007].
MDA

in its core is an approach to design IT system architectures taking into


consideration heterogeneous systems to be covered in different levels of models. It
describes how to transform theses models step by step from an independent sys-
tem level to platform models. So models and the transformation of these models
are focused by the MDA

approach. The MDA

describes three different types of


models:
Computation Independent Model (CIM): The requirements for a system
are designed/modeled in a Computation Independent Model. The CIM de-
scribes the environment and situation in which the system will be used
APPLYING MDA CONCEPTS TO BUSINESS PROCESS MANAGEMENT
106
from a business point of view. So the CIM should be the bridge between
Business Architects and Software Architects.
Platform Independent Model (PIM): The view of a system from the platform
independent viewpoint is designed/modeled in a Platform Independent
Model. The goal is producing models, which can be transformed in an ar-
bitrary system platform. Often targeting a system model for a technology-
neutral virtual machine is used in this step. The virtual machine consist-
ing of a set of parts and services (communications, scheduling, naming,
etc.) is used as a platform free of context.
Platform Specific Model (PSM): The Platform Specific Model is a view of
specific platform. The PSM merges the specification of the Platform Inde-
pendent Model with the specific details of a particular platform.
Code: Finally, MDA

as a software engineering approach has to produce


code (in the broader sense) which can be run on specific platforms.
CIM
PIM
PSM
code
M
D
A

t
o
p

d
o
w
n
a
p
p
r
o
a
c
h

Figure 3: The Levels of MDA

approach
Transformation from one level to another is crucial within MDA

. Any transforma-
tion should cause manual activity as few as possible. Following MDA

, transfor-
mation from PIM to PSM can be fully automated. On the other hand, for trans-
formation from CIM to PIM no automated method is available so far. It has to be
done manually, more or less well supported by tools.
To summarize, MDA

tries to transform the business requirements step by step to


a system in a model driven approach. But also the CIM usually strongly focuses
on the technical view and does not necessarily integrate all business dependen-
cies, which are contained in a Business Process. So the authors advise to start
with Business Process Models (BPM), which can be seen as part of CIM, in order
to define the requirements for the systems out of the scope of the business needs.
For this level the Business Process Modeling Notation (BPMN) can be used. Only
by including this level it is guaranteed, that the engineered systems support the
Business Processes in an optimal way and that should be the ultimate goal.
Service Oriented Architecture
Primarily the Service Oriented Architecture (SOA) is a management concept that
predicates that IT infrastructure must be aligned with the goals of the business.
This means that business processes are optimally serviced by the functionalities
of the IT systems. But furthermore SOA also focuses on the architecture of IT sys-
tems. So the logic of the business process should not be part of the program logic
of a system. The process should be executed by a sequence or better by a compo-
sition of services offered by the IT systems. Thus these services are independent
APPLYING MDA CONCEPTS TO BUSINESS PROCESS MANAGEMENT
107
from each other and system integration is not implemented on the system level,
but on the process level, services are highly reusable.
Although SOA does not focus on a special technology often Web Services and
standard technologies like SOAP, WSDL, UDDI and BPEL are tightly associated
with this approach. However SOA can also be realized for example with CORBA,
DCOM or EJBs. For details see [OMG 2007a].
Thus the service orientation is primarily a management concept it is important to
start software engineering by defining business requirements in a process ori-
ented scope and transform these needs in a model oriented way using MDA

. This
challenge emphasizes the advantage of our approach starting with a Business
Process Model (see chapter MDA

). Already during this step the process analyst


should think about services and business rules, which are needed for supporting
the business process. After completing this step the IT architect has a package
including processes, needed services and rules for transforming these business
requirements into a service oriented IT architecture. As services used in the proc-
esses are offered by different IT systems, the Business Process Models are the
(only) bracket of all services and therefore of all IT systems used in a company.
This is demonstrated in figure 4.
CIM
PIM
PSM
code
M
D
A

t
o
p

d
o
w
n
a
p
p
r
o
a
c
h
code
code
code
code
code
code
PSM
PSM
PSM
PSM
PSM
PSM
PIM
PIM
PIM
PIM
PIM
PIM
BPM
BPM
System 1 System 2 System 3

Figure 4: The Business Process as a bracket for all IT systems
Depending on how many different software engineering approaches are used
within the same organization, one could try to harmonize platform independent
models (PIM) in order to achieve a homogeneous architecture throughout different
systems, even if different platforms are used. Figure 4 reflects this idea by sug-
gesting also to have only one set of PIM for all systems and platforms.
DUALITY "BUSINESS GRAPHEXECUTION GRAPH"
For at least 10 years there has been being a consensus that business process
models are not identical to executable workflow definitions or BPEL processes
[Karagiannis 1996]. During the implementation of process based applications fo-
cusing on the process view you can distinguish three essential artifact types.
These are illustrated as Graphs and shown in Figure 5. For easier understanding
they are described in a different order than shown in the figure.
APPLYING MDA CONCEPTS TO BUSINESS PROCESS MANAGEMENT
108
UML
models
legacy
system,
no modeling
reference models
(standard software)
Business Graph
Workflow Graph
Execution Graph
workflow model
(business process
modeling language)
(extended
business process
modeling language)
(modeling language of
Workflow management
systems)
UML
models
legacy
system,
no modeling
reference models
(standard software)
Business Graph
Workflow Graph
Execution Graph
workflow model
(business process
modeling language)
(extended
business process
modeling language)
(modeling language of
Workflow management
systems)

Figure 5: Duality Business Graph - Execution Graph
Business Graph
The Business Graph describes the business process, this is a model that de-
scribes how the process should be executed. It is typically modeled using a busi-
ness process modeling language supported by a business process modeling tool
(e.g. BPMS supported by the tool ADONIS

or BPMN supported by various tools).


It usually contains no IT-specific information and just describes the process from
the functional point of view. It's modeled completely and contains activities that
are executed manually as well as IT-supported.
Execution Graph
The Execution Graph describes the defined workflow template that can be exe-
cuted in a workflow management system. Usually it's modeled in the design com-
ponent of the workflow management system under usage of a proprietary model-
ing language or standards like BPEL or XPDL.
There are several reasons that business graph and execution graph have to be
regarded separately:
The modelers of Business Graphs are business experts and the modelers
of the Execution Graph are IT experts.
There is no 1:1 relation between Business Graph and Execution Graph.
The granularity of Business Graph and Execution Graph is different.
The Execution Graph does not contain manual activities.
The Execution Graph might contain technically motivated decisions and
workflow activities for example for rollback mechanisms or exception han-
dling.
The Business Graph does not contain any technical data that are neces-
sary for execution.
Workflow Graph
As described above there are substantial differences between the Business and
the Execution Graph. For deducing the Execution Graph from Business Graph it
is helpful to have an intermediate model which describes the interrelation be-
tween both modelsthe Workflow Graph. Workflow Graph must not be mistaken
for the executable process definition - the Execution Graphas it was described
as above.
APPLYING MDA CONCEPTS TO BUSINESS PROCESS MANAGEMENT
109
The Workflow Graph describes which parts of the business process should be re-
alized by a certain implementation technology. Therefore the granularity is de-
fined on witch the workflow should be controlled by identifying which activities of
the business process should be supported by workflow activities or services and
which activities have to be aggregated or singled out. The Business Graph is en-
riched by technical implementation details that are necessary for developing the
Execution Graph.
There are two ways of deducing the Workflow Graph from the Business Graph.
The first possibility is to use the Business Graph and amend the missing informa-
tion. Another possibility is to completely remodel the Workflow Graph. This ap-
proach can be helpful if the quality of the Business Graph does not suffice.
The Workflow Graph is an extension of the Business Graph. That's why it's prac-
tical to use the same modeling tool for defining the Workflow Graph as for the
Business Graph provided that the modeling language includes the necessary ele-
ments or can be extended.
MODEL DRIVEN IMPLEMENTATION OF WORKFLOW APPLICATIONS
As described in the introduction chapter there are different levels of modeling de-
fined by the MDA

framework of the OMG

. Emphasizing BPM, figure 6 shows the


artifact types which can be assigned to the different levels of modeling:
business process
modelling languages
CIM
PIM
PSM
code
XPDL
BPEL4WS,
Services
Modeling Method
for XPDL, BPEL, ..
extended
business process
modeling languages
M
D
A

t
o
p

d
o
w
n
a
p
p
r
o
a
c
h
Workflow Graph
Execution Graph
Business Graph

Figure 6: MDA

and Process Graphs


Focusing on the process view, the Business Graph can be classified as Computa-
tion Independent Model (CIM layer). The Workflow Graph represents a Platform
Independent Model (PIM layer). The graphical representation of the Execution
Graph can be classified as a platform specific model (PSM layer) while the file rep-
resentation of the Execution Graph is part of the Code (Code layer).
As already described the transformation of the Business Graph to the Execution
Graph is done via the Workflow Graph. Therefore each iteration step requires
transfer ration and transformation mechanisms. The way the transformation is
realized depends on the tools used for modeling on each level and in the last step
on the workflow management system used for process execution. Interfaces be-
tween the modeling tools can be realized as manual interfaces if integration is not
possible or realizable, or as offline interfaces by exporting and importing files with
model information or as online interfaces if tools are well integrated or both levels
are modeled in the same tool.
Apart from the ideal situation, application development projects usually dont fol-
low the waterfall approach. Thats why merging mechanisms for transfer-
APPLYING MDA CONCEPTS TO BUSINESS PROCESS MANAGEMENT
110
ring/transforming models again are needed and case model information needs to
be changed on the higher level and were already edited on the level below. Fur-
thermore round trip mechanisms for adopting the higher level models for changes
made on a lower level can be helpful to keep consistency between all levels of
modeling.
Referring back to figure 5, there is no 1:1 mapping between levels. For consis-
tency issues but even more for monitoring issues described later, it is necessary
that business Ids (from higher levels) are known on lower levels or at least are
mapable.
MODEL DRIVEN BUSINESS MONITORING OF WORKFLOW APPLICATIONS
Extending the concept for monitoring issues is the last step. Figure 7 shows how
the way back has to be integrated. When running the daily operation, business
transactions produce audit data. This audit data can be transformed for different
views, of a consistent mapping between the different levels is available.
business process
modelling languages
CIM
PIM
PSM
code
logging
instance
T
r
a
n
s
-
f
o
r
m
a
t
i
o
n
,
XPDL
BPEL4WS,
Services
Modeling Method
for XPDL, BPEL, ..
extended
business process
modeling languages
Elementary
Audit Data
Business
transaction
Execution Process
Monitoring View
Business Process
Monitoring View
Workflow Graph
Execution Graph
Business Graph
M
D
A

t
o
p

d
o
w
n
a
p
p
r
o
a
c
h
T
r
a
n
s
-
f
o
r
m
a
t
i
o
n
,

Figure 7: Using MDA

-Models as templates for viewing monitoring data


Excursus: Levels and Addresses of the Business Monitoring Framework
In [WfMC 2004] the Business Monitoring Framework (BMF) was introduced. It
describes the three levels of monitoring:
Strategic Monitoring
Tactical Monitoring
Operational Monitoring
Each Monitoring Level offers relevant Information for different addresses or roles.
Figure 8 extents the respective figure shown in [WfMC 2004] accordingly:
APPLYING MDA CONCEPTS TO BUSINESS PROCESS MANAGEMENT
111
..
.
..
.
..
.
..
.
..
.
..
.
..
.
..
.
Strategic Level
(Process Scorecard)
Tactical Level
(Process Type)
Operational Level
(Process Instance)
Runtime
Environment
(Logging / Audit Data)
...
Data
Source
1
Data
Source
1
Data
Source
2
Data
Source
2
Data
Source
3
Data
Source
3
Data
Source
n
Data
Source
n
Instances of Process
Type 1
Instances of Process
Type 2
..
.
..
.
Instances of Process
Type n
1:1
Mapping
Aggregation,
Transformation
... ...
Aggregation,
Transformation
Aggregation of
Instances of Process
Type 1
Aggregation of
Instances of Process
Type 2
Aggregation of
Instances of Process
Type n
P1
P2
Pn
...
P1
P2
Pn
...
Adressees / Roles
Process Manager
Process Responsible
Process Responsible
Process Worker
Technical Support
2nd Level Support
Product Responsible
Operation
Figure 8: Levels and addresses within the Business Monitoring Framework
The different addressees shown above can be easily divided into technically ori-
ented and business oriented people. Their respective monitoring issues can be
assigned to business process monitoring view and execution process monitoring
view, as shown in figure 9:
T
r
a
n
s
-
f
o
r
m
a
t
i
o
n
,
Elementary
Audit Data
Execution Process
Monitoring View
Business Process
Monitoring View
T
r
a
n
s
-
f
o
r
m
a
t
i
o
n
,
Operational Level
Tactical Level
Strategic Level
A
g
g
r
e
g
a
t
i
o
n
Process Instance
Process Type
Process Scorecard
Operational Level
Tactical Level
A
g
g
r
.
Process Instance
Process Type
Operational Level Activity Instance
Adressees / Roles
Process Manager
Process Responsible
Process Worker
Technical Support
2nd Level Support
Product Responsible
Operation

Figure 9: Different views are aggregated for different BMF-levels and addressees
Some examples describe the addressees needs from figure 9 in more detail:
Operation needs detailed information typically on activity instance level in
order to solve punctual problems. They need information on the code level
and not from a business point of view.
Technical / 2
nd
Level Support typically needs information on process in-
stance level (where is one specific process stuck at the moment) or aggre-
gated for more process instances of the same type (how many billing proc-
esses are running at the moment, are they delayed compared to yester-
APPLYING MDA CONCEPTS TO BUSINESS PROCESS MANAGEMENT
112
day). Technical support needs to see a technical view. Therefore they are
better off with a platform specific view (e.g. mapped to XPDL models)on
the execution process level.
Business people carry out different roles in a process-oriented organiza-
tion. But all of them will have problems to understand technical models.
Therefore, monitoring data has to be transformed into the business proc-
ess view before we can aggregate for their different needs.
A process worker, for example, needs to know where his actual process is
stuck (such as, he can see it in his work list in a workflow system) on the
process instance level.
The owner or responsible of the same process will have more aggregated
information needs. He wants to see how many instances have been fin-
ished that day, how long it took systems and people to perform the proc-
ess in average and the like.
Finally the process manager has to periodically report to a panel on a
strategic level. He has questions like: Could the organization manage to
improve processing time or throughput? On this level, more data such as
financial data is combined in order get statements for performance man-
agement on a higher business level. This can be done using process
scorecards or integrating this data in balanced scorecards.
Now the cycle is closed; it was showed how to enhance the MDA

approach in
order to build business process oriented systems including monitoring artifacts
on all levels of modeling from the beginning. Making sure to keep consistency be-
tween levels and having mappable models, one can go all the way back with audit
data transforming it to the more technical execution view or even back to the
business view depending on the addressee for monitoring data.
This enhanced MDA

approach integrates mechanisms for advanced business


monitoring needs which in turn are needed to support the process management
life cycle.
DIFFERENT APPROACHES FOR (RE-) INTRODUCING BUSINESS PROCESS MANAGEMENT
Realizing new applications (from scratch)
The Process Management Life Cycle as described in the introduction chapter
represents a standard approach how to introduce BPM. It starts defining the
strategy, documenting as-is processes and starting an intensive optimization
phase taking into account the set of objectives defined in the strategy.
In order to get the should-be processes into action, processes have to be imple-
mented which not only means to do a software project but also to consider all or-
ganizational changes. BPM projects are challenged by this organizational part of
the change process at least as much as by typical challenges coming up from
software projects.
When everything is in place, optimization measures have to be evaluated whether
if they are effective (and the new processes are more efficient in terms of the new
strategy) or not. This process controlling is done by monitoring daily operations
and doing analysis on it.
APPLYING MDA CONCEPTS TO BUSINESS PROCESS MANAGEMENT
113

Figure 10: Starting the PMLC for realizing new applications
Continuous Improvement
Once the organization has managed to get implemented processes into action,
they become as-is processes in daily business. If no changes are needed, those
processes will be running and running. But Business Process Management prom-
ises to help organizations to quickly and flexibly adopt for a changing world. This
only can be achieved with an effective process controlling in place. Based on such
evaluation data, the next roundtrip through the PMLC can start:

Figure 11: The PMLC enables for continuous improvement
Whether the process strategy has to be adopted and revolutionary changes are
needed (Process Strategy and documentation has to be adapted) or only evolu-
tionary improvements have to be made in the short term, the monitoring data
collected throughout the process controlling is needed as a basis for process op-
timization.
In any case, it is the process controlling and evaluating against objectives which
pushes process managers and/or process-accountable to step into the next cycle
of the PMLC. They identify gaps between as-is and their objectives and therefore
feel the need for further improvement. This mechanism keeps the PMLC running
APPLYING MDA CONCEPTS TO BUSINESS PROCESS MANAGEMENT
114
and leads organizations to sustainable Business Process Management, at least as
far as the authors experiences from various BPM projects tell.
Projects aimed at introducing Business Process Management often require large
amounts of initial investments before gaining benefit. But even if they have been
finished at all; without process controlling, without integrating business process
based objectives into the organizations incentive scheme, the introduction of BPM
is likely to fail in the long run.
Once such projects have failed, BPM becomes the negative buzzword of the year
within this specific organization. It is difficult to try to convince people of BPM ad-
vantages again in such situations. Even worse, it is almost impossible to find
supporters for re-introducing BPM. Here the following approach can bring up new
momentum:
A fast BPM (re-) introduction or Start with Business Monitoring first
So the key factors to achieve sustainable BPM within organizations are quick in-
troduction projects and enabling for process controlling. Furthermore almost
every established organization has existing as-is processes already. At least this
can be assumed as the organization is running daily business. Maybe as-is proc-
esses are not explicitly known or available as business process models, but im-
plicitly they are there.
The major idea behind this approach is to step into PMLC as quickly as possible.
So why not starting with process controlling/business monitoing?

Figure 12: Quick introduction phase for business monitoring
The key characteristics can be described as follows:
Compared to long running classical introduction projects process manag-
ers see first results and monitoring data for their processes very early.
People can more easily identify themselves with BPM because they are di-
rectly dropping in getting monitoring data and discussing how to improve
their real running processes.
Such projects can typically start in well defined areas without having the
need for new software applications for the daily business. Therefore, the
change process is much more simple in the first phase. This also helps to
reduce conservative attitudes.
Benefits from business process improvement can be easily demonstrated
by comparing monitoring data in the short run. Even smaller improve-
APPLYING MDA CONCEPTS TO BUSINESS PROCESS MANAGEMENT
115
ments tend to be a sufficient argument pro BPM because initial invest-
ments are much more smaller.
But you need a monitoring toolset which can be quickly introduced and
easily adopted for future extensions.
The extended concept for monitoring issues, discussed in chapter Model Driven
Business Monitoring of Workflow Applications, supports this approach in a good
way, when using it the other way round. In contrast to using modeling for devel-
oping new software, models are used to extract and document as-is processes
from existing software and running business.
Execution Graph models are extracted from software by analyzing soft-
ware dialogue sequences and by interviewing technical staff.
Business Graph models are derived from interviewing business people in
order to document how they are working as-is.
There is assumed to be a gap between how business- and IT-people see
their as-is processes. This gap can be narrowed by trying to find a suit-
able mapping throughout modeling the Workflow Graph.
Existing audit data has to be transformed in order to fit to the Execution
Graphs. Where no audit data exists, semiautomatic approximations or
even manual logging can help. In some cases small adaptations in the
existing software can help to overcome such a deficit.
Finally, process controlling/business monitoring can be set up based on
the business process monitoring view (see figure 9).
Starting with this approach, sharply-focused process improvements can be im-
plemented in a quick and easy way, based on real process monitoring data and
not based on estimations. As a side effect, technical- as well as business-focused
models are created and can be easily broadened in the future. BPM brings up di-
rect benefits to process managers and forces them to continue their work based
on a PMLC. And it prepares the ground for future software developments using
MDA

approaches because a lot of the needed models will be already existing


then.
SUMMARY
Introducing BPM in a classical way can cause large initial investments and can
take a long time until benefits arise. The longer such projects take, the higher is
the risk to fail or at least the risk to fail to demonstrate benefits originated by this
project and not by other effects.
Three key factors to successfully introducing BPM are discussed:
A short-running introduction project
Integrating process-based objectives into the organizations incentive
scheme.
Demonstrable benefits for persons who have to play a key role in a proc-
ess-based organization, so that they want to continue working on process
improvements.
Then a sustainable Process Management Life Cycle will be in place to ensure con-
tinuous improvement and business excellence.
Based on this PMLC an approach is provided to (re-) introduce BPM by starting
with monitoring in order to keep introduction time short. In contrast to classic
BPM projects it helps to get into BPM step by step with smaller initial effort and
more easily achievable benefits.
APPLYING MDA CONCEPTS TO BUSINESS PROCESS MANAGEMENT
116
Last, but not least, concepts for model-driven business monitoring provided in
this article are designed to support business monitoring/process controlling
which is considered to be crucial for a sustainable and effective Business Process
Management.
REFERENCES
[Karagiannis 1996]
Karagiannis, D.; Junginger, S.; Strobl, R.: Introduction to Business Proc-
ess Management Systems Concepts. In: Scholz-Reiter, B.; Sickel, E.
(Eds.): Business Process Modeling. Springer, Berlin et al. 1996., pp.81-
106
[WfMC 2004]
Junginger et al.; Workflow-based Business Monitoring. In: Fischer, L.
(Ed.): Workflow Handbook 2004. Future Strategies Inc., 2004, pp. 65-80.
[BPTrends 2007]
Palmer, N.: A Survey of Business Process Initiatives.
Business Process Trends Report, 2007. www.bptrends.com or
www.wfmc.org/researchreports/documents/Survey_BPI.pdf
[OMG 2007]
OMG: Model driven Architecture; refer to www.omg.org/mda
[OMG 2007a]
OMG: Service Oriented Architecture; refer to www.omg.org/soa
117

From Functional Silos to a
Process Vision
Salvatore Latronico and Francesco Battista,
openwork, Italy
INTRODUCTION
Even before fascinating promises about orchestrating organizations, BPM directly
supported a different point of view. ERP systems and all other tradi-
tional/transactional ones have been designed and developed to vertically support
departmental needs, offering a wide variety of fundamental and powerful func-
tional islands where users live and act. On the other side, BPM suites take care of
creating bridges to link islands together and allow their inhabitants to rely on
structured but flexible communication channels.

Picture 1: BPM bridges functional islands made of department people and systems
A real-life business case will allow us to illustrate how organizations usually work
without BPM methodological and technological support, highlighting related criti-
calities. Then we will explore general benefits and evolution opportunities of intro-
ducing a BPM suite, presenting both real-time prototyping and process data moni-
toring and analysis experiences. An essential inspection of a few peculiar BPM
key aspects and implications will complete the picture of how much BPM suites
are, and will be more and more, needed to orchestrate people and systems within
and among organizations.
BUSINESS CASE: ORGANIZATIONS HABITS WITHOUT BPM
Needs, scope and originally adopted procedure
Main elements presented here derive from a BPM project executed in TNT Global
Express Italy (referred to as the company in the rest of the text). Company In-
formation Technologies and Finance & Accounting departments have rapidly ac-
quired BPM view and logics, playing a very important and active role in all process
reengineering phases as reported below.
FROM FUNCTIONAL SILOS TO A PROCESS VISION
118
Fiscal purposes require the company to generate, verify and store securely, on a
monthly basis, a set of documents for each legal entity belonging to the company
group.
This procedure takes place involving the following departments or external func-
tions:
Finance and Accounting (F&A)
Information Technologies (IT)
External Audit (EA)
General Services (GS)
and the following main systems:
ERP (Finance and Accounting system)
Document Management System (DMS)
Signature software
DVD burning software
Document-optimizer software
The number of accounting documents for each legal entity is clearly defined for all
months except for end of year closure period when this number isnt known in
advance: in this case the F&A Director is responsible for stating when all needed
documents have actually been produced.
During the whole year, F&A department prints from the ERP system all necessary
accounting documents, producing very large files, each containing thousands of
accounting records related to one single legal entity and one single month.
Every accounting document is then managed by IT department operators that op-
timize and compress them with a dedicated software tool and then index and en-
ter those accounting documents into the Document Management System (DMS).
At defined deadlines an IT operator (in general, different from the one in the previ-
ous step) accesses the DMS and verifies if, for a specific legal entity, all required
accounting documents are already available in the system. If this is the case, the
IT operator selects relevant documents in the DMS and copies them into a file
system directory creating documents volume. This has to be made in compliance
with several criteria (such as month of issue, legal entity name, etc.), taking also
into account maximum volumes size, creating more than one volume when
needed. When there are no more accounting documents for the same
month/legal entity combination, the IT operator creates a hash codes file (stan-
dard well-known file fingerprint) for each volume. Then he electronically signs
them along with related timestamp in order to certify their content and allow the
detection of any eventual volume documents change after signature. For hash
codes creation, signature and timestamp, the IT operator will use dedicated soft-
ware.
The electronic signature used in the business case we refer to is in reality a
digital signature. In Italy and most of European countries the term digital
signature refers to the highly secure method of signing documents based on
personal smartcard issued, verified and guaranteed by Certification Authori-
ties. As details of digital signature features are not essential for the case we
present and they do not affect the main idea we want to communicate in this
article, we use here the generic expression electronic signature. Neverthe-
less we want to remark that some important aspects of strength and security
related to Italian and European digital signature may add value in such a
context.
FROM FUNCTIONAL SILOS TO A PROCESS VISION
119
In theory is the F&A Director that should sign the volumes hash codes files be-
cause she/he knows accounting documents content. In the real case F&A Direc-
tor never actually signs nor verifies volumes but he is simply forced to blindly
delegate the IT operator to act on his behalf (see the following Criticalities para-
graph for details and comments on this). Once a document volume has been elec-
tronically signed, the IT department is requested to burn the signed volume data
on a DVD and deliver it to the F&A department responsible for verifying and stor-
ing it into a fireproof safe. In case F&A rejects the DVD for any reason, IT must fix
reported issues and burn the DVD again.
The process continues then with F&A department sending signed volume hash
codes file to General Services (mail and registry function) which is responsible for
forwarding it to External Auditor as a proof of task completion and for any possi-
ble future reference. General Services department is also responsible for filing in
the DMS both message sent to External Auditor and related receipt.

Activity Owner Tool Trigger Ability
1 Printing Documents F&A ERP PULL
Knowledge of ERP print-
ing functions
2 Optimizing Documents IT
Document-
optimizer
dedicated
tool
PULL
Use of document-
optimizer dedicated tool
3
Indexing and Storing
Documents
IT DMS PULL
Knowledge of which
document data need to
be indexed and where
4
Creating Document
volumes
IT
DMS and
File System
PULL
Knowledge of deadlines
and rules for volume
creation
5
Creating and signing
volumes hash codes
file
IT on be-
half of
F&A Di-
rector
Signature
software
Action 4
completed
Investigate and verify
content of each docu-
ment before signing
6 Burning DVD IT
DVD burn-
ing software
Action 5
completed
Use of DVD burning soft-
ware
7 Delivering DVD to F&A IT
Action 6
completed

8
Verifying and storing
DVD in fireproof safe
F&A File System DVD received
9
Sending signed hash
codes file to GS
F&A Mail System
Action 8
completed (posi-
tive verification)
Send mail with attached
signed hash codes file to
General Services
10
Receiving signed hash
codes file
GS Mail System PULL Check mailbox
11
Sending signed hash
codes file to EA
GS Mail System
Action 10
completed
Send mail with attached
signed hash codes file to
General Services
12
Storing signed hash
codes file
GS DMS
Action 10
completed
Knowledge of which
hash codes file data
need to be indexed and
where
13
Receiving signed hash
codes file receipt
GS Mail System PULL Check mailbox
14 Storing receipt GS DMS
Action 13
completed
Knowledge of which re-
ceipt data need to be
indexed and where
Table 1: procedure steps without BPM
FROM FUNCTIONAL SILOS TO A PROCESS VISION
120
At the end of the process we have one or more DVD containing accounting docu-
ments and signed hash codes file for each month and legal entity. Accounting
documents, signed hash codes file and related receipts are also stored on the
DMS.
The most important steps and elements of the procedure are listed in Table 1.
Criticalities
The company managing all those steps strictly needed the whole procedure to be
completed on time, respecting all deadlines and adding the possibility to trace and
verify each step for audit purposes.
Starting from those strong requirements there were several and relevant critical-
ities that had to be examined and solved: lets comment the most important ones.
The process is extremely time-consuming: a lot of activities can be automated.
Moreover the only activity requiring a particular ability and responsibility (Activity
5) is delegated, for sake of process simplicity, by F&A Director to an IT operator
with no idea or knowledge about documents content.
The F&A Director, owner of the whole procedure due to his corporate and legal
responsibilities, simply knows when it begins (Activity 1) but after that he has no
control and only a completely blind view until the end of the most important ac-
tivities (Activity 8). Technical and manual activities cause non-F&A people to to-
tally manage the core of an extremely delicate F&A procedure and related docu-
ments are loaded into DMS without any actual contents verification.
Despite several relevant deadlines, most important procedure steps are pull-
driven which means that operators have to manually check event occurrence to
trigger further actions, with no automatic alerts or notifications support and, of
course, no opportunities to verify activities status and/or measure service levels.
All the actors of the process communicate with each other exclusively with an un-
structured approach; everybody remains in its own functional silos even using
very sophisticated software tools for executing each single activity but then going
back close to stone-age efficiency to let the process advance to the following step.
BUSINESS CASE: ADOPTING BPM TO OPEN AND LINK FUNCTIONAL SILOS
Guiding people and systems across the organization
Considering the general context of the business case and the highlighted critical-
ities, BPM introduction was essentially focused on following points:
system vs. human: distinguishing activities that can be done automati-
cally by systems, from activities that must be done by people;
orchestration: putting in place automatic processes managing and guid-
ing both people and systems involved;
delegation: avoiding delegations starting from eliminating all risky and il-
logical cases; each actor of the process has to be able to execute activities
on his own, directly managing and controlling related responsibilities;
content check: allowing documents content control during and not only at
the end of the process, changing process flow accordingly;
alerts: introducing time based alerts and related service level measure-
ments;
In compliance with the first point, Table 1 has to be rearranged and the result is
shown in Table 2.


FROM FUNCTIONAL SILOS TO A PROCESS VISION
121

Activity Owner Tool Trigger
1 Printing Documents F&A ERP PULL
2 Optimizing Documents Process Automation
Document-optimizer
dedicated tool
Process Event
3
Creating Document Vol-
umes
Process Automation File System Process Flow
4
Creating Volumes hash
codes file
Process Automation Signature software Process Flow
5
Signing Volumes hash
codes file
F&A Director Signature software Process Flow
6
Verifying Volumes hash
codes file
Process Automation Signature software Process Flow
7
Indexing and Storing
Documents
Process Automation DMS Process Flow
8 Burning DVD Process Automation DVD burning software Process Flow
9 Delivering DVD to F&A IT Process Flow
10
Verifying and storing DVD
in fireproof safe
F&A File System DVD received
11
Sending signed hash
codes file to EA
Process Automation
Action 10
completed (posi-
tive verification)
12 Storing hash codes file Process Automation DMS Process Flow
13
Receiving hash codes file
receipt
GS DMS Mail received
14 Storing receipt Process Automation Process Flow
Table 2: procedure steps with BPM
Note that human activities are only 5 over 14 and all may be orchestrated using
BPM suite tools that allow us to draw activities in a flow chart defining WHO does
WHAT and WHEN.
Activities 2 and 3 can be managed with an automated process or System to Sys-
tem process (S2S) that we called Documents Acquisition (DA).
This is a daemon process (single instance) that never ends, collects accounting
documents using naming convention (file type, legal entity, year and month), and
creates volumes.
Other activities can be organized in a process where Systems and Humans inter-
act (S2H2S) that we called Volume Signature process (VS) that is triggered by DA
process for each document volume (multiple instance).
Activity 7 can be managed with a process that we called Document Index and Stor-
age (DIS) triggered by VS process for each document in volume (multiple in-
stance).
FROM FUNCTIONAL SILOS TO A PROCESS VISION
122
Picture 2: Documents Acquisition (DA) process
FROM FUNCTIONAL SILOS TO A PROCESS VISION
123
The process flow chart of the most important processes, DA and VS, are in Picture
2 and Picture 3 respectively. Its very important to note that presented pictures are
self-explanatory and this is one of the most important BPM features: drawing flow
charts that clearly represent process logic being at the same time software gears
for application running.
From a semantic point of view activities in Picture 2 and Picture 3 are character-
ized this way; small gears icon for system activities (server side) while a small
head icon is used for human activities.
Trace data about document volumes, their statuses, related history, etc., are reg-
istered in the DMS from both systems and humans.
But how do operators execute their own activities?
This is done via a web application they can access from PC desktop or via an e-
mail reminder. In both cases operators get to a list of activity links, each opening
a form containing all data item related to the volume being processed.
This approach gives the opportunity to move from PULL to PUSH mode, where an
activity never completes until the operator in charge hasnt done what he was re-
quested to do (execute actions, enter specific data, etc.) in the flow chart.
The F&A Director requested to execute signature activity in the VS process, of
course, is also allowed to open and check volume documents, irrespective of their
physical location, as he/she can now be aware of what is being signed. This is the
reason why hash codes file are always generated server side.
Lets examine in detail what happens for year-end closure period: this is a typical
case of unstructured process step.
For year-end closure period, no rules for volume completion have been set, simply
because documents number is not known in advance and cannot be formally de-
fined. In the specific year-end closure case the rule is the F&A Director who, after
examining documents, is responsible to declare when they all have been printed
(unstructured process case) while from January to December periods process
rules are formally defined (structured process cases).
Note that in the year-end closure case, DA process immediately triggers VS proc-
ess, while otherwise it triggers VS process only when a volume is full or when all
scheduled documents of a month/legal entity combination have been collected. In
the latter case, DA process triggers DIS process for all open volumes related to
specific month/legal entity combination.
This means that for year-end closure period the F&A Director will be allowed to
sign volume hash codes file no matter how many documents are in volume, while
for all other accounting periods he will be allowed to sign volume hash codes file
exclusively when all requested documents have been collected.
Its important to underline how DA process drives VS process to immediately trig-
ger hash codes computation each time a new file is added, in order to get volume
hash codes file ready for signature (see Restart process node in Picture 3).
All processes include alerts and timers to control activities execution and notify
actors of events and deadlines.
As a result, with no additional manpower, all operators are perfectly orchestrated
independently from their own department, opening functional silos and linking
them together.
Everybody is informed on time about process flow and due activities, pushing
alerts when needed.
FROM FUNCTIONAL SILOS TO A PROCESS VISION
124

Picture 3: Volume Signature (VS) process
FROM FUNCTIONAL SILOS TO A PROCESS VISION
125
Now the F&A Director is able to execute operations from his own desk, without
delegating high corporate and legal responsibility tasks to IT operators.
Of course, all these activities benefit from the fact of being guided via the proc-
esses of a BPM suite and these include a detailed and complete traceability of
WHO has done WHAT and WHEN, general reliability, automatic SLA control and
alerts and so on.
In the end the most important points are:
people know what to do even when not knowing (partially or totally) the
process;
people understand what is happening in the process thanks to e-mail,
reminders, alerts helping the process to advance;
people are now monitored and traced during activities execution.
These are the main features of the BPM solution delivered, fully satisfying cus-
tomer requirements on the GO LIVE project date.
Supporting the evolution of ideas and needs, the F&A department, and all
other involved departments, clearly perceived the improvements the project
have brought, but most of all, people started to understand the potential of a
completely different approach. As a consequence, a few weeks after the pro-
ject GO LIVE, small changes or project scope extensions have already been
defined, to further improve the delivered solution and corporate perform-
ances.
The very important point we want here to underline is that, for several reasons, it
would have been very difficult (if not impossible) for all people involved in the pro-
ject analysis to identify those changes or extensions before actually touching with
hands and directly experiencing the delivered processes.
This phenomenon has then triggered a sort of Fibonaccis sequence where evolu-
tionary step n is based on steps n-1 and n-2: every project adjustment delivered
allows the organization to better focus enhancement areas, often letting new re-
quirements and ideas emerge for a continuous improvement, what we define a
real-time prototyping.
Picture 4: Volume Signature (VS) process: step n+1
FROM FUNCTIONAL SILOS TO A PROCESS VISION
126
According to this evolution path, below we present a couple of the subsequent
analysis/implementation steps in the project after the first GO LIVE:
Step n+1: why should the F&A Director always sign a volume for ap-
proval?
What would happen if she/he detects an error in one of the volume
documents?
Those questions originated a change in the process allowing the F&A Director to
approve or invalidate volume in case of errors, requesting the F&A department to
print again all volume accounting documents. All this was made by applying very
simple modifications to flow chart and related web form as shown in Picture 4.
Note that this case was very rare or even impossible before the BPM introduction
because the electronic signature was delegated by the F&A Director to an IT op-
erator.
Step n+2: why should the F&A Director always entirely approve or reject a
volume?
Why should even one single error in one single volume document cause
the entire volume to be rejected?
Those questions originated a change in the process allowing the F&A Director to
even partially approve volumes requesting the F&A department to mend errors
and print again invalidated fiscal documents only. All this was made applying
very simple modifications to flow chart and related web form as shown in Picture
5.
Picture 5: Volume Signature (VS) process: step n+2
FROM FUNCTIONAL SILOS TO A PROCESS VISION
127
Monitoring and analyzing: awareness tools
One direct advantage of adopting BPM for managing processes is that almost eve-
rything done by means of processes is traced in detail, including timing of each
related event. Reasonably, this sounds very interesting in any circumstance, in-
cluding the presented business case.
Making use of Business Activity Monitoring (BAM) functionalities on the huge set
of data traced, produces significant benefits in terms of monitoring (real-time con-
trol of processes jewel-data, from the summary KPI down to a single activity
measures and details) and analyzing (trend indicators, bottleneck detection, need
for redesign, move of resources from one department to another).
Both analyzing and monitoring features are very strong tools to scientifically
study the way processes are executed in organizations, allowing to detect specific
criticalities and get hints to propose general alternative/extensions to existing
processes. These tools are extremely important to allow all department people to
give their essential contribution for process reengineering and evolution.
IS BPM JUST SOFTWARE? THERE'S A LOT MORE THAN MEETS THE EYE
Traditional software development approach
The business case presented highlights very important implications of BPM suites
usage, but there are more general concepts to investigate.
The real innovation introduced by BPM suites is not related to WHAT can be de-
livered using them (the final result will always be a software application) but to
HOW those suites allow us to work for defining, developing and maintaining com-
panies business solutions.
We will investigate some key methodological aspects that exclusively BPM suites
class items can provide due to their completely different approach to organiza-
tional real-life issues.
Irrespective of project scope, technology used, terms and jargon, when an organi-
zation requires a software application, in almost every case there will be a stan-
dard sequence of project macro-phases, delivering as a final result the required
application.

Picture 6: General macro-phases for non-BPM software applications
FROM FUNCTIONAL SILOS TO A PROCESS VISION
128
Macro-phase names may be different depending on the specific category and/or
type of project, but they can almost ever be brought back to:
Analysis: detailed functional (generally) non-technical description of what
the customer wants;
Design: high level technical and technological directions defining the cor-
rect architecture and environment to build what the customer wants;
Development: technical activities (code writings, system configurations or
similar) in compliance with defined architectural context to functionally
deliver what the customer wants;
Each macro-phase involves different actors with different skills and experiences,
using different terms, concepts and, we could even say, different languages.
Passing from one macro-phase to another causes a risky translation activity that
implies a partial loss or distortion of the knowledge that originated the initial need
and idea. This phenomenon is only moderately related to the ability of the actors
involved: at the end of the macro-phases, the final result will be (slightly or
strongly) different from what the customer wanted.
But despite what we just said, lets suppose the delivered solution perfectly
matched the customers request: in any case this would definitely be true only for
a limited time interval. In fact, customer needs tend to change often and quickly
as a consequence of adjusting or following real-world evolutions: on the contrary
software solutions require efforts and time to be updated.
In fact, with the above mentioned approach, any rigorous change in software ap-
plication will cause the restart of the three macro-phases cycle with related is-
sues: on top of this, the cycle itself includes some fixed embedded time and cost
consumption, irrespective of activities scope and complexity.
The result is that customers are generally using software applications that do not
fit their requirements and the most common justification for that is: a new up-
dated version is not ready yet or an updated version would cost too much.
But in the rush real-world vs. software applications, the first is always going to
win with a very large margin if we continue to use for all software exclusively the
high-resource-consuming three macro-phases approach mentioned above.
vs. process-is-software BPM approach
In few words we are talking about widespread need for Business Agility (BA), un-
derlining that this feature is always helpful when supported, not essential for
every piece of software, but absolutely needed for software in charge of guiding
and orchestrating people and systems activities.
In other words, the need for moving or changing (functional) islands is very rare,
while business is continuously requiring to modify and build new bridges to dif-
ferently link those islands.
Along with Business Agility comes the important concept of Abstraction Level (AL),
measured in terms of how much distant from business are approach, concepts
and application design tools: the higher the abstraction level, the closer the dis-
tance from business.
As business peoples contribution is essential for business management and busi-
ness management is largely (if not totally) based on software, why shouldnt all
possible efforts be done to help shortcut business people with activities of defining
and updating software they will use to manage business?
The key to better understand how much important those two concepts may be, is
the relation that links them together. Using a high abstraction level approach and
FROM FUNCTIONAL SILOS TO A PROCESS VISION
129
tools to build a software solution will allow to change it in compliance to real-
world evolutions. The higher the Abstraction Level, the higher will Business Agil-
ity. This is true for first definition and delivery of a software solution, but even
more important in terms of maintenance, where the ability to change is crucial.
Lets define Business Case Coverage (BCC) as the set of business cases covered
and managed by software and verify which is the relation between the business
case coverage and abstraction level.
When we consider the class of traditional/transactional software built according
to the three macro-phases cycle mentioned above, it is clear that by increasing
business case coverage the abstraction level decreases approaching zero. This is
related to the adopted paradigm structure that essentially ties up the organization
with technicalities with limited or no added value for the business.
On the other hand when we consider the class of software solutions built using
BPM tools, again increasing business case coverage will cause abstraction level to
decrease, but with substantially different absolute values and always keeping ab-
straction level above a minimum X.
Picture 7: abstraction level / business case coverage
This minimum level X may be defined as the BPM maturity of a tool, increasing
day after day as a consequence of theoretical and technological BPM evolution.
Higher levels of abstraction and the process-is-software approach translate into
easier ways to involve users and managers and let them directly contribute to the
definition and maintenance of orchestration rules.
Transparent, solid, elastic: BPM as a backbone for organization activities
Vertical/traditional/transactional software is still very important and is not at all
supposed to be replaced, but the fundamental role of business orchestration has
been or will soon be taken over by BPM.
Of course, BPM suites available today have very different maturity and few of
them only partially allow us to support a process-is-software approach within a
FROM FUNCTIONAL SILOS TO A PROCESS VISION
130
complex BPM project. Nevertheless, the direction is clearly defined and both ven-
dors and customers have to be ready to follow this path switching from functional
silos thinking to a process vision.

Spotlight

BPM and
Workflow
in Healthcare


The Chester County Hospital:
Case Study
Ray Hess, The Chester County Hospital, USA
EXECUTIVE SUMMARY
The healthcare industry has been slower to adopt Business Process Management
(BPM) than other industries. However, The Chester County Hospital (CCH) has
distinguished itself by not only implementing workflow management technology
in a healthcare setting, but by customizing and supplementing that technology
with its own home-grown applications. The result is a workflow system that inte-
grates clinical, operational and financial processes to support patient-centered
care. In addition to meeting the primary goal of providing safer, more efficient care
to patients, BPM has enabled CCH to improve working conditions for employees,
dramatically increase productivity, achieve higher levels of cost optimization, and
become a competitive force to reckon with in the local healthcare community.
Note: This case study was originally submitted to the 2006 Global Excel-
lence Awards for BPM and Workflow, where it won the prestigious Gold
Award for North America.
OVERVIEW
While other industries have embraced BPM, healthcare has been slow to use BPM
to reengineer its processes, in large part because the very nature of the industry
has not lent itself to such an endeavor. Healthcare is not a finite, stationary in-
dustry with clearly defined, static procedures. The constantly changing variables
involved in patient care, along with the mobility involved in the administration of
that care, make it much more challenging to apply workflow procedures. Ironi-
cally, the need for integrated workflow management is probably more vital in
healthcare than in virtually any other industry, because in few other industries
can such processes mean the difference between life and death as they can in a
healthcare setting.
The Chester County Hospital is acutely aware of the need for workflow technology
in healthcare, and has made its mark in southeastern Pennsylvania by achieving
all-around excellent results in implementing BPM. Located in West Chester, Pa.,
with roots dating back to 1892, CCH is a provider of a full network of healthcare
services including a 234-bed not-for-profit acute care hospital; home care; and
many other ancillary services.
Faced with the challenges of increasing patient safety, efficiency and cost-
effectiveness, and meeting the demands of increasingly sophisticated and knowl-
edgeable healthcare consumers, CCH discovered that BPM was one of the keys to
the survival of a healthcare system in the 21
st
century.
CCH implemented what would become the foundation for its workflow manage-
ment portfolio: the Soarian health information solution from Siemens Medical
Solutions, which integrates the TIBCO iProcess Engine into its core functionality.
Soarian combines clinical, financial, diagnostic and administrative processes
across a healthcare enterprise. CCHs IT staff took Soarian a step further by cre-
ating their own, customized applications to add on to Soarian to meet the specific,
133
THE CHESTER COUNTY HOSPITAL
unique needs of their facility. The result came to be known at CCH as 9ADMIT, a
workflow system that integrates patient care with Web platforms, legacy systems,
and telephone and paging systems. 9ADMIT can be thought of as a vestibule with
various rooms branching off of it; the system launches at Admission and then
takes the data that comes in through the Soarian system and routes it to different
workflows based on predetermined criteria. Once launched, the system follows
the patient through his or her stay, constantly listening for data, displaying that
data for the people who need it, and ensuring that the next steps in the process
are taken. Like a guardian angel of sorts, the system always knows where the pa-
tient is in the continuum, where he or she should be next, and what steps should
be taken for his or her care.
Two workflows in particular have yielded enormously valuable results for CCH:
Bed Management and Infection Control.
KEY BUSINESS MOTIVATIONS
The primary motivation for adopting workflow management at CCH has been as
basic as the very essence of what drives just about every individual who enters
the medical field: the desire to make people well, and to do so in the most safe
and effective way possible. In addition, other motivating factors have been chal-
lenges that CCH, along with the healthcare industry as a whole, has faced in re-
cent yearsrapidly escalating costs, aging patient populations and therefore more
chronic diseases, human resource shortages and increasingly complicated work
environments. Also, patient expectations have dramatically risen over the past
several decades, compelling healthcare providers to search for ways of providing
increased customer service. As the Baby Boomer population is aging and conse-
quently consuming more healthcare, they are demanding quality and conven-
ience that their parents and grandparents would not have thought to expect.
Moreover, the Internet has provided individuals with the ability to access health-
care information on their own, and has created savvy healthcare consumers who
are prepared to take their business to the provider who will give them the highest
level of service and the greatest probability of recovery.
CCH has discovered that the key to meeting these challenges is integrated work-
flow managementthe process of seamlessly moving patients, information and
resources throughout the healthcare continuum in a way that was never before
possible.
KEY INNOVATIONS
At its most basic level, workflow management takes the vast amount of scattered,
multisource data that is available in a healthcare system and moves that informa-
tion throughout the enterprise so that key clinical and administrative information
can be shared, interpreted and analyzed. The result has been an enormously
positive impact on both clinical and business outcomes. For each user at CCH,
physician, nurse, executive, etc., the workflow-engineered IT system provides the
right information for the task at hand. Rather than having to use a string of sys-
tems with clumsy interfaces and multilevel trees and menus, the single, uniform
system brings together relevant information, orders and documentation in a
meaningful way, from the moment a patient is admitted. It gathers the patients
lab and diagnostic results, vital signs, documentation and orders, and organizes
them in a way that is most logical for the patients condition. In a sense, the sys-
tem watches over the patient throughout his or her stay, constantly capturing
data to guide the patients care from one step to the next.
134
THE CHESTER COUNTY HOSPITAL
While CCH has applied a variety of workflows to its processes, and continues to
develop new ones, two workflows in particular have reaped important positive
outcomes: Bed Management and Infection Control.
Bed Management: Impact on business
The availability of beds and lack thereof is one of the greatest sources of bottle-
necks in a hospital environment. Emergency departments and recovery rooms are
full of patients who are simply waiting for beds. Ultimately, these delays have
the potential to increase the length of patients hospital stays and require staff to
work over and beyond their normal hours. CCH has used Bed Management work-
flows to reduce turnaround time from the moment a patient leaves a bed to the
moment the next patient can be admitted to that bed. The hospitals Bed Man-
agement system follows patients throughout their stays and alerts key depart-
ments, such as Nursing and Environmental Services, regarding transfers and dis-
charges. By reducing wait times for beds, CCH is getting patients to the proper
care setting that their conditions require, and providing more satisfying in-patient
experiences for them.
Bed Management: Impact on Process
The manual handoff of tasks from person to person is a process that is suscepti-
ble to human error. Prior to the use of BPM, the Bed Management process was an
extremely complex and inefficient process, encumbered by a host of these manual
handoffs and tasks. A unit secretary would need to enter transfer or discharge
information into the system and then begin a manual process of documentation
and phone calls, notifying several departments and individuals that a patient had
vacated the room and the bed needed to be cleaned. Once the bed was cleaned
and ready for a new patient, another series of phone calls needed to be made. Ad-
ditionally, both the unit secretary and housekeeping staff utilized several docu-
ments to track the response and steps in the process. This cumbersome system
often required nurses to perform many extra steps that took them away from
their primary duty of caring for patients.
CCHs Bed Management system has automated many of the manual processes
involved in Bed Management and, as a result, the hospital has experienced a 50
percent reduction in the manual processes involved. The improved workflow
process begins with the same step as the unautomated process: the unit secre-
tary enters the discharge or transfer order into the system. But thats where the
similarity ends. From that point on, the subsequent steps, notifications, monitor-
ing and progress of the workflow are totally automated, ending with the house-
keeper entering a numeric code into a telephone, signifying that the bed is
cleaned and ready. This has also freed housekeepers from having to respond to
overhead pages and find phones to obtain work assignment details, and has di-
minished nurse involvement in the bed cleaning workflow, allowing them to focus
on the care of their patients.
When the bed is emptied, the workflow process automatically starts multiple ac-
tivities. The Nursing Supervisor is called via a text-to-speech alert to her IP tele-
phone and asked to give the cleaning a priority level: 1= 45 minutes, 2 = 30 min-
utes, 3 = 15 minutes. The system updates a Web-based electronic Bedboard so
everyone knows the exact status of the hospitals bed situation. It also accesses
reference tables to see which housekeeper to contact based on the floor, day of
the week, and shift. The appropriate housekeeper is alerted of the empty bed and
its priority via a text pager. The housekeeper indicates that the cleaning is started
by dialing to a specific number from the bedside phone and pressing the numeral
135
THE CHESTER COUNTY HOSPITAL
1. He indicates the bed cleaning is complete by dialing the same number and
pressing 2. The workflow automatically contacts the nursing supervisor and
updates the Bedboard when the room is ready. If the bed cleaning is not started
within the specified time, the housekeeper is re-paged, along with the Housekeep-
ing supervisor. Both the Nursing and Housekeeping supervisors utilize wireless
tablets to have up-to-the-minute status information to manage their people, and
to interact with the workflow engine.
The Hospital has also built safety checks and an alert mechanism into its Bed
Management workflow in the form of Business Activity Monitoring (BAM). As al-
ready mentioned, an escalation path is created so that if a bed is not cleaned in a
predetermined amount of time, an alert is sent out at certain time intervals to
keep the process moving. Furthermore, the system automatically notifies the IT
staff of an issue if the pages do not go out within 15 minutes. Every day the BAM
system generates an activity report for the previous day and sends it to the direc-
tor of housekeeping, the vice president of support services and shift supervisors,
who can review the reports and determine if there are bottlenecks in the system
and who or what is causing them.
CCH brought in an external consultant to conduct a pre- and post-workflow time
and motion study of the bed cleaning process. The consultant chose to focus on
the key business issue of bed availability on the hospital unit that had the biggest
bottleneck, Telemetry, during the peak discharge times between 2 and 8pm. His
study showed that the workflow automation process changes resulted in six beds
being available to receive patients an average of two hours earlier than compared
to the pre-workflow status.

136
THE CHESTER COUNTY HOSPITAL
The following charts display before and after scenarios of CCHs Bed Management
workflow processes. Notice the free floating boxes on the pre-workflow diagram.
They represent manual steps that were not connected to the rest of the process.

The graphic following shows the current main workflow screen for bed manage-
ment.
137
THE CHESTER COUNTY HOSPITAL
Main Workflow Screen for Bed Management

Bed Management: Impact on technology
The original bed cleaning process utilized functionality that required human in-
teraction with every step. It included work lists placed on a clipboard and a nurs-
ing census list that was crossed out and edited throughout the day. Pages were
made to the housekeeper via an in-house paging system to a numeric pager. This
method required someone to manually call the system and insert data for paging.
The housekeeper would then need to return the call for information. A work order
system created in Microsoft Access was also used. This system would print out a
requisition to a designated printer in the housekeeping office, prompting either
someone in the housekeeping office or the hospital operator to page the house-
keeper again. The nursing supervisor used an IP phone and the housekeeping
supervisor used a walkie-talkie. In all cases, information was given or obtained via
human-initiated contact.
The BPM automated system utilizes much of the same communication technol-
ogy, but now the steps are automated. There were several technology changes
made to allow this to occur. Numeric pagers were replaced with text pagers, an
Intel Dialogic analog telephonic card was purchased, and a Web UI was created to
report the status of all beds within the institution. The telephonic card was used
to automate pages to the existing paging system, to send text to speech messages
to individuals via the telephone, and to receive feedback via the phones keys. The
Access database was removed from the process since the metrics were captured
directly within the BPM process. The walkie-talkie was discontinued and wireless
tablets were given to the nursing and housekeeping supervisors respectively to
view the Bedboard Web UI. This UI was made available to all appropriate person-
nel and continually updated by the BPM system. In this way, the process, dis-
cussed above, was automated and controlled, reducing variation and the potential
138
THE CHESTER COUNTY HOSPITAL
for human error. Key to the success of this endeavor was the using of existing
technology (phones and pagers) while removing manual tasks from the staffs
workload.
Infection Control: Impact on business
It is a well-known fact that the longer a patient stays in the hospital, the greater
he or she is at risk for hospital-acquired infections. This is particularly significant
in facilities like CCH, where 85 percent of the rooms are semi-private. In such an
environment, patients are vulnerable to infections potentially carried by their
roommates. Hospital rooms are full of objects that provide breeding grounds for
bacteria, from telephones to food trays, and from toilets to flower arrangements.
Therefore, it is of utmost importance, and in some cases can even mean the dif-
ference between life and death, for a hospital to identify as early as possible any
patients who are carrying infections that require isolation, and to communicate
that fact to all the staff who will be involved in these patients care. This early
identification of patients with isolation-requiring infections not only protects other
hospital patients and the hospital staff from contracting infection, but also can
reduce the length of stay for the infected patient. Obviously, the sooner the pa-
tient is placed in isolation and receiving the proper antibiotics, the sooner he or
she will recover.
There are two major infections that, once contracted, cause a patient to be con-
sidered a carrier for life: Methicillin Resistant Staphylcoccus Aureus (MRSA) and
Vancomycin Resistant Enterococcus (VRE). It is imperative that patients who
carry these organisms are properly isolated until their current contagiousness
can be determined. If either of these organisms is spread to another person, that
person is infected for life. Furthermore, since hospital patients are invariably in a
compromised state, any newly acquired infection could cause serious complica-
tions or even death. It is tragic when the hospital inadvertently infects the patient.
The central reason for creating the isolation workflow was to prevent this from
happening.
It is also important to identify patients who, although previously active with a
contagious infection, no longer require isolation. Putting a patient in isolation un-
necessarily leaves already-scarce isolation rooms unavailable to patients who
really need them and causes bottlenecks in bed availability. In addition, due to
the extra supplies needed (gowns, masks and gloves) and the additional time in-
volved for staff members to gown-up before entering isolation rooms, the hospi-
tal incurs extra costs for keeping a patient in isolationapproximately $100 per
day.
Infection Control: Impact on process
Before CCH implemented an Infection Control workflow, the identification of pa-
tients needing isolation was a hit-or-miss, ineffective process. Admission assess-
ments were done after the patient was placed in a bed. At that point, if the as-
sessment revealed that the patient was a carrier of a contagious infection, the pa-
tient had to be moved into isolation, after already having exposed a roommate
and the staff to the infection. This also involved having to reclean the contami-
nated bed, causing more bottlenecks in bed flow. Moreover, the staff that would
care for the patient in isolation was caught unprepared, and in some cases was
not fully stocked with the proper isolation supplies.
On the flip side, the old IT system would flag cases where cultures were positive,
but not those that were negative, since theoretically negative results were not a
reason for action. In the case of Infection Control, however, negative results are
139
THE CHESTER COUNTY HOSPITAL
just as important as positive ones, since knowing that a patient does not need
isolation is just as important as knowing that a patient does, for reasons already
outlined above.
To determine its effectiveness at Infection Control, the IT staff at CCH conducted a
study on patients entering the hospital with MRSA. Four percent of adult patients
entering the hospital have a history of MRSA, and they account for 8 percent of
patient days. In a study of 30 patients over a four-week period, approximately 25
percent of MRSA patients with prior positive results who should have gone into
isolation upon admission were missed at initial bed placement. This necessitated
an immediate transfer to isolation status. If they were missed altogether, this
would place other patients and hospital workers at risk for the infection.
Now that CCH has implemented an Infection Control workflow, when a patient
comes in with a history of a contagious infection, the system places an automated
phone call to alert the nursing staff to take precautions, to perform a culture im-
mediately to determine if the patient requires isolation, and to explain the situa-
tion to the patients family members. The system then takes care of all the opera-
tional downstream work, such as notifying the Laundry Department to send extra
isolation supplies to the patients room.
The following charts display before and after scenarios of CCHs Infection Control
workflow processes.

140
THE CHESTER COUNTY HOSPITAL

The graphic below shows the current main workflow screen for infection control.
MAIN WORKFLOWSCREEN FOR INFECTION CONTROL

INFECTION CONTROL: IMPACT ON TECHNOLOGY
The original Infection Control process utilized little technology and was heavily
dependant on human action at almost every step. Prior to the implementation of
the isolation workflow, a database flag was set in the patient registration system if
the patient was identified with MRSA or VRE. However, this flag did not appear in
any clinical computer screens and therefore clinical staff were unaware of the pa-
tients status. Also, the Infection Control Department manually entered the flag
on a periodic basis (often many weeks after the fact). This master list was kept on
the Infection Control Departments computer, and was printed out and distrib-
141
THE CHESTER COUNTY HOSPITAL
uted to clinical staff. As a result, if a patient was newly diagnosed with the
MRSA/VRE infection and subsequently readmitted prior to the publishing of the
new list, their current status would not have been evident at either admission or
during initial review on the floor. Also, new positive lab values were printed to the
floors printer and were to be accompanied by a phone call from the Microbiology
Department. Negative lab values were printed and needed to be interpreted by the
nurse for potential relevance. In either case, values that needed to be acted upon
necessitated a series of phone calls and work orders.
The BPM system was configured to automate these same tasks with little human
intervention. The master list of infectious patients was placed in the SQL Server
database, updated automatically by the BPM engine, and made available through
the institution via a Web UI. Every patient being admitted is checked against the
master list and if a match is found, the system performs the necessary steps. A
phone call is placed using the above-mentioned telephonic interface to the nurs-
ing supervisor. E-mail notification is sent to the Infection Control department via
the Microsoft Exchange server. Clinical notifications show up on the Soarian
clinical system and on the Bedboard UI noted above. Notifications to ancillary de-
partments, such as Laundry for isolation gowns, occur automatically via printed
media. Every new lab value is evaluated for relevance. New labs with critical val-
ues are acted upon with proper notifications and the master list is immediately
updated. Again, a key technological aspect of this processs success has been the
ability to coordinate existing technology with the BPM system and to remove
many of the manual steps.
The Impact of Workflow on Users
In the case of both Bed Management and Infection Control, workflow has reduced
the number of manual steps within the process of identification, notification and
tracking. This in turn has reduced the time and effort of the end users, making
them more efficient and productive and allowing them more time for the care of
patients rather than spending time on the process.
Bed Management has reduced the number of manual steps by 50 percent. It pro-
vides an enterprise-wide, up-to-the-minute status for the care delivery team. It
automatically manages nursing alerts, prioritization, and housekeeping staff as-
signments, and provides the supervisors with an automated departmental man-
agement and comprehensive reporting of activity for analysis. Infection Control
has assured that every patient with a known MRSA or VRE condition is managed
and isolated properly. It assures that isolation bed placement is optimized by ap-
propriately placing all patients in the proper status but only for as long as they
need to be there. Proper care is automatically directed, assuring efficient, effective
and safer patient care.
The following chart indicates the volume of patients that have been impacted and
the automated alerts that have been accomplished by the Bed Management and
Infection Control workflows at CCH:
Number of patients monitored
at any given time
1,200-1,500
Number of beds cleaned
(since October 28, 2004)
37,500+
Admissions checked for MRSA
since December 2004
38,500+
Number of pages sent out 181,000+
Number of phone calls 77,500+
142
THE CHESTER COUNTY HOSPITAL
System Configuration
Most workflow examples that we have seen are predicated on end-user interaction
with a computer terminal. The key to CCHs BPM success was less the choice of
technology (which telephonic card or .NET versus Java) but rather configuring the
BPM process to interact with the end-user using the technology that they already
employed. This is most apparent when BPM is used to manage workflow for sup-
port departments that are never in front of a terminal but who need to be able to
receive assignments and provide updates to the system.
The chart below provides a description of CCHs BPM system.

HURDLES OVERCOME
Management hurdles
As CCH began to implement workflow processes, there was an immediate positive
buy-in from managers. There were, however, some challenges, the first of which
was to educate key members of the hospital on the capabilities and power of the
workflow management tools. The next challenge was to determine which proc-
esses to analyze and improve upon with workflow management, and to ensure
that management was committed to the initiative and able to align the appropri-
ate resources to analyze, design, test and monitor all workflows.
Business hurdles
One of the biggest hurdles that the CCH IT Department encountered in setting up
an integrated workflow management system was achieving consensus on the
proper protocols to follow, since there is such a huge variation in medicine on
how to deal with certain illnesses, etc. Another challenge was determining just
how much to automate in the workflow process. Some processes would simply
need to be refined, while others would require a complete redesign. It was very
important for CCH to get a buy-in from doctors for the workflow design, and to get
doctors to trust the system enough to perform functions that had previously been
143
THE CHESTER COUNTY HOSPITAL
performed by people. The IT staff needed to strike a balance between making
processes simpler and more efficient for doctors and yet not automating processes
so much that it took away or limited doctors control.
Another challenge was setting up a workflow system in an environment that is
not a stationary setting where workers sit at computers and perform most of their
job functions right at their desks or work stations. A healthcare setting by its very
nature is very mobile. Employees, from housekeepers to surgeons, do not work in
one place, but rather depend on pagers, phones and laptops to carry out their
duties. It was challenging to take traditional workflow systems and modify them
so that they could be effective in a clinical setting, where workers are like moving
targets and the steps involved in performing their job duties vary from hour to
hour and are as unique as the patients they serve.
Technology hurdles
In creating its workflow management system, CCHs greatest technological chal-
lenge was to build a whole portfolio of workflow modules to add on to the Soarian
workflow engine and make those modules work seamlessly together, rather than
building standalone pieces that were independent of each other. The next techno-
logical hurdle to overcome involved putting a monitoring system in place as a fail-
safe for the system. Again, in few other industries does the failure of a workflow
system have the potential to mean the difference between life and death, as it
does in a healthcare environment. Hospital staff relies on this system, not just
during normal business hours, but 24 hours a day, seven days a week, 365 days
a year. The IT staff was challenged with building an early warning system for
business interruption occurrences that would provide a level of vigilance far be-
yond anything that was already available out of the box.
As a result, at CCH, there is a built-in escalation management system that goes
into action the minute it detects a lock-up or failure in any of the many handoffs
to legacy systems. Indeed the stability issue has rarely been with the BPM engine,
but rather with these legacy systems. As mentioned in Bed Management, a com-
prehensive monitoring system using a .NET Web UI, text paging, and e-mail noti-
fications was created to monitor the database and system activity. This monitor-
ing system resides on a separate server and continually pings the various proc-
esses and hardware for proper operation.
BENEFITS
Cost Optimization/Increased Revenues
In a healthcare setting, particularly a not-for-profit institution such as CCH, the
traditional business model of increasing revenues does not directly apply. Most
payers pay on a fixed-fee basis, either a certain amount per approved day of care
or per case, regardless of the care given. When it comes to financial matters, the
focus must be on optimizing cost so that there is a positive margin on each case.
So, rather than seeking a profit, hospitals need to focus on decreasing expenses
and then seek to increase the volume of patients. If efficiency is not achieved, the
hospital could experience a loss with each patient admission and increasing vol-
ume would actually hurt the institution. The constant goal is to provide excellent
care, to contribute to a population of healthier people, and to do so in a safe and
cost-effective manner. CCH has made great strides in meeting this goal through
its workflow management system.
As already discussed earlier in this case study, the workflows that CCH has put
into place have helped to ensure that the hospital is making the most of its oper-
144
THE CHESTER COUNTY HOSPITAL
ating dollars by eliminating waste and spending money where it is most needed.
The Bed Management workflow, for example, has eliminated 50 percent of the
manual steps involved in the bed cleaning process and increased the timeliness of
bed availability. This has decreased the costs associated with overtime hours,
lengthy hospital stays, and bottlenecks caused by a shortage of clean beds, which
often requires diverting patients to other hospitals. The Infection Control workflow
has reduced the extra costs that are incurred from hospital-acquired infections,
such as lengthier stays and potential lawsuits. It has also saved money by elimi-
nating the unnecessary use of isolation rooms and isolation supplies which, as
stated earlier, costs an additional $100 a day.
The bottom line is that workflow technology has allowed CCH to provide better,
safer care to its patients, and thus meet or exceed national regulatory require-
ments for obtaining the maximum reimbursement from insurance providers.
Productivity Improvements
CCHs Bed Management workflow system has automated many of the manual
processes involved in Bed Management. All the steps, notifications, monitoring
and progress of the workflow are totally automated, with only the housekeeper
needing to enter into a telephone that the bed is cleaned and ready. This has
freed housekeepers from having to respond to overhead pagers and find phones
to respond to requests, and has diminished nurse involvement in the bed clean-
ing workflow, allowing them to focus on the care of their patients. It also has di-
minished the bottlenecks that occur due to a shortage of clean beds, and has re-
duced patients wait time to get into a room and begin receiving the proper care.
The Hospitals Infection Control workflow has greatly improved productivity by
providing staff with an early warning system to identify patients who need/do not
need isolation rooms. The system manages tables that identify which patients
have MRSA and VRE, noting the site of initial and last infection, valuable infor-
mation from a clinical perspective. The ability to immediately identify patients
who require isolation allows staff to save time through advance preparation. For
example, the Laundry Department is notified to send extra isolation supplies to
the patients room. The ability to identify patients who no longer require isolation
has decreased the incidence of using isolation rooms and supplies unnecessarily,
saving time and money.
COMPETITIVE ADVANTAGES
The implementation of BPM at CCH has gained competitive advantages for the
hospital as both an employer and as a healthcare provider. By becoming a hospi-
tal of distinction that has set itself apart from its competitors, CCH has been able
to attract the best and the brightest staff, and has become the hospital of choice
for the surrounding community.
In light of the current shortage of healthcare workers, the ability to attract quality
employees is critical. In general, those who select healthcare as a profession tend
to be individuals who are driven to help others. When their jobs are weighed down
with non-clinical duties, they tend to become frustrated and dissatisfied. By put-
ting efficient workflows in place, CCH has been able to provide employees with a
work environment that allows them to focus on what they enjoy the most: patient
care. Demonstrating its workflow efficiency at nursing recruitment venues has
given the hospital a definite advantage in attracting prospective nursing employ-
ees. Likewise, young physicians who have been educated in a time when workflow
has become increasingly important are seeking to affiliate themselves with hospi-
145
THE CHESTER COUNTY HOSPITAL
tals that have streamlined processes in place that make it easier for them to prac-
tice medicine. Thus CCH has increased its ability to attract physicians to its staff.
Workflow management has also provided CCH with a competitive advantage in
attracting patients. As discussed earlier, todays healthcare consumer, particu-
larly Baby Boomers, are much more demanding of quality care, and are much
more informed about healthcare issues. Publicly reported data on hospitals is
available with the click of a computer mouse, and consumers do their homework
in deciding where to go for healthcare services. CCH believes that BPM will help in
decreasing not just infection rates but, at the end of the day, reduce morality
rates as well. Thus, incorporating BPM into its strategy to increase patient safety
has definitely improved CCHs ability to attract patients.
BPM has set CCH apart from competitors, attracting employees and patients
alike. In so doing, the hospital has moved competitive goalposts and has caused a
healthy drive for other hospitals to stay in the game by following suit with efficient
workflow management processes.
PLANS TO SUSTAIN COMPETITIVE ADVANTAGE
CCH continues to develop the BPM workflows. Since the healthcare industry con-
tinues to change, CCH has purposely designed its workflow management system
with evolution in mind. The IT staff plans on adding pieces to the system so that
ultimately there will be workflow procedures in place to not only manage business
processes but to manage evidence-based medicine and clinical care wherever
possible. The hospital has already added dietary, diabetes, Congestive Heart Fail-
ure, admission assessment, outpatient, microbiology results, automated nursing
notes, automated discharge instructions, and smoking cessation education work-
flows. It is currently working on drug management, heart attack, pneumonia,
sepsis, radiology test preparation, and Emergency Department workflows. CCH
strives for the creation of workflow-enabled patient care processes that cover all
aspects of care.
146
147

Business Process Management in
Pharmaceutical R&D
Dr. Kai A. Simon, ALTANA Pharma AG
a Nycomed company, Germany
Abstract. Although highly profitable, the pharmaceutical industry has been fac-
ing increasing development cost, price pressure, and regulatory requirements. In
this context, many companies have embarked on BPM initiatives to manage effi-
ciency and compliance. This article provides an introduction to BPM in the phar-
maceutical industry on the basis of a short case study in clinical Research and
Development (R&D).
INTRODUCTION
In today's economy, corporations are more than ever looking for ways to improve
efficiency and effectiveness and business processes have, once again, made it into
the spotlight and become a major target for improvement initiatives. Business
Process Management, Business Activity Monitoring and Workflow, enabled by
Web Services and Service-Oriented Architectures are popular buzzwords that
have found their way into analyst reports, consulting companies marketing, soft-
ware vendors product brochures, and finally also management presentations.
But, if business processes seem to be en vogue again, many are already asking:
"What's new there? Havent concepts like Business Process Reengineering been
around for years? Indeed, business processes are hardly a new idea and even
though the term has been coined as late as 1990, the idea of business processes
and business process reengineering (BPR) has a pedigree from various disciplines,
such as marketing and administrative science (Simon, 1994; Simon, 2003).
So, if business process improvement as such is no real news, its implementation
with Business Process Management is what executives, business analysts, and
software engineers should have a close look at. The idea of improved business
processes has been around for years, but a proper way to implement such an
idea beyond the one-time BPR initiatives has been lacking, leading to some of the
most major setbacks in the history of management theories, with large business
process reengineering projects failing to deliver substantial long-term results.
Establishing Business Process Management is thus neither a rebirth of the busi-
ness process concept, nor Business Process Reengineering rising from the grave,
but the art and practice of applying a viable long-term methodology to reaping the
benefits from a process-oriented approach to business management.
SOME CONCEPTUAL ASPECTS ON BPM
When trying to define Business Process Management, it is easy to get confused,
since the term is used for a both a concept and a corresponding technology solu-
tion. Conceptually, it is about establishing and implementing goals and methods
for process improvement. Processes can be seen as non-tangible organizational
assets, much like an individuals knowledge and information, and well-managed,
they will pay off in terms of organizational performance and results. Processes,
moreover, are somewhat special in that they provide a structure that synchro-
nizes other assets that are used in value creation and support activities. They can
BUSINESS PROCESS MANAGEMENT IN PHARMACEUTICAL R&D
148
thus be considered as the conceptual framework, describing means and ends, for
the other components that are required to run the organization. With that said,
BPM can be defined in the following way:
Business Process Management is itself a process that ensures continued im-
provement in an organizations performance. It is thus the meta-process that de-
fines the framework and provides the tools for driving and improving performance
in business processes.
Considering BPM itself as a process means that the same guiding principles that
are used for other business processes also apply to the process management
process. At times, this means taking a radical-change perspective, resulting in the
fundamental tenets of the process to be re-considered and overhauled. At other
times, processes might undergo a cycle of continuous improvement with minor
adjustments. However, at all times, the wider context and fit with other processes
should be taken into consideration.
When looking at the core elements of Business Process Management, many peo-
ple might have a sudden dej-vu experience. Managing and improving processes
and supporting them by means of workflow technology across functional and or-
ganizational borders sounds suspiciously close to the concept of workflow-
enabled business process reengineering of the mid-1990s. The answer is yes and
no. Process management theory has become a mature field and many organiza-
tions have conducted process redesign over the past years.
However, it is still safe to claim that many organizations still lack a comprehen-
sive process management concept. In the 1990s the theory was if the business
was reengineered it would be more effective. In the 2000s its about understand-
ing that the process must be continually tended to, that processes need to change
as business changes. Both the process and the management of the process need
to be more flexible and adaptable than could be achieved through one-time reen-
gineering efforts.
As a technology, Business Process Management refers to the software that sup-
ports the concept by automating, managing and measuring the workflow within
and across organizations. A BPM technology environment typically comprises an
array of interconnected components that serve different purposes. A workflow
management engine to handle the process logic, an integration layer to allow data
exchange across applications, a portal to enable a unified access and interface,
and a content and document management system to store and retrieve business
documents.
It is often suggested that Business Process Management and information tech-
nology are inseparable in the sense that BPM inevitably requires the implementa-
tion of a workflow or process execution system. This is true to the extent that IT
support for process execution contains a considerable potential for improvement
with regard to time, cost and quality. Nevertheless, the bedrock of BPM is not
technology, but the relationship between business and IT. In order to realize the
full potential of BPM, business and IT must become equal partners. Business
owns its processes and is responsible for identifying improvement potentials,
whereas IT provides the necessary platform, tools and technological integration.
More recently, software vendors have proposed the concept of service-oriented
architectures (SOA) as a technology-based vehicle for process improvement. While
this idea is conceptually appealing to CIOs for their obviously easy structure, and
CEOs because of the alleged opportunity to reduce IT costs, the actual migration
to a service-oriented infrastructure and application landscape is hardly trivial.
BUSINESS PROCESS MANAGEMENT IN PHARMACEUTICAL R&D
149
SOA can provide architectural guidance for IT implementation but will not help to
solve inconsistencies in the business domain. In other words, there is nothing in
the SOA concept that will prevent a company from implementing perfect IT sup-
port for bad processes.
BPM DRIVERS AND ISSUES IN THE PHARMA-INDUSTRY
Albeit a doubling of sales since year 1996, many pharmaceutical companies fail to
deliver a long-term growth perspective and the recent wave of large-scale mergers
and acquisitions is a clear indicator of that development. A recent CMR report
(CMR 2006) indicates some of the contributing factors to this development. Over
the past 10 years, R&D expenditures have increased 60 percent and development
times went up 20 percent, while new molecular entity output has decreased to 80
percent compared to year 1996. At the same time, the number of clinical trials
per New Drug Application has more than doubled and the number of patients in
each study almost tripled over the past 30 years. It is also becoming more obvious
that regulatory bodies are increasingly reluctant to approve so-called me-too
drugs, i.e. products that are variations of already existing chemical entities and
mainly aim at cutting a piece of the cake in profitable and often already highly
populated disease areas.
At the same time, the Pharma industry is surrounded by a considerable number
of regulatory bodies that demand compliance within multiple areas, such as
GxP
1
. These regulations not only guide process content, but also require proc-
esses and their supporting IT systems to be developed, implemented and changed
in a validated and auditable manner. Companies being stock-quoted in the US
also have to comply with Sarbanes-Oxley-Act (SOX) regulations, which require the
establishment of an extensive control and risk management framework.
These factors alone are strong proponents for applying a more comprehensive
approach to managing processes and their performance. In some companies,
there is an additional urge brought forward by expiring patents and a need to
boost efficiency in drug development. As a consequence, many pharmaceutical
companies invest considerable time and effort in process management and im-
provement initiatives, often targeting research and development (R&D) as a core
element.
Pharmaceutical R&D has traditionally been viewed as a funnel. A wide range of
new chemical entities and subsequently drug candidates and research choices
are narrowed down over time until a small number of new products make it
through the clinical development phase and finally to the market. The funnel is
typically depicted in a linear way and discovery results or in-licensed early stage
projects maintain a smooth and regular flow of compounds through the funnel.
R&D management meant thus to ensure the steady inflow and to shorten the
funnel by improving R&D processes by means of more efficient processes and use
of technology.
Technology based improvements includes the deployment of high-throughput-
screening in drug discovery, but also the use of Remote Data Capture
2
for collect-
ing patient data in clinical studies and more recently, the analysis of study data

1
The term GxP is a generalization of quality guidelines, mainly used in the pharmaceutical
industry. Most frequently, it refers to Good Laboratory Practice (GLP), Good Clinical Practice
(GCP) and Good Manufacturing Practice (GMP).
2
There is an array of terms for using mainly internet-based data collection: Remote Data Cap-
ture (RDC), Electronic Data Capture (EDC) and electronic Case Report Form (eCRF).
BUSINESS PROCESS MANAGEMENT IN PHARMACEUTICAL R&D
150
in clinical data warehouses and the electronic submission of study results to the
regulatory authorities.
However, while there have been successful projects in each of these domains, the
integration of business processes in R&D across main process borders is still little
more than a vision. Processes are often considered within a functional domain
and the corresponding IT-systems are functionally deployed best-of-breed solu-
tions.
Assuming that it will become more important to manage the entire R&D funnel as
a single entity, rather than a set of loosely coupled steps, individual processes and
tools that support only a part of the R&D cycle will provide only a partially effec-
tive answer. Managing the flow of information and decision points up- and down-
stream in the R&D cycle requires that the entire process and its information are
integrated.
MINI CASEPHARMACEUTICAL R&D
Within ALTANA Pharma (now Nycomed), process management became a part of
the corporate agenda in 2002, when the participants in an IT strategy meeting
put it on the mid-term IT strategy roadmap. In 2004, a process-modeling tool
(ARIS toolset) was implemented in central IT for corporate use and the first proc-
ess modeling exercises took place at project level, mainly performed by external
consultants. In 2005, the CIO decided to set up a process management team as
part of the IT strategy and architecture department within the Corporate IT office
in order to create the critical mass required to establish BPM at larger scale and
carry it into the business units. Staffed with four externally recruited senior ex-
perts, the team was assigned with the task to develop and establish a corporate
BPM framework and to deliver BPM expertise into ongoing and future business
and IT projects.
At the time being, several large-scale business projects were also underway, in-
volving the replacement of various systems, but also the redesign of the business
processes they should support. These initiatives included a major program in
clinical development that had recently been released, consisting of four projects
and covering a considerable part of the clinical development chain from study
management and electronic data capture (EDC), to clinical data management and
analysis, as well as drug safety. As the initiative was considered as highly impor-
tant for ensuring the future clinical development capabilities of the company, a
process manager from the BPM team was assigned full-time to the program, to-
gether with an IT project manager and an enterprise IT architect.
A review of the existing process documentation in the organization, performed by
the BPM team, had revealed that high-level process maps that could be used as a
starting point for process analysis within the program did not exist. At the same
time, a plethora of process documentation, residing in a variety of systems and on
paper was found, including paper-based documentation of guidelines and stan-
dard operating procedures (SOPs), SOX-relevant processes in ARIS databases,
PowerPoint and Visio files, but also some R&D process models that had been de-
veloped prior to the program with very high level of granularity.
BUSINESS PROCESS MANAGEMENT IN PHARMACEUTICAL R&D
151

2.3.2 Clinical Study Phase I
2.3.6 Regulatory Documentation
2.3.7 Toxicological Development
2.3.8 Substance and Formulation Development
2.3.9 Clinical Study Supplies
2.3.3 Clinical Study Phase II 2.3.4 Clinical Study Phase III
Clinical R&D
2.3.5 Clinical Study Phase IV
Multi-Project Management
Purchasing Production Pr oduct Distribution
Material Suppliers
Manufacturers
Wholesalers
Human Resour ces
Information Technology
Administration
Supply Chain Management
Procurement
Marketing
Sales
Phar macies
Physi cians
Patients
Strategic Enterprise Management
Finance
Discover y Research
Pre-clinical Research &
Development
Clinical Development Regulator y Submission
Drug Safety
Product Lifecycle Management
Research Partners
CROs
Investigators
Stakeholders
Regulatory Authorities
Governance
Research & Development
Commer cial Management
Supply Chain
Suppor t Processes
2.3.1.1 Study Authorization 2.3.1.2 Study Planning 2.3.1.3 Study Set-up 2.3.1.4 Study Execution
2.3.1.5 Study Closing 2.3.1.6 Study Analysis
2.3.2 Clinical Study Phase I
2.3.3 Clinical Study Phase
II
2.3.4 Clinical Study Phase
III
2.3.5 Clinical Study Phase
IV
Data reception fromCRO
Study Reporting
Study Management
Expa nsion of Cli nical
Developme nt at Level 1
Expa nsion of Cli nical
St udy at Level 2
St a r t o f i n itia t io n an d a u t ho r izat io n c li n i ca l st ud ie s
C Pr ep a ra t ion of t h e cl in ica l d e ve lo p me n t pl a n s ( CD P s)
D a ta in p u t p h a r ma ce u tica l sc ie n ce s
Da ta i n p u t GF M
Da ta i n p u t r eg u l a to r y a ff a irs
Da ta i n p u t b i o me tr y
Da t a in p u t cl in ica l de vel o pme n t
D a ta i np u t d ru g sa fe ty
F i n a l iza t io n of cl in ica l s tu d y b ud g et s
C Pr ep a ra t ion of t he dr a ft st ud y p ro t o co l s u mma ri e s
C DT
Dr af t c li n i ca l d e ve lo p men t p l a n s
Da ta i n p u t d is co ve ry r ese a rch
D a ta i np u t p re - cl in ic al r e se a rch
Dr a f t s tu dy p r o t oc o l su mma r ie s
C Re vie w an d a p pr o va l of d r a f t stu d y p ro to co . .. Ap p ro ve d st ud y p ro to co l su mma r i e s
C Pr ep a r a t io n of cl i n ic al stu d y b u dg e ts
A p p ro ve d s tu d y p ro to co l s u mma ri e s
R C D
Dra ft stu d y p ro to co l su mma r i e s
RC D/ CR
C Ap p ro va l of cl in ic al d e ve lo p me n t pla ns
Dr a ft cl in i ca l de vel o pme n t pl a n s Ap p ro ve d cl in ica l de ve lo p me .. .
V P R C D
A p p ro ve d c li ni ca l d e ve lo p me ...
C DT
F i n a liza tio n o f cl in ica l s tu d y b u d g e ts
C Ap p ro va l o f t he c li n ic a l stu d y bu d g e ts
C Pr e p ar a t io n o f th e d ra ft s tu d y p r ot o co l su mma ri es Dr a f t s tu d y pr o to co l su mma r ie s
C Re vie w a n d ap p ro va l o f dr a ft st ud y p ro to co ... Ap pr o ve d stu d y p r ot o co l su m ma r ie s
C Pre p ar a tio n o f c li n ic a l stu d y bu d g e ts
Ap p ro ved st ud y p ro to col su mma ri es
Fi na l iza ti o n o f cl in ica l st u d y bu d g et s
Hea d o f R C D/C R- Un i t
Ap p ro ve d cli ni ca l st u d y b u d ge ts
C Cl in ica l stu d y a u t ho r iza t io n s
A pp r ove d c li n ic a l stu d y b ud g e ts
En d o f in it i at io n a nd a u th o riza tio n cl in ica l s tu d ie s
Au th o rize d cli ni ca l st u d y bu dg ets
R C D
Dr a f t st u dy p ro t oco l su mma ri e s
RC D/ CR
In fo rma ti on o n c li n ic a l stu d y a u t ho r izat io ns
A ut h or ize d c li n ic a l stu d y b ud get s
Ma n a g e me n t Boa r d R&D
Au th o rize d cli ni ca l st u d y bu dg ets
Ap p ro va l o f cl in ic a l d e ve lo p me nt pla ns
Dr a ft cl in i ca l d e ve lo p men t p la ns Ap pr o ve d cl in ica l d e ve lo p me ...
Ap p ro ved cl i n i ca l d e v el o p me .. .
V P R C D
C DT
St ar t o f in it ia tio n a nd a u th o riza tio n c lin ica l s tu d ie s
C Pr e p ar a tio n o f th e c lin ica l d e ve lo p me nt p la n s ( CD P s)
D at a in p u t p h a rma ceu ti ca l sci en ce s
Da ta in p u t GF M
Da ta in p u t re g ul a to ry a ffa ir s
Da ta in p u t b i o met r y
Da ta i n p u t cl in ica l d e ve lo p me nt
D at a i n p ut d r u g sa fe ty
F i n a liza tio n o f cl in i ca l st ud y b u d ge ts
C Ap p ro va l o f th e c li ni ca l st u d y bu d g e ts
C Pr e p ar a tio n o f th e d ra ft s tu d y p ro to co l su mma r i e s
C DT
Dr a f t c lin ica l de ve lop me n t pl an s
Da ta in p u t d is co ver y re se a r ch
D at a in p ut p r e -c lin ic a l re sea rch
Dr a f t s tu d y p ro to c ol su mma r ie s
C Re vie w a n d ap p ro va l o f d ra ft stu d y p ro t o co ... Ap p ro ve d stu d y p r o to co l su mma r ie s
C Pre p ar a tio n o f c li ni ca l st u d y bu d g e ts
A pp r ove d st ud y p ro t o co l su mma ri e s
F i n a l iza t io n o f cl in ica l st u d y bu d g et s
He a d o f R C D/ C R- Un it
Ap p ro ve d cli n i cal stu d y bu d g et s
C Cl in i ca l stu d y a u th o r iza t io n s
A p p r o ve d c li n ic a l stu d y b u d g e ts
En d o f in it iat io n a nd a u th o riza tio n c lin ica l s tu d ie s
Au th o ri ze d cli n i cal stu d y bu d gets
R C D
Dr a ft stu dy p r o t oco l su mma ri e s
RC D/ CR
In fo rma t i o n o n c li ni ca l st u d y a u th o r izat io n s
A u th or ize d c li n ic a l stu d y b ud get s
Ma n a g eme n t Bo a r d R& D
Au th o ri ze d cli n i cal stu d y bu d gets
C Ap p ro va l o f cl in ic a l d e ve lo p me nt pla ns
Dr af t cl i ni ca l d e ve lo p me nt p la n s Ap p ro ve d cl in i ca l d e ve lo p me ...
V P R C D
A pp r ove d cl i n i ca l d ev e lo p me . ..
V P R C D
C DT
Synt he si s of low-
level proces ses
BUSINESS PROCESS MANAGEMENT IN PHARMACEUTICAL R&D
152
The BPM team chose to approach the process design work stream by rapidly de-
veloping a top-level map and to drive the top-down design of R&D processes in a
way that would allow the process design teams in the different projects to attach
their detailed process maps to these high-level value chains. Simultaneously, the
already existing process models were synthesized into the high-level map struc-
ture.
A separate work-stream was set up at program level to cover interface related is-
sued at process and system level, which were considered to be crucial for ensur-
ing consistency across the projects and actually being able to create a process
model across the entire clinical domain. Given the fact that study data manage-
ment and analysis consisted of more than 100 detailed process models, consider-
able effort had to be put into this part. However, being able to depict and analyze
the information flow along the entire clinical development process had not been
possible before and was considered as being one major benefit of the BPM ap-
proach being used in the program. Especially in areas, where little experience ex-
isted within the organization, specifically EDC and the use of data warehouse
technology for clinical data analysis; the process design work enabled a much
deeper understanding of the potentials and caveats of these solutions.
In general, the process analysis and visual representation of as-is processes also
served as an eye-opener and fueled the discussions on process improvement po-
tentials that were then becoming part of the to-be process maps. In addition, the
process documentation was used to derive business requirements for the new IT
systems to be put in place. Instead of brainstorming sessions, to-be process visu-
alizations were used to identify functional requirements and these visuals were
also provided to potential vendors to enable them to provide more informed re-
sponses during the vendor selection.
Another important role was played by the scenario development and analysis that
were performed to depict the changing process and system landscape over time.
Since the overall change program has a time scope of three years and was built
on a successive replacement of process and systems, rather than a big-bang ap-
proach, it became crucial to understand the implications of replacing individual
parts of the landscape, while maintaining overall organizational and technical op-
erability. For this purpose, a set of scenarios to reflect the state of the landscape
at critical points was developed that also covered the required process interfaces
and data transfer points. Each of the identified process interfaces that required a
hand-over of data was then analyzed and, if required, specified in more detail.
LESSONS AND RECOMMENDATIONS
BPM organization
It is frequently discussed whether process management is a task for a specialized
organizational unit, or something that must become part of every employees daily
work. And, if there is a special group for BPM, where should it be locatedIT, as
a staff function in finance or another business unit, or as direct report to the ex-
ecutive board? There are good reasons for any of these options and no one best
way to organize. However, in order to establish process management, a clear
mandate is required that allows the BPM team to become an integral part of all
process-related activities in the organization.
When establishing a corporate BPM initiative in a pharmaceutical or otherwise
regulated environment, the documentation and management of processes that
fall into the regulated domain are crucial. The official and regulatory relevant
processes are those documented in the signed-off guidelines and SOPs and parts
BUSINESS PROCESS MANAGEMENT IN PHARMACEUTICAL R&D
153
of the business might reject any other process documentation. Consequently, a
process management initiative in a regulated environment must embrace the
Quality Assurance system. This might be achieved by using the process reposi-
tory also as the basis for process descriptions to be used in guidelines and SOPs.
In the specific case, the process repository was set-up as a validated environment
and a process release cycle management was implemented for process change
control and publication.
Process targeting
The selection of processes that are taken into consideration depend on the chosen
overall approach. BPM theory demands a consistent approach, supported by top-
management and diffused throughout the organization. In this case, if BPM is a
clear top-down oriented approach and considered as a strategic initiative, proc-
esses are selected on the basis of their criticality and value-adding potential.
However, in many organizations BPM is driven by single entities, most frequently
the IT department, and in these cases a project-driven selection is more appropri-
ate, especially if there is no outspoken mandate to introduce BPM and thus no
senior management support to enforce compliance with BPM principles.
This approach can be valid also to demonstrate quick wins in individual projects,
but has the inherent risk of resulting in a vast number of process models with
inconsistent modeling styles and conventions, interface gaps, or incompatible
overlaps, especially if no mandatory modeling tool is prescribed and project man-
agers allow their teams and consultants to use office software as modeling tools.
Independent from the above context, the initially-targeted processes should pos-
sess several characteristics to make their incorporation into a BPM initiative
meaningful and provide for potential benefits to be reaped.
They should cross-departmental boundaries, i.e. span over more than one func-
tional unit. Crossing departmental boundaries leads to breaking down of barriers,
optimizing the entire process rather than just activities or tasks, and can begin
transformation to a process culture. They should also involve information flows
that involve the creation and use of information by distinct entities. Crossing in-
formation boundaries means thinking about processes in terms of information
flows that are cross-functional in the same way as process boundaries. In the AL-
TANA change program, the clinical development clearly satisfied this requirement.
It is a major process that spans over a considerable number of departments, each
of them responsible for specific elements of the process, but working in a highly
interconnected way.
They should involve several applications that provide support for the processes
being involved. Crossing applications boundaries leads to addressing the white
space between applications where most exceptions and data transformation takes
place. The process modeling within the program included the analysis of the ap-
plications being used within each sub-process and the process steps that involved
more than one system, thus allowing the identification of interfaces. In addition,
this information was fed into the architecture work-stream that would develop an
overall IT architecture for the clinical domain.
Process-map design
In most organizations, some kind of process documentation already exists prior to
a BPM initiative. Typically, the processes being implemented in ERP systems are
(more or less well) described and publicly traded companies have their financial
processes documented as part of their SOX compliance efforts. Due to the regu-
lated environment in which Pharma companies operate, all GxP relevant proc-
BUSINESS PROCESS MANAGEMENT IN PHARMACEUTICAL R&D
154
esses are documented in guidelines and standard operating procedures and sub-
jected to a structured change management.
While this situation is obviously a good starting point, it also constitutes a prob-
lem: Typically, these processes are documented in different tools and lack consis-
tence with regard to method, conventions, and modeling style. It is thus advisable
to develop a strategy to harmonize process documentations throughout the or-
ganization and to migrate onto a single design platform. In pharmaceutical com-
panies, this requires some extra effort, since regulations require validated systems
to be used wherever GxP relevant processes are concerned and might result in a
demand for validated process management software.
To develop a comprehensive and consistent process map, process modeling,
analysis and redesign need to be approached simultaneously at several levels.
Top-down analysis starts with a level-0 map and breaks down this highly
abstract model into sub-processes.
Bottom-up synthesis takes existing process models, e.g. processes
documented in SOPs and synthesizes these into aggregate models.
Definition is the design of additional processes, typically taking place as
part of business improvement projects or as part of IT initiatives.
When analyzing and synthesizing simultaneously, it is crucial to ensure that the
models being developed are consistent across design levels. At the same time, the
definition of new processes must take place within the context of the already ex-
isting models in order to avoid gaps and overlaps.
BPM benefits need to be demonstrated
While the benefits of using a cross-project approach to process management were
apparent at program level, the project teams initially did not see an immediate
benefit, especially since the focus of the analysis was process visualization, rather
than preparing the ground for process automation. This changed however, when
the first critical process interfaces were uncovered as part of the program level
St ar t o f i n it i a ti on an d au tho r i zati o n c l i n ic a l s t ud ie s
C Pr e p ara t io n of th e c l i n ica l de vel o p ment pl a ns ( CD P s)
D ata i n p ut p ha rma ce u ti ca l sc i en ce s
Da t a i n pu t GF M
Da t a i n pu t r eg ul a tor y af fa i r s
Da t a i n pu t bi om etr y
Da ta i n pu t cl in i ca l de vel o pm ent
D at a i np ut d r ug saf e ty
Fi na l i za t io n of cl i n i ca l s t ud y b ud get s
C Ap p ro va l o f th e c li ni c al st u dy bu dge ts
C Pr e p ara t io n of t he d ra f t s tu d y p ro to co l s u mm a ri es
C DT
Dr a f t c l i n ic a l de v e lop men t pl an s
Da t a i n pu t di s co v ery r ese ar ch
D at a i np ut p r e -c l i ni c al r e sea r ch
Dr af t s tu dy pro t oc ol su m ma r i e s
C Re vi e w an d ap pr o va l of dra f t st ud y p r ot o co ... Ap pro ve d stu d y p r ot o co l su m ma r ie s
C Pre p ar at io n of c li ni c al st u dy bu dge ts A p p roved s tud y p r oto c ol s u mm a r i es
Fi na l iza ti o n o f c l i n i ca l st ud y bu d g ets
Hea d o f R C D/C R- Un it
App r ove d cl i ni c al st u d y bu dg ets
C Cl i n ic a l st u dy au t ho ri za t i on s A pp r oved c l i n ic al stu d y b ud get s
En d of i ni t i at ion an d au tho r i zati o n c l i n ic a l s t ud ie s
Aut h ori ze d cl i ni c al st u d y bu dget s
R C D
Dr a ft st u dy p ro t oco l s u mma r i e s
RCD/ CR
In fo rma ti on o n c li ni c al st u dy au t ho ri zat i o ns A ut h ori ze d c l i n ic al stu d y budgets
Ma na g eme nt Boa rd R& D
Aut h ori ze d cl i ni c al st u d y bu dget s
C Ap pr o va l of c l i n i c al de vel o p ment pl ans
Dr af t cl ini cal de vel o pm ent pl a ns Ap pro ve d cl i n i ca l de ve l op me .. .
V P R C D
A p p roved c li ni ca l d e v el o p me .. .
V P R C D
C DT
St a r t o f i n i ti a ti on a nd au th o ri zat io n c l i n i ca l s t ud i e s
C Pr ep ara t i on of th e c l i n ic a l de vel o p ment pl a ns ( CD P s )
ata i n p ut ha rma c eu ti ca sc i en c e s
Da t a i n pu t GF M
Da t a i n pu t r eg ul ator y af fa ir s
Da t a i n pu t bi om etr y
Da t a i n pu t cl i n ica l de ve lo pm ent
Dat a i np u t d rug saf e ty
Fi na l i za t io n of cl in i ca l s tud y b ud get s
C Pr ep ara t i on of t he dra f t s t ud y p r oto co l s u mm a ri es
C DT
Dr af t c l i n ic a l de v e lo p men t pl an s
Da t a i n pu t di s co v ery r ese ar ch
D at a i np ut p r e -c l i ni c al r e sea r ch
Dr af t s tu dy pro t oc ol su m ma r i e s
C Re vi e w an d ap pr o va l o f dra ft st ud y p rot o co . .. Ap pro v ed st ud y p rot o co l su m ma r i e s
C Pr ep ar at i o n of c l i ni c al s tu dy bu dge ts pp rove d tud y p r oto c ol u mma r i es
R C D
Dra f t st u dy pro t oco l su mma r i e s
RCD/ CR
C Ap pr o va l o f cl i n i c al de vel o p ment pl ans
Dr af t cl in i ca l de ve lo pm ent pl a ns Ap pro v ed cl in i ca l de ve l op me .. .
V P R C D
pp rove d li ni ca l ev el o p me.. .
C DT
Fi n a li za t i o n of cl in i ca l s tud y b ud get s
C Ap pro v a l o f t he c l i ni c al stu dy bu dg e ts
C Pr ep ar a ti on o f t he dra f t s t ud y p r oto c o l s u mm a ri es Dr af t s tu dy pr o toc o l su m ma r i e s
C Re v ie w an d ap p ro va l of dr a ft s tud y p rot o co . .. Ap pr o ved st ud y p rot o co l su m ma r i e s
C Pr ep ar a ti o n o f c l i ni c al stu dy bu dg e ts A pp rov ed s tu d y p r ot o col s u mma r i e s
Fi na l i za t io n o f cl i n i ca l s t ud y bu dg et s
Hea d o f R CD/ C R- Un i t
Ap p rove d c li ni cal st u d y b u dget s
C Cl i n i ca l s tu dy au t ho r iza ti on s A pp r oved c l i ni c al st u dy b ud gets
En d o f ini t iat io n and au t ho ri zat io n c li n i ca l s t ud i e s
Au th or iz ed c li ni cal st u d y budget s
RC D
Dra f t s tu dy pro t oco l su mm a ri e s
RC D/ CR
In f o rma t i on o n c l i ni c al stu dy au t ho r iz ati o ns A ut h or ize d c l i ni c al st u dy bu dget s
Ma n a g eme nt Boa r d R& D
Au th or iz ed c li ni cal st u d y budget s
Ap p ro va l of cl in i c al de vel o p ment plans
Dr a ft cl i ni c al de v elo p men t pl a ns Ap pr o ved cl in i ca l de ve l op me ...
A pp rov ed c l i ni ca l d ev el o p me.. .
V P R C D
CDT
Fi na li za t io n of cl in i ca l s tud y b ud get s
C Ap pro va l o f t he c l i ni c al s tu dy bu dge ts
C Pr ep ara t i on o f t he dra f t s t ud y p r oto c o l s u mm a ri es Dr af t s tu d y pro toc ol su m ma r i e s
C Re vi e w an d ap p ro va l o f dr a ft s tud y p rot o co . .. Ap pro v ed st ud y p rot o co l su m ma r i e s
C Pr ep ar at i o n of c l i ni c al s tu dy bu dge ts pp rov ed tud y p r oto col u mma r i es
i na l i za t io n o f i n i ca l st ud y u dg et s
Hea d o f RC D/ CR- Un it
App rove d c li ni cal st u d y bu d gets
C Cl i n i ca l st u dy au t ho ri za ti on s A pp r oved l i nic al st u dy ud g ets
En d of ini t i at io n a nd au th o ri zat io n c li n i ca l s t ud i e s
Aut h or iz ed c li ni cal st u d y budget s
RC D
Dra f t s tu dy pro t oco l su mma r i e s
RC D/ CR
In f o rma t i on o n c l i ni c al s tu dy au t ho ri za ti o ns A ut h ori ze d l i nic al st u dy ud get s
Ma na g e me n t Boa rd R& D
Aut h or iz ed c li ni cal st u d y budget s
Ap pr o va l o f cl i n i c al de vel o p ment pl ans
Dr af t cl i ni ca l e ve lo pm ent pl a ns Ap pro v ed cl in i ca l de ve l op me ...
pp rov ed i ni ca l ev el o p me.. .
V P R C D
C DT
St ar t o f in i t ia t i on and au t ho ri zat i o n c l in i ca l s tud i e s
C Pr ep ar a ti on of t h e c li n i ca l de ve lo p ment pla ns ( CDP s)
D at a i n p ut p ha r ma ceu t ic a sc i en ce s
Da ta i n pu t GF M
Da ta i n pu t reg ul at ory aff a i rs
Da ta i n pu t bi om et r y
Da t a in pu t c l i n i ca l d e vel o pm ent
D ata i np ut d ru g sa fe ty
Fi na l iza ti o n o f cl i n ic a l s t ud y b ud ge ts
C Pr ep ar a ti on of the dr a ft s t ud y p r ot o co l s u mma r i es
C DT
Dr af t c l in i ca l de ve l op men t pl a n s
Da ta i n pu t dis co ver y rese arch
D ata i np ut p re - c l ini c al re s ea rch
Dr af t s tu dy pr o to c ol s u mma r i e s
C Re vi e w an d ap pro v a l of dr a f t stu d y p r oto co . .. Ap pr o ved st ud y p ro to co l su m ma r i e s
C Prep ar ati o n of c l i nic al st u dy bu d ge ts A pp r oved s t ud y p rot o col s u mm a ri es
R C D
Dra f t stu dy pr o toc o l su mm a ri e s
RC D/ CR
C Ap pro v a l of cl i n ic al de ve lo p ment pla ns
Dr aft c l i ni cal d e vel o pm ent p la ns Ap pr o ved cl i n ic a l de v e lop me ...
V P R CD
A pp r oved c l i n i ca l d ev el o p me. ..
C DT
St ar t o f i n i ti a t ion and au t ho ri zat i o n c li n i ca l s t ud i e s
C Pr ep ar a ti on of t h e c li n i ca l de vel o p ment pl a ns ( CD P s)
D at a i n p ut p ha r ma ceu t ica l s c ie n ce s
Da t a in pu t GF M
Da t a in pu t r eg ul ato ry a ff a ir s
Da t a in pu t b i om et r y
Da t a i n pu t c l i n ic a l de v elo pmen t
D ata i np ut d rug saf e ty
Fi na li za t i o n of cl in i ca l s tud y b ud get s
C Ap pro v a l o f t he c l i n ic al stu d y bu dg e ts
C Pr ep ar a ti on of t he dra f t s t ud y p r oto c o l s u mm a r i es
CDT
Dr af t c li n i ca l de ve l op men t pl an s
Da t a in pu t d is c o very r ese ar ch
Da ta in p ut p re - c li ni c al re se a rch
Dr af t s tu dy pr o toc ol su m ma r i e s
C Re v ie w an d a p pro va l of dr a ft s tud y p rot o co . .. Ap pr o ved st ud y p rot o co l su m ma r i e s
C Pr ep ar a ti o n of c l i n ic al stu d y bu dg e ts A pp rov ed s t ud y p rot o col s u mma ri es
Fi na l i za t io n o f cl in i ca l s t ud y bu dg et s
Hea d o f R CD/ C R- Un i t
App rov e d cli ni cal st u d y bu dget s
C Cl i n i ca l s tu dy au t ho r iza ti on s A pp r oved c li ni c al st u dy b u d gets
En d of ini t iat i on and au t ho ri zat i o n c li n i ca l s t ud i e s
Auth or i zed cli ni cal st u d y budget s
R CD
Dra f t s tu dy pro t oco l su mm a ri e s
RC D/ CR
In f o rma t i on o n c l i n ic al stu d y au t ho r iz ati o ns A ut h or ize d c li ni c al st u dy b udget s
Ma n a g eme nt Boa r d R& D
Auth or i zed cli ni cal st u d y budget s
C Ap pro va l of cl in i c al de vel o p ment plans
Dr aft c l i ni c al de v elo pmen t pl a ns Ap pr o ved cl in i ca l de ve l op me ...
V P RC D
A pp rov ed c l i ni ca l d ev el o p me.. .
V P RC D
CDT
Top-down anal ysi s
Identification of high level proces ses a nd defi nition of proces s map
Bot t o m- up synt hesi s
Int egr ation of already defined a nd docu me nt ed proces ses
Defi ni t i on
Docu me nt ation of addi tional proces ses wit hi n project s
BUSINESS PROCESS MANAGEMENT IN PHARMACEUTICAL R&D
155
process analysis. It is now accepted that the opportunity to analyze and design
process in their corporate context is one of the major BPM advantages.
Using a single platform
In order to develop and maintain a consistent process model it is imperative to
use a single shared process design environment with strict modeling conventions
and to prevent individual projects from working outside the platform. The con-
sultants that were contracted for the individual projects were not trained in using
the corporate process-modeling tool and proposed the use of office applications
for graphical process design. The project teams thus assigned an additional re-
source to transcribe the process models into the corporate repository. This ap-
proach required additional effort, but ensured that the overall process map was
kept consistent and the analysis of processes and interfaces at program level pos-
sible.
CONCLUSIONS
As opposed to the hype-style BPM propositions being brought forward by many
software vendors and consultants, business process management as such is not
going to completely revolutionize the way a pharmaceutical company operates. It
will however, contribute to increasing management and control over business
processes. When being implemented throughout a company, it can also enable a
faster adaptation to changing strategic guidelines and business requirements
through a controlled change management at process and systems level.
The potential benefits that BPM can deliver are typically referring to these issues,
but when looking at the current situation in many companies we will realize that
there often is a mismatch between the claims brought forward by BPM technology
vendors and the actual situation in the business. It is a frequently stated argu-
ment that BPM software is the one and only solution for enabling automated and
integrated processes. In reality, process automation today frequently takes place
within ERP-systems and many companies have been using forms of integration
technology, mainly at application integration level, for quite some time. In this
sense, the technology elements of BPM are, to a considerable extent, reusing the
installed base of integration tools. The short case-study being presented in this
paper also describes a scenario where BPM has delivered a significant contribu-
tion, but the main benefits stem from conceptual process work, rather than ap-
plying BPM as a technical concept.
While the potentials through process integration by means of technical integra-
tion should not be neglected, the real benefit obviously lies in the ability to obtain
greater process control and transparency. Taking this perspective, BPM goes be-
yond a purely technological value proposition and toolbox and assumes the role
as framework for helping the business to manage change in a process- and in-
formation-centric manner.
Within ALTANA Pharma, BPM has contributed significantly to making processes
more transparent, thus laying the foundation for current and future improve-
ment. The use of a process-based business analysis and redesign approach in
clinical development has demonstrated the potential that lies in the concept, if
properly applied and as a result, using processes as a starting point for business
improvement initiatives has become widely accepted and is applied in a variety of
other business units. These initiatives, in turn, add to the critical mass by con-
tributing to the development of a corporate process map and laying the founda-
tion for a corporate-wide approach to BPM.
BUSINESS PROCESS MANAGEMENT IN PHARMACEUTICAL R&D
156
REFERENCES
CMR (2006), CMR International 2006/2007 Pharmaceutical R&D Factbook
Simon, K. (1994), Towards a theoretical Framework for Business Process Reen-
gineering, Studies in the Use of Information Technology, Dept. of Informatics,
Gteborg University, 1994
Simon, K. (2003), BPR in the Pharmaceutical Industry, Gothenburg Studies in
Informatics, Dept. of Informatics, Gteborg University, 2003
157

Workflow Opportunities and
Challenges in Healthcare
Jonathan Emanuele and Laura Koetter,
Siemens Medical Solutions USA, Inc., USA
ABSTRACT
Workflow technology has expanded substantially into the healthcare industry
over the last year. Hospitals are embracing this technology as a means to improve
operational efficiency, achieve patient safety goals and positively influence the
quality of care. This paper will explore the opportunities BPM and workflow tech-
nology have to make a profound impact on patient care while examining the chal-
lenges that are present in the healthcare arena.
INTRODUCTION
Hospitals today face a constant challenge to find ways to improve the quality of
care, while at the same time reducing costs and increasing revenue. Concepts
such as process optimization, throughput, and efficiency are gaining attention
within the healthcare community as a means of achieving operational goals. Hos-
pitals are increasingly asked to tackle the problem of doing more with less. For
example, as the number of patient visits continues to rise, the number of beds is
not,
1
and hospitals need to make better use of their assets. At the same time, the
focus must remain on clinical excellence and quality of patient care.
Most emergency departments are at or over capacity.
2
The majority of hospitals
lose money when treating Medicare/Medicaid patients; more than $25 billion in
2005.
3
Staffing shortages exist across healthcare job types, causing emergency
department overcrowding, diversion of emergency patients to other facilities, re-
duced number of staffed beds to serve patients, delayed discharge and increased
length of stay for patients, and decreased staff and patient satisfaction.
4

These challenges do not change the fact that patients deserve and demand safe
and top quality care. Patients put their trust in hospitals to treat them according
to best practices, to ensure they receive the appropriate tests, medications and
interventions for their condition. Delayed care delivery, unnecessary tests, medi-
cation errors or preventable complications due to an omitted step in a plan of care
increases the likelihood of poor patient outcomes. Workflow technology and busi-
ness process management (BPM) concepts, designed to help hospitals deliver the

1

The Advisory Board Co. (2007, January 17). Leveraging IT to Optimize Hospital Throughput: An Improved Approach to Manag-
ing Capacity. Presented at Maryland HIMSS conference.
2 American Hospital Association. (April 2006). The State of Americas Hospitals Taking the Pulse: Findings from the 2006 AHA
Survey of Hospital Leaders. Retrieved Jan 18, 2007, from http://www.aha.org/aha/research-and-trends/health-and-hospital-
trends/2006.html
3 American Hospital Association. (Oct 2006). Underpayment by Medicare and Medicaid Fact Sheet. Retrieved Jan 18, 2007, from:
http://www.aha.org/aha/content/2006/pdf/underpaymentfs2006.pdf
4 American Hospital Association. (April 2006). The State of Americas Hospitals Taking the Pulse: Findings from the 2006 AHA
Survey of Hospital Leaders. Retrieved Jan 18, 2007, from:

http://www.aha.org/aha/research-and-trends/health-and-hospital-
trends/2006.html
WORKFLOW OPPORTUNITIES AND CHALLENGES IN HEALTHCARE
158
right work, to the right people, at the right time, are ideally positioned to serve the
needs of the patient.
Other industries have used business process management (BPM) concepts to
automate and improve processes with success. Opportunities exist to bring BPM
concepts to the healthcare industry; however there are significant challenges to
address for successful utilization of these concepts. The healthcare industry has
been slow to adopt BPM practices and workflow tools. This is due, in part, to
technology constraints. Only recently have we seen the increased use of SOA-
based healthcare information systems or electronic medical records by which we
can collect the data necessary to leverage the possibilities of business process
management. But even with growing data availability, there is a perception that
healthcare is a much more complex environment than other industries that em-
ploy BPM, and that BPM is simply not ready for that level of complexity.
For the healthcare industry to fully support the investment in BPM concepts and
workflow technology, the tools and technology must handle the complex condi-
tions and challenges of a healthcare environment, from physical and financial
resource constraints to utilizing technology for clinical decisions to variations in
patient conditions and treatments. BPM in hospitals can apply to administrative,
operational as well as clinical processes. Human lives will be affected by these
tools. There are opportunities for BPM to improve patient care and operational
processes and potentially achieve significant financial savings and ROI for the
healthcare industry. In order for this to happen, BPM must take into account the
unique challenges of managing a care process for health care providers and pa-
tients with workflow technology. Ideally, a fully optimized healthcare information
system would allow patients to receive the best care, in the least amount of time,
for the least cost, with increased profit margin for the hospital.
BPM is gaining traction in the healthcare setting. As it continues to meet the chal-
lenges and concerns of the industry, healthcare BPM as a movement will grow.
Healthcare BPM is the vision for the future of healthcare.
TECHNOLOGY
BPM has many tools at its disposal for handling the life-cycle of a business proc-
ess. Rules engines and workflow engines handle run-time execution, while model-
ing, simulation, and analytical tools exist for offline optimizations. This section
will discuss the run-time applications that can be embedded into healthcare in-
formation systems.
Clinical Decision Support (CDS) is a well-established area in medical informatics.
This field assists the clinician in making medical decisions, by providing access to
the data necessary to make an informed choice. It is especially targeted at en-
hancing patient care and reducing errors. Typically, CDS systems involve a rules
engine, with the majority of these based on the HL7 standard Arden syntax.
5

These systems function primarily by receiving triggers from an electronic patient
record when data on a patient is entered, updated or about to be updated. The
rules engine then gathers data from the database and performs logic to respond
with reminders or alert messages to the clinical staff. As an example, during the
physician order entry process, if the doctor is ordering a drug that has potentially
negative interactions with another drug already listed as an active drug the pa-

5 The Arden Syntax for Medical Logic Systems. (updated July 21, 2006). Retrieved January 29, 2007, from:
http://cslxinfmtcs.csmc.edu/hl7/arden/
WORKFLOW OPPORTUNITIES AND CHALLENGES IN HEALTHCARE
159
tient is taking, a rule can alert the doctor to that interaction and suggest alterna-
tive drugs that would avoid the potential problem. These rules are typically im-
mediate decisions or scheduled recurring decisions. They are usually limited to
clinical decision making and do not take into account the notion of state man-
agement and the need to adjust recommendations over time as the process un-
folds.
A workflow engine is a step beyond a rules engine. Workflow engines specialize in
the execution of business processes, not just decisions made at a discrete point in
time. Workflow engines are beginning to be utilized in the healthcare industry,
which can draw on the strong knowledge base in workflow available from many
other industries. The healthcare processes these engines handle can deal with all
aspects of running a hospital, including clinical, financial, administrative and op-
erational processes. These tools often make use of graphical flow-diagram inter-
faces, which try to make an executable workflow look as close to a process dia-
gram as possible. These tools can greatly assist in clinical decision making by not
only presenting clinicians with alerts and reminders, like a rules engine, but also
by handling the teamwork aspect of clinical decisions, the time management and
task allocation aspects of process delivery, state changes in patient or operational
conditions, and behind-the-scenes automation of process steps.
Workflow processes can be designed, built, modified, and maintained by a hospi-
tal. The workflows a hospital chooses to implement are tailored to that hospitals
unique needs and problem areas. Hospitals can let their process experts design
their workflows, such as a team of physicians and nurses working together to de-
sign a clinical workflow. All of this customization can be accomplished at the con-
tent level, without requiring custom code to be developed or delivered by a hospi-
tals HIS vendor. This allows for a large degree of flexibility, and sites may con-
stantly update and adapt their workflows well after their initial implementations.
Additionally, workflows, or the ideas behind the workflows, can be shared and
exchanged between hospitals if desired.
According to Gartner, Inc., a key component of the fourth generation of a com-
puter-based patient record is workflow. Gartner states that workflow capability
will be an integral part of these [healthcare information] systems.
6
Workflow of-
fers potential to aide in clinical decisions, reduce medical errors, and even save
human lives. Workflow technology makes the vision of healthcare BPM possible.
ARCHITECTURE
The following diagram represents an example of the architecture of a workflow
engine integrated with a healthcare information system (HIS). The integration be-
tween the two is usually highly customized for optimal performance, but key to
the integration is a strong service-oriented architecture. The three main compo-
nents of a healthcare workflow are: the input of a process model, the integration
with the HIS system, and integration to any additional systems the hospital may
have.

6 IMIA Yearbook 2003. (Feb 2003). Leveraging IT to Improve Patient Safety. Pg. 8. Retrieved January 23, 2007, from:
http://www.himss.org/content/files/whitepapers/PatientSafetyWhitePaper122602.pdf
WORKFLOW OPPORTUNITIES AND CHALLENGES IN HEALTHCARE
160

The process definition (Step 1) is provided by the healthcare provider organization
or hospital. Process definitions can initially be done on whiteboards, in tools like
Microsoft Visio, or in specialized modeling software. In order for the model to be
importable, it must at some point be converted to a language that the engine can
understand, such as XPDL. These process definitions define the process that a
hospital wants to follow. They usually have a happy-path to ensure that what is
supposed to happen is being done. These definitions also include what actions to
take when the process deviates from the happy-path, such as notifying supervi-
sors. The processes are not performed as a one-time transaction, like a rule in a
rules engine. Rather, they span time, such as a patients entire visit or the length
of a patients care at one facility.
In order for these processes to be kicked off and for the state within the workflow
to be updated, it is important that the HIS system notify the workflow engine
when major events happen (Step 2). A patient being admitted to the hospital, an
order being placed and an assessment being valued are all examples of events in
the HIS that need to be pushed to the workflow engine. A typical patient encoun-
ter workflow will start on an admission, monitor updates to the patient record
throughout the stay, and terminate when the patient leaves the hospital.
There are many times the workflow process needs to make a decision among mul-
tiple options in a process definition. The system will usually query the HIS to look
up additional information (step 3). If there is a need for complex queries or com-
plex decision logic, the workflow engine can call a rules engine (Step 4). For ex-
ample, suppose an order to discharge the patient was received and the engine
needs to decide which tasks need to be completed before the patient is sent home.
In order to do so, the engine will query the HIS and identify pending orders, labs,
discharge summaries and other steps that need to be completed.
When incomplete activities are identified, it is necessary for the workflow engine
to push a task to hospital staff (Step 5). Perhaps it is to remind a nurse to com-
plete an assessment that is overdue, or to notify housekeeping that a bed needs
to be cleaned. There are many modes of communication that come into play here,
including messages inside the HIS and messages through external devices such
as pagers, phones, and email. When orchestrating care, it is critical that the en-
gine has access to hospital staff and clinicians and can assign tasks to them and
alert the next human in the escalation chain if a response is not documented in
the patient record.
WORKFLOW OPPORTUNITIES AND CHALLENGES IN HEALTHCARE
161
The workflow engine is also capable of performing actions directly in the HIS
without human intervention, helping to get work done (Step 6). These actions may
mirror actions available to end users of the HIS, such as placing an order, saving
an assessment, or adding a patient to a staff census. Automation of services with-
out human intervention is based upon hospital agreement that some services do
not require human input.
The final point to bear in mind when applying a workflow engine-enabled HIS is
that there may be other sources of data, other applications in the hospital, and
new devices that should be integrated in the process. The ability of the workflow
engine to provide a rich set of programming interfaces enables a hospital IT de-
partment to fully leverage the tool (Step 7). For example, a hospital may have a
historical database of infectious disease data that needs to be queried when a pa-
tient arrives. The ability to interface with external data sources allows the work-
flow engine to leverage any electronic data or system the hospital may have.
OPPORTUNITIES FOR HEALTHCARE BPM
At any point in time, hospitals are managing several hundred processes across a
myriad of departments and clinicians. Every patient passing through the emer-
gency department or staying in a bed has a unique and evolving plan of care, re-
quiring treatment from multiple clinicians, departments and off-site facilities.
Processes are in place to support care and keep the hospital running. Patients
must be transported, dietary trays delivered, blood drawn, vital signs monitored,
results entered into information systems, beds cleaned, bills tracked, and so on.
The number of processes and patients in play for any single member of the
healthcare team can be very significant.
The opportunities for BPM in healthcare abound. BPM and workflow technology
can have an impact by automating steps, integrating the team, pushing informa-
tion when and where its needed, managing communication points and making
decisions. Automating healthcare business processes can positively impact the
time and resources necessary to provide patient care. In an industry plagued by
staffing shortages and high costs, better utilization of human and physical re-
sources enables the team to focus on patient care and results in a better bottom
line for the hospital. Workflows automate aspects of the healthcare delivery proc-
ess to improve compliance with policies and standards of care, eliminate commu-
nication breakdowns and bring the inter-disciplinary team in sync to deliver an
integrated plan of care.
A clinical care environment is hectic, multi-disciplinary and ever-changing.
Treatment plans must be reactive to unique patient conditions and physician de-
cisions for care. Processes must be adaptive. Workflows can respond quickly to
changes in patient status, alerting clinicians of minor issues before they become
more serious and costly conditions. Healthcare workflows allow for variations in
patient care decisions, accommodating the needs of a people-driven industry.
Healthcare BPM Process Opportunities: Clinical, Operational, Financial
and Administrative
Where are the process opportunities in healthcare? As it has in other industries,
BPM can improve operational, financial and administrative processes. But unique
to healthcare, BPM can also be implemented to improve clinical processes. In fact,
as hospitals adopt workflow engines, the processes prioritized at the top of the list
are most often clinical care improvement initiatives.
WORKFLOW OPPORTUNITIES AND CHALLENGES IN HEALTHCARE
162
Clinical processes focus on patient care. Processes may address patient safety
goals, such as reducing patient falls or preventing pressure ulcers. National
directives, such as The Joint Commissions annual National Patient Safety
Goals,
7
provide guidelines to potential patient safety initiatives. Keeping patients
safe reduces risks of patient suffering and more expensive treatments.
Improved patient care may stem from following evidence-based standards of care.
Standards of care are treatment guidelines for patients with certain diagnoses.
Guidelines are ever-changing and are updated whenever new information on a
given treatment comes to light. Managing standards of care with healthcare BPM
allows for quick changes to processes and reduces the need to reeducate the team
after each changeteam members can rely on the workflow engine to manage
guideline recommendations.
Clinical process improvements can be designed to respond to trends in
healthcare. For instance, MRSA (Methicillin-resistant Staphylococcus Aureus), a
type of bacteria resistant to certain antibiotics, is on the rise in healthcare set-
tings. MRSA infections accounted for two percent of the total number of staph
infections in 1974 and 63 percent in 2004.
8
MRSA patients are more likely to die,
have longer hospital stays and have higher treatment costs.
9
Implementing proc-
ess improvements to help prevent infection and to identify and treat patients who
have been infected can help improve patient safety and reduce costs.
Clinical processes are about providing the best care for the patient to decrease
their distress and improve their health. Workflow makes better care possible.
Operational process opportunities work to improve efficiencies, reduce costs and
improve patient throughput resulting in the ability to treat more patients at
reduced cost. Workflows can help to enhance communication channels, eliminate
unnecessary phone calls and allow clinicians to focus their time on direct patient
care. Patient throughput is a major operational process which impacts a
hospitals bottom line. Efficiently managing the process of getting a patient into
and out of a bed results in more patients receiving timely care and fewer patients
diverted to other facilities.
Financial process improvement opportunities are critical. With tight margins and
underpayment for services rendered, hospitals must maintain strict control of
finances to be able to continue providing care to the community. A radiology
appointment missed may mean an extra day in the hospital while the
appointment is rescheduled that Medicare doesnt cover. A dietary tray delivered
to a discharged patient is waste. The longer a patients length of stay, the greater
the cost of care. There are ample opportunities to use BPM to improve returns.
Administrative process opportunities improve documentation compliance.
Administrative processes focus on documentation and assessment completion,
signing of charts, processing of consent forms, etc. Incomplete documentation
has a direct impact on patient care; if the documentation is not done, the
information is not available for clinical decision making. Lack of documentation

7 The Joint Commission. (2007). National Patient Safety Goals. Retrieved January 18, 2007, from:
http://www.jointcommission.org/PatientSafety/NationalPatientSafetyGoals/
8 Centers for Disease Control and Prevention. (updated Oct 6, 2006). MRSA in Healthcare Settings. Retrieved Dec 11, 2006, from:
http://www.cdc.gov/ncidod/dhqp/ar_MRSA_spotlight_2006.html
9 Pennsylvania Health Care Cost Containment Council. (August 2006). MRSA in Pennsylvania Hospitals. PHC4 Research Brief,
10, 1-4.
WORKFLOW OPPORTUNITIES AND CHALLENGES IN HEALTHCARE
163
may also impact the bottom line. Insurers will not reimburse for care which may
have been delivered, but was not documented.
Application of Healthcare BPM: JCAHO/CMS Acute Myocardial Infarction
National Hospital Quality Measure
The Joint Commission and Centers for Medicare & Medicaid Services
(JCAHO/CMS) put out quality measure guidelines for the treatment of Acute
Myocardial Infarction (AMI).
10
These guidelines outline evidence-based measures
of care suggested for the treatment of patients with AMI. Following standards of
care contributes to improved health outcomes for patients presenting with this
condition.
While evidence-based guidelines are an accepted industry standard, the actual
adoption of guidelines by clinicians is difficult to achieve. It takes an average of
five years for guidelines to be adopted into routine practice.
11
As an example,
JCAHO/CMS quality measures recommend patients diagnosed with AMI receive
a beta blocker within 24 hours of arrival at the hospital. Approximately 50
percent of eligible patients do not receive beta blockers after an acute myocardial
infarction.
12

The costs of non-compliance with JCAHO/CMS AMI quality measures include
increased mortality, increased patient health issues, more expensive treatments,
future readmissions and extended lengths of stay.
Healthcare BPM can contribute to the management of the Acute Myocardial
Infarction care process. Utilizing a workflow engine and interacting with the inter-
disciplinary care team, BPM can have an impact on patient health and outcomes.
A workflow engine can gather real-time and historic patient informationresults,
assessments, orders, problems, diagnosesto identify AMI patients as soon as
possible. Once identified, the workflow starts monitoring for the measures of care.
If a measure of care, such as aspirin administration within 24 hours of arrival, is
not fulfilled, the workflow engine notifies the appropriate roles in the care team to
make a decision. A suggestion is made to consider placing an order. For other
measures of care, suggestions might include the request of a procedure or patient
education. In all cases, the clinician has an option to follow the suggestion or
document a reason he/she chose an alternative. The workflow engine continues
to listen to responses, escalating issues not addressed or progressing to other
steps in the process as the patient receives care. Throughout the process, role-
based users interact with the information systems, sending new information to
the workflow engine, affecting process decisions.
In addition to patient identification and monitoring of measures of care, the
workflow engine can be utilized to collect compliance data for reporting to the
JCAHO and CMS organizations. Data is collected throughout the patient visit,
highlighting missing information or incomplete care. This data can be utilized

10 Centers for Medicare & Medicaid Services (CMS) and the Joint Commission on Accreditation of Healthcare Organizations
(JCAHO). (June 2006). The Specifications Manual for National Hospital Quality Measures, Version 2.0. Retrieved July 2006, from:
http://www.jointcommission.org/Performance
Measurement/PerformanceMeasurement/Current+NHQM+Manual.htm
11 Lomas J, Sisk JE, Stocking B. (1993). From evidence to practice in the United States, the United Kingdom, and Canada. Mil-
bank Q, 71, 405410.
12 Bradford WD, Chen J, Krumholz HM. (1999). Under-utilisation of beta-blockers after acute myocardial infarction. Pharma-
coeconomic implications. Pharmacoeconomics,15, 257268.
WORKFLOW OPPORTUNITIES AND CHALLENGES IN HEALTHCARE
164
internally by a hospital for quality and process improvement purposes, offering
the hospital an opportunity to identify ways to improve patient health and rates of
compliance.
13

Application of Healthcare BPM: Bed Management
One of the pressing problems in hospitals is achieving efficiency in the bed turn-
over rate, or the ability for the hospital to get one patient out of the hospital bed
and the next person into that bed as quickly as possible. In an average hospital, a
single bed is turned 53 times per year. In a hospital operating in the 75
th
percen-
tile, a bed is turned 61 times per year. The increased bed turnover equates to a
revenue gain of over $7 million.
14
The pieces in play include transportation,
housekeeping, facility maintenance and a bed inventory system. The workflow
engine allows a system to track, in real-time, when a patient exits his or her room
upon discharge, to schedule all housekeeping and maintenance activities imme-
diately, to track when those staff members leave the room, having completed their
tasks, and mark the bed as available for a new patient. The state the workflow
engine manages in this case isnt the state of the patient, but the state of the bed.
By tracking beds that require cleaning, location of housekeeping personnel, time
required to clean, and other statistics, you can more accurately manage the bed
turnover in the hospital.
CHALLENGES TO HEALTHCARE BPM
The opportunities for Healthcare BPM abound and the potential benefits to the
healthcare industry are great. As with all BPM projects, however, Healthcare BPM
is not without significant challenges. Some challenges are similar across
industriesaccess to data required, degree of automation, sending too many
alerts and reminders. But healthcare has a very unique responsibilitypatient
health. If a workflow process dealing with a car accident claim fails, the effects are
undesirable, but do not affect a human life. If a healthcare process fails, it has the
potential to cause patient harm. It is imperative to address the challenges and
needs of healthcare BPM with great care and attention.
Lack of Access to Data Required
Healthcare workflows utilize data from electronic information systems to make
decisions. One of the greatest impediments to building workflow-based healthcare
information systems is lack of access to all relevant data. Without available and
updated information, it is virtually impossible to fully manage the state of a pa-
tients care.
Several data access problems may exist. Data may be incomplete, as when new
healthcare information systems are implemented and historic data from older
systems is not ported forward. Or, data may continue to exist in inaccessible pa-
per formats. There is wide variation in the level of automation among hospitals.
The industry is in favor of an electronic patient record, with real-time data entry
and interaction, but adoption varies widely. Some facilities still operate on a
manual paper-based system. Many others exist in a middle ground, with some
aspects of the charting process entered electronically and other aspects still
communicated on paper.

13 Portions of the subject matter disclosed herein are the subject of pending patent applications.
14 The Advisory Board Co. (2007, January 17). Leveraging IT to Optimize Hospital Throughput: An Improved Approach to Manag-
ing Capacity. Presented at Maryland HIMSS conference.

WORKFLOW OPPORTUNITIES AND CHALLENGES IN HEALTHCARE
165
The level of automation and data available at a hospital facility drives prioritiza-
tion of which processes are candidates for improvement with workflow technol-
ogy. The iterative nature of BPM allows for continuously changing workflows
start simple and add to the workflow process as new features are implemented at
the hospital site.
Identifying Patient Populations with a Workflow Engine
In a clinical environment, a workflow engine often must identify the population of
patients to whom the workflow applies. For example, a workflow that monitors
compliance with JCAHO/CMS Acute Myocardial Infarction quality measures
must first identify patients with myocardial infarction. Information available to
identify a patient with a specific condition is not always complete. The workflow
engine can be used to determine a patient is a likely AMI candidate, even if a di-
agnosis isnt entered into the system until the patient is discharged. We may in-
struct the workflow to check for elevated test results, often an indication of AMI.
This identifies a patient, but there is still a margin of errornot all patients with
elevated results have AMI and not all patients with AMI have elevated results.
Therefore, our patient population data is not perfect. Data available to make a
decision may be limited, out-of-date or potentially inaccurate.
Workflows do not replace human interactions in planning and providing care.
Processes must be built to include, not exclude, opportunities for human inter-
vention and feedback. In the case of identifying a patient population, such as
AMI, the system can give clinicians the opportunity to inform the workflow that
the patient is not an AMI patient and should be excluded from the population. In
the case of a suggested order that is not appropriate for the patient, clinicians
must have the opportunity to override the suggestion. Humans and automated
process management working together are the ideal solution.
Technical Feasibility Opens Automation Questions
An integrated workflow engine opens the door to numerous technically-feasible
possibilities for process management. The workflow engine can be called upon to
perform actions behind the scenes, such as automatically updating the patient
record or automatically placing orders.
A balance must be determined between what a hospital is comfortable having the
workflow engine do automatically, without user intervention, and which steps
require human interaction. A workflow may seek clinician input at specific points
in the workflow. Medication orders will always require human confirmation before
activation. A dosage change must flag a pharmacist to review the request. While
many functions are technically feasible, care and concern must be taken to
evaluate how decisions might affect patient safety. At the end of the day, the hos-
pital, not the workflow engine, remains responsible for patient care.
Physicians Choose Different Treatment Options
Physicians make different decisions when treating patients with similar condi-
tions. The healthcare industry gives physicians the discretion to determine the
care their patients need. This makes healthcare unique; it is unrealistic to force a
single standard when developing clinical treatment processes. Even when evi-
dence-based standards of care are encouraged, physicians have the opportunity
to choose alternate treatment plans and every patients condition, or combination
of conditions, drive different clinical decisions.
Healthcare BPM must be flexible. While a workflow may suggest best practices
according to hospital or national guidelines, workflows must offer the ability to
deviate from the suggested action. If a medication is suggested for a patient, along
WORKFLOW OPPORTUNITIES AND CHALLENGES IN HEALTHCARE
166
with the option to place the order, the physician needs to be offered the options to
order an alternate drug, decline placing the order or find more information about
the disease, drug or guideline evidence. Clinical workflows require built-in flexibil-
ity to accommodate human discretion.
Communicating with Team Members
Throughout a workflow process, it may be necessary to give information to or re-
quest decisions and actions from members of the team. In a clinical environment,
it is imperative that the team members receive these messages in a timely man-
ner. How the information is delivered is important. If the only method for com-
municating with team members is on a stationary computer screen in an office or
nurse station, then there is the potential that team members will not log in often
enough to receive new information. Options must exist to support communication
via additional devices, such as PDAs, pagers and phones. These options bring
their own issues, such as security concerns or limited reception in all areas of the
facility. In workflow-building, it is important to identify the appropriate device for
communication for each team member and time of day, sometimes utilizing mul-
tiple communication methods.
When pushing action requests and alerts to team members, care must be taken
to avoid overloading the person with too many messages. If a doctor is being
paged every time a lab result posts, he or she will quickly start to ignore the pages
entirely. It is important to consider the number of interactions that may be re-
quested of a single person across workflows. A balance between optimizing proc-
esses and sending too many alerts to team members should be considered. The
workflow engine can be used to track not only the state of the patient, but also
the state of the messages being sent to the clinical staff. If staff members are re-
ceiving too many alerts, a workflow might consolidate messages or utilize an al-
ternate communication method.
CONCLUSION
Healthcare is a complex environment with opportunities for BPM and workflow
technology to improve patient care and operational efficiencies. Healthcare BPM
faces technical, clinical and cultural challenges. Despite these challenges, BPM is
powerful and possible in healthcare. This paper has demonstrated that workflow
can be applied in healthcare environments. Healthcare BPM is a movement and
vision for the future of healthcare.
The architecture presented in this paper describes at a high level the Soarian
product of Siemens Medical Solutions USA, Inc. and the workflow opportunities
and examples discussed are processes currently implemented or in development,
using the Soarian HIS, at hospitals around the globe.
167
Authenticated Document/
Data Exchange in Healthcare
Dr. Mohammed Shaikh, Image X Inc., USA
INTRODUCTION
Exchange of documents and data in commercial organizations is normally ac-
complished using traditional workflow methodologies. Successful implementation
of workflow in these organizations is encouraging agencies that did not look at
these workflow methodologies favorably because data and documents exchanged
were considered confidential and restricted and for use only by authorized users.
The workflow in these organizations requires that user be authenticated before
accessing the document/data as well as obtain their signatures at each step due
to legal requirements associated with these processes. In addition retaining the
confidentiality of the document/data based on user authentication is of utmost
concern. Recent advances in digital signature technology and its use in replacing
traditional signature have opened the possibility of creating a successful docu-
ment/data exchange workflow for authenticated documents and data. Further
this approach could be extended to authenticate each user and their role to meet
confidentiality and security requirement. Some of the processes that can be iden-
tified for authenticated document/data exchange are:
Document/data exchange associated with healthcare document requiring
HIPAA compliance.
Judicial transactions like TROs (Temporary Restraining Order) etc.
Financial Disclosure Documents
Documents associated with Federal or State approval i.e. FDA, FAA etc.
Documents associated with sensitive national security matters used by
Local, State, Federal and International government agencies.
In this paper we will provide a brief introduction to digital certificate technology
and its evolution followed by outlining why forms based workflow is critical to
automate workflows involved in most of the situations outlined above. Next we
will consider evolution of electronic filing and the workflow associated with elec-
tronic document/data exchange. Finally we will outline the new frontier that is
taking shape where identity management using digital certificate can be utilized
to authenticate users and their roles to create a paperless workflow maintaining
the privacy and legal requirements that are essential to these processes.
EVOLUTION OF ELECTRONIC SIGNATURE AND DIGITAL AUTHENTICATION:
Some of the key events associated with adoption of Digital Certificate based elec-
tronic signature are listed below:
National Institute of Standards and Technology (NIST) established a fed-
eral digital signature standard (DSS) during the period 1991-94.
Many U.S. States established legal frameworks for digital signatures, most
of them based on Utah's legislation (1995). See Biddle (1996) for a com-
mentary on matters of concern about the Utah model, including privacy
aspect.
AUTHENTICATED DOCUMENT/ DATA EXCHANGE IN HEALTHCARE
168
On Oct. 1, 2000, the U.S. Electronic Signatures in Global and National
Commerce Act went into effect. The so-called e-signature law allows for
electronic signatures to be as legally binding as handwritten signatures.
In the next paragraphs we will outline the significance of legal precedence associ-
ated with signature and evolution of digitally authenticated documents.
SIGNATURES AND THE LAW
According to ABA, a signature is not part of the substance of a transaction, but
rather of its representation or form. Signature serves the following general pur-
poses:
Evidence: Signatures authenticate writing by identifying the signer with
the signed document. A signature is a distinctive mark used by the signer
that makes the writing attributable to the signer.
Approval: In certain contexts defined by law or custom, a signature ex-
presses the signer's approval or authorization of the writing, or the
signer's intention that it has legal effect. A signature on a written docu-
ment can impart a sense of clarity and finality to the transaction and may
lessen the subsequent need to inquire beyond the face of a document.
The formal requirements for legal transactions, including the need for signatures,
vary in different legal systems and with the passage of time. Sometimes it is nec-
essary to use a Notary to authenticate the signers signature on a paper.
To summarize the basic purposes of signatures outlined above, a signature must
have the following attributes according to ABA:
Signer Authentication: A signature should, indicate the signer of the
document, message or record, and should be difficult for another person
to produce without authorization.
Document Authentication: A signature should identify what is signed,
making it impracticable to falsify or alter either the signed matter or the
signature without detection.
Digital signature technology generally surpasses paper technology in all these at-
tributes. To understand why, one must first understand how digital signature
technology works.
HOW DIGITAL SIGNATURE TECHNOLOGY WORKS
Use of digital signatures usually involves two processes, one performed by the
signer and the other by the receiver of the digital signature:
Digital signature creation uses a hash result derived from and unique
to both the signed message and a given private key. For the hash result to
be secure there must be only a negligible possibility that the same digital
signature could be created by the combination of any other message or
private key.
Digital signature verification is the process of checking the digital sig-
nature by reference to the original message and a given public key,
thereby determining whether the digital signature was created for that
same message using the private key that corresponds to the referenced
public key.
To sign a document or any other item of information, the signer first delimits pre-
cisely the borders of what is to be signed. The delimited information to be signed
is termed the message in these Guidelines. Then a hash function in the signer's
software computes a hash result unique (for all practical purposes) to the mes-
AUTHENTICATED DOCUMENT/ DATA EXCHANGE IN HEALTHCARE
169
sage. The signer's software then transforms the hash result into a digital signa-
ture using the signer's private key. The resulting digital signature is thus unique
to both the message and the private key.
PUBLIC KEY CERTIFICATES
To verify a digital signature, the verifier must have access to the signer's public
key and have assurance that it corresponds to the signer's private key. However,
a public and private key pair has no intrinsic association with any person; it is
simply a pair of numbers. Some convincing strategy is necessary to reliably asso-
ciate a particular person or entity to the key pair.
In a transaction involving only two parties, each party can simply communicate
(by a relatively secure "out-of-band" channel such as a courier or a secure voice
telephone) the public key of the key pair each party will use. Such an identifica-
tion strategy is no small task, especially, when the parties are geographically dis-
tant from each other, normally conduct communication over a convenient but
insecure channel such as the Internet, are not natural persons but rather corpo-
rations or similar artificial entities, and act through agents whose authority must
be ascertained. As electronic commerce increasingly moves from a bilateral set-
ting to the many-on-many architecture of the World Wide Web on the Internet,
where significant transactions will occur among strangers who have no prior con-
tractual relationship and will never deal with each other again, the problem of
authentication/nonrepudiation becomes not merely one of efficiency, but also of
reliability. An open system of communication such as the Internet needs a system
of identity authentication to handle this scenario.
CHALLENGES AND OPPORTUNITIES
The prospect of fully implementing digital signatures in general commerce pre-
sents both benefits and costs. The costs consist mainly of:
Institutional overhead: The cost of establishing and utilizing certifica-
tion authorities, repositories, and other important services, as well as as-
suring quality in the performance of their functions.
Subscriber and Relying Party Costs: A digital signer will require soft-
ware, and will probably have to pay a certification authority some price to
issue a certificate.
Hardware to secure the subscriber's private key: There may be cost
associated with securing the digital certificate on part of signer.
Digital certificate verification cost: Persons relying on digital signa-
tures will incur expenses for verification software and perhaps for access
to certificates and certificate revocation lists (CRL) in a repository.
On the plus side, the principal advantage to be gained is more reliable authentica-
tion of messages. Digital signatures, if properly implemented and utilized, offer
promising solutions to the problems of:
Identity theft: The possibility of identity theft is eliminated except in case
of loss of digital certificate;
Imposters, by minimizing the risk of dealing with imposters or persons
who attempt to escape responsibility by claiming to have been imperson-
ated;
Message integrity, by minimizing the risk of undetected message tam-
pering and forgery, and of false claims that a message was altered after it
was sent;
AUTHENTICATED DOCUMENT/ DATA EXCHANGE IN HEALTHCARE
170
Formal legal requirements, by strengthening the view that legal re-
quirements of form, such as writing, signature, and an original document,
are satisfied, since digital signatures are functionally on a par with, or su-
perior to paper forms; and
Open systems, by retaining a high degree of information security, even
for information sent over open, insecure, but inexpensive and widely used
channels. The most widely-used standard for digital certificates is X.509.
FORMS RUN ORGANIZATIONS & ELECTRONIC FORMS MAKE IT SIMPLE AND PAPERLESS
From Courts to healthcare, from manufacturing to financial institutes, everyone
uses forms. But the sheer mass of paper generated by excess printing and the
lack of error protection inherent in a paper-based form workflow makes it costly
and impractical.
Electronic forms like X forms, InfoPath were created to solve these problems and
eliminate cost and inefficiencies associated with paper forms.
Using paper forms invites disorder, filing mistakes, damage, loss, waste, and
other complications. To solve these problems, organizations could format their
documents into HTML for publication on the web, but this is a costly and time-
consuming process. What's more, the user remains unable to submit documents
directly to the recipient from the computer screen, but instead can only print
them out to mail or fax, resorting again to paperand all of its attendant costs.
Most organizations use forms to collect data from customers, employees, vendors,
and contractors. Forms contain information that need to be processed, secured,
and acted upon for a variety of purposes. To be effective, forms-based processes
should be flexible to meet an organization's needs. They should be efficient in get-
ting input and approval from everyone involved, and equipped to allow collabora-
tion among several people or departments. Approval and validation of forms by
multiple authorities is an important part of workflow used by number of organiza-
tions. The data exchange needed between the forms and line of business applica-
tions has resulted in development of XML schemas that have become standard
for different industries. In following paragraphs we have outlined few of the stan-
dards that are of interest in form based workflow in legal, healthcare and busi-
ness environments.
THE GLOBAL JUSTICE XML DATA MODEL (GLOBAL JXDM):
The Global Justice XML Data Model (Global JXDM) is intended to be a data refer-
ence model for the exchange of information within the justice and public safety
communities. The Global JXDM is a product of the Global Justice Information
Sharing Initiative's (Global) Infrastructure and Standards Working Group (ISWG).
It was developed by the Global ISWG's XML Structure Task Force (XSTF)
HEALTHCARE XML STANDARD DEVELOPMENT:
The HL7 protocol was developed by the Health Level 7 Organization which con-
sists of grammar and vocabulary that is standardized so that clinical data can be
shared amongst all healthcare systems, and easily understood by all. By using
the HL7 messaging protocol as a standard, all systems following the HL7 specifi-
cations are able to communicate easily with one another, without the need for
information conversion.
AUTHENTICATED DOCUMENT/ DATA EXCHANGE IN HEALTHCARE
171
BUSINESS XML STANDARD DEVELOPMENT:
The Electronic Business (eBusiness) eXtensible Markup Language (XML) [ebXML]
set of specification enable electronic trading relationships between business part-
ners and integrates new technologies:
Communicate data in common terms (Core Components Technical Speci-
fication [CCTS] v2.0.1)
Register and provide eBusiness artifacts and services (ebXML Registry
Services [ebRS v3.0] an Registry information Model [ebRIM v3.0])
Configure technical contract between business partners (Collaboration
Protocol Profile and Agreements [CPP/CPA v2.0])
Provide secure and reliable transport (ebXML Messaging Services [ebMS])
Enable business processes (ebXML Business Process Specification
Schema, [ebBP v2.0.3]).
E-FILING ALLOWS ELECTRONIC DOCUMENT AND DATA EXCHANGE USING XML:
Multiple government agencies are implementing electronic filing and electronic
recordation of documents as a means of document/data exchange between
courts and attorneys and other departments e.g. Child Support department,
County recorders office etc. E-filing allows organizations to create a workflow
across multiple departments across a WAN. We have outlined case studies where
true digital authentication is used to create a document/data exchange between
various county departments to accomplish a TRO (Temporary Restraining Order),
create an Advanced HealthCare Directive (AHCD) using digital authentication for
patient and notary to sign it and for the physician to view it and create the Elec-
tronic Batch Record (EBR) automation process where each participant is digitally
authenticated.
E-filing is complete automation of the workflow needed between various agencies
e.g. Sheriff, D.A., DCSS, Probation agencies, Juvenile agencies etc. as well as us-
ers, e.g. attorneys, Pro Se Litigants, Process Servers etc., in judicial environment
or between physician, patient, pharmacy, laboratory, insurance companies etc.
This automation uses a multitude of technologies and standards that will allow
these diverse entities to exchange document and data electronically. The com-
plexities in this automation arise out of security concerns, data compatibility is-
sues and legal concerns. These case studies outline the base modules needed to
accomplish this automation and also describes need for standards and what is
needed to make acceptance of these standards easy for future implementations of
E-filing process. The process can be divided into following modules:
Document assembly and workflow automation at the filing entity to gen-
erate the document and data envelope needed by receiving agency.
Document/data transformation to receiving agency in a standard format.
Acceptance/rejection module.
Electronic return receipt generation module.
Transfer module for transferring data to Line of business application
Transfer module for transferring document-to-document repository.
WORKFLOW INVOLVING AUTHENTICATED DOCUMENT/DATA EXCHANGE:
A number of organizations as outlined below are forced to rely on paper
documents to create processes that will withstand the challenges created by our
legal system and conform to rules, such as: recording process involved in transfer
of real estate; court filings used to obtain judgments via court proceedings;
recordation of wills and testaments etc. Generally authorities responsible for legal
AUTHENTICATED DOCUMENT/ DATA EXCHANGE IN HEALTHCARE
172
validation of these processes have regarded electronic documents as unreliable
resulting in paper documents to be the only legally acceptable document. Other
instances needing paper documents with wet signature involve legal, healthcare
and other type of transactions requiring authentication of parties involved and
providing confidentiality and privacy for the information that, by law, cannot be
released to unauthorized individuals. These type of transactions must be
accomplished by secure transfer of documents between parties and require that
unauthorized personnel can not access the document during the exchange
process between parties that are generally located at different location. Some of
the agencies that are involved in these kinds of transactions are:
Judicial agencies such as Courts, Sheriff, District Attorneys etc.
Healthcare agencies like Hospitals, Clinics, Laboratories, Pharmacies, In-
surance agencies.
Parties involved in criminal proceedings involving minors or child support
matters.
Financial transactions e.g. sensitive financial information needed by SEC
Drug and medical appliance certification and approval applications involv-
ing FDA.
Educational organizations.
National security agencies that deal with sensitive data related to national
security and government affairs.
Some of these agencies have rules that have been established over decades and
can not be modified without going through exhaustive analysis of implications of
these changes. Some of these processes cannot be modified without changes in
laws. These considerations make acceptance of digitally authenticated documents
by these authorities difficult.
At the same time, reliance on authenticated identities is becoming an increasingly
crucial requirement for the introduction of Internet-based solutions, state-based
Medicaid administration, a morass of local regulations and rules that render truly
standardized products unworkable. Standardized solutions will eliminate
interoperability costs and barriers to rapid customer adoption and
implementation of products that require identity management. The more quickly
these solutions can be implemented, the faster these organizations will realize
cost and efficiency returns. Failure to solve the identity problem globally will leave
only one optionin-house administration of proprietary identities, an approach
with significant inherent problems.
Today these organizations face unnecessary cost and complexity. Defining,
administering, and maintaining an identity schemeevent ID number +
passwordis expensive and yields no competitive advantage. Every entity-specific
identification process imposes costs and generates customer service issues. In
spite of all the difficulties associated creating Authenticated document workflow, a
number of agencies have created pilot or working prototypes to demonstrate the
viability of digital authentication and workflow. The case studies outlined below,
highlight the next frontier that is evolving in creating authenticated document
workflow.
SEALED AND CERTIFIED DOCUMENT WORKFLOW IN COURTS:
Most of the court documents are considered to be in the public domain and freely
available for public viewing but when a document is sealed it cannot be displayed
to general public. These documents are sealed by a judge based on special
consideration and can be only unsealed based on a ruling by the judge or another
AUTHENTICATED DOCUMENT/ DATA EXCHANGE IN HEALTHCARE
173
judge. Anyone whos been through the court system, whether for domestic
violence, elderly abuse or child-support issue, knows how burdensome it can be.
There are arraignments, bail hearings, trial and court dates, and mounds of
paperwork. The amount of work that goes into every aspect of anyones legal
travails is overwhelming, and its the organizations behind the courts that, in
some ways, truly feel the weight of the work. Victims are overwhelmed by the
number of pages of forms, many involving repetitive questions. Victims advocates
spent two to three hours filling out forms, and victims often have to wait hours for
an available advocate. It takes another four to six hours from the time a judge
signs the Order of Protection until the sheriff receives the service paperwork. Up
to five agencies are involved in each procedure, all of which are in different
locations. Therefore, manual paper delivery uses up valuable time and sometimes
forces the victim to live with abuse rather than approach the court authorities.
DIGITAL AUTHENTICATION BASED PROCESS TO OBTAIN TRO (TEMPORARY RESTRAINING ORDER).
A battered person can get legal protection from the battering party by filing a
restraining order with the court that can result in getting immediate protection
and resulting in a TRO. Evidence of TRO allows the sheriff or police officer to
provide the necessary protection to the battered party. But the major problem is
the paperflow that is involved in obtaining a TRO and also the distances involved
between these agencies that makes it difficult for the battered party to approach
these agencies without the knowledge of battering party.
The paper form-based flow can be altered using digital forms that can exchange
data between Filer, Judge and Sheriff. In addition digital authentication of filing
attorney, the judge and sheriff makes the process compliant with the security and
privacy concerns. In addition, if the victim moves from the existing place of
residence to a new place the sheriff can access these documents without delay by
authenticating himself. In the following paragraph we describe the workflow
associated with this process.
When a judge gets a document digitally signed by an Attorney, to verify the
signature on the document, the judge's software first uses CAs (certificate
authority) public key to check the signature on attorneys certificate. Successful
de-encryption of the certificate proves that the CA created it. After the certificate is
de-encrypted, the judge's software can check if the attorney is in good standing
with the certificate authority and that all of the certificate information concerning
the attorneys identity has not been altered. (Although these steps may sound
complicated, they are all handled behind the scenes by the judge's software). The
judge then signs his order digitally and a copy is electronically delivered to the
sheriff and court clerk in minutes. The sheriff can digitally authenticate the
judges certificate and can make it available to other parties i.e. sheriffs in another
county if they provide proper credentials, for viewing. The digitally authenticated
document provides:
Proof of identity.
Prevention from unauthorized use.
Intuitive UI for end users (encryption, decryption, and digital signatures).
In the event that information is intercepted, encryption ensures privacy that pre-
vents third parties from reading and or using the information.
AUTHENTICATED DOCUMENT/ DATA EXCHANGE IN HEALTHCARE
174
How it Works
Digital authentication process
Benefits of digital authentication process
ADVANCED HEALTHCARE DIRECTIVE WORKFLOW:
Technological advances in medicine have made it possible to prolong life in pa-
tients with no hope of recovery. The physician is faced with deciding what meas-
ures should be used to keep patients alive are extraordinary in individual situa-
tions. Advance Medical Directives are documents intended to provide guidance to
medical professionals if one is incapacitated and cannot make a medical decision.
Advance directives can be defined as the right of incompetent patients to refuse
unnecessarily burdensome treatment, but at the same time emphasize the neces-
sity for written evidence documenting their wishes. This empowers an agent, who
has the power of an attorney, to make end-of-life decisions and give instructions
about your health care wishes, if you are in a chronic vegetative state. Most of
us procrastinate in creating an AHCD due to difficulty in obtaining proper advice,
help and documents. Even in those cases where a person has signed an AHCD, it
may be difficult to for him to have his wishes enacted due to unavailability of
signed documents when they are needed.
AUTHENTICATED DOCUMENT/ DATA EXCHANGE IN HEALTHCARE
175
Governor Schwarzenegger signed AB 2805 on September 28, 2006, a measure
authored by Assemblyman Sam Blakeslee. AB 2805 permits AHCDs to be digi-
tally signed and notarized using the California digital signature standards which
were established in law in 1995. The measure protects current requirements for
AHCDs to be signed and either notarized or witnessed by two people. But, it also
allows patients and notaries to use digital signatures and requires the use of a
digital certificate for that signature.
An Advanced Healthcare Directive could have been instrumental in alleviating
confusion around a case such as that of Terri Schiavo, said Blakeslee. However,
making end-of-life or life-sustaining treatment decisions is just the first step.
AHCDs only work if people proactively record these decisions with their medical
provider.
ADVANCED HEALTH CARE DIRECTIVE (AHCD):
CMA (California Medical Association), Medepass and Image X have teamed to cre-
ate www.healthcarewishes.com to allow a person to digitally sign an Advanced
Healthcare Directive and also provide digital notarization. Further, a physician
with valid authentication can retrieve the AHCD on the web in compliance with
AB 2805 to act in accordance with the patients wishes. The electronically stored
Advance Healthcare Directives are available to healthcare providers at any time
via secure Internet or facsimile.
How it works

Electronic flow of the AHCD
AUTHENTICATED DOCUMENT/ DATA EXCHANGE IN HEALTHCARE
176
AHCD with Date, Time stamp and Digital signature
CONCLUSION
One can summarize that this is just the start of the digital authentication process
to replace the onerous paper-based process. As more and more agencies under-
stand advantages of digital authentication and approve these processes by pass-
ing necessary rules, we hope to see better security and privacy as well as more
efficient process and conformance with the law

BIBLIOGRAPHY:
Digital Signature Guidelines, published by American Bar Association Section of
Science and technology, Information Security Committee,
Product code 5450012
SearchSecurity.com Definitions (Powered by WhatIs.com) July 2006
Legal XML Proposed Standard: XML Standards Development Project, XML Court
Document 1.1 Draft Standard, E-filing report, published by Glasser Legalworks,
Little Falls N.J.
Global Justice XML Data Model, U.S. Department of Justice, office of justice pro-
grams, http://it.ojp.gov/jxdm/3.0.3/index

Section 2


Standards and
Technology

179

Quality Metrics for
Business Process Models
Irene Vanderfeesten
1
, Jorge Cardoso
2
, Jan
Mendling
3
, Hajo A. Reijers
1
, Wil van der Aalst
1
1
Technische Universiteit Eindhoven, The Netherlands;
2
University of Madeira, Por-
tugal;
3
Vienna University of Economics and Business, Austria.
SUMMARY
In the area of software engineering, quality metrics have shown their importance for
good programming practices and software designs. A design developed by the help
of these metrics (e.g. coupling, cohesion, complexity, modularity and size) as guid-
ing principals is likely to be less error-prone, easy to understand, maintain, and
manage, and is more efficient. Several researchers already identified similarities
between software programs and business process designs and recognized the po-
tential of quality metrics in business process management (Cardoso, Mendling,
Neuman & Reijers, 2006; Gruhn & Laue, 2006; Latva-Koivisto, 2001). This chapter
elaborates on the importance of quality metrics for business process modeling. It
presents a classification and an overview of current business process metrics and it
gives an example of the implementation of these metrics using the ProM tool. ProM
is an analysis tool, freely available, that can be used to study process models im-
plemented in more than eight languages.
INTRODUCTION
Key in many instances of innovation is the transfer of information and understand-
ing developed in one discipline to the other (Kostoff, 1999). A prime example is
workflow management itself, a technology based on the importation of process
models from manufacturing operations into administrative work. In this chapter we
embark on further opportunities for knowledge transfer to the field of process mod-
eling and workflow management, in particular from software engineering.
In the mid-1960s software engineers started to use metrics to characterize the
properties of their code. Simple count metrics were designed to be applied to pro-
grams written in languages such as C++, Java, FORTRAN, etc. Various of these
metrics provided a good analysis mechanism to assess the quality of the software
program design. Since there is a strong analogy between software programs and
business processes, as previously argued in (Reijers & Vanderfeesten, 2004; Guce-
glioglu & Demiros, 2005), we believe that software metrics, such as coupling, cohe-
sion, and complexity, can be revised and adapted to analyze and study a business
process' characteristics.
A business process model, regardless whether it is modeled in e.g. BPEL, EPC,
BPMN or Petri Nets, exhibits many similarities with traditional software programs.
A software program is usually partitioned into modules or functions (i.e. activities),
which take in a group of inputs and provide some output. Similar to this composi-
tional structure, a business process model consists of activities, each of which con-
tains smaller steps (operations) on elementary data elements (see Table 1). More-
over, just like the interactions between modules and functions in a software pro-
QUALITY METRICS FOR BUSINESS PROCESS MODELS
180
gram are precisely specified, the order of activity execution in a process model is
predefined using logic operators such as sequence, splits and joins.
In this chapter we elaborate on the transfer and adaptation of quality metrics from
the software engineering domain to business processes. First, we introduce a gen-
eral outline on software engineering metrics. Next, an overview is given of the cur-
rent status in business process metrics, adopting a widely used classification from
software engineering. Finally, we elaborate on some practical applications of these
business process metrics and present our conclusions and a look into the future of
business process metrics.
Table 1: Similarities between software programs and business processes
Software program Business process
Module/Class Activity
Method/Function Operation
Variable/Constant Data element
METRICS IN THE SOFTWARE ENGINEERING DOMAIN
In the area of software engineering a wide variety of software quality metrics has
been developed. The main purpose of software quality metrics is to obtain program
designs that are better structured. Some of the most important advantages of a
structured design are, as pointed out in (Conte, Dunsmore & Shen, 1986), that (i)
the overall program logic is easier to understand for both the programmers and the
users and (ii) the identification of the modules is easier, since different functions are
performed by different modules, which makes the maintenance of the software pro-
gram easier. According to (Conte, Dunsmore & Shen, 1986; Shepperd 1993; Troy &
Zweben 1981) the quality of a design is related to five design principles:
CouplingCoupling is measured by the number of interconnections among mod-
ules. Coupling is a measure for the strength of association established by the
interconnections from one module of a design to another. The degree of cou-
pling depends on how complicated the connections are and on the type of con-
nections. It is hypothesized that programs with a high coupling will contain
more errors than programs with lower coupling.
CohesionCohesion is a measure of the relationships of the elements within a
module. It is also called module strength. It is hypothesized that programs with
low cohesion will contain more errors than programs with higher cohesion.
ComplexityA design should be as simple as possible. Design complexity grows
as the number of control constructs grows, and also as the sizein number of
modulesgrows. The hypothesis is that the higher the design complexity the
more errors the design will contain.
ModularityThe degree of modularization affects the quality of a design. Over-
modularization is as undesirable as under-modularization. The hypothesis is
that low modularity generally relates to more errors than higher modularity.
SizeA design that exhibits large modules or a deep nesting is considered unde-
sirable. It is hypothesized that programs of large size will contain more errors
than smaller programs.
In literature coupling and cohesion are generally considered to be the most impor-
tant metrics for software quality, although researchers do not agree on their relative
importance. In (Shepperd 1993; Troy & Zweben 1981), results of analyses are pre-
sented that indicate that coupling is the most influential of the design principles
under consideration. However, in (Myers, 1978) cohesion and coupling are consid-
QUALITY METRICS FOR BUSINESS PROCESS MODELS
181
ered as equally important. Also, complexity and size are seen as a good quality met-
ric for software program designs (Troy & Zweben, 1981).
In addition, various researchers carried out studies to gather empirical evidence
that quality metrics do indeed improve the quality of a software design. Bieman and
Kang, in particular, have shown examples how cohesion metrics can be used to
restructure a software design (Bieman & Kang, 1998; Kang & Bieman, 1996; Kang
& Bieman 1999). Also, in (Selby & Basili, 1991) evidence is presented that low cou-
pling and high strength (cohesion) are desirable. By calculating coupling/strength
ratios of a number of routines in a software library tool it was found that routines
with low coupling/strength ration had significantly more errors than routines with
high coupling/strength ratio. In (Card, Church & Agresti, 1986), a number of For-
tran modules from a National Aeronautics and Space Administration project were
examined. Fifty percent of the high strength (high cohesion) modules were fault
free, whereas only 18 percent of low strength modules were fault free. No relation-
ship was observed between fault rate and coupling in this study. Finally, (Shen, Yu,
Thebaut & Paulsen, 1985) have discovered that, based on their analysis of three
software program products and their error histories, simple metrics such as the
amount of data (size) and the structural complexity of programs may be useful in
identifying those modules most likely to contain errors.
QUALITY METRICS IN THE WORKFLOW DOMAIN
Because of the similarities between software programs and workflow processes,
explained in the introduction, the application of similar quality metrics to the work-
flow field is worth investigation. We have conducted a literature review on business
process metrics and found that, despite the vast literature on software engineering
metrics, there is not much research on business process metrics available yet. Al-
though some researchers suggest using software metrics to evaluate business proc-
ess designs (Baresi et al, 1999), the number of publications on concrete metrics and
applications in the business process domain is still small and only of a very recent
date. In this section, the existing "state-of-the-art" in business process metrics is
summarized using the same classification as in software engineering.
Coupling
Coupling measures the number of interconnections among the modules of the
model. As such, it is highly related to degree and density metrics in (social) network
analysis (see e.g. Brandes and Erlebach, 2005). The application of these measure-
ments is straight forward if the process model is available in a graph-based nota-
tion. The average degree, also called coefficient of connectivity in (Latva-Koivisto,
2001), refers to the average number of connections that a node has with other
nodes of the process graph. In contrast to that, the density metric relates the num-
ber of available connections to the number of maximum connections for the given
number of nodes. The density metric was used in a survey as a predictor for errors
in business process models in (Mendling, 2006) with some mixed results. While
there was actually a connection between density and errors, the explanatory power
of this metric was found to be limited. An explanation for that might be that density
is difficult to compare for models of different size since the maximum number of
connections grows quadratic. It appears that the average degree of nodes in a busi-
ness process could be better suited to serve as a quality metric.
Moreover, Reijers and Vanderfeesten (Reijers & Vanderfeesten, 2004) also developed
a similar coupling metric counting the overlap of data elements for each pair of ac-
tivities using a static description of the product represented by a Product Data
Model (PDM) (van der Aalst, 1999; Reijers, 2003; Reijers, Limam Mansar, & van der
QUALITY METRICS FOR BUSINESS PROCESS MODELS
182
Aalst, 2003). Two activities are 'coupled' if they contain one or more common data
elements. To calculate the coupling value for a business process the activities are
selected pairwise and the number of 'coupled' pairs is counted. Finally, the mean is
determined based on the total number of activities. The outcome always lies be-
tween 0 and 1. This data oriented coupling metric is complemented with a cohesion
metric, which is described in the next section.
However, all of these coupling metrics do not yet deal with how complicated the
connections are as suggested in the definition of coupling. A weighted coupling met-
ric, with different weights for the XOR, OR, and AND connectors, is part of our cur-
rent research.
Cohesion
Cohesion measures the coherence within the parts of the model. So far, there is
only one paper on a cohesion metric for business processes available. Reijers and
Vanderfeesten (Reijers & Vanderfeesten, 2004) developed a cohesion metric for
workflow processes which looks at the coherence within the activities of the process
model. Similar to their coupling metric this cohesion metric also focuses on the in-
formation processing in the process and takes a data oriented view. For each activ-
ity in the process model the total cohesion is calculated by multiplying the informa-
tion cohesion and the relation cohesion of the activity. Finally, a cohesion value for
the whole process is determined by taking the mean of all activity cohesion values
(i.e. adding up all cohesion values and dividing it by the number of activities). The
value for this cohesion metric always lies between 0 and 1. This data oriented cohe-
sion metric is complemented with a coupling metric, which is described in the pre-
vious section. The combination of these two metrics, as proposed by (Selby & Basili,
1991), gives a coupling-cohesion ratio which supports the business process de-
signer to select the best (low coupling, high cohesion) design among several alterna-
tives (Reijers & Vanderfeesten, 2004).
Complexity
Complexity measures the simpleness and understandability of a design. In this
area most of the research on business process metrics has been done (Cardoso,
Mendling, Neumann & Reijers, 2006; Gruhn & Laue, 2006, Latva-Koivisto, 2001).
For instance, both (Gruhn & Laue, 2006) and (Cardoso, Mendling, Neumann & Rei-
jers, 2006) consider the adaptation of McCabe's cyclometric number as a complex-
ity metric for business processes. This complexity metric directly measures the
number of linearly independent paths through a programs source code. In prac-
tice, the industry interpretation of McCabe's cyclomatic complexity thresholds are
the following (Frappier, Matwin, & Mili, 1994): from 1 to 10, the program is simple;
from 11 to 20, it is slightly complex; from 21 to 50, it is complex; and above 50 it is
untestable.
In (Cardoso, 2005a) the Control-Flow Complexity (CFC) metric is defined, which is
also derived from software engineering. The CFC metric evaluates the complexity
introduced in a process by the presence of XOR-split, OR-split, and AND-split con-
structs. For XOR-splits, the control-flow complexity is simply the fan-out of the
split, i.e. CFC
XOR-split
(a)= fan-out(a). For OR-splits, the control-flow complexity is 2
n
-1,
where n is the fan-out of the split. i.e. CFC
OR-split
(a)= 2
fan-out(a)
-1. For an AND-split, the
complexity is simply 1, i.e. CFC
AND-split
(a)= 1. Mathematically, the control-flow com-
plexity metric is additive. Thus, it is very easy to calculate the complexity of a proc-
ess, by simply adding the CFC of all split constructs. The greater the value of the
CFC, the greater the overall architectural complexity of a process. This metric was
evaluated in terms of Weyukers properties to guarantee that it qualifies as a good
QUALITY METRICS FOR BUSINESS PROCESS MODELS
183
and comprehensive one (Cardoso, 2006). To test the validity of the metric, an ex-
periment has been carried out for empirically validation (Cardoso, 2005b). It was
found that the CFC metric is highly correlated with the control-flow complexity of
processes. This metric can, therefore, be used by business process analysts and
process designers to analyze the complexity of processes and, if possible, develop
simpler processes.
Other researchers, for instance (Latva-Koivisto, 2001), also propose graph complex-
ity metrics, such as the Coefficient of Network Complexity (CNC) or the Complexity
Index (CI), to evaluate business processes. In general, Cardoso et al (Cardoso,
Mendling, Neumann & Reijers, 2006) have identified three different types of busi-
ness process complexity: (i) computational complexity, (ii) psychological complexity,
and (iii) representational complexity.
Modularity
Modularity measures the degree to which a design is split op into several modules.
Our literature review has not provided any business process metric that measures
the modularity of a business process design. This is no surprise regarding the fact
that activites are most often treated as black boxes in business process modeling.
Size
Size simply measures how big a model is. The size of a business process model can
be determined using a measure similar to the number of Lines of Code (LOC) from
software engineering metrics. The LOC metric in software engineering has been
used for years with a significant success rate (Jones 1986). Cardoso et al., Gruhn &
Laue and Latva-Koivisto (Cardoso, Mendling, Neumann & Reijers, 2006; Gruhn &
Laue, 2006; Latva-Koivisto, 2001) all propose to count the number of activities to
establish a measure for size.
While this size metric is very simple, it is very important to complement other forms
of process analysis. For example, the control-flow complexity of a process can be
very low while its activity complexity can be very high. For example, a sequential
process that has a thousand activities has a control-flow complexity of 0, whereas
its activity complexity is 100.
From the "state-of-the-art" in business process metrics, we conclude that this field
of research is just at its start and that there is a lot of potential for further develop-
ment of business process metrics. This classification, which was adopted from the
software engineering field, is not yet very precise. For instance, Mendling uses a
coupling metric as means to calculate complexity (Mendling, 2006) and Latva-
Koivisto, Gruhn & Laue, and Cardoso et al. also use size as a measure for complex-
ity (Cardoso, Mendling, Neumann & Reijers, 2006; Gruhn & Laue, 2006; Latva-
Koivisto, 2001). Perhaps, this classification of business process metrics should be
revised in the future when this area is more mature.
Moreover, we observe that the values for each metric do not yet have a clear mean-
ing, e.g. when the value for coupling for a certain business process model is 0.512
we do not yet know just from the number whether this is high or low, or good or
bad. According to (Cardoso, 2005a) it can take several years and a lot of empirical
research before such a number really makes sense and quantifies the design in a
proper way. Despite this, business process metric analysis in the current situation
still gives the designer some insights and guidance on the quality of the design.
Moreover, we believe in the potential of these metrics and their importance for
business process design in the future.
QUALITY METRICS FOR BUSINESS PROCESS MODELS
184
APPLICATION
Besides the theoretical overview of business process metrics which was provided in
the previous sections, we would also like to give some insight in the practical appli-
cation of these metrics so far. Because this area emerged only recently, there are
only a few applications available, while a lot of new research is ongoing at the mo-
ment of writing this chapter.
The practical applications that we present here mainly have two directions. First of
all, we look at the capabilities of a set of metrics for predicting errors (i.e. we investi-
gate whether there is a relationship between the value of the metrics and the pres-
ence of errors in the business process model). Secondly, we present the early im-
plementation of a tool that supports designing of business process models guided
by these metrics.
Prediction of error probability based on metrics
Among our hypotheses on the use of business process metrics we state that busi-
ness process models which are designed using the business process metrics con-
tain less errors, are easier to understand and maintain. A first step made towards
the empirical validation of this hypothesis is made in a quantitative analysis about
the connection between simple metrics and error probability in the SAP reference
model (Mendling et al, 2006a; Mendling et al, 2006b). The SAP reference model is a
collection of EPC business process models that was meant to be used as a blue-
print for rollout projects of SAPs ERP system (Keller & Teufel, 1998). It reflects Ver-
sion 4.6 of SAP R/3 which was marketed in 2000. The extensive database of this
reference model contains almost 10,000 sub-models, about 600 of them are EPC
business process models.
The survey reported in Mendling et al (2006a) includes two parts: the verification of
relaxed soundness (which is a minimal correctness criterion for business process
models) and the prediction of error probability based on statistic methods. The veri-
fication of relaxed soundness revealed that about 6 % of the EPC models (34 of 604)
contained errors such as e.g. deadlocks. This result on its own emphasizes the
need for verification tools in business process modeling projects.
In the second part, the authors investigate the question whether errors appear by
chance in a process model, or if there is some way to use business process metrics
to predict the error probability. The hypothesis behind this research is that large
and complex models are more likely to contain errors, basically because the human
modeler is more likely to loose the overview of all interrelations represented in the
model. The authors use a set of simple metrics related to size of the models as input
to a logistic regression model, i.e. a statistical model to predict occurrence or non-
occurrence of an event. The event in this context is whether the process model has
an error or not. The results show that these simple metrics are indeed suitable to
predict the error probability. In particular, it appears that a higher number of join-
connectors is most strongly connected with an increase in error probability.
This survey illustrates one promising application of business process metrics. Still,
there is further research needed to identify more elaborate and sophisticated met-
rics. Moreover, there is a need for further empirical investigation in order to estab-
lish an understanding of when a threshold value of a certain metrics indicates bad
design in terms of maintainability or likely error proneness.
The ProM tool
In recent years ProM has emerged as a broad and powerful process analysis tool,
supporting all kinds of analysis related to business processes (van Dongen et al,
QUALITY METRICS FOR BUSINESS PROCESS MODELS
185
2005). In contrast to many other analysis tools the starting point was the analysis
of real processes rather than modeled processes, i.e., using process mining tech-
niques ProM attempts to extract non-trivial and useful information from so-called
event logs. Moreover, ProM also allows for the calculation of several quality met-
rics as will be illustrated later in this section.
Traditionally, most analysis tools focusing on processes are restricted to model-
based analysis, i.e., a model is used as the starting point of analysis. For example,
a purchasing process can be modeled using EPCs and verification techniques can
then be used to check the correctness of the protocol while simulation can be used
to estimate performance aspects. Such analysis is only useful if the model reflects
reality. Process mining techniques use event logs as input, i.e., information re-
corded by systems ranging from enterprise information systems to web services.
Hence the starting point is not a model but the observed reality. Therefore, we use
the phrase real process analysis to position process mining with respect to classi-
cal model-based analysis. The widespread use of information systems, e.g., systems
constructed using ERP, WFM, CRM, SCM, and PDM software, resulted in the om-
nipresence of vast amounts of event data. Events may be recorded in the form of
audit trails, transactions logs, or databases and may refer to patient treatments,
order processing, claims handling, trading, travel booking, etc.
Figure 1 is used to explain the different types of process analysis supported by
ProM. First of all, it is relevant to note that when studying business processes one
can look at models (lower left corner) or study the observed behavior (lower right
corner).

Figure 1: Overview of the functionality of ProM: (1) discovery, (2) conformance, and (3)
model analysis
Using process mining it is possible to automatically derive process models using
process mining techniques (van der Aalst, Weijters & Maruster, 2004). ProM offers
many process discovery techniques. The result may be a Petri net, EPC, or YAWL
model. Figure 1 shows some of the modeling notations supported by ProM and also
mentions some of the products that provide event logs in a format usable by ProM.
Also the list of languages suggests a focus on pure process models, discovery does
not need to be limited to control-flow and may also include temporal, resource,
data, and organizational aspects.
If a model is already given, the information stored in logs can be used to check con-
formance, i.e., how well do reality and the model fit together. This can be seen as
another quality dimension. Conformance checking requires, in addition to an event
log, some a-priori model. This model may be handcrafted or obtained through proc-
QUALITY METRICS FOR BUSINESS PROCESS MODELS
186
ess discovery. Whatever its source, ProM provides various ways of checking
whether reality conforms to such a model (Rozinat & van der Aalst, 2006). For ex-
ample, there may be a process model indicating that purchase orders of more than
one million Euro require two checks. Another example is the checking of the so-
called "four-eyes principle''. Conformance checking may be used to detect devia-
tions, to locate and explain these deviations, and to measure the severity of these
deviations.
Last but not least, ProM also provides various ways of model analysis. ProM offers
various plug-ins to analyze the correctness of a model, e.g., soundness and absence
of deadlocks. For example, it is possible to load the SAP reference model expressed
in terms of EPCs into ProM and analyze it using reduction rules or invariants. ProM
also allows for the verification of a variety of modeling languages (e.g., BPEL,
Staffware, etc.) using a mapping onto Petri nets. Besides model verification, ProM
also allows for the calculation of various other quality metrics, e.g., cohesion and
coupling, complexity, size, etc. Given the topic of this chapter, we elaborate on these
metrics.
Complexity and Size in ProM
In order to study the complexity of process models we have developed several plug-
ins for the ProM framework. As stated previously, ProM provides various ways of
model analysis, such as soundness and absence of deadlocks. The newly developed
plug-ins target the analysis of the quality of process designs. Figure 2 shows one of
the plug-ins analyzing the complexity, coupling, and size of an EPC process model.

Figure 2: Screen shot of the ProM tool showing the analysis sheet for EPC's. For the
EPC process model presented, several metrics are calculated. Note that this tool is
still under development and that some concepts on the screen shot (e.g. BA Tree, and
Weighted Coupling) are not explained in this chapter.
As can be seen from the figure, the size of a process model has several components,
such as the number of events, functions, ORs, XORs, and ANDs. Events and func-
tions are specific elements to EPCs. The figure also shows the Control-Flow Com-
plexity (Cardoso, 2005a) of the process displayed which is 6, the density (Mendling,
2006) which is 0.048, and the weighted coupling which is 0.515. While, at this
QUALITY METRICS FOR BUSINESS PROCESS MODELS
187
point in time, these numbers may be rather complicated to interpret for someone
outside this area of research, we expect that when organizations have successfully
implemented quality metrics as part of their process development projects, empiri-
cal results and practical results from real world implementation will set limits and
threshold for processes. Recall that this scenario happened with the McCabe cyc-
lomatic complexity (Frappier, Matwin, & Mili, 1994).
Data oriented Cohesion and Coupling in ProM
Within the ProM framework, also an environment is developed to calculate cohesion
and coupling metrics based on the theory in (Reijers & Vanderfeesten, 2004). The
proposed coupling-cohesion ratio can be used to compare alternative designs de-
fined on the same PDM. In this context, a design is a grouping of data elements and
their respective operations into activities, such that every activity contains one or,
preferably, more operations. The best design is the one with the lowest coupling-
cohesion ratio.
In Figure 2 and Figure 3, some screen shots of the cohesion and coupling environ-
ment are shown. Both screen shots contain a different design based on the same
PDM. In the first design (Figure 2) the process cohesion value is 0.183, the process
coupling value is 0.714, and the coupling-cohesion ratio is 3.902. The part of activ-
ity A is also indicated in the PDM. For the second design (Figure 3) these values are:
0.123, 0.867, and 7.049. When comparing the values for the coupling-cohesion
ratio for both designs we see that the first design has a lower ratio and thus is the
better alternative of the two process designs.

Figure 3: Screen shot of the cohesion and coupling environment in ProM for the first
design. This design contains seven activities that are defined on the PDM. Activity A
is indicated.
QUALITY METRICS FOR BUSINESS PROCESS MODELS
188

Figure 4: A screen shot of the cohesion and coupling environment for design B in
ProM. Design B contains six activities that are defined on the PDM and it differs from
design A because activities A and E are merged. Activity AE is indicated in the PDM in
the screen shot.
CONCLUSION
Currently, organizations are modeling and designing business processes without
the aid of metrics to question the quality or properties of their models. As a result, it
may happen that simple processes are modeled in a complex and unsuitable way.
This may lead to a lower understandability, higher maintenance costs, and perhaps
inefficient execution of the processes in question (e.g. when such models are used
to enact). Considering the efforts that modern organizations spend on creating and
maintaining business processes we can truly speak of a great opportunity for the
use of quality metrics here.
Examples of important questions that can be made relative to a process model are:
can process P1 be designed in a simpler way?, what is the complexity of process
P2?, is process P3 difficult to adapt? and, can process P4 be easily restructured
into sub-processes? In the future, these kinds of questions can perhaps be satis-
factorily answered with the use of process metrics such as coupling, cohesion,
complexity, modularity, and size metrics. Each of these metrics analyses a business
process from a particular perspective.
It is clear that quality metrics for business processes need yet to come to full bloom.
In particular, much empirical work needs to be done to assess the applicability and
validity of the various proposed metrics. However, both for practitioners and re-
searchers there is highly attractive prospect of sophisticated tools coming available
that are capable to thoroughly analyze process models against low cost, at consid-
erable speed, and yielding tangible business benefits.
QUALITY METRICS FOR BUSINESS PROCESS MODELS
189
ACKNOWLEDGEMENT
This research is partly supported by the Technology Foundation STW, applied sci-
ence division of NWO and the technology programme of the Dutch Ministry of Eco-
nomic Affairs.
REFERENCES
Aalst, W.M.P. van der (1999). On the automatic generation of workflow processes based on
product structures. Computers in Industry, 39, 2, pp. 97-111.
Aalst, W.M.P. van der; Weijters, A.J.M.M.; and Maruster, L. (2004). Workflow Mining: Discov-
ering Process Models from Event Logs. IEEE Transactions on Knowledge and Data Engi-
neering, 16(9), pp.11281142.
Baresi, L.; Casati, F.; Castano, S.; Fugini, M.; Mirbel, I.; Pernici, B.; and Pozzi, G. (1999).
Workflow Design Methodology. In P. Grefen, B. Pernicii and G. Sanchez, editors, Database
Support for Workflow Management: the WIDE Project, pp. 47-94, Kluwer Academic Pub-
lishers.
Bieman, J.M., and Kang, B.-K. (1998). Measuring Design-level Cohesion. IEEE Transactions
on Software Engineering, 24, 2, pp. 111-124.
Brandes, U., and Erlebach, T., editors (2005). Network Analysis: Methodological Foundations
[outcome of a Dagstuhl seminar, 13-16 April 2004], volume 3418 of Lecture Notes in
Computer Science. Springer-Verlag.
Card, D.N.; Church, V.E.; and Agresti, W.W. (1986). An Empirical Study of Software Design
Practices. IEEE Transactions on Software Engineering, 12, 2, pp. 264-271.
Cardoso, J. (2005a). How to Measure the Control-flow Complexity of Web Processes and
Workflows. In: Fischer, L., ed., Workflow Handbook 2005, pp. 199-212, Lighthouse Point.
Cardoso, J. (2005b). Control-flow Complexity Measurement of Processes and Weyukers Prop-
erties. Proceedings of the 6
th
International Enformatika Conference (IEC 2005), Interna-
tional Academy of Sciences, Budapest, Hungary. Vol. 8pp. 213-218.
Cardoso, J. (2006). Process control-flow complexity metric: An empirical validation, IEEE In-
ternational Conference on Services Computing (IEEE SCC 06), Chicago, USA, pp. 167-
173, IEEE Computer Society.
Cardoso, J.; Mendling, J.; Neuman, G. & Reijers, H.A. (2006). A discourse on complexity of
process models. In: Eder, J.; Dustdar, S. et al, editors, BPM 2006 workshops. Lecture
Notes in Computer Science 4103, Springer-Verlag, Berlin, pp. 115-126.
Conte, S.D.; Dunsmore, H.E.; and Shen, V.Y. (1986). Software Engineering Metrics and Mod-
els, Benjamin/Cummings Publishing Company, Inc..
Dongen, B.F. van; Alves de Medeiros, A.K.; Verbeek, H.M.W. ; Weijters, A.J.M.M.; and Aalst,
W.M.P. van der (2005). The ProM framework: A New Era in Process Mining Tool Support.
In G. Ciardo and P. Darondeau, editors, Application and Theory of Petri Nets 2005, vol-
ume 3536 of Lecture Notes in Computer Science, pp. 444454, Springer-Verlag, Berlin.
Frappier, M., Matwin, S. and Mili. A. (1994). Software Metrics for Predicting Maintainability:
Software Metrics Study: Technical Memorandum 2. Canadian Space Agency, January 21.
Gruhn, V., and Laue, R. (2006). Complexity metrics for business process models. In: Witold
Abramowicz and Heinrich C. Mayer, editors, 9
th
international conference on business in-
formation systems (BIS 2006), vol. 85 of Lecture Notes in Informatics, pp. 1-12.
Guceglioglu, A.S., and Demiros, O.W. (2005). Using Software Quality Characteristics to
Measure Business Process Quality. In W.M.P. van der Aalst, B. Benatallah, F. Casati, and
F. Curbera, editors, Business Process Management (BPM 2005), Lecture Notes in Com-
puter Science, volume 3649, pages 374-379, Springer-Verlag, Berlin.
Jones, T. C. (1986). Programming Productivity. New York, McGraw-Hill.
Kang, B.-K., and Bieman, J.M. (1996). Using Design Cohesion to Visualize, Quantify, and
Restructure Software. 8
th
International Conference on Software Engineering and Knowl-
edge Engineering, Knowledge Systems Institute, Skokie IL, pp. 222-229.
Kang, B.-K., and Bieman, J.M. (1999). A Quantitative Framework for Software Restructuring.
Journal of Software Maintenance, 11, pp. 245-284.
QUALITY METRICS FOR BUSINESS PROCESS MODELS
190
Keller, G., and Teufel, T. (1998). SAP(R) R/3 Process Oriented Implementation: Iterative Proc-
ess Prototyping. Addison-Wesley.
Kostoff, R.N. (1999). Science and Technology Innovation. Technovation, 19, 10, pp. 593-604.
Latva-Koivisto, A.M. (2001). Finding a complexity measure for business process models. Hel-
sinki University of Technology, Systems Analysis Laboratory.
Mendling, J. (2006). Testing Density as a Complexity Metric for EPCs. Technical Report JM-
2006-11-15. Vienna University of Economics and Business Administration. Retrieved from
http://wi.wu-wien.ac.at/home/mendling/publications/TR06-density.pdf
Mendling, J.; Moser, M.; Neumann, G.; Verbeek, H.M.W.; Dongen, B.F. van; and Aalst,
W.M.P. van der (2006a). A Quantitative Analysis of Faulty EPCs in the SAP Reference
Model. BPM Center report BPM-06-08, Eindhoven University of Technology, Eindhoven.
Mendling, J.; Moser, M.; Neumann, G.; Verbeek, H.M.W.; Dongen, B.F. van; and Aalst,
W.M.P. van der (2006b). Faulty EPCs in the SAP Reference Model. In: J.L. Fiadeiro, S.
Dustdar and A. Sheth, editors, Proceedings of BPM2006, Lecture Notes in Computer Sci-
ence, volume 4102, pp. 451-457, Springer-Verlag, Berlin.
Myers, G.J. (1978). Composite/Structured Design. Van Nostrand Reinhold, New York, NY.
Reijers, H.A. (2003). Design and control of workflow processes: business process management
for the service industry. Lecture Notes in Computer Science 2617, Springer-Verlag, Berlin.
Reijers, H.A. (2003). A cohesion metric for the definition of activities in a workflow process.
Proceedings of the 8th CAiSE/IFIP8.1 International workflop on Evaluation of Modeling
Methods in Systems Analysis and Design (EMMSAD 2003), pp. 116-125.
Reijers, H.A.; Limam Mansar, S.; and Aalst, W.M.P. van der (2003). Product-Based Workflow
Design. Journal of Management Information Systems, 20, 1, pp. 229-262.
Reijers, H. A., and Vanderfeesten, I.T.P. (2004). Cohesion and Coupling Metrics for Workflow
Process Design. In J. Desel, B. Pernici, and M. Weske, editors, Proceedings of the 2
nd
In-
ternational Conference on Business process Management (BPM 2004), Lecture Notes in
Computer Science volume 3080, pp. 290-305, Springer-Verlag, Berlin.
Rozinat, A., and Aalst, W.M.P. van der (2006). Conformance Testing: Measuring the Fit and
Appropriateness of Event Logs and Process Models. In C. Bussler et al, editor, BPM 2005
Workshops (Workshop on Business Process Intelligence), volume 3812 of Lecture Notes in
Computer Science, pp. 163-176, Springer-Verlag, Berlin.
Selby, R.W., and Basili, V.R. (1991). Analyzing Error-Prone System Structure. IEEE Transac-
tions on Software Engineering, 17, 2, pp. 141-152.
Shen, V.Y.; Yu, T.-J.; Thebaut, S.M.; and Paulsen, L.R. (1985). Identifying Error-Prone Soft-
ware, IEEE Transactions on Software Engineering, 11, 4, pp. 317-324.
Shepperd, M. (1993). Software Engineering Metrics Volume I: Metrics and Validations,
McGraw-Hill.
Troy, D.A., and Zweben, S.H. (1981). Measuring the Quality of Structured Designs, Journal of
Systems and Software, vol. 2, pp. 113-120.
Weyuker, E.J. (1988). Evaluating Software Complexity Measures. IEEE Transactions on Soft-
ware Engineering, 14, 9, pp. 1357-1365.
191

Enterprise Architecture as a
Meta-Process
Heinz Lienhard, ivyTeam-SORECOGroup,
Switzerland
Its the processes: the effective implementation of processes makes all the difference to the
success of an Enterprise Architecture project. Business processes should be designed not
only for optimal business support but also for easy process management in a dynamic
business world.
INTRODUCTION
With Enterprise Architecture (EA) one tries to address the fundamental base of an
enterprises organization, but should it be called this way?
The architecture of a house, e.g., usually means something solid and durable
you do not keep on changing it all the time. But experiences in recent years (some
very bitter ones) have shown clearly that change may be one decisive constant
concerning business and enterprises: keep re-inventing your business, as the
saying goes. Hence it seems that what is fundamental about successful enter-
prises of the future will be their dynamism and not so much their structure or
architecture. As will be shown, this can be done applying appropriate BPM tech-
nology. Through business rules accessed by the processes many changes are
possible without redesign of the process model. Meta rules allow reconfiguring a
process in a dynamic, automatic way. The main components of an EA process are
identified in the meta-process, through which all other processes are managed.
They are modified, often automatically, either to improve their performance or to
accommodate new strategic inputs. This, of course, requires monitoring the per-
formance of the various processeswhich is part of modern BPM systems. In
short: a coherent powerful approach to Enterprise Architecture viewed as a proc-
ess is presented.
In management literature the increasing relevance of a process oriented view of a
firm is widely acknowledged. Citing Gartner Research (2005): we believe that
successful enterprise architecture programs are process-focused
The benefits are significant:
EA as a process treats an enterprise for what it really is: a dynamic sys-
tem. And the most mileage is gotten out of a technology that is required
anyway: BPM. No plethora of methods and tools is required, keeping
things simplewhich is unusual in the IT area
By proper initial design of processes they all become manageable in a
transparent and efficient way. A lot can be done without changing the to-
pology of the process modelsand even this can be achieved quite effi-
ciently thanks to modern BPM. Hence, basically a single tool is required
to implement Enterprise Architecture as a Meta-Process.
Practical examples from business and eGovernment illustrate the value of the
proposed approach.
Thereforeto use current terminologyBPM is needed (Business Process Man-
agement), albeit with some reinterpretation: its not only the business processes
that have to be managed, its all of them (see above).
Enterprise Architecture should first of all mean the setting up of meta-processes
allowing the enterprise to be efficiently adapted to a changing world. But nice
ENTERPRISE ARCHITECTURE AS A META-PROCESS
192
pictures of these processes wont do: they have to be implemented, i.e. supported
by powerful technologies. This has also been proven crucial to business proc-
esses. In the days of Business Process Re-engineering (the hype then was BPR) a
lot of beautiful graphics with explaining text was producedjust to be lost and
forgotten on some shelf. Processes only gained center stage when modern
BPM/workflow technologies allowed going from graphical process model directly
to the real-time application.
WHY FOCUS ON PROCESSES?
Well, everybody is now talking about processes. Maliciously one might say: if
there is no solution, there must at least be a processlike with all those peace
processes going on in the world. No, there are good reasons for focusing on proc-
esses.
It has been shown before
1
that processes are at the heart of IT applicationsor
almost any application for that matter. Usual standard applications never provide
all functionality necessary to run a business, despite their growing complexity.
This will of course be accentuated by the growing need to constantly adapt ones
business. Even traditional business softwarewhether one realizes it or not
uses implicit processes, usually hard coded ones. There is a lot to be gained by
making these processes explicit: Applications are obtained that are flexible, easier
to understand, better documented and amenable for customization by the end
user. Now even the worlds second largest software company explicitly addresses
processes
In management literature the increasing relevance of a process oriented view of a
firm is acknowledged
2
. Is not management a process too? Maybe the future will
bring management processes into the limelight instead of the CEO with an up-
roarious salary? Probably not. But daily work of management will be supported
more and more by IT-implemented processes allowing the CEO to become a
proper entrepreneur again, taking full responsibility for the enterprise and its
people.
And what about enterprise architecture? It should first of all provide meta-
processes to manage all relevant processes in the enterprise, such as:
Core Business Processes
Management Processes
Business Rule Processes
3
)
Support Processes
Content/Knowledge Processes
BPM Methodology Processes
etc.
Hopefully, it will become evident in the following.
Technologies for Enterprise Architecture
What is hardly desirable are the thought of more new tools and frameworks to
distract people from their essential tasks, namely doing business and caring for
customers. However, there is no way around mastering the business processes,
hence no way around BPM. Why not fully use what modern BPM has to offer? Put
it to work for mastering the meta-process to manage the Enterprise Architecture.
This brings significant advantages:

1
Lienhard, H. (2002) Workflow as a Web Application the Grand Unification. In L. Fischer
(Ed.), Workflow Handbook 2002 (pp. 65-80). Future Strategies Inc., Florida USA.
2
Spickers, J. (2004). The St.Gallen Management Model. University of St.Gallen, CH.
3
Lienhard, H., Knzi, U.M. (2003) Workflow and Business Rules a Common Approach. In L.
Fischer (Ed.), Workflow Handbook 2005 (pp. 129-139). Future Strategies Inc., Florida USA.
ENTERPRISE ARCHITECTURE AS A META-PROCESS
193
No need to train people to master yet another set of tools
Look and feel remain the same
The meta-process and the related ones (e.g. management processes) can
be monitored and audited the same way business processes are
By starting from graphical models flexible and transparent solutions are
obtained.
The chosen BPM approach should be founded on well-established Internet stan-
dards like XML/XPDL for process export and import and WSDL for Web Services,
which are instrumental for tying together existing business applications within
overall processes (EAI) and to link customers and business partners processes.
Therefore it is important to seamlessly integrate Web services into the processes
to orchestrate them
4
. This allows to obtain (in a straight-forward way) solutions
based on a Service Oriented Architectureanother current hype. Tight integration
of process modeling and workflow definition is a must to be able to go straight
from graphical model to executable solution. And BPMN, see BPMI.org, provides a
common language for graphical process modeling.
These are the technological choices to be made for insuring rapid and low cost
adaptations of the IT infrastructure, an important part of a viable Enterprise Ar-
chitecture.
It has already been shown elsewhere (ref.1) how the same technological choices
also allow an efficient management of business rules and policies and make them
directly accessible for processes and meta-processes. In addition, we show here
how all processes can be managed by making judicious use of BPM.
BUSINESS PROCESSES AND BUSINESS RULES
As pointed out before (ref. 1) both the BPM and the BR (business rule) gurus are
making similar claims. The Business Rule Approach promises
5
: Increased speed
of implementation and the BPM people tell us: BPM provides enhanced busi-
ness agility There is only one serious answer: you need both of them together.
This becomes evident in the context of enterprise architecture.
Clearly a business process has to do one thing: optimally support the business.
To be sure that it does so the process has to be monitoredand adapted where
necessary. But even in the case processes run well there will be changes required
due to an increasingly dynamic business environment. Hence, process designers
are confronted with a double challenge: to design them to suit business needs
and to anticipate changes as far as feasible. With modern BPM approaches (as
described in ref.1) processes can readily be built and adapted via graphical mod-
els that are automatically translated into real-time applications. But this requires
some familiarity with the design tool and should hardly be done on an hourly or
daily basis. Yet such rapid modifications may be needed, without influencing the
basic underlying workflow. The following example of a simple order process (Fig.
1) is a case in point. An executable process design is shown allowing for changes
in business policy without actually modifying the process, i.e. its implementation
as workflow:

4
: Lienhard, H. (2003) Web Services and Workflow A Unified Approach. In L. Fischer (Ed.),
Workflow Handbook 2003(pp. 49-60). Future Strategy Inc., Florida USA.
5
Chisholm, M. (2004) How to Build a Business Rule Engine. Boston: Elsevier.
ENTERPRISE ARCHITECTURE AS A META-PROCESS
194
Fig. 1: Part of Order Process Model

The process model (Fig. 1) shows a simulation snapshot:
A customer (number 1001) is ordering some items for a total price of $2000. Da-
tabase information about the specific customer is accessed, like Age, Cust-
Status (gold in this case) etc. After deciding that the customer is not new, mean-
ing a standard rebate rule may apply, a BR (business rule) is invoked by the proc-
ess and evaluated to find out whether a young customer (in the meaning of the
rule) has placed the order. Since the answer yes is returned, the process pro-
ceeds to the step below. The red dot representing the data-object flowing through
the process is just about to enter a meta-rule call that will yield the current policy
on how to treat such customers. Depending on the current policy an appropriate
sub-process from a process catalog will be selected: maybe one for shipping flow-
ers in case the young customer happens to be a femaleor a football ticket in the
other case. After execution of the chosen sub-process another business rule will
give the current rebate rate for the specific customer (e.g. depending on status
and order price).
ENTERPRISE ARCHITECTURE AS A META-PROCESS
195
An example for BR to determine the rebate rate:
CustRule1: IF((TL(0)=gold and TL(1)> 1000), 0.75, 1.0)
TermList TL = [in.CustStatus,in.PriceOrder] ;
i.e. a customer with status gold placing an order for over $1000 will currently
benefit from a 25 percent rebate.

In case the customer is new he/she will be
registered, then either a meta-rule will be
evaluated to determine which rebate rule
has to apply, or a manual selection of the
rule is necessary because of certain cus-
tomer information. If this is true a task is
assigned to an employee with the appro-
priate role to perform this selection. A spe-
cial dialog element presents the authorized
person with the possible rules within a
standard browser (Fig. 2).

Fig. 2: Dialog Page
The process model has been designed and implemented as workflow with the help
of Xpert.ivy in BPMN (Business Process Modeling Notation)
6
. A number of special
elements, e.g. to access a database or generate a dialog form, have been used,
fully compatible with the standard. The rule
evaluation symbol has been specially
marked, although it represents the usual
sub- process call (independent sub-process,
to be precise).
By using business rules and meta-rules
accessible by the process there is a lot of
flexibility built in, without having to change
the model. This brings about another
advantage: separation of concerns and re-
sponsibilities. Thus, the rules can be set up
and managed by other people than the ones
concerned with the process implementation.
These rules may reside in a rule catalog
(DB) on a totally different system from the
one running the workflowand may be
managed by a special process. All that is
needed are appropriate Web Service inter-
faces between the systems (see ref. 4 above).





Fig. 3: The Web service process for rule
evaluation (uses special start and end
elements).

6
Lienhard, H., Btler, B. (2006) From BPMN Directly To Implementation - The Graphical Way.
L. Fischer (Ed.), to appear in Workflow Handbook 2006. Future Strategy Inc., Florida USA.
ENTERPRISE ARCHITECTURE AS A META-PROCESS
196
Fig. 3 shows a process-based Web Service that evaluates the value of a rule. It is
called from the process with the rule name (or other identification of the rule) and
the appropriate terms as parameters. The Web Service process accesses the cor-
responding rule expression from a database and forwards it to the evaluation
step; the result (true, false or any other values) is returned to the calling business
process.
The term meta-rule is used in two different ways: for logical rules that are rules
about business rules (to be chosen from a rule catalog), and to select sub-
processes from a process catalog, actually a meta-rule about configuration of the
business process. The simple example (Fig. 1) already exhibits a lot of flexibility
without touching the process design itself. Rapid modifications are simply ob-
tained by adapting:
business rules
assigning rules to process nodes
meta-rules selecting business rules
meta-rules selecting sub-processes.
All rules are naturally managed by another process (see ref.1), i.e. by a corre-
sponding management workflow that may constitute a sub-process of an overall
EA-Process (see below). In some cases it may even be possible to fully automate
needed modifications depending on business results. But all this requires that
business (and support) processes be designed not only for optimal support of the
business, but also for ready manageability by an EA meta-process.
META-PROCESSES
Those processes that support the management of other processes, e.g. business
processes, are called meta-processes. Obviously, meta-processes can be designed
just as flexibly as was exemplified by the business process shown above (Fig. 1)
and these in turn may be managed by meta-processes invoking meta meta
rulesWell, it has to stop somewhere, but as in mathematics truly powerful con-
cepts can be iterated or even used recursively.
It is important that business process models be designed to allow for efficient
process management, providing as much process flexibility as possible to allow
for adaptations without changing the graphical model. Thus, important gains in
transparency and flexibility are achieved via properly managed rule and process
catalogs. In addition, it is a major step towards separation of concerns among
processand meta-processparticipants. The least desirable solution to adapt
the behavior of a process is redesigning the graphical model, which requires a
manual meta-process. Thanks to modern BPM tools this task can also be carried
out efficiently.
EA META-PROCESS
This is of course the top meta-process which supports the management of all the
important processes. Two main sub-processes, Rule Management and Process
Management (see Fig. 4), are part of the EA process. The first is concerned with
defining and modifying business rules and meta rules, the second with adapting
the processes. The Process Management sub-process may also call upon the rule
management process, e.g. to modify some meta rule that influences the behavior
(see above) of a business process being managed. The EA Process and its main
sub-processes as depicted in Fig. 4 are shown only on the top hierarchical level.
The EA Process will be enacted when:
1. performance of some business processes is not satisfactory
2. business changes require adaptations
3. business strategy is revised.
ENTERPRISE ARCHITECTURE AS A META-PROCESS
197
Most modern BPM systems (like Xpert.ivy) monitor what is happening in the
workflow and provide information about the cases having been run, like:
Start and end times
Started by which process
Creator of case
Start and end times of all tasks
Roles and users involved in tasks
Merging times of tasks
What data has been entered into forms
etc.
Fig. 4: EA Meta-Process and main Sub-processes
A performance overview is usually obtained through a BAM (Business Activity
Monitoring) process. Based on this information necessary modifications of the
business processes (1. above) can be determined. As well, in the first task of the
main EA process (Fig. 4), changes may be decided on because of new business
requirements or new strategy input (2. or 3. above). In a next step tasks are as-
signed to implement the changes, where possible automatically, otherwise by
assigning specific tasks to corresponding roles. Then one or both of the sub-
processes may be invoked, depending on the nature of the changes. Before ending
the EA Process all modifications have to be approved. If refused, the whole process
starts again with the task assignments.
The Rule Management sub-process guarantees the editing of the rule catalog(s) in
a controlled manner. On the lower, more detailed level, the process may cross
several organizational boundaries involving different persons with different roles,
to accomplish the task. Certain adaptations may be fully automated; e.g. the rule
defining a customer rebate (example of Fig.1) might be modified if some perform-
ance figures drop below a given threshold without any manual intervention.
The Process Management sub-process will be started whenever the process itself
has to be altered. In the simplest case modifying a meta rule may suffice, as has
been discussed in the order process example above, and therefore the changes
will essentially be handled again by Rule Management. But the situation may
have escalated to where it becomes necessary to select other sub-processes from
ENTERPRISE ARCHITECTURE AS A META-PROCESS
198
a process catalog to be assigned, i.e. called, at specific places in the main process.
As an example the sub-process that defines the steps for rewarding a particular
customer status has to be replaced by another one because of loosing more and
more customers to the competition. As a last resort a person has to be involved
having a role as model designer, i.e. somebody authorized to modify the process
model with the help of a design tool. When approved, the new version of the proc-
ess will be uploaded on the workflow server and will take over all new cases. Be-
fore general approval of all work done in this management sub-process, the as-
signment of rules to specific nodes in a process are checked and possibly
changed. This step can again be done without redesigning the model. If all adap-
tations have been approved, the sub-process terminates, i.e. control is returned to
the EA process otherwise the Process Management sub-process will be started
again.
To summarize: the management of business processes contains several main
steps (in order of increasing sophistication):
assignment of Rules to Process Nodes
modifications in the Rule Catalog
adaptation of Meta-Rule Catalog
selection of Sub-processes (from Process Catalog)
(re-) design of Processes
A clear distinction of the various steps allows for a clear separation of concerns,
i.e. assignment of different tasks to appropriate roles, resulting in a dramatic re-
duction in complexity. But EA is more than this: all activities mentioned above
become integrated into an overall control process (see Fig. 5).
Fig. 5 EA: Overall Control Process
PROCESS ANALYSIS: WHY NOT TEST THE REAL THING?
Analysis of processes has, so far, not been addressed. Well, before embarking on
a BPM tool to obtain executable models, ivyTeam first developed and promoted
tools for analytical modeling. Years of experience have shown that processes can
be simulated and analyzed to the end of the world and backto find out that
reality is something else again. Often, this endeavor is not much more than a
waste of time. In his excellent report
7
on BPMS, Bruce Silver eloquently argues
that executable models are central to BPMS; if analytical modeling is required,
the trend is to bring it inside the BPM suite. And that is the solution. After all,
as was argued above, modern BPM systems do provide monitoring of process, i.e.
workflow, performance (BAM). Why not use this same feature for analyzing the
behavior of the executable solution albeit in a time lapse mode? All what is
needed are some additional parameters in the executable model (to be turned into
a workflow, as demonstrated before) and a small robot process to start processes
according to some desired statistics. In this way the real thing is tested and not
some abstract analytical model. Process (or workflow) monitoring is used twice: to

7
Silver, B. (2006) Understanding and Evaluating BPM Suites.BPMInstitute.org
ENTERPRISE ARCHITECTURE AS A META-PROCESS
199
analyze the process behavior under some given statistics and later to continu-
ously improve the operational workflow.
PRACTICAL EXAMPLES
An interesting eGovernment application nicely underlines some of the points
made above. A couple of years ago, the interior department of the State (Canton)
of Zug, Switzerland, decided to support, and where possible automate, the Citi-
zenship Administration Process. Because it combines modern BPM with extensive
EAI (Enterprise Application Integration) functionality the BPM tool Xpert.ivy was
selected to implement the solution. The whole project was realized in 2.5 months.
The participants in the process were involved in the process modeling right from
the start, leading to exceptionally high user acceptance of the final system. Large
savings were immediately obtained, despite ever increasing numbers of applica-
tions for citizenship. Especially important, the very consuming periodic statistical
reports to the state government were now automatically generated by a click of a
button.
About half a dozen minor adaptations were necessary because of new require-
ments introduced since the beginning of the project. Specifically, in the past year
new legislation has been introduced requiring special treatment of certain cases,
thus making changes in parts of the process necessary. As an example, we show
the sub-process generation of documents (Fig. 6, left) which generates different
responses when different rules (paragraphs) apply, e.g.
- Paragraph 28: applicant is spouse of an expatriate Swiss;
- Paragraph 58: re-naturalization of a Swiss etc.
Such additions or changes to the law are expected to proliferate in the future.
Hence, a new version of the sub-process (Fig. 6, right) will be the best alternative.
In this version the evaluation of a meta rule will provide the ID of a process to be
chosen from a process catalog (DB) according to the paragraph that applies in the
particular case, and a generic sub-process call will execute it. Hence, changes in
the law can now be managed without touching the actual process.

ENTERPRISE ARCHITECTURE AS A META-PROCESS
200
Fig. 6: Two versions of sub-process from Citizenship Administration; left:
fixed topology; right: dynamic configuration with meta rule.
An example where extensive process monitoring has been implemented is the call
center of the Town Utilities of Munich, which has over one million customers and
is the biggest independent utility in Germany. Here the automated processes
guide the service employees at the center through all required support activities.
Since the introduction of the solution, the call center staff does not have to oper-
ate and serve a number of different systems and applications any more, but are
being automatically served via the process with information, documents etc., for
providing an efficient and high level of quality service to the customer. A number
of real-time process data are made available to different organizational levels to
manage the call centers operations.
Another significant Utility Enterprise Germany operates a similar call center sup-
ported by BPM. It features a very powerful solution for monitoring the perform-
ance of individual staff members as well as entire teams. Quotas for successful
contract completion in a first customer contact case (i.e. without intervention of
the back office), is an important key statistic alongside duration times for han-
dling the various activities. Notoriously long service times by all call center em-
ployees usually indicate a weakness in the process design, making modifications
necessary, i.e. the EA process loop has to be started.
Fig. 7 shows an example of top-level cockpit information automatically assembled
from process monitoring (via Xpert.ivy)
8
.

8
Figures 7 & 8 have been kindly provided by Customer Business Solutions GmbH, Mnchen,
the company that implemented the mentioned project.
ENTERPRISE ARCHITECTURE AS A META-PROCESS
201
Fig. 7: Real life cockpit information of the call center.

Average costs per completed customer contact in relation to given objectives is
another example of readily obtained performance information for management
(see Fig. 8).
Fig. 8: Average costs of completed customer contacts in relation to target.
ENTERPRISE ARCHITECTURE AS A META-PROCESS
202
WHAT ARE THE BENEFITS
EA as a process treats an enterprise for what it really is: a dynamic system. And
the most mileage is gotten out of a technology that is required anyway: BPM. No
plethora of methods and tools is required, keeping things simplewhich is un-
usual in the IT area
By proper initial design of processes they all become manageable in a transparent
and efficient way. A lot can be done without changing the topology of the process
modelsand even this can be done quite efficiently thanks to modern BPM.
To summarize the value of the process approach to EA, the following benefits can
be achieved:
Use of a single tool: same look and feel, similar functionality, lower cost
Flexible and transparent solutions
Straight-forward data exchange between the business process, the sup-
porting processes, and management processes
Monitoring of all relevant processes and obtaining feedback information
for adaptation via the EA meta-process
Monitoring of the EA meta-process itself allows for transparency at that
level
Synergies from bringing together rule and process approach
A natural fit into modern IT architecture (like SOA), in fact, enabling it-
its the processes, isnt it?
UPCOMING TRENDS
BPM is key to automating and managing processesand not just Business Proc-
esses. The B in BPM should not be taken too religiously. An enterprise is a highly
dynamic system best described by processes, i.e. by executable models. Hence,
Enterprise Architecture will be recognized as a meta-process through which all
the other relevant processes are managed. And it has been shown that this can
be done in a coherent uniform approach combining BPM with business rules (BR)
and meta-rules. Even process analysis can be carried out using executable mod-
els.
Future BPM systems will allow rich Web interfaces besides the classical HTML
forms
9
. This way they become a very powerful tool to set up Rich Internet Applica-
tions
10
, or to use a trendy term, Web 2.0 Applications.
Eventually BPM will become fully user-centered and the notorious business-IT
divide does not even have to be bridged anymore: the IT-department as known
today may simply disappear
Enterprise Architecture as a meta-process will be for business people imple-
mented by business people, supported by a professional Process Group.

9
E.g. Version 4.0 of Xpert.ivy.
10
von Gunten, K. Process-based Application Development: A flexible and End-user centered
Way of creating Software to be presented at the Jazoon Conference 2007
203

Overcoming Negative Tendencies
in Automated Business Processes
Juan J. Moreno, Lithium Software / Universidad
Catlica, Uruguay
Luis Joyanes, Universidad Pontificia de
Salamanca, Spain
ABSTRACT
Companies that have adopted BPM enjoy the advantages of superior efficiency
and operational visibility with consequent competitive advantage. However, there
are situations in which this automation hides negative tendencies that could
threaten business success. Tendencies derived from external economical effects,
fraud or unexpected behaviors, materialized in a sequence of process instances
created in the BPM application, can drastically decrease business results.
This paper proposes a solution for the problem, using models based on knowl-
edge extraction from human participants, its representation and integrated utili-
zation in the BPMS. Detecting relevant events using BAM (Business Activity Moni-
toring) and generating natural language sentences enable alerting the business
analyst in a much more agile way. Ultimately, the final goal could be defined as
enabling the system to adapt itself, updating the process definition to overcome
the negative tendency.
Finally, if the tendency is detected on time, and the right people alerted, it could
become an important business opportunity.
INTRODUCTION
This chapter is divided in four main sections. In this introductory section we illus-
trate the business context and the problem definition. Next, we include a short
resume of previous and related work. After that the proposed solution is de-
scribed, and finally conclusions are presented.
BUSINESS CONTEXT
Automating human-driven processes could be a great improvement for the busi-
ness. Workflow and Business Process Management Systems enable organizations
to reduce costs and provide a better and more agile service. For example, in a
typical workflow application such as a loan approval process, the requester could
obtain a final decision in much less time than in a paper-based environment. He
could also monitor his request (his process instance) across its life cycle, and in-
teract with evaluators (provide more information, additional guarantees, and so
on).
An interesting effect in automated business processes are tendencies. A tendency
in this context is a sequence of similar process instances that requires similar
activities to be processed, and has the property of appearing in a relatively short
period of time.
OVERCOMING NEGATIVE TENDENCIES IN AUTOMATED BUSINESS PROCESSES
204
Continuing the loan approval example; lets suppose we have a rate of 1.000 loan
requests daily, in 50 geographically-distributed dependencies. The amount re-
quested varies from $10.000 to $100.000 with a normal distribution around
$50.000. In a given moment the loan request increases to 1.200, an average of 4
additional requests in each dependency. But the 200 new loan requesters have
similar profiles, and the amount requested is always around $10.000. This could
be a hidden coincidence in each isolated dependency, but clearly constitutes a
tendency relevant for the whole business.
This kind of tendencies could be an opportunity for marketing staff, if they could
manage it and get the most out of it. But they also could become a threat for
business success in case they are not detected on time.
The previously described tendency seems to be absolutely legal. But it could also
constitute some kind of fraud, for example, one that takes advantage of the geo-
graphically distributed and disconnected IT infrastructure. In this work, we will
focus on the first one; tendencies that are not frauds, but could still be dangerous
for the business, and if detected on time, could become a business opportunity.
PROBLEM DEFINITION
The problem to be addressed in this work is managing tendencies in automated
business process, in order to reduce the risk of decreasing business results, and
if possible, identifying business opportunities.
PREVIOUS AND RELATED WORK
When business processes are automated and executed by human participants,
most important decisions and most process-instance relevant information are
stored in databases. Significant business knowledge exists in many workflow and
BPM related items. In previous work
1
, we have proposed a model that allows ex-
traction and modelling of generated knowledge, stored in a Business Process
Management System.
With this knowledge available, we demonstrated that is possible to make recom-
mendations to human participants about good and bad decisions, to improve
performance, reduce mistakes and shorten learning times. These recommenda-
tions were based on successful and unsuccessful cases in the past.
CASE-BASED REASONING CBR
We used the Case-Based Reasoning (CBR) methodology to retrieve knowledge
from the repository (set of process instances), compare it with the current process
instance, make the recommendation and retain the relevant information. This
mechanism is divided in four main activities, as shown in Illustration 1:
Retrieve the most relevant cases considering the process instance being
evaluated
Reuse the solution from the chosen case
Revise or adapt the solution if necessary.
Retain the final solution when it has been validated.

1
Moreno, Juan J. and Joyanes, Luis. Applying Knowledge Management to exploit the poten-
tial of information stored in a business process management system(BPMS). Workflow Hand-
book 2006. Ed. Fischer, Layna. Pub. Future Strategies Inc 2006.
OVERCOMING NEGATIVE TENDENCIES IN AUTOMATED BUSINESS PROCESSES
205

Illustration 1 Using knowledge stored in process instances
PATTERNS
It is impracticable to compare each new instance against all process instances
stored in the database. There could be thousands, even millions, of them. In this
context, we introduce the concept of the process instance pattern that repre-
sents many process instances. Additionally to the scalability argument, patterns
provide a much more robust approach to deal with wrong decisions in process
instances. Given that a pattern represents many process instances, it is less sen-
sitive to instances that involve bad decisions (made by human beings).
We will say that a process instance is similar to a given pattern, when their inter-
est variable values are similar.
Additionally, the information in these patterns is stored balancing the patterns
with the successful feedback (from the expert, another person or a system). In
other words, when a process instance finishes, it is qualified as successful or not.
This qualification makes stronger or weaker the recommendation of decisions
taken in this instance and represented in the pattern.
KNOWLEDGE MAPS
This solution is not only used to make recommendations about decisions to be
taken during the process instance life cycle. It is also used to make recommenda-
tions about the most suitable participant in a given process activity, considering
his skills and past performance in similar instances.
In conclusion, we have a framework to group process instances in patterns, so it
is possible to:
Recommend decisions in every step of the process, based on successful
and unsuccessful decisions taken in the past for similar process in-
stances.
Suggest the most suitable participant for a stage of a process instance,
based on the instance characteristics, the available participants and their
abilities.
PROPOSED SOLUTION
The proposed solution for overcoming tendencies y automated business processes
could be conceptually divided in two main stages:
Detecting the tendency.
Taking actions to overcome the tendency.
In the following sections, we will be discussing alternatives for both of them.
Case
Database


Patterns
REUSE
REVISE
OR ADAPT
RETAIN
RETRIEVE
BPMS
Identified
Pattern
Process
Instance
Recommendation
CBR
OVERCOMING NEGATIVE TENDENCIES IN AUTOMATED BUSINESS PROCESSES
206
DETECTING THE TENDENCY
To detect the tendency, we have two main approaches:
Proactive, anticipating the tendency
Reactive, detecting the tendency after is has been established.
Proactive tendency detection
Given the pattern infrastructure previously described, it is feasible to define our
own patterns; using the values we need for interest values in order to identify
some kind of behaviours. This means that an expert who knows the business
very well could identify some types of process instances that require special con-
sideration.
The pattern group many process instances. If we predefine a pattern whose inter-
est variable values represent the potentially dangerous process instance, all simi-
lar instances will be associated with the pattern, enabling their detection, as
shown in Illustration 2.

Illustration 2 Proactive tendency detection using predefined patterns.
When the CBR Engine analyzes the new process instance, it uses its interest
variable values to associate it with a pattern. Given that we have predefined pat-
terns, labelled, for example, as dangerous for business or opportunity for busi-
ness, when a process instance is associated with one of them, it is detected. After
detected, some actions should be taken to avoid the risk of a business threat, or
to take advantage of the business opportunity.
Reactive tendency detection
Reactive tendency detection is also based on patterns. Each time a process in-
stance is associated with a pattern, the patterns are updated (considering the
success or not of the process instance). This means the pattern knows how
many, and which process instances have been associated to it.
If we have a pattern growing faster than the others in the number of associated
process instances in given period of time, this could mean we are in presence of a
tendency, as shown in Illustration 3.
Case
Database
Pattern 1
Normal
Process
Instance
CBR
Pattern 2
Dangerous for
Business
Pattern 3
Normal
Pattern 4
Normal
Pattern 5
Opportunity
for Business
Pattern n
OVERCOMING NEGATIVE TENDENCIES IN AUTOMATED BUSINESS PROCESSES
207
Illustration 3 Reactive tendency detection using the number of associated process
instances.
The CBR component could easily know how many instances are associated with
each pattern. It should also be provided with some extra functionality to manage
the time dimension, given that the growth could happen rapidly.
Given that this information is stored in the pattern, the growth could also be
monitored by a standardand externaltool.
Proactive or reactive detection?
Unfortunately, there is not a definitive answer. Both approaches have strengths
and weaknesses:
The proactive approach minimizes risk and maximizes the probability of
detecting a tendency that could become a business opportunity. The main
disadvantage is that someone has to know what kind of process these are
in order to define the pattern with appropriate values in its interest vari-
ables. This could be a complex, even impossible task.
The reactive approach is less expert-dependent, but it detects the ten-
dency when it is already happening. If detected too late, the tendency
could have been already harmful, or the business opportunity could al-
ready have passed. Anyway, the reactive approach is much more generic,
so it could be used in any domain, in any application.
At this stage, it should be clear that is not enough with just one approach to de-
tect the tendency. If we want a complete solution for the problem, we have to inte-
grate both of them, joining their strengths and reducing their weaknesses. This is
possible given that they are not mutually excluding.
TAKING ACTIONS TO OVERCOME THE TENDENCY.
After detecting the tendency, some actions should be taken in order to avoid ma-
jor risks or recognizing business opportunities.
We propose two main set of actions:
Alerting the business analyst.
Automatically updating process definition to avoid risks.


Pattern 1
# P.I.: 50
Pattern 2
# P.I.: 50
Pattern 3
# P.I.: 150
Pattern n
# P.I.: 350
Time: 00:00:00
Pattern 1
# P.I.: 60
Pattern 2
# P.I.: 100
Pattern 3
# P.I.: 155
Pattern n
# P.I.: 362
Time: 01:00:00
Pattern 1
# P.I.: 70
Pattern 2
# P.I.: 160
Pattern 3
# P.I.: 165
Pattern n
# P.I.: 372
Time: 02:00:00
Pattern 1
# P.I.: 80
Pattern 2
# P.I.: 230
Pattern 3
# P.I.: 177
Pattern n
# P.I.: 379
Time: 03:00:00
OVERCOMING NEGATIVE TENDENCIES IN AUTOMATED BUSINESS PROCESSES
208
Alerting the business analyst.
There are many different ways to alert the business analyst about the tendency
(negative or business opportunity), such as email, visual notifications, phone
calls, and so on. However, the most important thing is that after detected, the
business analyst should be alerted as soon as possible and in a very clear man-
ner.
In this context, we suggest using natural language to describe the tendency. In
this way, the business analyst is able to rapidly detect if the tendency constitutes
a threat, an opportunity or is irrelevant.
Natural language, combined with the values of the interest values of the pattern
associated with the tendency, could produce an objective and complete view of
the situation.
In previous work
2
we have demonstrated that it is possible to generatein a ge-
neric waynatural language from BPM business rules. Using these techniques, it
is feasible to generate natural language to express tendencies produced in a BPM
environment.
For example, the reader could easily see that is possible to generate a natural
language output when analyzing the tendency described in Illustration 3. The
generated text could be similar to Illustration 4, where underlined texts are the
tokens replaced by the alerting mechanism.

Illustration 4 Generating natural language to alert about a tendency.
Given a fast notification after the tendency is detected, the business analyst could
take business actions effectively in order to avoid the risks or take advantage of
the opportunity.
Automatically updating process definition to avoid risks.
In some scenarios, it is too risky waiting for the business analyst to take action in
order to avoid a business threat hidden in a process instance tendency.
In these situations, an automatic stop and go for the process instances that are
generating the tendency is mandatory. In this context, we propose an automatic
update of the process definition.

2
Delgado, Matas and Moreno, Juan J. Generating natural language from business rules in a
BPM environment. Computer Science Department. Universidad Catlica del Uruguay. 2006.
Process Tendency: There is a fast growing in process instances associated
with Pattern 3, from 50 to 230 in three hours. The values of interest vari-
ables in the pattern are:
Interest_Variable_1 : value 1
Interest_Variable_2 : value 2

Interest_Variable_n : value n
OVERCOMING NEGATIVE TENDENCIES IN AUTOMATED BUSINESS PROCESSES
209
In previous work
3
we have proposed that it is possible to modify the behaviour of
a process, generating exceptions, which are intercepted by the Business Activity
Monitoring component (BAM). After intercepting the process instance we can
modify the flow, forcing it to follow another path.
Illustration 5 Using BAM to intercept and modify the process flow.
Using this technique, after detecting a process instance that is following a poten-
tially harmful tendency, we can redirect it to some special and controlled activity
to properly manage it, instead the normal activities it would flow without BAM
intervention.
This solution is generic enough to be useful in any process, given that BAM has
an action scope of the complete process.
CONCLUSIONS
Over the past several years, Business Process Management and Workflow tools
have helped organizations to improve their performance and reach business
goals, mainly through automating and accelerating business processes. However,
there are still new applications in which BPM could be extended to provide an
even major help to achieve business objectives.
In this chapter we have presented the concept of tendencies that could appear in
an automated business process. A tendency in this context is a sequence of simi-
lar process instances that requires similar activities to be processed, and have the
property of appearing in a relatively short period of time. Such tendencies could
represent a wide range of business events, from a threat for the business to a
market opportunity to be exploited.
The key factor is detecting the tendency on time and taking the proper actions in
order to overcome it. We have proposed proactive and reactive mechanisms to
detect the tendency. These mechanisms takes advantage of the pattern approach
proposed in previous work, which enables a successful management of the
knowledge stored in a BPMS.

3
Laborde, Alejandro; Moreno, Juan J.; Novales, Rafael and Joyanes, Luis. Business Activity
Monitoring and Business Rules for Exceptions Management in the rules of a BPMS. Proceed-
ings of SISOFT 2006. Cartagena de Indias, Colombia.
BAM Component
Activity 1
Activity 2
Activity 3
Special
Activity
OVERCOMING NEGATIVE TENDENCIES IN AUTOMATED BUSINESS PROCESSES
210
We have described two major types of actions after the tendency is detected. The
first one is alerting the business analyst using natural language. In this way, the
understanding and action time is reduced, decreasing the risk of a business
threat and increasing the probability of being able to take advantage of a market
opportunity. The second type of action is oriented to reacting against a business
threat, automatically updating the process flow to intercept the potentially dan-
gerous process instances.
Having reached these objectives, we can provide an answer for the problem defini-
tion: it is possible to manage tendencies in automated business process, in order
to reduce the risk of decreasing business results and identifying business oppor-
tunities.
ACKNOWLEDGEMENTS
The authors would like to thank the members of Lithium Software Research
Team, and the Process Automation Working Group of the Universidad Catlica
del Uruguay, for rich discussions of concepts in this chapter presented.
211

Defining Easy Business Rules for
Accomplishing the Basel II Risk
Handling in Banks
Dr. Juan J. Trilles, AuraPortal BPMS, Spain
ABSTRACT
Practically all banks in the world are urged to implement during the coming several
years, starting 2007, the new risk-covering procedures established by Basel II. To
accomplish this regulation it is expected that banks will have to invest huge
amounts of money due to the new financial software systems required and the dif-
ficulties in the implementation involved. However, an intelligent design of Business
Rules supported on a BPM platform can be a new way to solve the problem with
enormous savings in money and time. The present work outlines the way to design
those rules with a practical example.
INTRODUCTION
The Framework for International Convergence of Capital Measurements and
Capital Standards, known as Basel II, published by the Basel Committee on
Banking Supervision requires banks to align their capital adequacy assessments
with underlying credit, market and operational risk to accurately reflect the ade-
quacy of their capital reserves.
Basel II is built around three mutually reinforcing pillars. Pillar I describes the
calculation for regulatory capital related to credit, market and operational risks
aligning minimum capital requirements more closely to each banks risk of eco-
nomic loss. Pillar II provides for supervisory review of an institutions activities
and risk profile to determine the adequacy of its capital and internal assessment
processes. Pillar III addresses improved market discipline and transparency
through specific public disclosure and reporting requirements to encourage safe
and sound banking practices.
Basel II takes effect starting in 2007 to banks located in most parts of the world
but it is anticipated that sensible delays will arise given the deep and broad effects
that the new regulations will have in the banking practices, that will lead the na-
tional supervisory authorities to allow for longer lengths of time in order to
achieve true implementation of Basel II in their supervisory areas.
This paper does not intend to analyze Basel II regulations or its impact in the
banking world. The reason for its existence is to offer a way of allowing the inclu-
sion, in daily banking operations, of the practices that Basel II promotes as a gen-
eral set of guidelines that all banks should embrace irrespective of regulatory re-
quirements, as stated in the paragraph 444 of the framework document, and at
the same time reduce to a reasonable level the awesome cost (according to most
financial consultants and banking experts) that this adoption may involve.
BASIC CONCEPTS
Basel II comprises several methods for determining the Regulatory Capital, (the
capital required to cover the risks associated with Credit, Market and Operating
conditions), one for each of the classified positions. Concerning the Credit Risk,
DEFINING EASY BUSINESS RULES FOR ACCOMPLISHING THE BASEL II RISK HANDLING
212
there are two methods: Standard and IRB (Internal Ratings-Based Approach). Of
these, the IRB is the most desirable because it can be tailored with great precision
to the particular credit operations of each bank.
One of the key issues for adopting Basel II IRB Method in the banks daily opera-
tions is the calculation of the Expected Loss (EL) associated with each credit op-
eration and the Unexpected Loss (UL) evaluated for the whole set of credit assets.
The Expected Losses (EL) are to be registered as forecasted costs and as such,
must be provisioned in the balance sheet. The EL does not need particular alloca-
tion of Regulatory Capital. EL is formulated as follows:
EL = PD * LGD * EAD where:
PD = Probability of Default (Probability that the borrower will not honour
his payment obligations).
LGD = Loss Given Default (Amount of money that will be lost if default is
produced, after considering the guaranties and collateral that will be real-
ized in case of default).
EAD = Exposure At Default (Amount that is subject to being defaulted).
The Unexpected Losses (UL) do require the allocation of Regulatory Capital to
cover the statistically calculated default risks.
RAROC (Risk Adjusted Return On Capital)
The inclusion of the credit risk component as well as the consideration of the
regulatory capital as the denominator in calculating the economic return of credit
assets has led to the definition of the term RAROC (Risk Adjusted Return on
Capital). The formulation for RAROC follows.
RAROC = (GM OE EL + PRC) /RC where:
GM = Gross Margin (Revenues Borrowing costs).
OE = Operating Expenses (proportional operating costs burden attribut-
able to the particular operation, line of business, etc.).
EL = Expected Loss (of the operation, line of business, etc.)
RC = Regulatory Capital allocated for this operation, line of business, etc.
PRC = Profit from Regulatory Capital (income produced by the regulatory
capital located in government guaranteed deposits).
According to Basel II, the EL and RC components of the aforementioned RAROC
formula have specific calculating methods as follows.
EL = PD * LGD * EAD
RC = K * EAD
Of these, PD, EAD, LGD and consequently EL, are to be calculated by the banks
own models. The calculation of K is accomplished by specific formulas provided
by Basel II as will be seen below.
BUSINESS RULES
The above formulations for calculating the RAROC according to the Basel II direc-
tions can be easily accomplished by defining a set of Business Rules that will be
used in the BPM processes devoted to handling credit risk in the daily bank op-
erations. To determine which rules will intervene, it is advisable to analyze first
the model for calculating the formula components: PD, LGD, EAD, EL and
RAROC.
In the present article, four types of business rules are used:
DEFINING EASY BUSINESS RULES FOR ACCOMPLISHING THE BASEL II RISK HANDLING
213
Textual Rule
This type of rule contains text that needs to be interpreted by a person to
apply it. The interpretation and application occurs within a personal task in
the workflow generated by a process in the BPM system. The task executor
reads the content of the rule and takes it as the instructions to be followed
in such a task.
Assignment Rule
This type of rule holds the values of several heterogeneous parameters that
are used across the company with independence of the processes. However,
the processes can read the values of one or more of such parameters for its
particular needs, any time.

Calculation Rule
This type of rule performs a calculation (mathematic, dates or literal) using
any variables, constants, functions and operators. The result is saved in a
field of a process panel as defined, either at rule definition time or rule exe-
cution time or both.

Inference Rule
This type of rule generates a value by inferring it from an unlimited set of
criteria qualifications orderly located in a matrix. The first combination that
suits all conditions identifies the rule result. This result can be used also as
a criterion by another rule. The former rule is called a nested rule of the
latter.
THE RULES SCHEMES
In the purest and more powerful conception, a business rule has only a declara-
tive nature. It contains information about how something must (or should) be
done but does not perform any action to achieve it. The action is to be performed
either by:
A personal task within the BPM process in case the rule needs to be in-
terpreted by human intelligence. This is the case for textual rules.
A system task within the BPM process in case the information in the rule
does not need human interpretation. This is the case for assignment, cal-
culation and inference rules.
A divergent gateway within the BPM process in similar circumstances as
explained in the last paragraph.
In the following paragraphs, the rules that intervene in the risk handling process
are presented with the functions they regulate. The figures show the rules as if
they were executing the calculations, inferences, etc. but as just explained, one
should not be mislead by that representation. The actions are not performed by
the rules but instead, by the tasks and gateways in the processes that use them
as will be shown later.
Probability of Default (PD)
The model for calculating PD through business rules follows.

DEFINING EASY BUSINESS RULES FOR ACCOMPLISHING THE BASEL II RISK HANDLING
214
Liquidity Margin
Internal Earnestness Appraisal
Solvency Margin
Solvency Rating
Net Assets
Credit Amount
Yearly Income
Yearly Expenses
Liquidity Rating Earnestness Rating
Borrower Rating PD_Probability of Default
RT
422
RC
420
RI
431
RC
421
RI
432
Launching Form
RI
433
RC
434
Borrower Points
Personal Task
Launching Form
External Earnestness Appraisal

Here, the calculation of the PD (Probability of Default) is made by the rule RI.436.
This is an inference rule that obtains its values based on the Borrower Rating
which is obtained from another inference rule: RI.435, which in turn bases its
result on the Borrower Points that are obtained through the calculation rule
RC.434.
The rule RC.434 calculates the Borrower Points by means of a formula that multi-
plies the values of Solvency Rating, Liquidity Rating and Earnestness Rating ob-
tained through the rules RI.431, RI.432 and RI.433 respectively.
The rule RI.431 derives Solvency Rating based on Solvency Margin, which is pro-
vided by rule RC.420. The rule RI.432 derives Liquidity Rating based on the Liquid-
ity Margin provided by rule RC.421. The rule RI.433 derives Earnestness Rating
directly from the values of Internal Earnestness Appraisal and External Earnest-
ness Appraisal, both manually evaluated following the textual rule RT.422.
The values for Net Assets, Credit Amount, Yearly Income and Yearly Expenses re-
lated to the borrower are introduced manually by the applicant in the form used
for credit request (Launching Form).
Loss Given Default (LGD)
The LGD is the amount of money that will be lost if default is produced after con-
sidering the guaranties and collateral that will be realized in case of payment de-
fault.
In order to evaluate the LGD it is assumed that the borrower will not be able to
cover the debt and therefore the bank has to require the collateral guaranties to
be used, for example, selling the mortgaged real estate offered as collateral. There-
fore, an estimation of the value that will be recovered in case of executing the col-
lateral must be known in advance in order to calculate LGD.
In the example presented here, it is assumed that several kinds of collateral can
be used in the credit contract: Pawn, Bank guarantee, Mortgage and Other Col-
lateral. Therefore, all these possibilities must be contemplated.
DEFINING EASY BUSINESS RULES FOR ACCOMPLISHING THE BASEL II RISK HANDLING
215
The model showing the business rules that carry out the calculation of LGD is the
following.

As shown in the figure, the LGD is obtained
in rule RC.449 multiplying the four mitiga-
tion components (Pawn, Bank, Mortgage and
Other Collateral).
These mitigation components are obtained
through the corresponding calculation rules
RC.451, RC.452, RC.453 and RC.454.
The calculating formula in each rule involves
the multiplication of the coverage provided
by the collateral (obtained manually in the
personal task that uses the textual rule
RT.425 as instructions) times its weight fac-
tor.
The weight factors are obtained through the
inference rules RI.455, RI.456, RI.457 and
RI.458 on the basis of the rating values given
manually to each collateral in the personal
task that contains the set of instructions
given in rule RT.424 (textual).
All the calculations envisaged in the busi-
ness rules mentioned above are automati-
cally and quickly performed by a particular
system task within the process that takes
care of the whole operation of credit ap-
proval, as will be seen later.

EAD_Exposition
The value of EAD_Exposition normally (although not always) coincides with the
nominal Credit Amount. The rule RC.447 makes the calculation.





EL_Expected Loss
Once the risk components: PD_Probability of Default, EAD_Exposition and
LGD_Loss Given Default are known, the calculation rule RC.450 obtains the value
of EL_Expected Loss by means of the formula EL = PD * EAD * LGD as can be
seen in the following figure.
DEFINING EASY BUSINESS RULES FOR ACCOMPLISHING THE BASEL II RISK HANDLING
216

K_Regulatory Capital Coefficient
The value of K is used for determining the Regulatory Capital as explained above.
The formulas for calculating K are credit class-dependent. There are 6 different
alternatives:
1. Corporate, Sovereign and Bank exposures.
2. Small and Medium-sized entities (SEM) exposures.
3. High Volatility Commercial Real Estate (HVCRE) exposures
4. Residential Mortgage (Retail) exposures.
5. Qualifying Revolving Retail exposures.
6. Other Retail exposures.
In this work, the chosen example deals with credit class exposure related to Resi-
dential Mortgage Exposures (Retail). The formula for determining K in this case is:

To solve this formula, the values of R (correlation between each particular asset
and the whole set of assets or the economic environment), PD and LGD are re-
quired. The value of R for this class of credit is fixed by Basel II equal to 0,15. The
PD and LGD values are calculated by the rules that have been shown above.
With all these data, the rule RC.444 computes the value of K (K_Regulatory Capi-
tal Coefficient):

RC_Regulatory Capital
The value of RC_Ragulatory Capital is obtained by the calculation rule RC.445
which says: RC = K * EAD:
DEFINING EASY BUSINESS RULES FOR ACCOMPLISHING THE BASEL II RISK HANDLING
217

GM_Gross Margin
The Gross Margin is the difference between the Revenues and the Borrowing Cost
for the bank to obtain the deposit. The corresponding calculations are carried out
by the interconnected rules RC.403 and RC.443 using the Credit Amount (from
the Launching Form) and the Capital Cost Factor that resides in the assignment
rule RA.400 as shown:

OE_Operating Expenses
Similarly, the Operating Expenses charge is calculated by the rule RC.442 which
takes the values of Credit Amount (from the Launching Form) and the Operating
Expenses Factor that resides in the assignment rule RA.400 as shown:

PRC_Profit from Regulatory Capital
The regulatory capital cannot be used as funds for loans, but it can be invested in
100% secure positions like State Guarantied deposits. The interest rate for these
deposits are registered in the assignment rule RA.400. With this information and
the calculation of RC_Regulatory Capital determined by rule RC.445, the
PRC_Profit from Regulatory Capital is derived by the rule RC.446:

DEFINING EASY BUSINESS RULES FOR ACCOMPLISHING THE BASEL II RISK HANDLING
218
RAROC_ Risk Adjusted Return On Capital
At this stage, all components of RAROC are known and therefore its calculation is
made by the rule RC.460 as follows.

THE PROCESS
What has been explained up to this point is insufficient for describing the whole
picture. Something has to be said about how to implement the system described.
The best way to handle the risk in a bank is to include all risk-related aspects
since the first moment when a credit is requested, and needless to say, that the
natural environment for this kind of job is a BPMS arena where the process tasks
are the ones in charge of calling and executing the business rules. Most rules call
for automatic execution of their contents and therefore are to be performed by
system tasks without the need of human intervention. Other rules need interpre-
tation and should be carried out by means of human intervention.
In this work, the example is modelled using the widely-celebrated standard BPMN
(the notation developed by BPMI.org and supported today by the OMG).
The imaginary bank is AuraBank and the Class of Process that will take care of
the credit request is AuraBank_Credit Request.
The Diagram of the Model for this Class of Process is shown below.
DEFINING EASY BUSINESS RULES FOR ACCOMPLISHING THE BASEL II RISK HANDLING
219

Depending on the credit class, the gateway DX will direct the flow to the adequate
subprocess. Each subprocess starts by placing a personal task in the workflow
queue of the person with the role that is responsible for filling in the form with all
data related to the credit request. In the example followed here, the subprocess
Loans to Residential Mortgages has been chosen. Its expanded notation is the fol-
lowing.

In this subprocess, after the first task: Launching Form has been completed filling
in the data related to the credit operation (applicants name and attributes, loan
amount, interest, reimbursement quotes, collateral, etc.), the flow arrives at the
subprocess Risk Analysis. If the analysis produces a positive outcome, the loan is
granted and the flow reaches the subprocess Loan Contract and Documentation. If
the outcome is negative, the subprocess Process Cancellation informs about the
loan denial and cancels the whole process.
The subprocess Risk Analysis has the following expanded notation.
DEFINING EASY BUSINESS RULES FOR ACCOMPLISHING THE BASEL II RISK HANDLING
220

In this subprocess, the nested subprocess Risk Calculations performs all the cal-
culations according to the business rules explained above. The results are exam-
ined by the gateway DX.68 that decides which one of the two approval circuits
applies.
The subprocess Risk Calculations shown here can be invoked from any subproc-
esses regardless of the credit class they are handling. Its expanded notation is the
following.

DEFINING EASY BUSINESS RULES FOR ACCOMPLISHING THE BASEL II RISK HANDLING
221
As it can be seen, this subprocess contains all the system tasks that perform the
required calculations. Also the business rules that are used for such calculations
are shown in the diagram.
CREATING BUSINESS RULES
Nothing has been said yet about how to define business rules. Naturally, different
applications from different vendors will offer different systems for creating the
rules and establishing the suitable automatic connections with the BPM proc-
esses that will execute actions following their directions.
As a sample, several reproductions of windows for creating rules are shown be-
low.
Textual Rule

Calculation Rule


DEFINING EASY BUSINESS RULES FOR ACCOMPLISHING THE BASEL II RISK HANDLING
222
Inference Rule

CONCLUSION
Taking into account the length limitations of a work like the present one that
must fit into a chapter of the handbook, I hope that the concepts outlined here
will suffice to help the reader to discover the use of business rules (in solid com-
bination with powerful and flexible BPMS platforms) as an optimal solution to
handle credit risks in any financial institution wishing to comply with the regula-
tions contained in the Basel II framework. And that, at a fraction of the otherwise
expected high cost.

223

MSCWV: Cyclic Workflow
Verification Algorithm for
Workflow Graphs
Sinnakkrishnan Perumal
1
and Ambuj Mahanti,
Indian Institute of Management Calcutta, India
Abstract: Workflow verification is an area of increasing concern in workflow man-
agement. It deals with verifying structural correctness of the workflows. This chap-
ter mentions the existing algorithms and their limitations, and then provides an
algorithm that can verify all types of workflow graphs. Theoretical complexity of the
proposed algorithm is O(E
2
), where E is the number of the edges in the workflow
graph. This algorithm is described through an illustrative business example. Fi-
nally, we summarize the implementation details.
1. INTRODUCTION
A workflow generally cuts across various functional divisions of any organization (or
a set of organizations) thus mapping all the tasks that are to be performed for
achieving a given objective. Workflow Management Systems (WfMSs) deal with
automating such workflows. Workflows can be represented in one of the many rep-
resentations available, including UML diagrams, workflow graphs and Petri nets.
Further, workflows can be depicted in several perspectives during design time such
as, functional perspective, resource perspective, information perspective, and con-
trol perspective.
In this chapter, we will be dealing with verifying structural correctness in the work-
flow graph representation of the workflows. Various structural errors occur in work-
flow graphs such as deadlock, lack of synchronization and infinite loops. These
structural conflicts are explained in detail in a later section.
1.1 Business Importance
In any workflow corresponding to a business process, especially those involving
hundreds of tasks, chances are there that there is a cycle. This could be due to
repetition of tasks for correction, for business rules adherence, etc. It is required to
identify errors in workflows before deploying them in business environment. Oth-
erwise, it could lead to various issues such as customer dissatisfaction, employees
frustration with the systems, reducing profits, etc. This problem is especially severe
in workflows with cycles, due to the complexity involved in identifying errors in
such workflows through human intervention.
1.3 Workflow graph representation
In the workflow graph representation, five types of node constructs are used: Se-
quence node, AND-split node, AND-join node, OR-split node and OR-join node. Se-
quence nodes, AND-split nodes and AND-join nodes are represented as rectangles;
OR-split nodes and OR-join nodes are represented as circles. (Table-1) gives the
node constructs and their descriptions.

1
This research was partially supported by Infosys Technologies Limited, Bangalore under the
Infosys Fellowship Award.
MSCWV: CYCLIC WORKFLOW VERIFICATION ALGORITHM FOR WORKFLOW GRAPHS
224
Table-1: Node Constructs
S. no. Type of
the node
Representation of
the node
Description of the node
1 Sequence
node

A sequence node can have a maxi-
mum of one incoming edge, and a
maximum of one outgoing edge.
2 AND-split
node

An AND-split node has more than
one outgoing edge and can have a
maximum of one incoming edge.
AND-split node is used to concur-
rently execute several threads of
tasks.
3 AND-join
node

An AND-join node has more than
one incoming edge and can have a
maximum of one outgoing edge.
AND-join node is used to merge the
several threads of tasks initiated by
AND-split node(s). AND-join node
waits for all the threads repre-
sented by all its incoming edges
before processing the subsequent
node.
4 OR-split
node

An OR-split node has more than
one outgoing edge and can have a
maximum of one incoming edge.
OR-split node is used to choose
exactly one thread out of several
threads of tasks.
5 OR-join
node

An OR-join node has more than
one incoming edge and can have a
maximum of one outgoing edge.
OR-join node is used to merge the
threads of tasks arising from OR-
split node(s). OR-join node waits for
exactly one of the threads repre-
sented by its incoming edges before
processing the subsequent node.
1.4 Structural Conflict Errors in Workflow Graphs
(Table-2) lists various structural errors that can arise while depicting the workflows
as workflow graphs. (Table-3) shows example workflow graphs for various struc-
tural conflicts.


A
B
C
D
E
MSCWV: CYCLIC WORKFLOW VERIFICATION ALGORITHM FOR WORKFLOW GRAPHS
225
Table-2: Structural Conflict Errors in Workflow Graphs
S.
no.
Type of the er-
ror
Description of the error
1 Deadlock Deadlock occurs when not all the incoming edges repre-
senting various threads respectively are received at an
AND-join node, thus leading to the AND-join node wait-
ing forever.
2 Lack of syn-
chronization
Lack of synchronization occurs when more than one
concurrent thread from an AND-split node(s) merges at
an OR-join node, leading to multiple executions of the
paths following the OR-join node.
3 Infinite loops Infinite loop occurs when the loop will be executed for-
ever, irrespective of the decisions made on choosing the
child nodes in various OR-split nodes within the loop.
4 Infinite loop
and deadlock
This structural conflict occurs when a loop is triggered
by an AND-split node, and when the loop thread merges
with the main thread in an AND-join node.
5 Nodes that
cannot be acti-
vated
This structural conflict occurs when there are nodes in
the workflow graph which cannot be reached from the
start node.
6 Nodes that
cannot be ter-
minated
This structural conflict occurs when there is no path
from any node to the terminal node (also called end
node).
Bushy loops Bushy loops are finite loops with extraneous nodes con-
nected to it which are not part of the loop.
7 (a). Loops with
extraneous
links to nodes
outside the loop
This structural conflict occurs when an outgoing edge
from an AND-split node within a loop goes to any node
outside the loop. This structural conflict leads to multi-
ple executions of the nodes outside the loop.
7
7 (b). Loops
with extraneous
links from
nodes outside
the loop
This structural conflict occurs when there is an edge
from a node outside the loop to an AND-join node within
the loop. This leads to deadlock problem at the AND-join
node.
MSCWV: CYCLIC WORKFLOW VERIFICATION ALGORITHM FOR WORKFLOW GRAPHS
226
Table-3: Example workflow graphs for various structural conflicts

Figure-1: Example for
Deadlock

Figure-2: Another Exam-
ple for Deadlock

Figure-3: Example for
Lack of synchronization

Figure-4: Example for
Infinite loop

Figure-5: Example for In-
finite loop and deadlock

Figure-6: Example for
Nodes that cannot be ac-
tivated

Figure-7: Example for
Nodes that cannot be
terminated


Figure-8: Example for
Loops with extraneous
links to nodes outside the
loop

Figure-9: Example for
Loops with extraneous
links from nodes outside
the loop
C
A
B
C B
D
A
C
A
B
D
C
B
A
B
C
A
C
D
S
B
A
C
B
T
A
D
E
B
C
A
D
E
A
C
B
D
MSCWV: CYCLIC WORKFLOW VERIFICATION ALGORITHM FOR WORKFLOW GRAPHS
227
1.4 Scope defined through Patterns
[1] describes twenty-one workflow patterns that can be used for building business
processes. Of these patterns, following five patterns were identified as Basic Control
Patterns:
Sequence (WFMC calls this pattern as Sequential Routing)
Parallel Split (WFMC calls this pattern as AND-Split)
Synchronization (WFMC calls this pattern as AND-Join)
Exclusive Choice (WFMC calls this pattern as OR-Split)
Simple Merge (WFMC calls this pattern as OR-Join)
This paper proposes an algorithm called Mahanti-Sinnakkrishnan Cyclic Workflow
Verification (MSCWV) Algorithm to verify workflow graphs that are created with the
above-mentioned five basic control patterns and the pattern Arbitrary Cycles. Arbi-
trary cycles pattern refers to unstructured loops, and thus it enables having com-
plex loop structures in the workflow graph. Thus, it allows having multiple entry
points and multiple exit points in the loops.
2 LITERATURE REVIEW
Workflow verification was solved for Petri Net representation of the workflows as
given in [2] and [3]. A restricted version of Petri Nets called WF Nets is used to rep-
resent workflows. Advantages of this method are that, (1) this uses the theoretical
background of the Petri Nets to analyze the workflows, and (2) this method verifies
workflows with loops as well. Disadvantages of this method are that, (1) representa-
tion of the workflow in Petri Nets is not intuitive and easy to understand, (2) verifi-
cation is very complicated as it uses the theoretical foundation of Petri Nets, (3)
Only Free-Choice WF Nets and well-structured WF Nets can be verified in polyno-
mial time [4].
Graph reduction rule method for verifying workflows represented as workflow
graphs was given first in [5]. In this method, there is a set of graph reduction rules
that are repeatedly applied on the workflow graph to reduce it to an empty graph. If
the workflow graph does not have any structural conflicts, then it reduces to an
empty graph, else the workflow graph does not reduce to an empty graph. Advan-
tages of this method are that, (1) it is simple to use, (2) it uses workflow graph rep-
resentation which is easier to understand. Disadvantage of this method is that it
does not give correct results when there are certain overlapped constructs (as
proved in [6]) that could be present in workflow graphs. Thus, to verify all acyclic
workflow graphs, [6] came up with a set of graph reduction rules, which could re-
duce any correct acyclic workflow graph into an empty graph and it would not be
possible to reduce any incorrect acyclic workflow graph with this set of graph re-
duction rules.
Propositional logic method (as given in [7]) uses logic-based representation for rep-
resenting the workflow. This method uses the traditional propositional logic way of
reducing the logic formulae into the conclusion, with modification to suit the con-
straints imposed by workflow. Complexity of this method is O(N
2
), where N refers to
the number of tasks in the workflow process. First disadvantage of this method is
that it does not give correct results when there are certain overlapped constructs
that could be present in workflow graphs. Another disadvantage of this method is
that it does not handle nested cycles.
A method for workflow verification based on block reduction, and partitioning of
cyclic workflow graphs into acyclic subgraphs was given in [8]. At each iteration,
this method searches for reducible blocks in the complete workflow graph. Reduci-
ble blocks are those blocks (a set of nodes and associated links) within the workflow
MSCWV: CYCLIC WORKFLOW VERIFICATION ALGORITHM FOR WORKFLOW GRAPHS
228
graph which can be potentially reduced to a block activity node, without causing
any change in the structural properties of the workflow graph in terms of structural
conflicts. Worst-case complexity of this method is O((+T)
2
.
2
), where T refers to
the set of edges, and N refers to the set of nodes in the workflow graph. This method
does not give correct results for certain workflow graphs that could lead to uninten-
tional multiple executions of the nodes outside the cycle. This problem in this
method is explained in [9].
Mahanti-Sinnakkrishnan algorithm (detailed in [10], [11], and [12]) is an iterative
graph-search based algorithm that verifies workflows represented as workflow
graphs. Mahanti-Sinnakkrishnan algorithm is simple compared to other workflow
verification algorithms as this is a verification algorithm for workflow graphs (which
in itself is simple), it uses graph search techniques, and the algorithm is modular-
ized using three simple procedures. Mahanti-Sinnakkrishnan algorithm is also effi-
cient compared to other workflow verification algorithms. Mahanti-Sinnakkrishnan
algorithm verifies all types of acyclic workflow graphs, including overlapped work-
flow graphs. However, it cannot verify cyclic workflow graphs. This chapter gives an
extension of Mahanti-Sinnakkrishnan algorithm called Mahanti-Sinnakkrishnan
Cyclic Workflow Verification (MSCWV) algorithm to verify both ayclic and cyclic
workflow graphs.
3 RESEARCH GAPS
From the discussion in section 2, it is clear that the existing algorithms individually
cannot verify all types of cyclic workflow graphs correctly. Even though method
proposed in [2] for WF Nets verifies cyclic workflows, it does not verify workflows
represented as workflow graphs. Although an algorithm is given in [13] for convert-
ing workflow graphs into WF Nets and vice versa, it is a tedious process. Hence, in
this chapter, we propose a new algorithm called Mahanti-Sinnakkrishnan Cyclic
Workflow Verification (MSCWV) algorithm designed to verify all types of cyclic work-
flow graphs and acyclic workflow graphs.
4 MAHANTI-SINNAKKRISHNAN CYCLIC WORKFLOW VERIFICATION (MSCWV) ALGORITHM
4.1 Approach
MSCWV algorithm verifies all types of cyclic workflow graphs and acyclic workflow
graphs. Idea behind this algorithm is as follows. Decision points in the workflow
processes are those moments in which the manager has to make a decision on
which option to choose for the given case. Decision points are denoted by OR-split
nodes in the workflow process. If these decisions are made in such a way that there
is no repetition of tasks in the workflow process, then it results in a base subgraph
(base subgraph is defined below). If, in a decision point, the manager decides to re-
peat a set of tasks, then it results in an OR-split triggered loop (OR-split triggered
loop is defined below). All such base subgraphs, and OR-split triggered loops are to
be separately identified and verified to verify the workflow graph. However, similar
to Mahanti-Sinnakkrishnan algorithm for verifying acyclic workflow graphs,
MSCWV algorithm checks only for a subset of base subgraphs and OR-split trig-
gered cycles, and still verifies the cyclic workflow graph correctly. Each iteration of
MSCWV algorithm corresponds to identification and verification of either a base
subgraph or an OR-split triggered loop.
We define a Base Subgraph as follows, Start node of the workflow graph belongs to
any base subgraph. If a node n belongs to a base subgraph and n is an OR-split
node, then exactly one child node of n and the edge connecting n to this child node
belongs to the base subgraph. If a node n of the graph belongs to a base subgraph
and n is not an OR-split node, then all child nodes of n and the edges connecting n
MSCWV: CYCLIC WORKFLOW VERIFICATION ALGORITHM FOR WORKFLOW GRAPHS
229
to these child nodes belong to the base subgraph. Any OR-split node that belongs
to base subgraph cannot trigger any loop in base subgraph.
Also, we define an OR-split triggered loop as follows, An OR-split triggered loop is
triggered by an OR-split node which we call as loop-trigger-node. If a node n be-
longs to an OR-split triggered loop and n is an OR-split node, then exactly one child
node of n and the edge connecting n to this child node belongs to the OR-split trig-
gered loop. If a node n belongs to an OR-split triggered loop and n is not an OR-split
node, then all child nodes of n and the edges connecting n to these child nodes be-
long to the OR-split triggered loop. Any OR-split node, in an OR-split triggered loop,
other than loop-trigger-node, cannot trigger any loop in that OR-split triggered
loop.
4.2 Description of the algorithm
MSCWV algorithm is an iterative algorithm. (Table-4) gives MSCWV algorithm in an
abstract level. Various procedures in MSCWV algorithm are described in section
4.3, and they are as follows,
CISC
Check_if_loop
VISC
VL
Prepare_for_next_instance
Initial base subgraph of the cyclic workflow graph, created in the first iteration, is
obtained by traversing from the start node of the workflow graph. In subsequent
iterations, OR-split triggered loops and base subgraphs are obtained by traversing
from a specially identified OR-split node called PED-OR node. PED-OR node is a
partially explored, deepest OR-split node with respect to the explicit graph obtained
in the previous iteration.
Traversal in any procedure happens as follows. From any traversed OR-split node,
exactly one of its child nodes is chosen for further traversal. From any other trav-
ersed node, all its child nodes are chosen for further traversal. Depth First Search is
used for traversal.
Table-4: MSCWV algorithm in abstract level
Step A: Check for nodes that cannot be activated:
If found, algorithm reports error and stops
Step B: Check for nodes that cannot be terminated:
If found, algorithm reports error and stops
Step C: Check for any back edge (a back edge connects a node to its ancestor)
with destination node as AND-join node:
If found, algorithm reports error and stops
Step D:
Do
Step 1: Call CISC
o Creates a base subgraph or an OR-split triggered loop
Step 2: Call Check-if-Loop
o Checks whether Step 1 created a base subgraph or it created an
OR-split triggered loop
o This procedure may be called once or twice for any iteration
Step 3:
If OR-split triggered Loop was created in Step 1:
o Call VL
MSCWV: CYCLIC WORKFLOW VERIFICATION ALGORITHM FOR WORKFLOW GRAPHS
230
Verifies OR-split triggered loop
Deadlock, Lack of Synchronization and Loops with extra-
neous links to nodes outside the loop are identified
Algorithm reports error and stops, if a structural conflict
was found
Otherwise:
o Call VISC
Verifies Base Subgraph
Deadlock and Lack of Synchronization are identified
Algorithm reports error and stops, if a structural conflict
was found
Step 4: Call Prepare_for_next_instance
o Prepare data structures for the next iteration
o Also identifies PED-OR-node for the next iteration
o Algorithm reports correct and stops if PED-OR-node not found
While (PED-OR-node found);
4.3 Procedures in the algorithm
4.3.1 CISC
CISC creates either a base subgraph or an OR-split triggered loop.
In the first iteration, CISC expands from the start node. In subsequent iterations,
CISC expands from PED-OR node. CISC does not traverse through any already ex-
panded node.
From PED-OR node, child node that has a lower distance to the end node, and for
which the edge from PED-OR node to it is not traversed, is chosen for traversal. If
an unexplored OR-split node is found in this traversal, CISC chooses the child node
that has a lower distance to the end node.
4.3.2 Check_if_loop
Check_if_loop procedure checks whether the PED-OR node is,
involved in a base subgraph, or
involved in an already found OR-split triggered loop, or
triggering a new OR-split triggered loop.
Initial preparations in main procedure of the algorithm before calling Check_if_loop
are as follows. Loop_tag of PED-OR node in OR_split_stack is set as OFF
(OR_split_stack is a stack for storing partially explored OR-split nodes. Loop_tag is
a Boolean for each OR-split node in OR_split_stack). Any node in OR-split stack
with Loop_tag as ON will be an ancestor of PED-OR node in the reverse order from
the top of the stack.
Check_if_loop traverses beginning from the node given to it as input parameter.
Check_if_loop is called for the first time with PED-OR node as input parameter. It
checks if the PED-OR node is visited again, and checks if any of the OR-split nodes
with Loop_tag as ON in OR_split_stack is visited. If PED-OR node is visited again,
then this means that PED-OR node is in an OR-split triggered loop. However, if any
of the OR-split nodes with Loop_tag as ON in OR_split_stack is also visited, then it
means that PED-OR node is in an already-found OR-split triggered loop, and not
triggering a new loop. For this, each OR-split node in OR_split_stack with Loop_tag
as ON, chosen in the bottom-to-top order in OR_split_stack, is checked whether it
is visited. If, in this check, an OR-split node is found to be visited, then Loop_tag of
all OR-split nodes in OR_split_stack above this node are set as False. If PED-OR
MSCWV: CYCLIC WORKFLOW VERIFICATION ALGORITHM FOR WORKFLOW GRAPHS
231
node is found to be triggering a new loop, then VL is called to verify the OR-split
triggered loop starting from PED-OR node and Loop_tag is set as ON for PED-OR
node. Else, if PED-OR node is found to be in an already found OR-split triggered
loop, then VL is called to verify the OR-split triggered loop. Else, Check_if_loop is
called again in this iteration as follows.
If there is any OR-split node in OR_split_stack with Loop_tag as ON, then
Check_if_loop procedure is called for the second time with input parameter as the
top-most OR-split node in OR_split_stack with Loop_tag as ON. If Check_if_loop
procedure found a loop, VL is called to verify the OR-split triggered loop.
If Check_if_loop procedure did not identify any loop in the above call(s), then
Loop_tag of all OR-split nodes in OR_split_stack are set as OFF, and VISC is called
to verify the base subgraph.
4.3.3 VISC
VISC verifies the base subgraph by traversing from the start node. If an OR-join
node is visited more than once, then VISC reports Lack of synchronization struc-
tural conflict and the algorithm stops. After visiting all the nodes of the base sub-
graph, VISC checks if each of the visited AND-join nodes was visited through all of
its incoming edges. If any visited AND-join node was not visited through all of its
incoming edges, then VISC reports Deadlock structural conflict and the algorithm
stops.
4.3.4 VL
VL verifies the OR-split triggered loop by traversing from the OR-split node that is
triggering the loop identified in Check_if_loop procedure. If an OR-join node is vis-
ited more than once, then VL reports Lack of synchronization structural conflict
and the algorithm stops. After visiting all the nodes of the OR-split triggered loop,
VL checks if each of the visited AND-join nodes was visited through all of its incom-
ing edges. If any visited AND-join node was not visited through all of its incoming
edges, then VL reports Deadlock structural conflict and the algorithm stops. Also,
after visiting through all nodes of the OR-split triggered loop, VL checks if the end
node of the workflow graph is visited. If the end node is visited, then VL reports Ex-
traneous Link to end node structural conflict and the algorithm stops.
4.3.5 Prepare_for_Next_Instance
Prepare_for_next_instance updates OR_split_stack and identifies PED-OR node. It
removes the top node(s) of OR_split_stack if all its outgoing edges have been trav-
ersed. While removing any node from OR_split_stack, Prepare_for_next_instance
checks if the OR-split node has its Loop_tag as ON in OR_split_stack. If the
Loop_tag is ON, then it changes the marking for the OR-split node to the outgoing
edge corresponding to last non-loop edge. Last non-loop edge for any OR-split
node is the last traversed outgoing edge that did not trigger a loop. If OR_split_stack
is not empty, then top node of OR_split_stack is chosen as PED-OR node. If
OR_split_stack is empty, then Prepare_for_next_instance reports that there are no
structural conflict errors in the workflow graph and the algorithm stops.
4.4 Definition of Level of Loop
Number of OR-split nodes with Loop_tag as ON in OR_split_stack gives the level of
the loop. For the workflow graph, there can be many loops at the same level.
4.5 Workout for an example
(Figure-10) gives a business workflow graph with cycles for illustrating MSCWV al-
gorithm step by step (More examples on how MSCWV algorithm works can be seen
in [9]). This sales order workflow process example was adopted from [14] and
MSCWV: CYCLIC WORKFLOW VERIFICATION ALGORITHM FOR WORKFLOW GRAPHS
232
changed to suit the needs of this chapter. This example has several base subgraphs
and several OR-split triggered loops. (Table-6) gives the Iteration table which shows
PED-OR node, sequence of nodes expanded by CISC, and traversal by
Check_if_loop, VISC and VL. For brevity, iterations 2 and 3 are shown through text.


Figure-10: Example Cyclic Work-
flow Graph for Illustration of the
algorithm
Table-5: Distance from start node
and distance to end node computa-
tion for the example cyclic workflow
graph in (Figure-10).
Node
No.
Distance from
start node
Distance
to end
node
T1 0 7
C1 1 6
T2 2 5
C2 3 4
T3 4 5
C3 5 4
T4 6 5
C4 7 4
C5 4 3
T5 8 3
T6 5 2
T7 9 2
C6 6 1
T8 7 0
It could be seen from (Table-5) that all nodes in the workflow graph have a finite
distance to the end node. If any of the nodes had distance to the end node as infin-
ity, then it will lead to nodes that cannot be terminated error. Since the example
workflow graph does not have any nodes like that, it is free from this error. Simi-
larly, distance from the start node for each node is used to check for nodes that
cannot be activated error.
There are two back edges in this workflow graph, first one is from node C3 to node
C1, and the second one is from node C4 to node C1. In both cases, the destination
node of the back edge is the node C1, and it is of type OR-join node. If any of the
destination nodes of the back edges is an AND-join node, then there will be a dead-
T1: New order
T2: Data verify
T3: Order submitted
for approval
C4
T5: Order ready
to ship
T7: Ship order
T4: Credit check
C3
T6: Order
cancelled
C2
C6
T8: End order
processing
C5
C1
MSCWV: CYCLIC WORKFLOW VERIFICATION ALGORITHM FOR WORKFLOW GRAPHS
233
lock error. Since there is no such problem in this example workflow graph, this is
free from such an error.
(Table-6) and (Table-7) present the iteration-wise trace of the algorithm, and up-
dates of the OR_split_stack by the algorithm. Following text should be read in con-
junction with these two tables. Broad guidelines for reading (Table-7) are presented
just above that table.
Iteration 1: CISC begins from the start node, and the nodes expanded are T1, C1,
T2, C2, C5, T6, C6, T8, in this order. Of these nodes, node C2 is an OR-split node.
Child node of node C2 with the lower distance to end node and that has not been
reached through node C2 is node C5. Hence, node C5 was chosen for further ex-
pansion at node C2. Check_if_loop procedure is called with the start node and it
does not identify any OR-split triggered loop in this iteration. Hence, VISC is called
to verify the base subgraph. For this base subgraph, in VISC, all the visited OR-join
nodes are visited just once. Further, there is no AND-join node in this base sub-
graph. Hence, this base subgraph is verified to be correct.
Iteration 2: PED-OR node is the topmost OR-split node in the OR-split stack,
hence PED-OR node for the second iteration is C2. In the second iteration, CISC
starts from the PED-OR node C2, and the nodes expanded are C2, T3, C3, C5.
Check_if_loop procedure is called with the PED-OR node C2 and it traverses the
nodes, C2, T3, C3, C5, T6, C6 and T8, and does not identify any OR-split triggered
loop. Hence, VISC is called to verify the base subgraph. VISC verifies this base sub-
graph as a correct one. As there was no OR-split triggered loop in this iteration, last
non-loop edge for PED-OR node C2 is set as T3 corresponding to the marking from
it.
Iteration 3: PED-OR node for the third iteration is C3. CISC expands the following
nodes, C3, T4, C4, T5, T7, and C6. Check_if_loop procedure is called with the PED-
OR node C3 and it traverses the nodes, C3, T4, C4, T5, T7, C6 and T8, and does
not identify any OR-split triggered loop. Hence, VISC is called to verify the base
subgraph. VISC verifies this base subgraph as a correct one. As there was no OR-
split triggered loop in this iteration, last non-loop edge for PED-OR node C3 is set as
T4 corresponding to the marking from it.
Iteration 4: PED-OR node for the fourth iteration is C4. CISC expands the follow-
ing nodes, C4 and C5. Check_if_loop procedure is called with the PED-OR node C4
and it traverses the nodes, C4, C5, T6, C6 and T8, and does not identify any OR-
split triggered loop. Hence, VISC is called to verify the base subgraph. VISC verifies
this base subgraph as a correct one. As there was no OR-split triggered loop in this
iteration, last non-loop edge for PED-OR node C4 is set as C5 corresponding to the
marking from it.
Iteration 5: PED-OR node for the fifth iteration is C4. CISC expands the following
nodes, C4 and C1. Check_if_loop procedure is called with the PED-OR node C4 and
it traverses the nodes, C4, C1, T2, C2, T3, C3, T4 and C4, and finds that this is an
OR-split triggered loop triggered by node C4. VL is called to verify this OR-split trig-
gered loop. For this OR-split triggered loop, in VL, all the visited OR-join nodes are
visited just once. Further, there is no AND-join node in this loop and no loop-
related structural conflicts were found in this iteration. Hence, this OR-split trig-
gered loop is verified to be correct. As a new OR-split triggered loop starting from
PED-OR node was identified in this iteration, Loop_tag is set as ON for PED-OR
node C4 in the OR-split stack.
Iteration 6: PED-OR node for the sixth iteration is C3. CISC expands the following
nodes, C3 and C1. Check_if_loop procedure is called with the PED-OR node C3 and
MSCWV: CYCLIC WORKFLOW VERIFICATION ALGORITHM FOR WORKFLOW GRAPHS
234
it traverses the nodes, C3, C1, T2, C2, T3 and C3, and finds that this is an OR-split
triggered loop triggered by node C3. VL is called to verify this OR-split triggered
loop. VL verifies this OR-split triggered loop as a correct one. As a new OR-split trig-
gered loop starting from PED-OR node was identified in this iteration, Loop_tag is
set as ON for PED-OR node C3 in the OR-split stack. This OR-split triggered loop is
verified to be correct. In Prepare_for_Next_Instance procedure, OR-split stack is up-
dated by removing OR-split nodes which are completely expanded. This leads to an
empty OR-split stack. Hence, the workflow graph is reported to be free from struc-
tural conflict errors and the algorithm stops.
Table-6: Iteration Table for the workflow graph given in (Figure-10)
Iteration No. &
PED-OR node
CISC Check_if_loop VISC / VL
1


PED-OR node
= NIL


Result: False
Trace of VISC:


2

PED-OR node
= C2
Nodes ex-
panded
are: C2,
T3, C3, C5
Nodes traversed are:
C2, T3, C3, C5, T6, C6,
T8
Result: False
Trace of VISC:
Nodes traversed are:
T1, C1, T2, C2, T3, C3,
C5, T6, C6, T8
3

PED-OR node
= C3
Nodes ex-
panded
are: C3,
T4, C4, T5,
T7, C6
Nodes traversed are:
C3, T4, C4, T5, T7, C6,
T8
Result: False
Trace of VISC:
Nodes traversed are:
T1, C1, T2, C2, T3, C3,
T4, C4, T5, T7, C6, T8
T1
T2
T6
C2
C6
T8
C5
C1
T1
T2
T6
C2
C6
T8
C5
C1
T1
T2
T6
C2
C6
T8
C5
C1
MSCWV: CYCLIC WORKFLOW VERIFICATION ALGORITHM FOR WORKFLOW GRAPHS
235
Iteration No. &
PED-OR node
CISC Check_if_loop VISC / VL
4


PED-OR node
= C4


Result: False
Trace of VISC:
C4 C5 C4
T6
C6
T8
C5
T1
T2
T3
C4
T4
C3
T6
C2
C6
T8
C5
C1
MSCWV: CYCLIC WORKFLOW VERIFICATION ALGORITHM FOR WORKFLOW GRAPHS
236
Iteration No. &
PED-OR node
CISC Check_if_loop VISC / VL
5


PED-OR node
= C4


Result: True
Loop triggered by: C4
Trace of VL:

6


PED-OR node
= C3

Result: True
Loop triggered by: C3
Trace of VL:

(Table-7) shows three updates in OR_split_stack for each iteration: 1. updates after
calling CISC, 2. updates after incorporating the results of Check_if_loop, and 3. up-
dates at the end of an iteration. Updates in the OR_split_stack are indicated using
bold and italicized fonts. CISC adds new OR-split nodes into OR_split_stack. If
Check_if_loop detects any OR-split triggered loop, then Loop_tag is set as ON for
PED-OR node, otherwise last non-loop edge is updated for PED-OR node. At the
end of an iteration, completely explored OR-split nodes are removed from the top of
the OR_split_stack.
C4
C1
C3
C1
T2
T3
C4
T4
C3
C2
C1
T2
T3
C3
C2
C1
T2
T3
C3
C2
C1
T2
T3
C4
T4
C3
C2
C1
MSCWV: CYCLIC WORKFLOW VERIFICATION ALGORITHM FOR WORKFLOW GRAPHS
237
Table-7: Iteration-wise OR_split_stack updates for the workflow graph in
(Figure-10)
Iteration No. &
PED-OR node
Stage OR_split_stack
OR-Split
node
Loop_tag Last non-
loop edge
After CISC
C2 OFF C5
After Check_if_loop
C2 OFF C5
1
PED-OR node
= NIL
End of Iteration C2 OFF C5
After CISC
C3 OFF C5
C2 OFF C5
After Check_if_loop
C3 OFF C5
C2 OFF T3
2
PED-OR node
= C2
End of Iteration
C3 OFF C5
C2 OFF T3
After CISC
C4 OFF T5
C3 OFF C5
C2 OFF T3
After Check_if_loop
C4 OFF T5
C3 OFF T4
C2 OFF T3
3
PED-OR node
= C3
End of Iteration
C4 OFF T5
C3 OFF T4
C2 OFF T3
After CISC
C4 OFF T5
C3 OFF T4
C2 OFF T3
After Check_if_loop
C4 OFF C5
C3 OFF T4
C2 OFF T3

4
PED-OR node
= C4
End of Iteration
C4 OFF C5
C3 OFF T4
C2 OFF T3
After CISC
C4 OFF C5
C3 OFF T4
C2 OFF T3
After Check_if_loop
C4 ON C5
C3 OFF T4
C2 OFF T3
5
PED-OR node
= C4
End of Iteration
C3 OFF T4
C2 OFF T3
MSCWV: CYCLIC WORKFLOW VERIFICATION ALGORITHM FOR WORKFLOW GRAPHS
238
Iteration No. &
PED-OR node
Stage OR_split_stack
OR-Split
node
Loop_tag Last non-
loop edge
After CISC
C3 OFF T4
C2 OFF T3
After Check_if_loop
C3 ON T4
C2 OFF T3
6
PED-OR node
= C3
End of Iteration

5 ALGORITHM IMPLEMENTATION AND RANDOM GRAPH GENERATION
MSCWV algorithm was implemented in C language for running in Linux server.
Random graph generator was used to create workflow graphs of various sizes and
of various types, in order to check the correctness of the algorithm and to find the
execution time of the workflow verification algorithm for graphs of various sizes.
Random graph generator was coded in C language for running in Linux server.
Random graph generator uses constructs to build a workflow graph. Various con-
structs used in random graph generator program are, AND construct, OR con-
struct, AND cluster construct, OR cluster construct, Loop construct, First Level
Overlapped Construct, Wrong AND construct, Wrong OR construct, and Wrong
Loop construct. Except for loop constructs, all other constructs replace a sequence
node or an edge, to add the construct into the workflow graph. Loop constructs are
added by adding a feedback edge to the workflow graph. In the above set of con-
structs, first six constructs are correct constructs, i.e., adding these constructs to
the workflow graph does not change the structural correctness properties of the
workflow graph. Last three constructs, viz., Wrong AND construct, Wrong OR con-
struct, and Wrong Loop construct are wrong constructs. Adding wrong constructs
into the workflow graph introduces structural conflict errors in the workflow graph.
Wrong constructs are added only when an incorrect workflow graph is required for
testing correctness and gauging the execution time of the algorithm. For generating
correct workflow graphs, correct constructs are chosen randomly and added to the
workflow graph till it is of required minimum size. Wrong workflow graph genera-
tion procedure is similar to the correct workflow graph generation procedure except
that one of the wrong constructs is chosen randomly and the chosen wrong con-
struct is added exactly once to the workflow graph.
6 EXECUTION TIME
Execution time for verifying the randomly generated workflow graphs was com-
puted through CPU time calculation. CPU time calculation was made using the
function clock(). Workflow graphs of various sizes like, 1000, 2000, 5000 and 10000
were used for finding the average execution time. Average time for verifying the
workflow graphs was computed by finding the execution time for verifying hundred
workflow graphs of each size. A wrap-around program was executed to prepare the
files required for workflow graph verification, to call the workflow verification pro-
gram, and to capture the results obtained. (Table-8) gives the average time taken for
verifying correct and wrong workflow graphs of each size.
MSCWV: CYCLIC WORKFLOW VERIFICATION ALGORITHM FOR WORKFLOW GRAPHS
239
Table-8: Execution time for verifying correct and wrong workflow graphs
Size of
Workflow
Graph (in
number of
nodes)
Correct
Workflow
Graph /
Wrong
Workflow
Graph
Number
of
Graphs
Average time
taken for verify-
ing each work-
flow graph
(in seconds)
Standard Deviation
in time taken for veri-
fying each workflow
graph
(in seconds)
Correct 100 0.7602 0.2419
1000
Wrong 100 0.0470 0.1094
Correct 100 3.0874 1.0020
2000
Wrong 100 0.1989 0.4720
Correct 100 19.9509 6.4237
5000
Wrong 100 1.1527 3.0691
Correct 100 85.2293 28.4461
10000
Wrong 100 4.3205 12.4024
7 CORRECTNESS AND COMPLETENESS PROOF
Structural conflicts detailed in (Table-2) are detected as Deadlock, Lack of synchro-
nization, Loops with extraneous links to nodes outside the loop, back edge with
destination node as AND-join node, nodes that cannot be activated or nodes that
cannot be terminated.
Initial tests of MSCWV algorithm detect nodes that cannot be activated and nodes
that cannot be terminated. Initial tests also detect any back edge that has a desti-
nation node as an AND-join node. Any edge that is detected as back edge will be-
have as a back edge at least for one OR-split triggered loop or one base subgraph of
the workflow graph. For such an OR-split triggered loop or a base subgraph, a back
edge that has a destination node as an AND-join node leads to Deadlock error.
If the workflow graph was found to be correct in all the initial tests, then the main
algorithm verifies it for Deadlock, Lack of synchronization, and Loops with extrane-
ous links to nodes outside the loop. This is achieved by verifying various OR-split
triggered loops and base subgraphs in the workflow graph.
Distance to the end node calculated for each node as part of the initial tests is used
to separate base subgraphs from OR-split triggered loops, and to separate various
OR-split triggered loops in the nested cycles. This is done as follows. While travers-
ing any OR-split node that is not yet expanded and choosing its child node for fur-
ther traversal, priority is given to the child node that has the lowest distance to the
end node. Also, while choosing the child node of PED-OR node for traversal, priority
is given to the child node with a lower distance to the end node and that is not ex-
panded yet. Thus, for a traversed OR-split node in a n
th
level OR-split triggered loop,
at least one of its child nodes that leads to a same level loop gets a higher priority
over child nodes that leads to (n+1)
th
level OR-split triggered loops.
The algorithm first verifies the initial base subgraph flow of the workflow graph in
its first iteration. In its subsequent iterations, child node of PED-OR node chosen
for further traversal determines whether a new base subgraph will be created or a
new OR-split triggered loop will be created. At any moment during the algorithm
execution, there is a base subgraph which forms the foundation. After this, if the
outgoing edge from PED-OR node triggers an OR-split triggered loop, this first-level
OR-split triggered loop is created over this base subgraph and verified.
Consider any iteration in which an n
th
level OR-split triggered loop is created. For
the subsequent iteration, one of the following three possibilities occurs:
MSCWV: CYCLIC WORKFLOW VERIFICATION ALGORITHM FOR WORKFLOW GRAPHS
240
if markings in the edges lead to a different path in one of the n OR-split
triggered loops (corresponding to n levels), then the OR-split triggered loop
through the new path is created and verified.
if markings in the edges lead to a (n+1)
th
level OR-split triggered loop, then
the new OR-split triggered loop so obtained is created and verified.
else, a new base subgraph is created and verified.
Thus, at any point of time, at most one stack of OR-split triggered loops is main-
tained for verifying the workflow graph.
Choices in executing various cases corresponding to a workflow graph arise due to
the choices in choosing various child nodes of OR-split nodes. If the workflow graph
is correct, then MSCWV algorithm traverses through all child nodes of all OR-split
nodes in the workflow graph before declaring it as correct. Also, it creates and veri-
fies various OR-split triggered loops and base subgraphs that are minimally re-
quired to verify the workflow graph.
MSCWV algorithm uses the deepest OR-split invariance property as given in [10] for
Mahanti-Sinnakkrishnan algorithm. For acyclic workflow graphs, this property
states that, if there is an OR-split node that is deepest in the workflow graph along
all the paths through its child nodes, and a complete set of acyclic instance sub-
graphs that are only different by the selection of child node for this OR-split node
are verified to be correct, then the OR-split node under consideration becomes neu-
tral to structural conflicts. In MSCWV algorithm, this property is used both for base
subgraphs and OR-split triggered loops. For an OR-split triggered loop triggered by
an OR-split node, deepest is used in connotation with the deepest OR-split nodes
in the feedback paths triggered by the OR-split node.
In each iteration, a not-yet-traversed outgoing edge from an OR-split node (corre-
sponding to PED-OR node of that iteration) is traversed. Hence, number of itera-
tions executed by the algorithm for a workflow graph is less than or equal to the
sum of number of child nodes of all OR-split nodes in the workflow graph. O(E)
computations are made in each execution of CISC as no edge is traversed more
than once for an execution of CISC. Similarly, O(E) computations are made in each
execution of VISC, VL and Check_if_loop. Let E
OSi
denote the number of child nodes
for i
th
OR-split node, and let N
OS
denote the total number of OR-split nodes in the
workflow graph. Then, the total complexity of MSCWV algorithm is,
OS
N
OSi
i=1
O(E)* E

O(E)*O(E) = O(E
2
)
8 CONCLUSION
We have described various structural conflict errors that could occur in cyclic work-
flow graphs. We have discussed the issues in existing algorithms and methods for
verifying cyclic workflow graphs in identifying these structural conflicts.
Further, we have presented an algorithm called Mahanti-Sinnakkrishnan Algo-
rithm for Cyclic Workflow Verification (MSCWV) to verify cyclic and acyclic workflow
graphs. This algorithm is an extension of Mahanti-Sinnakkrishnan algorithm origi-
nally designed for verifying acyclic workflow graphs. We have also illustrated the
working of MSCWV algorithm with a real life business example. Finally, we have
commented on execution time, and correctness and completeness proofs of the
proposed MSCWV algorithm.
MSCWV: CYCLIC WORKFLOW VERIFICATION ALGORITHM FOR WORKFLOW GRAPHS
241
9 REFERENCES
[1] W. M. P. van der Aalst, A. H. M. ter Hofstede, B. Kiepuszewski, and A. P.
Barros, "Workflow Patterns," Distributed and Parallel Databases, vol. 14, pp. 5-51,
2003.
[2] W. M. P. van der Aalst, "The Application of Petri Nets to Workflow Manage-
ment," Journal of Circuits, Systems, and Computers, vol. 8, pp. 21-66, 1998.
[3] H. M. W. Verbeek and W. M. P. van der Aalst, "Woflan 2.0: A Petri-Net-
Based Workflow Diagnosis Tool," in ICATPN, 2000, pp. 475-484.
[4] H. M. W. Verbeek, T. Basten, and W. M. P. van der Aalst, "Diagnosing
Workflow Processes using Woflan," Computer Journal, vol. 44, pp. 246-279, 2001.
[5] W. Sadiq and M. E. Orlowska, "Applying Graph Reduction Techniques for
Identifying Structural Conflicts in Process Models," in CAiSE, 1999, pp. 195-209.
[6] H. Lin, Z. Zhao, H. Li, and Z. Chen, "A Novel Graph Reduction Algorithm to
Identify Structural Conflicts," in HICSS, 2002, p. 289.
[7] H. H. Bi and J. L. Zhao, "Applying Propositional Logic to Workflow Verifica-
tion," Information Technology and Management, vol. 5, pp. 293-318, 2004.
[8] Y. Choi and J. L. Zhao, "Decomposition-Based Verification of Cyclic Work-
flows," in ATVA, 2005, pp. 84-98.
[9] S. Perumal and A. Mahanti, "Cyclic Workflow Verification Algorithm for
Workflow Graphs," Indian Institute Of Management Calcutta - Working Paper Series,
vol. WPS No. 599, pp. 1-58, 2006.
[10] S. Perumal and A. Mahanti, "A Simple and Efficient Algorithm for Verifying
Workflow Graphs," in Workflow Handbook 2005, L. Fischer, Ed. Lighthouse Point:
Future Strategies Inc., 2005, pp. 233-256.
[11] S. Perumal and A. Mahanti, "A Graph-Search Based Algorithm for Verifying
Workflow Graphs," in DEXA, 2005, pp. 992-996.
[12] S. Perumal and A. Mahanti, "Applying Graph Search Techniques for Work-
flow Verification," in HICSS, 2007, p. 48.
[13] W. M. P. van der Aalst, A. Hirnschall, and H. M. W. (Eric) Verbeek, "An Al-
ternative Way to Analyze Workflow Graphs," in CAiSE, 2002, pp. 535-552.
[14] "Case Study," Geneva Systems, Inc., 2004. Retrieved from:
http://www.genevasystems.com/highres/HTML/ProductInfoCenter/WFACaseStudy.htm,
Accessed: February 01, 2007.




243

Business Process Architecture and
the Workflow Reference Model
Chris Lawrence, Old Mutual, South Africa
INTRODUCTION
This paper is a response to David Hollingsworths The Workflow Reference Model
10 Years On.
1
He refers to a lack of agreement on a common meta-methodology
for modeling the business process. That is what this paper offers: a process meta-
model. The meta-model is derived not by abstracting business process architec-
ture from workflow or Business Process Management (BPM) implementation
technology, but by analysing the business process in terms of its core character-
istics.
2

Business rules play an important part in the meta-model, and the relationship
between process and rule is compared with that of Ronald Rosss Business Rule
Approach.
Implications of the meta-model are explored to see to what extent they illuminate
some of the process issues Hollingsworth raises in his paper.
BUSINESS PROCESS ARCHITECTURE
Hollingsworth claims the Workflow Reference Model (hereafter abbreviated to
WfRM) was an attempt to
construct an abstract view of the business process in terms of its core characteris-
tics, separated from the technologies that could be used to deliver its functionality in
a real world situation.
3

I am not sure it has done quite this. What it might have done instead is abstract
general principles from workflow technology applications, in pursuit of effective
interoperability standards for the industry. But this is not constructing an ab-
stract view of the business process in terms of its core characteristics. The latter
would call for a conceptual meta-model of what it is to be a business process, not
just what it is to be a workflow system. Neither the WfRM nor the equivalent
Business Process Management (BPM) reference model Hollingsworth proposes
provides an analysis of what it is to be a business process. They both seem to
take it for granted that we know what a business process is.
This paper offers something which I think is more like an abstract view of the
business process, one which in particular seems to fit the kind of rule-intensive
contexts where workflow technology predominates, like insurance, banking, legal
and general administration as well as some classes of industrial and manufac-
turing applications.
4


1 David Hollingsworth: The Workflow Reference Model 10 Years On, included in the WfMC Handbook 2004, Future Strategies Inc,
Lighthouse Point, FL, 2004.
2 David Hollingsworth: ibid.
3 David Hollingsworth: ibid.
4 David Hollingsworth: The Workflow Management Coalition Specification: The Workflow Reference Model, Document Number TC00-
1003, Issue 1.1, 19 January 1995.
BUSINESS PROCESS ARCHITECTURE AND THE WORKFLOW REFERENCE MODEL
244
I would argue the WfRM is an abstraction from workflow technology, not a con-
ceptual analysis of the process/workflow space in business terms. Either that
problem space was deliberately seen as out of scopewhich is fine; or it was as-
sumed that abstracting from workflow technology offerings is the same as concep-
tually analysing the process/workflow problem spacewhich is not so fine. By
extension the BPM reference model could end up as an abstraction from BPM
technology offerings rather than the conceptual analysis of the process space in
business terms which it needs to be.
Hollingsworths paper
5
displays other clues to what seems a generally (deliber-
ately?) un-analytical approach to processas if assuming that conceptual analy-
sis may not get us far. Process fragment is a case in point:
more emphasis is required on the decomposition of processes into fragments and
their consolidation in various ways to support more dynamic operational business
processes. The original model identified various ways in which process frag-
ments could interacthierarchic subprocess, parallel synchronised processes, etc
and did develop runtime models for binding them in execution terms. However, it
did not attempt to develop anything beyond a primitive choreography capability in
the area of process definition support for interactions between process fragments.
Later on the overall process is seen as a
combination of process fragments which can be recombined in various
ways to deliver new or modified business capability.
But what is a fragment? We approach an answer in the discussion of its inter-
nal and external aspects, which suggests it may be more a functionality compo-
nent than a process component:
Some form of choreography is required to identify the valid sequences of mes-
sages and responses across and between the participating process frag-
ments. The choreography requires each process fragment to exhibit a set of
prescribed external behaviours in response to such message sequences.
I do not expect my suggestion above to go unchallengedbut will explain later
what I mean by process as opposed to functionality component. For now I will
merely say that a process fragment seems like a piece of broken pottery, defined
by what it is a fragment of. Conversely a process seems little more than a set of
process fragments linked by messaging technology. Process fragment is some-
thing smaller than a process, but that is all. It is not an analytical component.
Another apparently equally arbitrary concept, but of something bigger than a
process, appears in the later discussion of the conceptual model phase, which
needs to focus on the position of the process within an end-to-end process de-
livery chain, involving interfaces with existing or planned processes within
other organizational domains.
So a process can be part of an end-to-end process delivery chain. But why
would a process not be an end-to-end process delivery chain, or vice versa?
We hear that in both the WfRM and the equivalent BPM reference model the
abstraction of the business process from the implementation technology re-
mains an important goal, because organizations tend to think differently, us-
ing different skills and methodologies, in the two domains.

5
David Hollingsworth (2004), op cit.
BUSINESS PROCESS ARCHITECTURE AND THE WORKFLOW REFERENCE MODEL
245
I agree the WfRM abstracts something from implementation technology, but I am
unconvinced that what it abstracts is the business process. The WfRM offers little
conceptual analysis of the business process itself, because it sees it in terms of
given system functionality. Interestingly these are the two domains in which or-
ganisations think differently. Why should there be two domains? Is it because
organisations see their business processes in terms of the systems they happen to
have?because they are always encouraged to? But the problem is that, with
rare exceptions, systems are so fragmented in process termscollections of point
solutions. Could this be why the business process is seen as something in a
separate domain linking those systems and components together?
A more fruitful paradigm might be to see the business process as something ex-
isting logically prior to those systems, but as implemented in themwith varying
degrees of success. In which case there would not be two domains at all: there
just seem to be two when the degree of success is low.
A currently favoured implementation technology is web services. Hollingsworth is
concerned that
emphasis on web services architecture[could] constrain the development of the
business process architecture. [T]his raises the danger of ignoring the organiza-
tional and human aspects of the business process, in favour of a resource model
wholly based on web services.
This could miss the point. The danger is not so much of ignoring the organiza-
tional and human aspects of the business process, but of overlooking the busi-
ness process itself. The natural technology choice of web services represents yet
another solution architecture capable of obscuring logical architecture.
What do I mean by logical architecture? What is it, what is special about it, why
is it needed? Is it the same as the conceptual model?
Hollingsworth sees the conceptual model as
the formulation of the business process in terms of business component ob-
jects and their interactions.
But if business component objects are things like Customer, Suppliermaybe
also Department, Product etcthe snag is that some of these are necessary to the
process while others are contingent. Customer may be part of the what (logical
architecture) but Department will ultimately be part of the how (solution architec-
ture). The business process may be implemented in terms of business component
objects and their interactions, but we could miss a valuable step if we see the
conceptual model (ie the highest and most abstract level) as something formulated
in terms which include components of solution architecture, broadly conceived.
This missing step is important for automation and integration, both of which call
for the right representation in the right generic structures. We need a meta-model
which promotes this kind of rigour.
Hollingsworth provides a clue as the discussion turns to translating the concep-
tual model into an executable model:
It has proved relatively difficult to develop a fully standardised framework model to
support this function within the BPM model. Not only do different BPM/workflow
products have different internal representation structures but a large amount of lo-
cal detail needs to be defined to develop the complete internal representation of the
executable model.
BUSINESS PROCESS ARCHITECTURE AND THE WORKFLOW REFERENCE MODEL
246
Not surprising, with so little basis for shared understanding. We can string ro-
botic components together with messages obeying specific standards. But by
what criteria do we see one string as a business process but not another string?
Hollingsworth continues to talk in solution architecture terms:
XPDL attempts to step around this problem through the use of extended attributes
for bi-lateral agreement. In practice many BPM products have process design tools
that are closely coupled to the execution capabilities of the BPMS. Generating a
common approach across multiple product environments may depend
more on agreement on a common meta-methodology for modeling the busi-
ness process. [My emphasis.]
That is what I offer in this papera candidate meta-methodology for modelling
business processes at logical level. But we will see it starts somewhere other than
the solution-architectural level of Service Interactions and Choreography:
the choreography could be likened to a very high level process definition that links
together process fragments by providing a set of high level business rules and iden-
tifiers for the locations of the resource implementing each process fragment.
There are two different things which should not be conflated. One is conceptual
analysis into what it is to be a business processleading to an agreed shared
meta-model. The other is a technology framework and standard protocols to let
business processes be implemented in a variety of waysincluding (where appro-
priate) participation by parties and services not identified or known at design
time. The meta-model should guide, but should not be, that framework.
Hollingsworth asks:
WHAT BPM STANDARDS ARE REQUIRED?
At the heart of any BPM reference architecture lies the methodology and stan-
dards for representing the business process... [which needs] ...to be consid-
ered at two levels:
(i) a lower level, internal view of each process fragment that is similar to the
traditional workflow process model
(ii) a higher level view concentrating on modeling the overall process flow, link-
ing re-usable process fragments together via a choreography. This is a view of
the external behaviors of the process fragments...
But both of these are solution levelhow, not what. If we want a common meta-
methodology for modeling the business process we should see the business proc-
ess as first of all a what, as something which can be implemented in a how. Then
we can represent the business process as a business process, not as a piece of
automated machineryeven though that may be how we want to implement it.
Hollingsworth surveys a number of existing approaches to Internal Process Defi-
nitionie (i) abovewhich
all provide a means of representing process flow, events and decision points, and
the classification of various process context data associated with executing proc-
esses. Some standards also provide a means of specifying the work resources as-
sociated with the process work items, or activities.
The purpose of these tools is to enable the integration of different process design
products with different execution products or to allow simple migration of existing
process definitions to a different design/execution product.
[But a] particular problem has been that different vendor products in the BPM space
tend to use different process design paradigms.
BUSINESS PROCESS ARCHITECTURE AND THE WORKFLOW REFERENCE MODEL
247
Again, not surprising, without agreement on what a business process is. The fol-
lowing design paradigms are discussed:
Transition Based Representation:
typically derived from Petri Net methodology; a process is represented
graphically as a network of activity nodes and explicit transitions between
them.
A powerful and familiar paradigm, sharing features with the meta-model I will
introduce later. But clarity is needed as to what level the transitions are at, what
sorts of things are making the transitions, what the transitions are between, and
what the activity nodes actually are.
A significant practical distinction is whether the transition logic allows [back-
ward] transition to earlier [preceding] activities, allowing cycling through a
set of nodes, or constrains transition flow to acyclic paths only.
This is one of the reasons proprietary generic workflow can struggle to achieve
real process control. In many contexts the question is whether or not the busi-
ness logic and logistics at any particular point make a return to a prior state sen-
sible. This in turn depends on what has happened before and what is happening
now to the real business entities. Sometimes looping back is inappropriate be-
cause it would mean rewriting historyeg pretending we had not written to the
customer first time round. Terms like transition, cycling and acyclic paths beg
the question as to what exactly is going through the process. This will be a key
piece of the meta-model. If there is a thing going through the process, the ques-
tion of whether or not looping makes sense at a particular point typically depends
on what is happening to the thing at that point.
Block Structured Decomposition:
Any single node in a process model may be decomposed to a lower level of
underlying process (a paradigm based upon the hierarchic sub-process
model).
This seems to refer to the familiar nesting construct in which (say) a process can
be a subprocess or activity in another process, enabling flexible reuse. But is this
nesting a feature of the logical model or a desirable feature of any solution archi-
tecture? The difference will be explored later.
Activity Pre- & Post-conditions:
In this approach no explicit transitions between activities are declared. The
process is defined as a set of activities each having entry (pre-) and exit (post-)
conditions; parallelism is implicit and when pre-conditions are met the activity
is initiated, independently of the status of other activities within the process.
To provide sequential operation, pre-conditions may relate to the completion of
a particular prior activity
This, like the transition-based representation, may inform a successful solution
architecture. But neither seems to illuminate what a business process is at a logi-
cal level, and what is going through it. They could both apply to a natural process
or an automated control function just as well as to a business process.
Role Activity Diagrams
RADs focus on defining the process through the actions that are taken within
abstract roles associated with the organization, and the interactions between
roles. In this way the process flow is modeled as the combined actions asso-
BUSINESS PROCESS ARCHITECTURE AND THE WORKFLOW REFERENCE MODEL
248
ciated with the cooperating roles and their interactions. ... Modeling of data
and documents is not normally handled.
But what is passing through the process, if the process itself is the combined ac-
tions associated with the cooperating roles and their interactions? Organisational
roles (however abstract) are how not what. They are there because processes are
implemented as they are: with a different implementation the roles might be dif-
ferent, or may not even exist. Modelling a process in terms of roles and role inter-
actions could be putting the cart before the horse.
Having summarised these different paradigms, Hollingsworth concludes:
The problem for the systems integrator is that it is not easy to transfer process
information between design tools and/or workflow control software based
upon the different design paradigms. A very large amount of work has been
undertaken by both industry and academia in attempting to define a common
representation of a business process which may be translated between these
different paradigms, and, by extension, the different process modeling and
definition tools associated with the paradigm.
...The recent work on BPMN represents an interesting approach... By encour-
aging adoption of a common modeling notation to express the core compo-
nents of process structure it reduces some of the variety that constrains the
production of a common machine interpretable process definition.
But how far does agreement on notation take us, without agreeing what the nota-
tion expresses? It is as if we have all agreed what the core components of process
structure are, as we have all apparently agreed what a business process is.
BPMN most readily depicts an as-is or to-be implemented process (how), rather
than the what of the business process at logical level. For example the use of
pools and lanes to contextualise the process is most relevant once the physical
implementation (how) has been decided. But BPMN is methodologically agnostic:
BPMN is independent of any specific process modeling methodology.
6

And there is thankfully little to stop it being used to depict business processes at
logical level, as my diagrams below should show.
It is no surprise that transferring process information between different para-
digms is a struggle. The attempts to define a common representation of a busi-
ness process appear to be abstractions from different technology approaches, not
different logical analyses of what a business process is. It is like doing data con-
version and integration in the days before the relational model, or like transferring
accounting information without considering double entry.
Having surveyed existing standards for internal process definition Hollingsworth
turns to Choreography & External Process Interactions. Focus has mostly been
on extending internal process definition approaches to external business-to-
business (B2B) and business-to-customer (B2C) contexts; or, as e-business gets
increasingly sophisticated, on
structured sequences of messages and the process implications behind such
sequences... [all the time] ...remembering that the prime requirement is to
support processes intermeshed across organizational boundaries.
A revealing reminder. It is one thing to imagine, design and implement a process
intermeshed across organizational boundaries. But what is the business-

6
Stephen A. White, Introduction to BPMN (Introduction to BPMN.pdf), IBM Corporation, 2004.
BUSINESS PROCESS ARCHITECTURE AND THE WORKFLOW REFERENCE MODEL
249
architectural status of such a process? Who owns it, governs it, who is account-
able for what aspect of it? More fundamentally, what is it there for? To get to an
effective logical model we need to steer clear of the solution level: Petri Nets, or-
ganisational roles, functional decomposition, rules engines, interoperability, cho-
reography, message exchange, process fragments. We must start further back.
A PROCESS META-MODEL
Request and outcome
The model I want to present has already been outlined elsewhere.
7
,
8

We start with a familiar schema:
input process output
Process re-engineering and quality initiatives often assume a model like this
implicitly or explicitly. It is not false, but it is too generic. It can apply to a natural
process like photosynthesis or a computer system process like computation. It
says nothing specific about a business process.
Derek Miers alludes to a key feature of a business process by the concept of
process as purpose.
9
A thread of intentionality runs through it.
We can express this by making the first and last components more specific:
request process outcome
There may be other inputs and outputs. But what makes it a business process is
that it is initiated by an implicit or explicit request. In the paradigm case the re-
quest is explicit, from an external customer. But the request could be implicit,
and the customer could be internal rather than external. Where goods are manu-
factured and/or distributed for stock, the request itself is pre-empted. But even if
the request lies in the future, it is still the reason the process exists.
The customer could even be a supplier who, say, asks an actual or potential cus-
tomer to supply a new contract, transaction breakdown, or overdue payment. We
need to construe customer very broadly, as any entity likely to request some-
thing the supplier might agree to provide. Ultimately the initiating event is de-
scribed as a request to reflect the intentionality which characterises a business
process. The initiating event is a request which the supplier intentionally re-
sponds to as a request.
Specifying outcome rather than output highlights the continuity of this thread of
intentionality. The input-process-output model has no necessary identity between
input and output: the process takes in, transforms or uses the inputs, and gener-
ates outputs which may be different from the inputs. But in the request-process-
outcome model the request and outcome are in an important sense the same. The
request is for the outcome; the outcome is the thing requested.
This may seem an over-complex way of stating the obvious. But it has architec-
tural implications. A business process does not just happen. Nor is it just a series
of causally related events like photosynthesis. It is a structured and intentional
response triggered by something seen as a request for a specific outcome. The
request initiates the process, and in the middle of the process it is still there.

7
Chris Lawrence, Make work make sense, Future Managers, Cape Town, 2005.
8
Chris Lawrence, Integrated function and workflow, in: Fischer, L. (Ed.). (2005). Workflow handbook
2005. Lighthouse Point, Florida: Future Strategies Inc.
9
Derek Miers, Issues and Best Practices for the BPM and SOA Journey, Enix Consulting
Limited, 2006.
BUSINESS PROCESS ARCHITECTURE AND THE WORKFLOW REFERENCE MODEL
250
Some steps have taken place to meet the request, but not all. The request is at a
particular status. At the end of the process is the requests last status transition,
which realises the outcome. The request is a structural element of the process.
This structure is apparent in administration or service processes like insurance,
banking, legal and general administration
10
but it is also obvious in (say) pur-
chase order processing. An order is a paradigm case of request, which initiates
and then travels through the process. This order is not a paper document, or its
digitised image. It is the request entity itselfa dataset of request parameters.
In a life assurance new business process the request entity is the application. In
an insurance claim process it is the claim. Either may or may not be on a physi-
cal form, which may or may not then be digitised.
Rules
Having identified the request entity the next step is straightforward. This is to de-
fine the rules at logical level, including rules specifying the sequence in which the
other rules should be applied. A typical sequence might be:
First: rules about what is and is not a well-formed request. Until these rules have
been met it may not be clear what type of request it is, and therefore what type of
process should start.
Second: authorization rules. Each type of request may have rules about who can
authorise it and what proof is needed. For example an order may need to be on
an order form signed by a known representative of a customer organisation.
Third: rules about carrying out the request.
Rules like these define the process at logical level. At this level the process is the
rules. A process diagram could show six steps in (eg) BPMN symbols:

Take
order
+
Check
credit
rating
+
Match
against
stock
+
Authorise
order
+
Despatch
order
+
Check
order
+

Figure 1
But these are rules which could equally be stated in words:
First, take the order.
Then, check the order.
Then, check credit rating.
...etc.
Within each step will be rules prescribing how to check the order, how to check
credit rating etc. To qualify as a well-formed order it may need to be either from a
known customer or, if from a new customer, it may need certain customer details.
Check credit rating may also include IF...THEN... rules, eg:
IF
Cash with order
THEN
Pass to Match against stock
ELSE

10
David Hollingsworth (1995), op cit.
BUSINESS PROCESS ARCHITECTURE AND THE WORKFLOW REFERENCE MODEL
251
IF
Order amount <= available credit
THEN
Pass to Match against stock
ELSE
...etc.
The relationship between the rules and the request entity is key. The rules must
allow for every possible value of the relevant attributes of the extended dataset
defined by the request entity. For example an order will have a date, a customer,
details of what is ordered. The customer will have a current available credit and
maybe a discount calculation rule. The products will have a price. If the rules do
not cover every possible value of every attribute, the process will not cover every
possible request. There could be manual and/or discretionary catch-alls, eg:
ELSE
Refer to credit manager for approval
...etc.
This is a perfectly sound rule, but it will need to be unpacked:
How is a credit manager identified?
What happens if approval is not granted?
...etc.
Eventually the rules will be complete, when they cover every possible contingency.
There will be rules within rulesrules inside the boxes Take order, Check order
etc. The steps may change when the rules are defined at a more granular level.
For example the Figure 1 process flow may need to change to:
Take
order
+
Check
credit
rating
+
Match
against
stock
+
Authorise
order
+
Despatch
order
+
Check
order
+

Figure 2
Here Check credit rating and Match against stock are in parallel, and both must
complete before Authorise order. It is not that one process flow is better than the
otherit is purely what the organisation decides its rules are.
I must acknowledge that some of what is being said here is at odds with current
business rules orthodoxy. For example to quote Ronald Ross:
Processes are processes, and rules are rules. They are not the same. A fun-
damental principle of the business rule approach is that each is a primitive.
Formally, one primitive can never decompose to another primitive. So proc-
BUSINESS PROCESS ARCHITECTURE AND THE WORKFLOW REFERENCE MODEL
252
esses never decompose into rules, and rules never decompose into proc-
esses...
11

And:
...business rules are not "process" in any sense of the word. Roger Burlton re-
cently expressed the business rule message this way: "Separate the know
from the flow." Business rules represent the "know" part of that -- the stuff
that guides the "flow." Guidance means following rules, of course -- hence the
name "business rules."
12

What idea of business process does this assume? Rosss best definition is:
Business process: the tasks required for an enterprise to satisfy a planned re-
sponse to a business event from beginning to end with a focus on the roles of ac-
tors, rather than the actors day-to-day job.
13

Rules and business processes interact (Ross quoting Roger Burlton again):
...business processes "...transform inputs into outputs according to guidance -- poli-
cies, standards, rules, etc...."
14

The process is the flowa scriptand the rules are the know which guide the
flow. So for example in making a cake, the script might be:
1. Combine flour, water, milk, and eggs in a large bowl.
2. Mix until batter is consistent but not entirely free of lumps.
This recipe represents a perfectly acceptable (albeit very simple) procedure or script.
... Now lets ask, what rules do we need?
Potential rules to provide appropriate control might include the following:
Milk must be fresh.
Bowl must be large enough so that contents do not spill out when stirred.
Batter may be considered "entirely free of lumps" only if there are no visible
masses of congealed batter larger than 2 cm in diameter.
These rules represent business knowledge that must be present when the proce-
dure or script is performed. ...I want both a script to follow ... and rules to guide me
in doing the work. But most importantly, I want the script and the rules to be sepa-
rate.
15

There is also the concept of surrogates:
How does any model of the business (including its processes) differ from any model
for the design of an information/knowledge system (including its processes)?
John Zachman
16
describes the crucial difference this way. A business model "... is
about real-world things." A system model, in contrast "... involves surrogates for the

11
Ronald G. Ross, "Do Rules Decompose to Processes or Vice Versa?", Business Rules Journal, Vol. 4,
No. 12 (Dec. 2003), URL: http: http://www.BRCommunity.com/a2003/b155.html
12
Ronald G. Ross, WHAT IS A BUSINESS RULE? March 2000, URL:
http://www.brcommunity.com/b005.php
13
Ronald G. Ross, "How Rules and Processes Relate ~ Part 2. Business Processes", Business Rules
Journal, Vol. 6, No. 11 (Nov. 2005), URL: http://www.BRCommunity.com/a2005/b256.html. The
quotation is from Janey Conkey Frazier.
14
Ronald G. Ross: ibid.
15
Ronald G. Ross, Principles of the Business Rule Approach, Addison Wesley Professional, Boston, MA,
2003.
16
John A. Zachman, The Zachman Framework: A Primer for Enterprise Engineering and Manufactur-
ing (electronic book). Zachman International (2002).
BUSINESS PROCESS ARCHITECTURE AND THE WORKFLOW REFERENCE MODEL
253
real-world things so that the real-world things can be managed on a scale and at a
distance that is not possible in the real world. These surrogates are the things
making up ... systems...." [emphasis added]. The most obvious kind of surrogate for
real world things is data. A system process includes actions that manipulate data
in various ways...
...A process in an information/knowledge system ... can manipulate other kinds of
surrogates as well, for example:
The supervisors work queue is actually a surrogate for a face-to-face interac-
tion between a supervisor and an order clerk each time a special order is re-
ceived.
The supervisors GUI for displaying orders in the queue is actually a surro-
gate for the flesh-and-blood order clerk.
A system process then is all about manipulating surrogates standing in for real-
world things. A business process, in contrast, should never include tasks or steps
for manipulating surrogates. Thats a big difference...
17

I have no problem with the idea of surrogate as representationin the sense that
(say) a customer data record is a surrogate for a flesh-and-blood customer. But it
does not follow that a supervisors work queue is a surrogate for a face-to-face
interaction between a supervisor and an order clerk each time a special order is
received or that a supervisors GUI for displaying orders in the queue is a surro-
gate for the flesh-and-blood order clerk. The analogy does not hold. The flesh-
and-blood customer exists, and the customer record represents the real-world
entity. But the customer record does not replace, or provide an alternative imple-
mentation of, the real customer. To see system functionality as primarily repre-
senting concrete entities and their relationships pertaining to a specific (and per-
haps historically prior) process implementation is to fall into the trap of automat-
ing the as-is which has bedevilled IT since its early days.
Some of the rules for the suppliers order process will be criteria for deciding if an
order is special. At a point in its history the supplier may have implemented its
order process by having a flesh-and-blood order clerk and a flesh-and-blood su-
pervisor. The order clerk would have applied those criteria and referred special
orders to the supervisor via face-to-face interaction. But consider an alternative
where the order clerk only captures the order, and stored system rules decide if
an order is special and, if so, route it to the supervisor. There is no face-to-face
interaction for the supervisors work queue to be a surrogate for. Or another im-
plementation where customers capture their own orders, there is no flesh-and-
blood order clerk, but there is a flesh-and-blood supervisor complete with work
queue. Or no flesh-and-blood order clerk and no flesh-and-blood supervisor, and
instead of routing to the supervisors work queue, special orders invoke special
processing requesting references direct from banks and/or special credit checks
direct from credit agencies.
Sequencing the scenarios like this may suggest an evolution of improvement.
History may have been like this, but it may not have been. There may never have
been an order clerk or a supervisor. These are just different implementations with
different features, costs and technical feasibilities in different contexts. Even if
things had happened in the order the scenarios were presented, a later scenario is

17
Ronald G. Ross, "How Rules and Processes Relate ~ Part 4. Business Processes vs. System Proc-
esses," Business Rules Journal, Vol. 7, No. 1 (Jan. 2006), URL:
http://www.BRCommunity.com/a2006/b265.html
BUSINESS PROCESS ARCHITECTURE AND THE WORKFLOW REFERENCE MODEL
254
not a surrogate for an earlier one. Something cannot be a surrogate for something
which may never have existed.
The false analogy of process surrogates is of a piece with the best definition of
business process quoted above in terms of
tasks required for an enterprise to satisfy a planned response to a business
event from beginning to end with a focus on the roles of actors.
Why should there be actors? There may be actors, and if so they are likely to have
roles. But to assume actors and roles are necessary to the business process is as
unwarranted as to assume system functionality is necessarily a surrogate for a
more concrete (or flesh-and-blood!) implementation which must have preceded it.
It is possible that assumptions like these are behind the statement that a busi-
ness process should never include tasks or steps for manipulating surrogates. (Is
there an echo here of Hollingsworths two domains?)
What do seem to be necessary components of business processes on the other
hand are rules. To that extent I agree with Ross and Burlton that business proc-
esses transform inputs into outputs according to guidancepolicies, standards,
rules, etc. But I see no residue of script which is not rules. Flow is also know. It
is arbitrary to see Milk must be fresh as a rule but Combine flour, water, milk,
and eggs in a large bowl as part of a script, and therefore not a rulejust be-
cause I want both a script to follow ... and rules to guide me in doing the work
and I want the script and the rules to be separate.
At an empirical level, there may be contexts where The really rapid change is in
the rules ... not in the business processes
18
, where the steps themselves and
their sequence do not change much, but things like authorisation limits and role
assignments do. But equally, and particularly in pursuit of automation efficiency,
a process implementation could change significantly (eg withdrawing cash from
an ATM rather than cashing a cheque) while the fundamental rules stay the same
(account and amount must be specified; withdrawal must be by a person author-
ised to withdraw from that account; withdrawal amount subject to limits; etc).
Far from separating process from rules a more helpful paradigm would be to see
process completely in terms of rules. In terms of strict logical typing it may be
that process and rule are different primitives, as computation is not number
and vice versa. But it does not follow that the steps and their sequence which
make up the entire content of the process are not rules. (In fact I lean to the view
that perhaps a business process itself is a rule at a logical level. Structurally there
can be rules within rules. And for an organisation to see the data set an event
represents as a request which it then intentionally honours is effectively to follow
a rule it has been given or has given itself.)
A more helpful distinction would be between the process itself at logical (rule) level
and any particular implementation of itin terms of systems, people, roles,
manuals, rule books etc. It is not that the process stays the same and the rules
change. Some rules are volatile; some hardly change. Sometimes the physical im-
plementation stays unchanged for a long time: because it works; because it is too
expensive or disruptive to change it; because no one thought of changing it or
that it could change. Sometimesparticularly for cost and/or quality (and there-
fore competition) reasonsthe physical implementation has to change.

18
Roger T Burlton, quoted in: Ronald G. Ross, "How Rules and Processes Relate ~ Part 2. Business
Processes," Business Rules Journal, Vol. 6, No. 11 (Nov. 2005), URL:
http://www.BRCommunity.com/a2005/b256.html
BUSINESS PROCESS ARCHITECTURE AND THE WORKFLOW REFERENCE MODEL
255
To show how artificial it is to separate process from rule consider a familiar ad-
ministration process. There could be a rule that a particular document or data
item must be available. What if it is missing? We do not suddenly move out of a
rule domain (x is needed) into a process domainwhich would then detail how
to get the missing information. The stipulation that the information is required is
a rule; and the statement of where to get it from is also a rule. We could imagine
an implementation where on finding the information is missing the requirement
rule immediately leads to generating correspondence to request the outstanding
information from the intended recipient. This is process, but it is also rules.
To return to the process meta-model, talk of rules not satisfied means we should
qualify the statement that the outcome is precisely the thing requested. Yes the
request is for the outcome in that, for example, an application (request) for a
passport is an application for a passport (outcome). But not every passport appli-
cation leads to a passport. There are eligibility rules, and if the applicant cannot
eg produce a birth certificate proving place of birth the passport will not be
granted. Strictly speaking the outcome is not necessarily a passport, but a well-
formed result in line with passport application rules. The paradigm well-formed
outcome will be a passport. But because it must be possible for rules to be not
satisfied, the requested outcome may not be the actual outcome.
Far from refuting the model this is actually a crucial implication. Exceptions and
error conditions are the bread and butter of business process, because rules are.
So what are rules? We have said a lot about them but not yet tried to define them.
This was deliberate. The meta-model only needs an operational definition: some-
thing is a rule if it is used as a rule. This is almost circular but not quite. It re-
flects the same thread of intentionality as request does.
This may be heresy to some business rule theorists. I have no objection to the
view that there is a significant subset of business rules which are best expressed
declaratively; which can more formally be expressed in predicate calculus format;
which can be categorised as rejectors, projectors and producers; and for which
it is true that rules build on facts, and facts build on concepts as expressed by
terms.
19
But I am unconvinced that there is anything about first, take the order,
then check the order, then check credit rating which disqualifies it as a state-
ment of rules. The development of systems and therefore the implementation of
business processes may be facilitated by a specially designed logicbase (consist-
ing of a factbase plus a rulebase) whose contents conform to rules about format,
atomicity etc. But this assumes we know what our process (our script) already is,
and our intention is to make it as efficient and as flexible as possible by providing
optimised support and guidance from an optimised store of rules. This is fine as
long as the script itself is optimal. If it is not how do we know? Because it is more
expensive or slower than our competitors? Because its error rate is higher? Be-
cause it is therefore not best practice? But this may only focus on how the script
is followed, not on the script itself. Where does the script come from? As soon as
we ask that question we open ourselves to possible ways of analysing the process
itself; and I suggest that a helpful way to do this is in terms of rules, without pre-
supposing any criteria as to what a rule is, other than its being used as a rule.
There are many reasons a statement (typically in imperative or indicative mood)
may be used as a rule. It could be an internal fact about the organisation or a fact

19
Ronald G. Ross, Principles of the Business Rule Approach, Addison Wesley Professional, Boston, MA,
2003.
BUSINESS PROCESS ARCHITECTURE AND THE WORKFLOW REFERENCE MODEL
256
about the universe beyond it. It could be something true by definition or some-
thing the organisation or some other body wants to hold true by definition. It
could be a control the organisation wants in place, temporarily or permanently, to
mitigate risk, to ensure survival or compliance, to monitor or maintain or improve
its competitive position or its quality or its profitability. And so on.
Many rules will be reasons why current implemented processes are how they are.
Some rules may be better served (now or in future) if the processes were different.
Some processes may be how they are because the rules are believed to be what
they are. Some explicit or implicit rules may no longer make sense or may never
have made sense.
Clearly the intention should beat least ideally, and assuming the opportunity
existsto establish what the rules are and should be, regardless of what is actu-
ally taking place in the organisation, including everything that is or should be
used as a rule, and excluding nothing on grounds of content or format. Then the
processes can be logically defined.
If formulations like first, take the order, then check the order, then check credit
rating can qualify as rules, then there can be rules within rules. There can also
be rules within rules within rulesand so on. But this does not mean the logical
process flow itself needs to allow indefinite hierarchies. The request-process-
outcome model provides a compelling argument for recognising just three levels.
Process, subprocess and task
The top level, process, is that of the model itself. A request of a particular type (eg
purchase order) initiates an instance of a particular process type (eg order proc-
ess). The model allows us to define a process unambiguously. It is not an arbi-
trary string of process fragments, nor an arbitrary stretch of an end-to-end proc-
ess delivery chain. It is from the request to the well-formed outcome.
The second level, subprocess, is that of the steps already identified: Take order,
Check order etc. Subprocess here means one of these steps. High-level rules like
First, take the order; then check the order; then check the customers credit rat-
ing etc indicate status changes which are important to the business. These busi-
ness status changes apply to all orders, irrespective of their attributes. It is this
generality which defines the subprocess level.
This is key. The request must qualify as a request of a particular type. Because it
is of that type, a particular set of rules applies to it. That set of rules is the proc-
ess. Because the rules are sequenced (according to sequencing rules!), the request
will undergo changes in business status, reflecting what rules have already been
applied, and what rules are yet to be applied. The subprocess level is the level ap-
plying to all requests of the particular type, regardless of any characteristics or
contingencies of the individual request itself.
The third level, task, reflects those individual characteristics and contingencies.
Different request datasets will follow different nested sets of rules and different
routing because of their different attribute values. Every order will make the sub-
process-level transition from status awaiting Check credit rating to status await-
ing Match against stock (or, in Figure 2, status awaiting Authorise order). But
different orders may make that transition in different ways.
Assume for example that these are the rules for Check credit rating:
1 An order may pass Check credit rating if accompanied by a cash payment
greater than or equal to the total order value.
BUSINESS PROCESS ARCHITECTURE AND THE WORKFLOW REFERENCE MODEL
257
2 An order may pass Check credit rating if the customer is not in arrears and
the total order value is less than or equal to the customers total current un-
used credit.
3 Any credit increase where the total order value is greater than the customers
total current unused credit and the customer has not sent cash with the or-
der (but the customer is not in arrears and the order is accompanied by a
bank reference justifying the credit increase) must be manually approved.
4 A customer who is in arrears and has sent in an order unaccompanied by
cash must be written to requesting payment before the order can be accepted.
5 A customer who is not in arrears, but has sent in an order greater than the
total current unused credit and unaccompanied by sufficient cash must pro-
vide a bank reference justifying the credit increase before the order can be ac-
cepted.
Some of these rules allow the subproc-
ess to be passed automaticallyfor
example if the order meets the criteria
of either rule 1 or rule 2.
Because of rule 3, the subprocess will
need to accommodate manual ap-
proval.
Rules 4 and 5 need the subprocess to
include writing to the customer; and
therefore both recording and acting on
the customers reply or, if the customer
does not reply, some sort of follow-up.
The subprocess could therefore have a
task structure as in Figure 3.
Automatic credit check, Manual
credit check, Manual record docu-
ments and Automatic follow up are
tasks. As the names suggest, Auto-
matic credit check and Automatic fol-
low up are automatic tasks, needing
the application of business rules but
not human intervention. Manual
credit check and Manual record
documents on the other hand are
tasks requiring human intervention.
We said above that the task level is
there because the request datasets go-
ing through the subprocess may be
different, and therefore different things may need to happen to them before the
objective of the subprocess is achieved.
A subprocess like Check credit rating could start with an automatic task which
processes the requests it can process, and re-routes the ones it cannot.
So Automatic credit check applies all five rules. If the order passes either 1 or 2,
it can go on to the next subprocess, Match against stock. If the order meets the
criteria of rule 3, then it would route to a manual task to allow the increased
credit to be approved.
Check credit rating
Automatic
credit
check
Manual
credit
check
Automatic
follow up
Manual
record
documents
meets criteria
of 3, 4 or 5
approved
(rule 3)
pass
1 or 2
written to
customer
(rule 4 or 5)
From
Check
order
To
Match
against
stock
Figure 3
BUSINESS PROCESS ARCHITECTURE AND THE WORKFLOW REFERENCE MODEL
258
For simplicity it will be assumed that the same manual task could allow for both
approval (rule 3), and also writing to the customer (rules 4 and 5). So if the order
met the criteria for either rule 4 or rule 5, it would route to the same manual task.
In the case of just approval (rule 3), then the order would route back to the auto-
matic task, which would then route it to the next subprocess. (Or this action
could be taken in the manual task.) If the customer needs to be written to (rules 4
and 5), then we need two additional tasks: a manual task to act on the customers
reply; and an automatic follow-up to loop if no reply after n days.
The details of the tasks need not concern us. The number, nature and configura-
tion of tasks and the routing between them will be determined by the business
rules governing the process itself, and applicable to the particular subprocess.
With this meta-model we do not start with what people do and the order they do it
in. We do not start with actors or roles, however abstract. This is where we may
end up. But we start with rules.
This example also shows that although the subprocess level is important for
analysis and design, as far as implementation is concerned (other than for man-
agement information and control purposes) it is technically redundant. All the
work of the subprocess is contained in its tasks, and the routing between them. A
process is a set of tasks strung together with routing pathways.
Can the task break down further? Yes and no. If we consider its functionality
then clearly the task could break into components and components of compo-
nents. But from a pure process and control perspective a task is a unit. A task
cannot partially complete, since the request (eg the order) must exit in a well-
formed way so as to route to the next task correctly. This is because everything
that happens in a task can happen together: in the case of an automatic task it is
a set of rules which can be applied in one go (in a defined sequence if necessary);
in the case of a manual task it is a set of things it makes logical sense for one per-
son at a particular skill and authority level to do in the same session.
In the case of a manual task it is obviously possible for a user to stop halfway
perhaps capture only part of the data the manual task is there for. It must be
possible to exit the task at that point, and either lose everything done so far or
leave things in a partially-updated state. The task itself has completed in a well-
formed way, and subsequent routing will take account of the abandoned or par-
tial updateeg by looping back to the same task to redo or finish the update.
We have now introduced an important feature of the meta-model: the distinction
between the process component (the task) and any functionality component(s)
needed to implement it. From a process and control perspective a task, like a
subprocess, is a transition in the business status of the request. The task is a
transition from one status to one or more possible statuses representing possible
subsequent routings. To achieve those transitions functionality is neededand
there could be many different ways of implementing the task.
The task as process component is a unit with attributes like:
Task name
Input status and possible output statuses
Whether automatic or manual
If manual:
What roles and/or skill levels required
and so on.
BUSINESS PROCESS ARCHITECTURE AND THE WORKFLOW REFERENCE MODEL
259
Automatic and manual tasks will call for different types of functionality. A manual
task may display and accept data via an interface, while both classes of task may
run rules and update data. But as nodes in a process flow they are equivalent, as
in taking the request from one input status to one of a number of possible output
statuses. In theory at least a manual task is replaceable by an automatic task
with the same outcome, or vice versa. (In practice incremental automation is more
likely to involve increasing the proportion of requests handled by automatic tasks,
than replacing manual tasks one-for-one by automatic tasks. Removing a manual
task will more likely involve rationalising routing within the whole subprocess
rather than just replacing the task with a single equivalent automatic task.)
We should therefore think of task-as-process-component as something separate
from but loosely coupled with the functionality component(s) which implement it.
The relationship could be many-to-many: one task could be implemented by a set
of functionality components (possibly linked together hierarchically) and/or the
same functionality component could implement more than one task. (Task al-
most corresponds to the more familiar WfMC activity, but task here is unambi-
guously a process component rather than something incorporating both process
and functionalityor something which is primarily functionality.)
It could be objected that at this task level the model stops being purely logical
(what) and becomes physical (how). Perhaps the process could be implemented
with different tasks; or in a different way entirely, not needing tasks? Perhaps.
But we recall that task arose in the metamodel by considering the impact of the
subprocess rules on the range of possible attribute values of the request dataset
passing through the subprocess (including relevant external contingencies). An
additional assumption is that tasklike process and subprocessis a control
unit at the appropriate level. The process is the control unit responsible for taking
the request through to the well-formed outcome. The subprocess takes the re-
quest from one business status to the next, defined by business rules applicable
to all requests of the initiating type, regardless of individual case characteristics.
Tasks are the minimum control units required so that any possible request can
make the status transition defined at subprocess level.
Control unit should be construed at a management/process information/state
description level. Since these are business processes there will be different stake-
holders interested in knowing for example where each request is, or perhaps just
knowing this information exists.
If task seems more solution and less purely logical this could come from the
real world nature of the last two axiomsincluding all possible cases (and there-
fore all the more complex aspects such as handling exceptions and various error
conditions
20
); and the nodes being unambiguously defined control units.
But it is really a question of what the meta-model is, and so what is inside it and
what is outside it. The meta-model is a coherent set of inter-related concepts
which can be used to design process models at logical level, models which can
then be implemented in solution architecture.
We have already alluded to the two analogies of the relational data meta-model
and the double-entry paradigm in accounting.
The relational meta-model provides a set of inter-related concepts (entity, rela-
tionship, attribute, primary key, foreign key etc) which can be used to design data
models at logical level, models which can then be implemented in physical data-

20
David Hollingsworth (2004), op. cit.
BUSINESS PROCESS ARCHITECTURE AND THE WORKFLOW REFERENCE MODEL
260
bases. It guides, but does not prescribe, design choices. It does not say what enti-
ties to recognise. A real-world data domain could be represented in two or more
equally valid logical data models, equally aligned with relational principles. The
different data models could reflect different business and strategic priorities, and
different assumptions about future flexibility. But there would be a vast and in-
definite number of invalid data designs, ignoring or flouting relational principles.
The strength of the meta-model is its principles for identifying the relatively small
number of good data models and choosing between them.
Like the relational model (and also like the double-entry paradigm, which does
not impose a particular chart of accounts but guides accounting design) the proc-
ess meta-model provides building blocks for creating a logical process design. It
does not say what the rules should be; or what requests an organisation should
honour and so what processes it should support. It does not say what controls
and break points and therefore what status transitions to recognise. It does not
prescribe ranges of values for request data sets. But it provides a related set of
concepts which, given an organisations business context and objectives, help it
make rational choices as to what its processes should be, how they should inter-
act, what intervention would repay technology investment, and how processes
should be designed and implemented to maximise return on that investment.
It does this not by viewing technology implementations in increasingly generic
and abstract terms, but by providing tools to identify and then represent the fun-
damental logic of its processes in terms of their rules. The eventual configuration
of tasks will depend on the scope of the particular process initiative and the stra-
tegic priorities of the organisation. Just as there may be more than one possible
logical data model for a given context there could be more than one possible proc-
ess model. But none need assume a particular technology implementation.
IMPLICATIONS OF THE META-MODEL
The meta-model implies and facilitates a logical process architecture. More sim-
ply: it is a logical process architecture.
To illustrate this I shall indicate how the model might inform some of the process
issues raised in Hollingsworths paper.
Process fragment
First the process fragment. The process itself is seen as
a combination of process fragments which can be recombined in various
ways to deliver new or modified business capability. This has become impor-
tant to support business agility and follows from the increasingly integrated
business relationships between trading organizations.
The internal view defines the actual or intended internal behavior of the
process fragmentit includes not just the activities and transitions between
them but, also [significantly] the internal resources required to support enact-
ment of the process. It will also identify the boundaries of the fragment in
terms of interactions with other process fragments or outside objects.
The external view defines the behaviour of the fragment as a black box,
addressed through its interfaces. This view sees the process fragment very
much as a source and sink of messages or events of different types.
Other than the need for agility because of increasingly integrated business rela-
tionships, there is little discussion of business context. What about ownership,
governance, accountability? We need to think through what integrated business
relationships might mean in the real world.
BUSINESS PROCESS ARCHITECTURE AND THE WORKFLOW REFERENCE MODEL
261
A possible way to integrate two or more businesses is where a process owned by
one business needs participation by another business. Life assurance company A
decides to outsource its new business underwriting to company B. This could
translate into the meta-model by having a new business process owned, defined
and implemented by A, but with manual underwriting tasks available to users in
company B. Company A may pursue greater agility by outsourcing the under-
writing of different products to different underwriting companies. For example
manual underwriting task instances for products 1, 3 and 5 might be available to
users in company B and those for products 2 and 4 to users in company C. The
meta-model would support thisbut not impose a particular solution.
In this scenario (eg where users access task instances over the web) the function-
ality used by companies B and C might not qualify as process fragments, as the
whole process is supplied and owned by company A. But it could be different.
Company A could send details of new business applications to companies B and
C by post or email; and get back underwriting decisionsagain by an agreed me-
dium. This would be similar to a company A process generating correspondence
to a customer or agent and then handling the response. It would be more like
where process fragments take place at company B and company C.
If the details of the new business application were in a file or data packet input
into (say) company Bs underwriting system, then this may be even closer to a
process fragmentas company Bs system functionality would be performing
part of company As process. Perhaps company B also handles company As
claims underwriting, and interacts in the same way. Then the data packets from
company A might need to contain process type indicators so company B can
identify them as new business or claims. If company B did underwriting for other
life assurance companies then the data packet might need a client attribute indi-
cating which life assurance company it was from. This approaches a world where
standards are neededso multiple clients like company A and multiple suppliers
like B and C can interact in the same way. We may need a choreography
to identify the valid sequences of messages and responses across and be-
tween the participating process fragments. The choreography requires each
process fragment to exhibit a set of prescribed external behaviours in re-
sponse to such message sequences.
But company A may want a more structured and arms-length relationship with
B and C, more like the customer-supplier relationship it in fact is. A disadvantage
of the B2B acronym is that it can obscure business-architectural features by
over-focus on technology and the drive to agility. There is a risk of overlooking
B2B fundamentals like ownership, accountability and governance.
Figures 4 and 5 below show alternative ways in which companies B and C might
interact with company A in respect of new business underwriting. Both diagrams
use BPMN symbols, and both are aligned with the proposed meta-model.
BUSINESS PROCESS ARCHITECTURE AND THE WORKFLOW REFERENCE MODEL
262
Issue
policy
+
Underwriting
Automatic
underwriting
Manual
underwriting:
company B
Needs
manual
u/w by B
No
manual
u/w o/s
Check
application
+
Capture
application
+
Check
documents
+
Manual
underwriting:
company C
Needs
manual
u/w by C
C
o
m
p
a
n
y
A
C
o
m
p
a
n
y

B
C
o
m
p
a
n
y
C

Figure 4
Figure 4 fits a scenario where all new business process functionality is supplied
and owned by company A. An automatic underwriting task identifies cases need-
ing no manual underwriting (or which have now been underwritten) and passes
them to the Issue policy subprocess. It splits the others between companies B
and C according to rules, perhaps based on product type. The two manual un-
derwriting tasks are made available to companies B and C respectively.
Issue
policy
+
Underwriting
Automatic
underwriting
Needs
manual
u/w by B
No manual
u/w o/s
Check
application
+
Capture
application
+
Check
documents
+
Needs
manual
u/w by C
C
o
m
p
a
n
y

A
C
o
m
p
a
n
y
B
C
o
m
p
a
n
y
C
Auto receive
response
Underwriting:
company B
+
Underwriting:
company C
+
u/w
request
u/w
request
u/w
outcome
u/w
outcome

Figure 5
Figure 5 shows a different relationship. Here company As new business process
excludes manual underwriting. The automatic underwriting task again splits the
cases into ones needing no manual underwriting (or which have now been un-
derwritten); ones for company B; and ones for company C. But instead of routing
to manual underwriting tasks in the same process, underwriting request mes-
sages travel to company B or C as appropriate. These initiate instances of under-
writing processes in B or C respectively, and generate outcome messages which
travel back to company A where the Auto receive response task processes them.
BUSINESS PROCESS ARCHITECTURE AND THE WORKFLOW REFERENCE MODEL
263
The two processes Underwriting: company B and Underwriting: company C can
be completely black box as far as company A is concerned, as long as they can
read the right request messages and generate the right outcome messages. Un-
derwriting: company B and Underwriting: company C can also be completely
different from each other internally.
Examples like these show an important feature of the meta-model in B2B and
B2C contexts. Process relationships can be depicted at a logical, business-
architectural, level without (at one extreme) showing the process so abstractly
that it and all its components float in space, not supplied, owned or governed by
any entity; or (at the other extreme) showing ownership etc only in terms of physi-
cally implemented processes. This is because the request is from a customer, and
the outcome is for a customer: the process itself is fundamentally from the sup-
plier perspective. But customer and supplier are construed broadly enough to
cover any B2B or B2C relationship.
So the meta-model allows us to simplify and sharpen our language. In Figure 5
company As new business process (from Capture application to Issue policy) is
a process. A process has an internal logical structure (subprocess, then task). A
process can interact with other processes and/or with itself. Company As process
initiates process Underwriting: company B and process Underwriting: company
C by generating appropriate requests. If it has issued a request message for com-
pany B it will suspend itself until it receives an appropriate outcome message
from company B, when it then carries on to completion. We do not need to call
company As process an end-to-end process delivery chain; or see Underwriting:
company B and Underwriting: company C as process fragments. The meta-
model defines them all as (business) processes.
Nesting and re-use
Some models allow nested or hierarchical relationships at subprocess or task (ac-
tivity) level. For example the WfRM nested subprocess construct
allows a process executed in a particular workflow domain to be completely encap-
sulated as a single task within a (superior) process executed in a different workflow
domain. A hierarchic relationship exists between the superior process and the en-
capsulated process, which in effect forms a sub-process of the superior.
21

The meta-model proposed here may at first sight appear more restrictive. It ac-
knowledges nesting between functionality components, and re-use of functionality
components between tasks. But the only nesting it allows at process component
level is between processes themselves, as in the process interactions mentioned
above. The reason is logical. The process is everything from initiating request to
outcome; the subprocess is a transition in business status of the request across
all possible requests; the task structure allows for status transitions for all possi-
ble attribute values of the extended request dataset. A task is therefore a status
transition of a specific request type, at a particular point in a particular process. It
cannot at the same time be another transition of another request type, or of the
same request type at a different point in the process. Nor would anything be
gained by allowing this kind of re-use. The actual functionality (by definition) is
contained in the functionality component(s) implementing the task, and function-
ality components can be re-used and nested with complete freedom.

21
David Hollingsworth (1995), op. cit.
BUSINESS PROCESS ARCHITECTURE AND THE WORKFLOW REFERENCE MODEL
264
There could be nesting between task and processin effect at process level. The
model defines process as everything from initiating request to outcome, so a
process needs a request entity to start it. It is perfectly in order for a task P1Tn in
process P1 (initiated by request R1) to create another request R2 initiating process
P2. (See Figure 7 under Process interactions and agility below.) But P2 cannot
be initiated by the same request R1 otherwise R1s outcome (O1) could be gener-
ated by P2, and by definition it must be generated by P1. This is true purely logi-
cally, and therefore cannot be a practical limitation. It is more a strength of the
meta-model that it insists on definition of request entities. It achieves the same
practical result as the WfRMs nested subprocess construct, but in the context of
formal rigour. (See also Process interactions and agility below.)
Work management
A perhaps more clearly practical strength of the model is its incorporation of in-
ternal work management into the same control structure as external interactions
with customers, agents, suppliers and other partners. In Figures 4 and 5 sub-
processes Capture application and Check application represent data entry and
validation work internal to company A, and subprocess Check documents repre-
sents that part of the process (again internal to company A) where applications
are checked to ensure no supporting documents are outstanding. A mature proc-
ess architecture would integrate these into the same work management/work
delivery functionality. With web deployment there is no reason why (in Figure 4)
this should not also extend to tasks Manual underwriting: company B and
Manual underwriting: company C. Figure 5 assumes different interaction and
ownership choices, but the interactions themselves (generation of requests and
processing of outcomes) can be incorporated into the same (internal) work man-
agement/work delivery functionality as the rest of company As process. There is
no logical or architectural reason why B2B and B2C interactions should be in a
different control domain to internal work management, and every reason why
they should not. We do not have to risk
ignoring the organizational and human aspects of the business process, in fa-
vour of a resource model wholly based on web services.
An effective process model can cover every part of every business process
including both (i) fully internal and (ii) web-deployed interactive and indetermi-
nately resourced processes and parts of processes.
Insofar as the meta-model implies (is) a logical process architecture, it can also
inform and guide a physical solution architecture capable of integrating process
control, work management and application system functionality in the most im-
mediate waysuch that the business process design and the application system
design are one and the same.
22
There only need to be two domains when imple-
mentation technology determines (and therefore constrains) business process.
When the horse is before the cart, and business process determines implementa-
tion technology, there is only one domainprocess implemented in technology
calling for one holistic mindset, one holistic skill set, one holistic methodology.
Late binding
Figure 5 also indicates how the model can handle late binding of resource to task
and/or dynamic discovery [of services] at fragment execution time. Trawling cy-
berspace for underwriting services may be far-fetched, but if it did make business
sense then the u/w request message would only need to be unspecific to either

22
Chris Lawrence, Make work make sense, Future Managers, Cape Town, 2005.
BUSINESS PROCESS ARCHITECTURE AND THE WORKFLOW REFERENCE MODEL
265
company B or company C, and instead conform to a standard format and be
transmitted to an appropriate broking service (resource directory) which would
link it up with an underwriting service provider capable of handling the request
and generating an u/w outcome message in an equally standard format.
For a less far-fetched example, subprocess Match against stock in the earlier or-
der process (Figure 1 or Figure 2) could be not an internal subprocess but a set of
tasks for issuing standardised request messages destined for appropriate whole-
sale suppliers (again via an appropriate broking service/resource directory) and
deciding (on eg price and lead-time criteria) which outcome messages to confirm
and therefore which (perhaps previously unknown) suppliers to contract with.
Process and data
Under the heading Information and its relationship to process and organiza-
tion Hollingsworth sees a distinction between process-based and information
based architectures:
Process-based architectures tend to emphasise process as the dominant di-
mension; processes consume, generate or transform information, behaving in
accordance with a set of corporate governance rules. By contrast, information
based architectures emphasise the information dimension, viewing processes
as operations that are triggered as a result of information change.
23


Does this have to be true? Must we
choose between process- and informa-
tion-based architectures? Remember-
ing the distinction between task (as
process component) and the function-
ality component(s) implementing the
task, then what sort of thing is a task?
In an implemented process-architected
solution the relevant entities might be
related as in a logical data model
something like Figure 6.
Note the repeating typeinstance pat-
tern. A process model (of a business or
business area) is at type level, where
the rules are. Each instance is an en-
tity created as required for the individ-
ual case (RequestInstance) and con-
forming to the rules for its type. For
example an automatic credit check TaskInstance (for eg order number 001) will
behave as determined by the automatic credit check TaskType.
Stepping through this data model, a request (RequestInstance) is of a particular
RequestType, identifying the correct ProcessType. A ProcessInstance is created of
that ProcessType. The ProcessType defines a set of SubprocessTypes which are
the subprocesses for that process. (SubprocessType and SubprocessInstance are
shown in broken lines because of their possible redundancy at implementation
level.) SubprocessType and/or ProcessType identify the set of TaskTypes for each
relevant SubprocessType/ProcessType. TaskInstances are created as required, as

23
David Hollingsworth (2004), op. cit.
Process
Type
Process
Instance
Subprocess
Instance
Subprocess
Type
Task
Type
Task
Instance
User Access
Functionality
Component
Task-
Component
Request
Type
Request
Instance
Routing
Figure 6
BUSINESS PROCESS ARCHITECTURE AND THE WORKFLOW REFERENCE MODEL
266
not every ProcessInstance will need instances of every TaskType. But typically a
TaskInstance for the first TaskType will be needed to start the process.
The (first) TaskInstance will run by causing the relevant FunctionalityCompo-
nent(s) to run. The relationship between TaskType and FunctionalityComponent
is many-to-many, but there would typically be one controlling FunctionalityCom-
ponent responsible for starting the task and calling other FunctionalityCompo-
nents as required. If the TaskType is a manual task then Access will define the
User(s) able to perform the TaskInstance.
Routing is an intersection entity with rules as to what next task(s) are possible
for each task. Attributes would be eg From TaskType (ie this TaskType), To
TaskType (ie next TaskType) and Condition (ie exit condition under which the
From TaskType would lead to the To TaskType).
In a physical implementation many of these actions and links would be by generic
process functionality operating against process control type data and creating
process control instance data as required. This generic functionality, along with
the type and instance process data, is typically referred to as a process engine.
A summary like this demonstrates that components like process, subprocess and
task are best thought of as data entities to control, guide and implement the be-
haviour of a process-based system. So a sophisticated process-based architec-
ture is one treating process as data at logical level and implementing process as
data at physical level. Conversely a sophisticated information-based architecture
is one which extends its reach to include process entities. If process is a subset
of information, why distinguish between process- and information-based archi-
tectures? An information-based architecture which viewed processes only as op-
erations triggered as a result of information change could be described as an
incomplete information-based architecture. Conversely a process-based architec-
ture which saw processes only as things which
consume, generate or transform information, behaving in accordance with a
set of corporate governance rules
thereby missing the point that business processes themselves can and should
be modelled and implemented as datacould be described as an incomplete proc-
ess-based architecture.
We do not have to compromise. We can take business process seriously, by tak-
ing seriously its representation in data. Hollingsworth admits the WfRM
does embrace all three dimensions [process, information and organization]
but takes a relatively simplistic view of the information dimension.
This simplistic view could be a reason it is often a challenge to extend proprietary
workflow packages to provide true process control functionalitydespite vendor
claims. Process control is about rules, and rules are data.
Process interactions and agility
This is a convenient point to return to the discussion of process interactions
important because of implications for process agility.
The model lets us distinguish two kinds of interaction. One (Figure 7) is men-
tioned under Nesting and re-use above, where eg a task P1Tn in process P1 cre-
ates a request R2 to initiate process P2.
BUSINESS PROCESS ARCHITECTURE AND THE WORKFLOW REFERENCE MODEL
267
Next
subprocess
+
Subprocess
Task P1Tn
Request
R2
Process P1
Process P2
+
Initiate

Figure 7

Next
subprocess
+
Subprocess
Task P3Tn
Process P3
Next
subprocess
+
Subprocess
Task P4Tn
With update
Process P4
Task P4Tm
Without update
Business
data
update
read

Figure 8

Another (Figure 8) is where task P3Tn in process P3 merely creates or changes
business data so that when a rule in task P4Tn in process P4 runs it has a differ-
ent result than would have happened before the update.
On the functionality component level there is no distinction, as P1Tn in process
P1 creating request R2 could be described as creating or changing business
data. The difference is of course that the business data in Figure 7 is a request,
and request is a structural component of the process meta-model.
Consider now a familiar version of Figure 8 where the data task P3Tn updates is
control data implementing a business rule. An example might be where process
P4 is an order process, the affected subprocess is Authorise order, task P4Tn is
Automatic authorisation and task P4Tm is Manual authorisation.
BUSINESS PROCESS ARCHITECTURE AND THE WORKFLOW REFERENCE MODEL
268
In Figure 9 the automatic authorisation task passes orders where the values are
within limits stored on the Authorisation limits file. Authorisation limits could
hold maximum values for, say, different product types and/or customer catego-
ries. Any order within these limits passes to the next subprocess. Any order ex-
ceeding one or more limits routes to a manual authorisation task.
The authorisation limits themselves are updated by a separate Parameter update
process. For a particular product/customer category combination the Parameter
update process could increase the limit from 100 to 120. Before the update an
order for 110 for that same product/customer category combination routes to
manual authorisation. After the update an identical order would pass automatic
authorisation.
Next
subprocess
+
Subprocess
Update
authorisation
limits
Parameter update process
Next
subprocess
+
Subprocess: Authorise order
Auto
authorisation
Within limits
Order process
Manual
authorisation
Exceeds limit(s)
Authorisation
limits
update
read

Figure 9
The meta-model supports this kind of agility. The order process itself has not
changed, but its manifest behaviour has because a rule implemented as control
data has been changed dynamically.
It is perhaps a relatively tame kind of agility. But even so its dynamic nature has
implications which need acknowledging. Do we worry about equitable treatment?
Two orders from two customers in the same category could arrive on the same
day for the same amount of the same product. One is delayed by an earlier sub-
process so that by the time it gets to subprocess: Authorise order it is automati-
BUSINESS PROCESS ARCHITECTURE AND THE WORKFLOW REFERENCE MODEL
269
cally approved, because of the update. The other order is not delayed, and has to
be manually approved because it gets in before the update. Does this matter? Or
should the effective date of the parameter update be compared with the date the
order is received, rather than the parameter update taking effect at run time? If so
the old authorisation limits would need to persist for a while alongside the new.
The impact of rule and control data changes will depend on business context.
Questions like these are commonplace in administration, and are unproblematic
when human beings provide the integration between work management and
administration systems. The challenge comes with increasing automationan
architecture integrating business process with business functionality.
Consider the impact of an agile change to the behaviour of the automatic under-
writing task in Figure 5. Perhaps we want to add another company (D) to share
the underwriting on product 4, and also move the whole of product 1 from com-
pany B to company D.
In Figure 10 the U/W company parameters file holds details about what com-
pany can underwrite what product, and rules about how to split underwriting on
a product between more than one underwriting company. The Automatic under-
writing task will need to read this file to know which company to generate the
underwriting request for. The Auto receive response task may also need to read
the file to validate an underwriting outcome.
Company B may have received underwriting requests for product 1 but not yet
completed them. So although, following the update, company B is no longer al-
lowed to underwrite product 1, an outcome from company B for a product 1 re-
quest issued before the update should be treated as valid.
But there could be more to it than this in a real business context. If company A
changes its rules so that underwriting work for products 1 and 4 is now out-
sourced to company D, company D must expect to receive these requests, and
companies B and C must also expect changes in their incoming requests. Do we
assume the U/W parameter update process handles some or all of this corre-
spondence, or is it just an internal data update process? What does just an in-
ternal data update process mean? What if the strategic intent is to implement as
many business processes as possible seamlessly, transparently and explicitly in
application technology, as this was seen to maximise effectiveness and efficiency?
BUSINESS PROCESS ARCHITECTURE AND THE WORKFLOW REFERENCE MODEL
270
Issue
policy
+
Underwriting
Automatic
underwriting
No manual
u/w o/s
Check
application
+
Capture
application
+
Check
documents
+
C
o
m
p
a
n
y

A
C
o
m
p
a
n
y
B
C
o
m
p
a
n
y
C
Auto receive
response
Underwriting:
company B
+
Underwriting:
company C
+
u/w
outcome
u/w
outcome
C
o
m
p
a
n
y
D
U/W company
parameters
update
read
Needs manual
u/w by
B or C or D
u/w
request
Underwriting:
company D
+
u/w
outcome
read
U/W parameter
update process
+
New business process

Figure 10

The overall business process could be like this:
Change/new
outsource contract
+
New/changed
contract
implemented
Request for
new/changed
contract

Figure 11
The customer in this case is actually the supplier or potential supplier. Or per-
haps the customer is internal to company Aa department responsible for con-
tracting with underwriting providers. The point is that the Underwriting parame-
ter update process of Figure 10 could be seen more holistically as part of a busi-
ness process to set up a new outsource contract and/or change an existing one.
(The big difference between system process and business process is not so much
an ontological distinction between surrogate and what the surrogate represents.
24

It is more a question of how holistically we choose to see the process.)
There may need to be three instances of the process: one to remove product 1
from company Bs contract; one to set up a new contract for company D to be sole
underwriter for product 1 and pooled underwriter for product 4; and one to
change company C from sole underwriter to pooled underwriter for product 4.
The process could also be responsible for generating and receiving correspon-
dence: eg new/amended contracts for signature and return.

24
Ronald G. Ross, "How Rules and Processes Relate ~ Part 4. Business Processes vs. System Proc-
esses," Business Rules Journal, Vol. 7, No. 1 (Jan. 2006), URL:
http://www.BRCommunity.com/a2006/b265.html
BUSINESS PROCESS ARCHITECTURE AND THE WORKFLOW REFERENCE MODEL
271
The amount of automation (and system support generally) appropriate for a busi-
ness process like this has more to do with its economics than with its logic or
purpose. Company A may have functionality to update underwriting company
parameters in one of its IT systems, but no functionality other than that to sup-
port changes in outsource contracting. But its business processes should not be
seen in terms of the systems it happens to haveunless of course they were suc-
cessfully process-architected, in which case the processes and their system im-
plementation would be one and the same. I accept the idea that a data entity is a
surrogate for the real-world entity it represents, but it does not follow that
any model of the business (including its processes)... [has to] ...differ from any
model for the design of an information/knowledge system (including its proc-
esses).
25

The reason for discussing scenarios like these is to stress that it is often not so
much what technology we need to maximise agility, but how exhaustive we want
(or can afford) our impact analysis to be. Automation at any level ushers in a new
world. Run-time opportunities for human intervention and ingenuity are reduced
by design, so the intervention and ingenuity must be supplied at design time.
Two things are often expected from a process engine or a BPMS. One is process
flexibility: being able to change a process at the click of a button. The other is
rule-based routingbeing able to store and change business rules controlling
how work travels through a business. Both are possible, but come at a price.
We have spoken about the run-time impact of rule and control data changes, and
said that process, subprocess and task themselves are best considered as data.
With process being treated as type and instance data entities manipulated by
generic functionality, a useful approach to agility (and to constraints against agil-
ity) is by way of data-model crows feet (Figure 12).
In general the higher an entity is the more
impact changing an attribute value has, as
it may affect lower entities. In a one:many
relationship the one affects the many,
but the many tends not to affect the one.
Higher Lower
Figure 13
Even
higher

In Figure 13 Even higher will impact Higher, which in turn will impact Lower.
So Even higher will generally indirectly impact Lower as well.
A classic example of an Even higher entity is Frequency, with values of monthly,
annual, half-yearly etc. Frequency is important in many business contexts but it
is not a process entity like ProcessType, TaskInstance etc. Consider the impact
of adding a new frequency, eg quarterly. In many financial services and general
commercial contexts this would not be possible without adding or changing func-
tionalityeg for calculating charges and/or interest. Another complication is ex-
ternal impact: there may be an implicit assumption that an internal Frequency
value is also allowable externally, eg for bank debit orders or standing orders. A
new internally allowable Frequency value may not be allowable in all external

25
Ronald G. Ross, ibid.
Higher Lower
Figure 12
BUSINESS PROCESS ARCHITECTURE AND THE WORKFLOW REFERENCE MODEL
272
contexts. Functionality changes may therefore need to be made to prevent un-
foreseen consequences. In view of the position of Frequency in the data model the
changes could be extensive.
Removing a Frequency value could have even greater impact. If we make half-
yearly invalid, what happens to historic half-yearly transactions? We cannot pre-
tend they never happened. There will also be customers who have been used to
having (say) half-yearly statements and must now be told they cannot any longer.
The impact of changing processes is similar to that of changing (other) control
data. A process change can mean different things depending on what actual data
is changing.
Often all that is intended is the sort of parameter change where an authorisation
limit is increased. This has process implications, but relatively untraumatic.
Another fairly innocuous type of process change would be a change to the Access
entity in Figure 6eg removing access for a particular TaskType from one set of
users and granting it to another. Such a change could have high impact: re-
routing categories of work from one department to another, say. So it should be
made in a coherent and controlled way, with full regard to training, accountability
etc. But it would be unproblematic from an architectural perspective.
What happens though if we want to change the process structure itself, for exam-
ple the sequence of subprocesses in the order process in Figure 1? Some changes
would make little sense, eg putting Check order before Take orderunless of
course the change had a special meaning:
Check order = manual check that the order was correct and complete;
Take order = capturing the order in a system, which can only be done (or
which we only want to do) with orders which have been manually checked
to ensure they are correct and complete.
What if Despatch order came before Authorise order? This would mean implicitly
authorising every order, and make the final Authorise order redundant. Unless
the supplier had a special relationship with its customers, who would be happy to
return any order not authorised after despatch.
A more plausible example might be if Match against stock and Authorise order
were to change places, as this might represent an equally valid (if more cautious)
way of doing business. There are a number of areas of possible impact.
What about previous process instancesbefore the change? Are our systems,
databases and people able to handle two types of history records: old ones con-
flicting with the new process and new ones matching it? Is there anything about
any aspect of the business infrastructure to force old cases to be interpreted as
new ones? Consider a customer claiming not to have got exactly what was or-
dered, and whether a supplier employee might try to resolve the query differently
if it was an old case or a new case. What if it was hard to tell old from new?
Another issue is process instances already started but not finished at the time of
the process change. A sequence could be:
1 Order matched against stock.
2 Order process changed.
3 Order authorised.
4 Order matched against stock (again).
Easily resolved (in concept at least) by only applying the process change to new
orders. But this means either having two process models in existence for a time,
BUSINESS PROCESS ARCHITECTURE AND THE WORKFLOW REFERENCE MODEL
273
or having to empty the system of all old orders first before making the change.
This could delay new orders while old ones are completing.
Quite apart from timing issues there is the business impact of the change itself.
In the old process someone responsible for manual authorisation would know
the order had already been matched against stock. We shall assume this means
both ensuring there was stock to fulfil the order and allocating specific stock so
that a subsequent order cannot be matched against the exact same stock. In the
old process an authorised order could be immediately despatched. If a customer
queried his order status a supplier employee could see the order was authorised
and confirm it was on its way. An order which was not authorised however would
need its stock allocation de-allocated, to make it available for subsequent orders.
The new process has different logistics. The supplier employee answering the cus-
tomer query cannot confirm the order is on its way just because it has been
authorised, as the stock may not yet be available. An order which is not author-
ised however will not need stock de-allocated, as none would have been allocated.
Scenarios like these are fairly mundane but they are what process and process
changes are about. The more process control is automated and integrated the
more agility is about thinking through the logistics of what a process change
means in practice. Not so much how agile can we be, and what technology solu-
tion will make us more so, but what do we need in place and what thinking and
designing and testing must we do so that agility (like changing a few data items
on the Routing entity in Figure 6) does not result in chaos.
USING THE META-MODEL
This section considers ways in which the meta-model can guide process design
and translation between different process representations and implementations.
Guiding process design
Much of the material in previous sections has been about this. In summary: the
model provides concepts and criteria for identifying processes, and for knowing
where they start and stop and how they interact. The key is to define the request
entity for each process. This sets the level the process is at (account, customer,
claim etc); what constitutes completion (outcome); and what sequence and struc-
ture of rules are needed on the way to completion. A process may trigger other
processes, and the triggering process may or may not depend on their completion.
The request entity will undergo status transitions related to the sequential and
structured application of rules. The meta-model guides the identification of these
status transitions first at subprocess then at task level. So it is a coherent meth-
odology for designing task structure. Once the tasks have been designed at a
process logic level, the functionality they require can be designed and built
(and/or found and adapted from existing applications). The details will vary de-
pending on what solution architecture is assumed.
The request entity identifies an extended data set. The implications of the interac-
tions between that data set and the process rules are what determine the process
design. The request entity is therefore a crucial process design component.
Treatment of process is aligned and intertwined with treatment of data.
Because a subset of tasks may be manual, the meta-model draws work man-
agement into the same holistic logical architecture. (There are further implica-
tions for strategic change, benefits realisation etc which are explored elsewhere.
26
)

26
Chris Lawrence, Make work make sense, Future Managers, Cape Town, 2005.
BUSINESS PROCESS ARCHITECTURE AND THE WORKFLOW REFERENCE MODEL
274
Process translation
Examples of this are
to enable the integration of different process design products with different
execution products or to allow simple migration of existing process definitions
to a different design/execution product.
27

Because the meta-model derives from an analysis of what it is to be a business
process, rather than what it is to be a workflow system or a BPMS, it provides
grammar and semantics to represent the essential structure of a business proc-
ess at logical level. Because a process implementation is a how to the what of the
logical representation, the grammar and semantics of the meta-model can also be
used to represent the implemented process. How this might be done would de-
pend on the implemented process and the intention behind the representation.
Translation involves mapping. The process meta-model will support translation if
different process designs (including both conceptual and executable models) can
map to it. The intention could be (i) to store the essential features of an imple-
mented process, however successful or unsuccessful, in an independent language
or format so as to implement that same process in different technology and/or
architecture. A translation like this would need to go via a conceptual model. Or,
more in line with Hollingsworths diagram of a BPM component model
28
, the in-
tention could be (ii) to have a common conceptual model capable of translation
into different executable models appropriate to different implementation architec-
tures. Or in a B2B context (iii) two or more organisations interacting at the proc-
ess level might want to share an understanding of their interactions.
A first observation is that scenarios like these will rarely feature a single proc-
esswith process defined as the meta-model defines it. In all three scenarios the
unit will typically be of a set of linked processes, as this would be a more likely
scope of a practical business intervention than a single process.
In scenario (i) the essential features of an implemented process set need to be
stored in an independent format so as to support a different implementation of
that same process set. The implemented process set is effectively to be reverse-
engineered into a conceptual model. Regardless of what the implementation was
like (perhaps a single, explicit, generic, process engine controlled the process set
in its entirety, or choreographed messaging was used) I would want to under-
stand the process set in business terms. This would involve identifying the re-
quest entities, and therefore the outcomes, and therefore where each process be-
gins and ends. In the case of a messaging solution, it is possible that some of the
messages and responses may be requests and outcomes, but perhaps not all.
Also not every request and outcome may be explicit in the implementation.
Just because that was what I would want to do does not mean it has to be done.
It depends on the objective behind the translation. A description of an imple-
mented process set could, in certain circumstances, be translated completely me-
chanically into the terms of the meta-model. Anything identified as a process
would be translated as a process, and anything which starts a process would be
translated as a request.

27
David Hollingsworth (2004), op cit.
28
David Hollingsworth (2004), op cit.
BUSINESS PROCESS ARCHITECTURE AND THE WORKFLOW REFERENCE MODEL
275
Manual
function MF1
Automatic
function AF1
Manual
function MF2
Manual
function MF3
Manual
function MF4
Automatic
function AF2
Yes No
Request
R1
Outcome
O1
Manual task
MT1
Automatic
task AT1
Manual task
MT2
Manual task
MT3
Manual task
MT4
Automatic
task AT2
Yes No
Request
R1
Outcome
O1
Figure 14

The result may not be a very good conceptual model, or much of an advance on
the previous description. But it would be in a conceptual format where coherent
methodological principles could be applied to improving and optimising it.
Having identified the request entities and start and end points of the processes in
the process set (either analytically or mechanically) I would identify subprocess-
level status transitions and then the task structure.
But again how analytical this would be would depend on the translation objec-
tives, and how much of a conceptual model or specification already existed. If
there was no intention to rationalise or improve the process set, then the as-is
components and routing could be transferred one-to-one into manual and auto-
matic tasks, perhaps with no attempt at logical analysis via subprocesses. Rules
would not be exposed, they would just be assumed to be correct and correctly
belong where they are in the implemented components: see Figure 14.
BUSINESS PROCESS ARCHITECTURE AND THE WORKFLOW REFERENCE MODEL
276
Figure 16
AF2 AT2
MF4 MT4
MF1 MT3
MF2 MT2
AF1 AT1
MF1 MT1
Component Task
Task-Component
AF2 AT2
MF4 MT4
MF1 MT3
MF2 MT2
AF1 AT1
MF1 MT1
Component Task
Task-Component

In accordance with the meta-model however the as-is components would not
transfer across as tasks themselves: an automatic or manual task would corre-
spond to each component, but with a Task-Component (or equivalent) record
linking it to the appropriate Task record (see Figure 6 and Figure 15 ).
It may however be that the components for (say) tasks MT1 and MT3 are or could
be identical. The correspondence could then be rationalised as in Figure 16.
Further changes could be made, depending on whether there was any need or
intention to improve or optimise the model. For example the interaction between
implemented rules and process flow might be improved by altering the configura-
tion of AT1 and MT2 as in Figure 17.
It might then be decided that further improvement might be served by recognising
a subprocess level, as in Figure 18. But none of these changes are mandatory,
and they can be made incrementally. The extent of the rationalisation would de-
pend on the scope and objectives of the exercise.
Reverse-engineering a logical data model from a physical file structure is often an
occasion for critiqueeg discovering that a repeating group inappropriately limits
entity instances, or that denormalisation compromises data integrity. In the same
way deriving a rigorous conceptual model from an as-is implementation will often
expose anomalies, incompleteness and faults. For example a common omission in
process environments based on proprietary workflow implementations is that
automatic tasks are under-represented, awkwardly or inexplicitly engineered, or
just plain absent. This can happen when process is seen in terms of sequences
of manual activities rather than sequences of rules.

Figure 15
AF2 AT2
MF4 MT4
MF3 MT3
MF2 MT2
AF1 AT1
MF1 MT1
Component Task
Task-Component
AF2 AT2
MF4 MT4
MF3 MT3
MF2 MT2
AF1 AT1
MF1 MT1
Component Task
Task-Component
BUSINESS PROCESS ARCHITECTURE AND THE WORKFLOW REFERENCE MODEL
277
Manual task
MT1
Automatic
task AT1
Manual task
MT2
Manual task
MT3
Manual task
MT4
Automatic
task AT2
Yes No
Request
R1
Outcome
O1
Figure 17
Manual task
MT1
Automatic
task AT1
Manual task
MT2
Manual task
MT3
Manual task
MT4
Automatic
task AT2
Yes No
Request
R1
Outcome
O1
Subprocess
1
Subprocess
2
Subprocess
3
Subprocess
4
Figure 18

The more reverse-engineering is mechanised the
less opportunity there would be for this.
I would express the conceptual model in BPMN
assuming BPMN continues as the standardbut
in the knowledge that the same information could
be expressed in a different format, eg in a popu-
lated database (see Figure 6). One implication of
using BPMN is that process, subprocess and task
all use the same symbol. So each box must be
carefully named to indicate whether it was a proc-
ess, subprocess or task.
Again depending on the objectives of the exercise
there are possible extensions. One would be to document human actors and/or
roles involved in the processsimply by populating entities equivalent to Access
and User in Figure 6. Figure 19 shows a possible population of Access.
Taken together these two extensions would be
Figure 19
Clerk2 MT4
Clerk2 MT3
Supervisor MT2
Clerk1 MT1
User Task
Access
Clerk2 MT4
Clerk2 MT3
Supervisor MT2
Clerk1 MT1
User Task
Access
BUSINESS PROCESS ARCHITECTURE AND THE WORKFLOW REFERENCE MODEL
278
specifying the work resources associated with the process work items, or ac-
tivities.
29

Much of what has been said is also relevant to scenario (ii), except that reverse-
engineering does not apply. For that reason it should be safe to assume an inten-
tion to get as pure a conceptual model as possible. It should therefore not be too
simplistic to see translation into different executable models as in theory mostly a
matter of different population of entities like Access, User, FunctionalityCom-
ponent and Task-Component (Figures 6, 15, 16 and 19).
Adopting a meta-model is ultimately a choice to see a particular domain in a par-
ticular way. So in the B2B scenario (iii) it is for the participants to decide how they
want to see their process interactions.
Although the shared scope will also typically be a set of linked processes, we need
not and cannot assume one conceptual model both to represent that set of proc-
esses and be shared by the participating organisations. More key is that where
organisations interact, they understand the interactions in the same way, and
that the interactions make sense to each participant in terms of its own individual
process model. This is more likely if the participants share the same meta-model,
either explicitly or implicitly (by subscribing to the same agreed standards).
Figure 4 and Figure 5 show two different ways in which the participants might
interact. It is worth drawing out some implications for meta-model sharing.
In Figure 4 the process in a sense belongs to company A. A familiar implementa-
tion would be where the new business process from Capture application to Issue
policy is defined in company As process or workflow application. The two tasks
Manual underwriting: company B and Manual underwriting: company C would
be part of that same application belonging to company A. It is just the users with
access to the two tasks that belong to company B and company C respectively.
Even if the linking between the tasks in the new business process was based on
(say) messaging, then Figure 4 could still apply: the diagram does not assume any
particular technology.
It does however make an ownership assumption. Even if the process control
technology used messaging, Figure 4 assumes company A owns (supplies, de-
fines, supports etc) the two manual underwriting tasks. Companies B and C are
suppliers to company A but (in this context) not by way of a business process.
They supply resources. The mechanism by which this was set up was no doubt a
business process (eg contracting), but in Figure 4 companies B and C are not re-
ceiving a request from company A which they are meeting by starting a process.
Instead they are participating in company As process, as that is the agreement.
Figure 5 could also be physically implemented using messaging technology, but
the relationships between the interacting companies are different.
As far as meta-model sharing is concerned, in Figure 4 companies B and C may
or may not share, understand, or subscribe to the process meta-model company
A employs in its business architecture. As long as B and C supply underwriters
and fulfil their service agreement, the process will work. The scenario in Figure 5
however would be helped if companies A, B and C shared the same meta-model
at least implicitly, by conforming to the same standards. It would be even better if
the participants explicitly shared, and knew they were explicitly sharing, the
meta-model underpinning those standards and providing their rationale.

29
David Hollingsworth, ibid.
BUSINESS PROCESS ARCHITECTURE AND THE WORKFLOW REFERENCE MODEL
279
Two organisations interacting commercially generally share an understanding of
double-entry. Communication would be more difficult if not. On a slightly less
universal level, communication between two organisations transferring data be-
tween each other is easier if both can assume they both understand the princi-
ples of the relational model in the same way.
Two organisations interacting with each others processes are more likely to find
those interactions go smoothly if, implicitly or explicitly, they subscribe to the
same process meta-model. At the very least, seeing interactions in term of re-
quests and outcomes means identifying customer and supplier roles and, by
extension, ownership and accountability. In a business process outsource (BPO)
context sharing a meta-model is almost mandatory, not only between the BPO
provider (providing the white-label process shell) and its clients; but also between
the separate clients who all use the same process shell as if it was their own.
SUMMARY
It is possible to derive an effective logical and generic meta-model for business
processes from the concept of a request. The request is implicit or explicit, from
an internal or external customer. Because it is for a specific outcome and of a
specific type it unambiguously determines the end-to-end process. The process
starts at the request and ends at either the requested outcome or an alternative
outcome according to the rules of the process.
By applying process rules to the request dataset regardless of the range of possi-
ble attribute values, the process is analysed to subprocess level. Each subprocess
is a business status transition of the request. So is each task. The task structure
is defined from the minimum set of status transitions required for all possible re-
quests to achieve the subprocess status transition. (Subprocess level is important
for analysis and design and valuable for management information and control,
but is technically superfluous in production.) As status transitions process, sub-
process and task are primarily control units. Conceptually at least the task would
be loosely coupled to any functionality component(s) which implement it.
Process components (request, process, subprocess, task) are seen as databoth
type entities (RequestType, ProcessType etc) and instance entities (RequestIn-
stance, ProcessInstance etc). The process model of an organisation is defined at
type level (ProcessType, SubprocessType, TaskType, plus all routing and interac-
tions), and is itself ultimately a structure of rules.
Processes can interact with each other (and with themselves) in a number of
ways. One type of interaction happens when functionality in one process updates
data which influences what happens in another process. But a task in one proc-
ess can also generate a request to initiate another process instancea new in-
stance of the same process type or of another process type.
The meta-model is independent of any implementation technology. However since
it is a logical business process architecture, it can guide the successful design
and implementation of process-architected business solutions.
30
Different imple-
mented processes and process-modelling formats can also be mapped to it, be-
cause it represents the underlying what of the business process.

30
Chris Lawrence, Make Work Make Sense, Future Managers, Cape Town, 2005.


Section 3


Appendices

283

WfMC Structure and
Membership Information
WHAT IS THE WORKFLOW MANAGEMENT COALITION?
The Workflow Management Coalition, founded in August 1993, is a non-
profit, international organization of workflow vendors, users, analysts and
university/research groups. The Coalitions mission is to promote and de-
velop the use of workflow through the establishment of standards for soft-
ware terminology, interoperability and connectivity among BPM and work-
flow products. Comprising more than 250 members spread throughout the
world, the Coalition is the primary standards body for this software market.
WORKFLOW STANDARDS FRAMEWORK
The Coalition has developed a framework for the establishment of workflow
standards. This framework includes five categories of interoperability and
communication standards that will allow multiple workflow products to co-
exist and interoperate within a users environment. Technical details are in-
cluded in the white paper entitled, The Work of the Coalition, available at
www.wfmc.org.
ACHIEVEMENTS
The initial work of the Coalition focused on publishing the Reference Model
and Glossary, defining a common architecture and terminology for the in-
dustry. A major milestone was achieved with the publication of the first ver-
sions of the Workflow API (WAPI) specification, covering the Workflow Client
Application Interface, and the Workflow Interoperability specification.
In addition to a series of successful tutorials across the U.S., Asia and
Europe, the WfMC spent many hours over 2006 helping to drive awareness,
understanding and adoption of XPDL. As a result, it has been cited as the
most deployed BPM standard by a number of industry analysts, and contin-
ues to receive a growing amount of media attention.
In "Open Formats and Transparency in Business Process Definition" pub-
lished in the Enterprise Open Source Journal, WfMC Executive Director Na-
thaniel Palmer discusses the merits of XPDL as means for ensuring process
definition transparency and portability. XDPL is being adopted as a require-
ment for BPM workflow RFPs, with the most recent examples cited as a large
federal government project and that of a telecommunications firm.
In November the WfMC Interface 1 Committee met to review enhancements
and extensions for XPDL 2.1 to be released in the first half of 2007.
WORKFLOW MANAGEMENT COALITION STRUCTURE
The Coalition is divided into three major committees, the Technical Commit-
tee, the External Relations Committee, and the Steering Committee. Small
working groups exist within each committee for the purpose of defining
workflow terminology, interoperability and connectivity standards, confor-
mance requirements, and for assisting in the communication of this informa-
tion to the workflow user community.
The Coalitions major committees meet three times per calendar year for
three days at a time, with meetings usually alternating between a North
American and a European location. The working group meetings are held
during these three days, and as necessary throughout the year.
MEMBERSHIP STRUCTURE AND DETAILS
284
Coalition membership is open to all interested parties involved in the crea-
tion, analysis or deployment of workflow software systems. Membership is
governed by a Document of Understanding, which outlines meeting regula-
tions, voting rights etc. Membership material is available at www.wfmc.org.
COALITION WORKING GROUPS
The Coalition has established a number of Working Groups, each working on
a particular area of specification. The working groups are loosely structured
around the Workflow Reference Model which provides the framework for
the Coalitions standards program. The Reference Model identifies the com-
mon characteristics of workflow systems and defines five discrete functional
interfaces through which a workflow management system interacts with its
environmentusers, computer tools and applications, other software ser-
vices, etc. Working groups meet individually, and also under the umbrella of
the Technical Committee, which is responsible for overall technical direction
and co-ordination.
WORKFLOW REFERENCE MODEL DIAGRAM

WHY YOU SHOULD JOIN
Being a member of the Workflow Management Coalition gives you the unique
opportunity to participate in the creation of standards for the workflow in-
dustry as they are developing. Your contributions to our community ensure
that progress continues in the adoption of royalty-free workflow and process
standards.
MEMBERSHIP CATEGORIES
The Coalition has three major categories of membership per the membership
matrix following. All employees worldwide are welcome to attend all meet-
ings, and will be permitted access to the Members Only area of our web site.

Full Membership is appropriate for Workflow and Business Process Man-
agement (BPM) vendors, analysts and consultants. You may include up to
MEMBERSHIP STRUCTURE AND DETAILS
285
three active members from your organization on your application and these
may be replaced at any time by notifying us accordingly.

Full
Member
Associate
/Academic
Member
Individual
Member
Fellow
(by election
only)
Visitor
Annual fee $3500 $1500 $500 $0 $100 per
day
Hold office Yes Yes Yes Yes No
Nominate
somebody for
office
Yes Yes No No No
Committee
membership
Yes Yes Yes Yes Observer
Voting right
on standards
Yes Yes Active
Participants
only
Active
Participants
only
No
Voting right
on WfMC.org
business
Yes Current
officers
only
Current
officers only
Current
officers only
No
Company
reps in Meet-
ings without
visitor fee
4
(transfer-
able)
1
(transfer-
able)
individual
only
individual
only
Fee
required
FULL MEMBERSHIP
This corporate category offers exclusive visibility in this sector at events and
seminars across the world, enhancing your customers' perception of you as
an industry authority, on our web site, in the Coalition Handbook and
CDROM, by speaking opportunities, access to the Members Only area of our
web site, attending the Coalition meetings and most importantly within the
workgroups whereby through discussion and personal involvement, using
your voting power, you can contribute actively to the development of stan-
dards and interfaces.
Full member benefits include:
Financial incentives: 50 percent discount all brochure-ware (such as
our annual CDROM Companion to the Workflow Handbook, advertising
on our sister-site www.e-workflow.org), $500 credit toward next years fee
for at least 60 percent per year meeting attendance or if you serve as an
officer of the WfMC.
Web Visibility: a paragraph on your company services/products with
links to your own company website.
User RFIs: (Requests for Information) is an exclusive privilege to all full
members. We often have queries from user organizations looking for
specific workflow solutions. These valuable leads can result in real
business benefits for your organization.
Publicity: full members may choose to have their company logos
including collaterals displayed along with WfMC material at conferences
/ expos we attend. You may also list corporate events and press releases
(relating to WfMC issues) on the relevant pages on the website, and have
a company entry in the annual Coalition Workflow Handbook
MEMBERSHIP STRUCTURE AND DETAILS
286
Speaking Opportunities: We frequently receive calls for speakers at
industry events because many of our members are recognized experts in
their fields. These opportunities are forwarded to Full Members for their
direct response to the respective conference organizers.
ASSOCIATE AND ACADEMIC MEMBERSHIP
Associate and Academic Membership is appropriate for those (such as IT
user organizations) who need to keep abreast of workflow developments, but
who are not workflow vendors. It allows voting on decision-making issues,
including the publication of standards and interfaces but does not permit
anything near the amount of visibility or incentives provided to a Full Mem-
ber. You may include up to three active members from your organization on
your application.
INDIVIDUAL MEMBERSHIP
Individual Membership is appropriate for self-employed persons or small
user companies. Employees of workflow vendors, academic institutions or
analyst organizations are not typically eligible for this category. Individual
membership is held in one person's name only, is not a corporate member-
ship, and is not transferable within the company. If three or more people
within a company wish to participate in the WfMC, it would be cost-effective
to upgrade to corporate Associate Membership whereby all employees world-
wide are granted membership status.
FELLOWS
The WfMC recognizes individuals from within its existing membership who
have made sustained and outstanding contributions to WfMC objectives far
and above that expected from normal member representation.
VISITORS
We welcome visitors at our meetings; it is an excellent opportunity for you to
observe first hand the process of creating standards and to network with
members of the Coalition. Your role will be as an observer only, and you are
not eligible for a password, or for special offers available to WfMC members.
You must pre-register and prepay your Visitor attendance fee. If you decide
to join WfMC within 30 days of the meeting, your membership dues will be
credited with your visitor fee.
HOW TO JOIN
Complete the form on the Coalitions website, or contact the Coalition Secre-
tariat, at the address below. All members are required to sign the Coalitions
Document of Understanding which sets out the contractual rights and ob-
ligations between members and the Coalition.
THE SECRETARIAT
Workflow Management Coalition (WfMC)
Nathaniel Palmer, Executive Director,
99 Derby Street, Suite 200
Hingham, MA 02043
+1-781-923-1411 (t), +1-781-735-0491 (f)
wfmc@wfmc.org.
287
WfMC Officers 2007

STEERING COMMITTEE
Chairman Jon Pyke Fellow, UK
Vice Chairman
(Europe) Justin Brunt TIBCO, UK
Vice Chairman
(Americas) Keith Swenson
Fujitsu Computer
Systems, USA
Vice Chairman
(Asia-Pacific)
Yoshihisa
Sadakane NEC Soft, Japan

TECHNICAL COMMITTEE
Chair Emeritus David Hollingsworth Fujitsu Software, UK
Committee Chair
Keith Swenson
Fujitsu Computer
Systems, USA
Vice Chairman
(Europe) Philippe Betschart W4, France
Vice Chairman
(Americas) Mike Marin Fellow, USA
Vice Chairman
(Asia-Pacific) Dr. Yang Chi-Tsai Flowring, Taiwan

EXTERNAL RELATIONS COMMITTEE
Chairman Ken Mei Global 360, USA
Vice Chairman
(Europe) Martin Ader W&GS, France
Vice Chairman
(Americas) Bob Puccinelli DST Systems, USA
Vice Chairman
(Asia-Pacific) Dr Kwang-Hoon Kim
BPM Korea, South Ko-
rea
SECRETARY / TREASURER Cor Visser
Work Management
Europe, Netherlands
INDUSTRY LIAISON CHAIR
Philip Larson
Appian Corporation,
USA
USER LIAISON CHAIR Charlie Plesums Fellow, USA

288
WfMC Country Chairs
ARGENTINA
Federico Silva
PECTRA Technology, Inc.
USA: +1 (713) 335 5562
ARG (BA): +54 (11) 4590 0000
ARG (CBA): +54 (351) 410 4400
fsilva@pectra.com
AUSTRALIA & NEW ZEALAND
Carol Prior
MAESTRO BPE Limited
Tel: +61 2 9844 8222
caprior@ozemail.com.au
BRAZIL
Vincius Amaral
iProcess
Phone: +55 51 3211-4036
vinicius.amaral@iprocess.com.br
CANADA
(Open)
FRANCE
Raphal Syren
W4 Global
Tel: +(331) 64 53 17 65
raphael.syren@w4global.com
GERMANY
Tobias Rieke
University of Muenster
Tel: [49] 251-8338-072 -
istori@wi.uni-muenster.de
ITALY
Luca Oglietti
Stratos
Tel. +39.011.9500000 r.a.
l.oglietti@gruppostratos.com
JAPAN
Yoshihisa Sadakane
NEC Soft
Tel: +81-3-5569-3399
sadakane@mxw.nes.nec.co.jp
KOREA
Kwang-Hoon Kim
Kyonggi University
Tel: 82-31-249-9679
Fax: 82-31-249-9673
kwang@kyonggi.ac.kr
POLAND
Boguslaw Rutkowski
PB Polsoft
Tel: +48 61 853 10 51
Boguslaw.Rutkowski
@pbpolsoft.com.pl
SINGAPORE & MALAYSIA
Ken Loke
Bizmann System (S) Pte Ltd
Tel: +65 - 6271 1911
kenloke@bizmann.com
SOUTH AFRICA
Marco Gerazounis
TIBCO Software Inc.
Tel: +27 (0)11 467 3111
mgerazou@tibco.com
SPAIN
Elena Rodrguez Martn
Fujitsu Software, Madrid.
Tel. +34 91 784 9565
ermartin@mail.fujitsu.es
THE NETHERLANDS
Fred van Leeuwen
DCE Consultants
Tel: +31 20 44 999 00
leeuwen@dceconsultants.com
TAIWAN
Erin Yang
Flowring Technology Co. Ltd.
Tel: +886-3-5753331 ext. 316
erin_yang@flowring.com
UNITED KINGDOM
<open>

USA (WEST)
Bob Puccinelli
DST Systems
Tel: +1 816-843-8148
rjpuccinelli@dstsystems.com
USA (EAST)
Betsy Fanning
AIIM International
Tel: 1 301 755 2682
bfanning@aiim.org
289
WfMC Technical Committee
Working Group Chairs 2007
WG1PROCESS DEFINITION INTERCHANGE MODEL AND APIS
Chair: Robert Shapiro, Fellow
Email: rshapiro@capevisions.com

WG2/3CLIENT / APPLICATION APIS
open

WG4WORKFLOW INTEROPERABILITY
Chair: Keith Swenson, Fujitsu Computer Systems, USA
Email: kswenson@us.fujitsu.com

WG5ADMINISTRATION & MONITORING
Chair: Michael zur Muehlen, Stevens Institute of Technology
Email: mzurmuehlen@stevens.edu

WG ON OMG
Chair: Ken Mei, Global 360
Email: ken.mei@global360.com

CONFORMANCE WG
Chair: Michael zur Muehlen, Stevens Institute of Technology
Email: mzurmuehlen@stevens.edu

WGRMREFERENCE MODEL
Chair: Dave Hollingsworth, Fujitsu
david.hollingsworth@uk.fujitsu.com

WG9RESOURCE MODEL
Chair: Michael zur Muehlen, Stevens Institute of Technology
Email: mzurmuehlen@stevens.edu



290
WfMC Fellows
The WfMC recognizes individuals who have made sustained and outstanding
contributions to WfMC objectives far and above that expected from normal
member representation.
WFMC FELLOWFACTORS:
To be considered as a candidate, the individual must have partici-
pated in the WfMC for a period of not less than two years and be
elected by majority vote within the nominating committee.
Rights of a WfMC Fellow: Receives guest member level of email sup-
port from the Secretariat; pays no fee when attending WfMC meet-
ings; may participate in the work of the WfMC (workgroups, etc), may
hold office.

Martin Ader
France
Robert Allen
United Kingdom
Mike Anderson
United Kingdom
Wolfgang
Altenhuber
Austria
Richard Bailey
United States
Justin Brunt
United Kingdom
Emmy Botterman
United Kingdom
Katherine Drennan
United States
Layna Fischer
United States
Mike Gilger
United States
Michael Grabert
United States
Shirish Hardikar
United States
Paula Helfrich
United States
Hideshige Hasegawa
Japan
Dr. Haruo Hayami
Japan
Nick Kingsbury
United Kingdom
Klaus-Dieter
Kreplin
Germany
Mike Marin
United States
Emma Matejka
Austria
Dan Matheson
United States
Akira Misowa
Japan
Roberta Norin
United States
Sue Owen
United Kingdom
Jon Pyke
United Kingdom
Charles Plesums
United States
Harald Raetzsch
Austria
Michele Rochefort
Germany
Joseph Rogowski
United States
Michael Rossi
United States
Sunil Sarin
United States
Robert Shapiro
United States
Dave Shorter
(Chair Emeritus)
United States
David Stirrup
United Kingdom
Keith Swenson
United States
Tetsu Tada
United States
Austin Tate
United Kingdom
Cor Visser
The Netherlands
Rainer Weber
Germany
Alfons Westgeest
Belgium
Marilyn Wright
United States
Michael zur Muehlen
United States

291

Workflow Management Coalition
Membership Directory
WfMCs membership comprises a wide range of organizations. All members in good stand-
ing as of January 2007 are listed here. There are currently two main classes of paid mem-
bership: Full Members and Associate Members (which includes Academic). Individual Mem-
bers are not listed. Each company has only one primary point of contact for purposes of the
Membership Directory. Within this Directory, many Full Members have used their privilege
to include information about their organization or products. The current list of members
and membership structure can be found on our website, wfmc.org.
ADOBE SYSTEMS INC.
Full Member
345 Park Avenue, San Jose CA 95110, USA
Steve Rotter, Senior Product Marketing Manager
Tel: [1] 408-536-6000
srotter@adobe.com
Adobe revolutionizes how the world engages with ideas and information. For more than two
decades, the companys award-winning software and technologies have redefined business,
entertainment, and personal communications by setting new standards for producing and
delivering content that engages people virtually anywhere at anytime. From rich images in
print, video, and film to dynamic digital content for a variety of media, the impact of Adobe
solutions is evident across industries and felt by anyone who creates, views, and interacts
with information. With a reputation for excellence and a portfolio of many of the most re-
spected and recognizable software brands, Adobe is one of the worlds largest and most
diversified software companies. As demand for digital content skyrocketed, Adobe solutions
provided a catalyst for moving ideas from concept through creation to delivery across any
digital device. The appointment of Bruce Chizen as Adobes Chief Executive Officer in 2000
further strengthened the companys market leadership, as Adobe delivered on strategies to
move from a desktop software company to a platform provider for enterprises. With its ac-
quisition of Macromedia, Inc. in 2005developer of the ubiquitous Flash technology and
a pioneer in multimedia and web developmentAdobe expanded its strong technology
foundation and portfolio of customer solutions.
ADVANTYS SOLUTIONS LTD.
Full Member
1250 Rene Levesque West, Suite 2200 Montreal, Quebec, H3B 4W8 Canada
Alain Bezancon, President
Tel: [1] 514-989-3700
alain.bezancon@advantys.com
Since 1995 ADVANTYS provides organizations worldwide with a range of innovative, power-
ful, robust, affordable and easy-to-use web based software through a practical approach to
technology. ADVANTYS' solutions are used daily by hundreds of customers worldwide to
automate processes, publish web sites, collaborate and develop web applications.
Today's ADVANTYS' flagship product is the WorkflowGen BPM / Workflow software already
deployed by more than 250 clients internationally. The WorkflowGen BPM / Workflow soft-
ware is fully web based and enables the end users to complete and monitor processes
online. .Net Web Forms and PDF forms can be used as electronic forms. The design and
implementation of the workflows are realized online via a graphical mapping interface with-
out programming. The WorkflowGen BPM / Workflow software is based around a practical
technical solution integrating tried and tested development standards. WorkflowGens scal-
able architecture, and its ability to incorporate additional development, provides for easy
integration with existing databases and applications including Microsoft Sharepoint and
SAP. WorkflowGen is available in 10 languages and distributed in 30 countries
AIIM INTERNATIONAL
Full Member
1100 Wayne Avenue, Suite 1100, Silver Springs, MD, 20910 United States
www.aiim.org
APPENDIX-MEMBERSHIP DIRECTORY
292
Betsy Fanning, Director, Standards & Content Development
Tel: [1] 240-494-2682 / Fax:[1] 301-587-2711
bfanning@aiim.org
AIIM International is the global authority on Enterprise Content Management (ECM). The
technologies, tools and methods used to capture, manage, store, preserve and deliver in-
formation to support business processes. AIIM promotes the understanding, adoption, and
use of ECM technologies through education, networking, marketing, research, standards
and advocacy programs.
APPIAN CORPORATION
Full Member
8000 Towers Crescent Drive, 16th Floor, Vienna, VA. 22182 United States
www.appian.com
Philip Larson, Director of Product Management
Tel: [1] 703-442-1057
larson@appian.com
Founded in 1999 and headquartered in Vienna, VA, Appian is the first business process
management (BPM) company to deliver advanced process, knowledge management, and
analytics capabilities in a fully-integrated suite. Designed to extend the value of your exist-
ing systems, Appian's process-centric, context-driven solutions align business strategy with
execution, and drive quantifiable improvements in business performance. Fortune 500
companies, government agencies, and non-governmental organizations have deployed Ap-
pians award-winning platformAppian Enterpriseto gain unprecedented visibility and con-
trol over their strategic business processes and enable customers to make better-informed
decisions about their business.
ARMA INTERNATIONAL
Associate Member
13725 West 109th Street Suite 101, Lenexa, KS 66215 United States
Peter R Hermann, Executive Director & CEO
Tel: [1]913-217-6025 / Fax: [1]913-341-3742
phermann@arma.org
BEA SYSTEMS
Full Member
2315 North First St., San Jose, California, 95131 United States
www.bea.com
Mike Amend, Deputy CTO
Tel: [1] 408-570-8000 / Fax:[1] 408-570-8901
mike.amend@bea.com
BEA Systems, Inc. (NASDAQ: BEAS) is a world leader in enterprise infrastructure software.
BEA delivers the unified SOA platform for business transformation and optimization in order
to improve cost structures and grow new revenue streams.
BEA AquaLogic BPM Suite, an integrated component of BEAs SOA platform, is a market
leading software suite that allows enterprises to integrate modeling, execution and meas-
urement of end-to-end business processes involving complex interactions between people
and IT systems. BEA customers across the world have achieved greater efficiency, control
and agility by using AquaLogic BPM Suite to optimize the business process lifecycle and
improve alignment between business and IT.
BIZMANN SYSTEM (S) PTE LTD
Associate Member
73 Science Park Drive, #02-05, CINTECH I, Singapore Science Park I, Singapore 118254
www.bizmann.com
Ken Loke, Director
Tel: [65] +65-62711911
kenloke@bizmann.com
Bizmann System (S) Pte Ltd is a Singapore-based company with development offices in Sin-
gapore and Malaysia, developing business process management (BPM) solutions and provid-
ing business process consultation services within the ASIA region. Bizmann develops and
implements business improvement solutions based on leading development engine such as
APPENDIX-MEMBERSHIP DIRECTORY
293
award winner BPM software, Bizflow. To further increase functionalities and to provide com-
plete end-to-end deliverables, Bizmann enhance Bizflow development engine by developing
additional intelligent features and integration connectors. Bizmann System has set up a Re-
gional PROCESS KNOWLEDGE HUB for the Asia market. Bizmann introduces Best Practices
through the Process Knowledge Hub and emphasizes quick deployment. All business process
designs/templates are developed by Bizmann as well as imported from the United Sates and
other advanced countries to facilitate cross knowledge transfers. Bizmann develops and im-
plements BPM applications across all industries. Unlike conventional solutions, BPM solu-
tions address the fundamental of process challenges that all companies face. It allows com-
panies to automate and integrate real and disparate business processes safely, and securely
extend processes to a wide variety of users via the Web. Bizmann BPM solutions rapidly ac-
celerate time-to-value with configure-to-fit process templates and Bizmanns best-in-class
business services, designed to address the unique challenges that companies face.
BPM KOREA SOFTWARE INDUSTRY ASSOCIATION [KOSA]
Full Member
Green B/D. 11F, 79-2, Garakbon-Dong, Songpa-Gu, Seoul 138-711 South Korea
www.sw.or.kr
Kwang-Hoon Kim
Tel: [82] (2) 4054535 / Fax: [82] 2-405-4501
kosainfo@mail.sw.or.kr
BRAINWARE STRATEGIES CONSULTING GMBH
Associate Member
Sonnengass 15 Grafenstein, Carinthia, A-9131 Austria
Roel Krageten
Tel: [43] 664.3070865
brainware@brainware-at.com
CACI PRODUCTS COMPANY
Associate Member
Advanced Simulation Lab, 1455 Frazee Road Suite #700, San Diego, CA 92108
Mike Engiles, SIMPROCESS Product Mgr
Tel: [1] 703-679-3874
mengiles@caci.com
CAPTARIS
Full Member
10085 N.E. 4th Street, Suite 400, Bellevue, WA. 98004 United States
www.captaris.com
Eric Bean, Senior Director, Products Group
Tel: [1] 425-638-4181
ericbean@captaris.com
Captaris, Inc. is a leading provider of software products that automate business processes,
manage documents electronically and provide efficient information delivery. Our product
suite of Captaris RightFax, Captaris Workflow and Captaris Alchemy Document Manage-
ment is distributed through a global network of leading technology partners. RightFax is a
proven market leader in enterprise fax server and electronic document delivery solutions.
Alchemy gives organizations the power to manage and use all of their fixed content
including images, faxes, email, PDFs, and COLDthroughout the information lifecycle
management (ILM) stages, with an integrated and scalable set of tools that are easy to de-
ploy and even easier to use. And Workflow provides easy, flexible and integrated business
process workflow for organizations, enabling productivity, accountability and compliancy.
CCLRC
Associate Member
Rutherford Appleton Laboratory, Chilton Didcot Oxon OX11 0QX United Kingdom
www.cclrc.ac.uk
Trudy Hall, Solutions Developer
Tel: 44]-1235-821900
t.a.hall@rl.ac.uk
APPENDIX-MEMBERSHIP DIRECTORY
294
CONSOLIDATED CONTRACTORS INTL. COMPANY
Associate Member
62B Kifissias, Marroussi Athens Attiki 15125 Greece
www.ccc.gr
Aref Boualwan, Product Manager
Tel : [30] 6932415177
aboualwan@ccc.gr
DST TECHNOLOGIES, INC.
Full Member
330 W. 9th Street, Kansas City, Missouri 64105 United States
www.dstawd.com
Bob Puccinelli, Director of Marketing AWD
Tel: [1] 816 843-8148 / Fax: [1] 816 843-8197
rjpuccinelli@dstawd.com
AWD (Automated Work Distributor) is a comprehensive business process management,
imaging, workflow, and customer management solution designed to improve productivity
and reduce costs. AWD captures all communication channels, streamlines processes, pro-
vides real-time reporting, and enables world-class customer service. For more than a dec-
ade, AWD has been improving processes in industries including banking, brokerage,
healthcare, insurance, mortgage, mutual funds, and video/broadband. Today, AWD is li-
censed by more than 400 companies worldwide with nearly 140,000 active users. AWD is
provided by the DST Technologies subsidiary of DST Systems, Inc. In business since 1969,
DST Systems was ranked as one of Americas Most Admired Companies by Fortune maga-
zine for 2006.
FLOWRING TECHNOLOGY CO. LTD.
Full Member
12F,No.120, Sec.2, Gongdao 5th Rd., Hsinchu City, 300 Taiwan
www.flowring.com.
Chi-Tsai Yang, VP and CTO
Tel: [886] 3-5753331 / Fax:[886] 3-5753292
jjyang@flowring.com
FORNAX CO
Associate Member
Taltos u. 1. Budapest 1123 Hungary
www.fornax.hu
Balazs Ferenc Toth, Business Development Consultant
Tel: [36] 1-457-3000 / Fax:[36] 1-212-0111
tamas.vizmathy@fornax.hu
FUJITSU SOFTWARE CORPORATION
Full Member
3055 Orchard Drive, San Jose, CA, 95134-2022, United States
www.i-flow.com
Keith Swenson, Chief Architect
Tel: [1] 408-456-7963 / Fax: [1] 408-456-7821
kswenson@us.fujitsu.com
Fujitsu Software Corporation, based in San Jose, California, is a wholly owned subsidiary
of Fujitsu Limited. Fujitsu Software Corporation leverages Fujitsu's international scope and
expertise to develop and deliver comprehensive technology solutions. The company's prod-
ucts include INTERSTAGE(tm), an e-Business infrastructure platform that includes the
INTERSTAGE Application Server and i-Flow(tm); and Fujitsu COBOL. i-Flow streamlines,
automates and tracks business processes to help enterprises become more productive, re-
sponsive, and profitable. Leveraging such universal standards as J2EE and XML, i-Flow
delivers business process automation solutions that are easy to develop, deploy, integrate
and manage. i-Flow has a flexible architecture that seamlessly integrates into existing envi-
ronments. This allows you to leverage your IT infrastructure investments and allows you to
easily adapt to future technologies.
GLOBAL 360, INC
APPENDIX-MEMBERSHIP DIRECTORY
295
Full Member
2911 Turtle Creek Blvd. Suite 1100 Dallas, TX 75219 United States
www.global360.com
Ken Mei, Director, International Sales Support
Tel: 1-603-459-0924
ken.mei@global360.com
Global 360 BOS is Process Intelligence for BPM, providing bottom-line BPM benefits with-
out the risk and cost of a BI project, and without relying on a competing application infra-
structure that attempts to obviate existing investments. While most BPM Suites are not
designed to address the management of processes that lie outside of their direct control,
Global 360 BOS is unique because it offers an independent layer that can integrate with
BPM Suites and other applications for providing end to end process visibility and align-
ment. Global 360 BOS benefits are focused in four distinct areas:
Visibility: End-to-end visibility into processes that span multiple organizational
functions and supporting system infrastructures.
Alignment: Alignment of operational processes with strategic business goals and
key performance indicators.
Efficiency: Identify optimal tradeoffs between time (service level) and cost, as well
as identify opportunities to increase utilization of human resources.
Agility: React to changing business conditions in real time and ultimately predict
and proactively address issues such as service level degradation.
HANDYSOFT GLOBAL CORPORATION
Full Member
1952 Gallows Road, Suite 200, Vienna, VA 22182, USA
www.handysoft.com
Robert Cain, Product Manager
Tel:[1] 703-442-5635
rcain@handysoft.com
HandySoft Global Corporation is leading the way for companies worldwide to develop new
strategies for conducting business through the improvement, automation, and optimization
of their business processes. As a leading provider of Business Process Management (BPM)
software and services, we deliver innovative solutions to both the public and private sectors.
Proven to reduce costs while improving quality and productivity, our foundation software
platform, BizFlow, is an award-winning BPM suite of tools used to design, analyze, auto-
mate, monitor, and optimize business processes. By delivering a single-source solution,
capable of improving all types of business processes, HandySoft empowers our clients to
leverage their investment across whole departments and the entire enterprise, making
BizFlow the Strategic Choice for BPM.
HITACHI LTD. SOFTWARE DIVISION
Full Member
5030 Totsuka-Chou, Tosuka-Ku, Yokohama, 2448555, Japan
Ryoichi Shibuya, Senior Manager
Tel: [81] 45 826 8370 / Fax:[81] 45 826 7812
shibuya@itg.hitachi.co.jp
Hitachi offers a wide variety of integrated products for groupware systems such as e-mail
and document information systems. One of these products is Hitachis workflow system
Groupmax. The powerful engine of Groupmax effectively automates office business such as
the circulation of documents. Groupmax provides the following powerful tools and facilities:
A visual status monitor shows the route taken and present location of each document in a
business process definition. Cooperative facilities between servers provide support for a
wide area workflow system Groupmax supports application processes such as consultation,
send back, withdrawal, activation, transfer, stop and cancellation. Groupmax is rated to be
the most suitable workflow system for typical business processes in Japan and has pro-
vided a high level of customer satisfaction. Groupmax workflow supports the Interface 4.
INTERWOVEN INC
Full Member
18 East 41st Street, 17th Floor, New York, New York 10017 United States
www.interwoven.com
APPENDIX-MEMBERSHIP DIRECTORY
296
Steve Tynor
Tel: [1] 212-213-5056 / Fax:[1] 212-213-5352
steve.tynor@scrittura.com
Interwoven Inc delivers an integrated Java language web-based suite of Business Process
Management (BPM), Workflow and Document Management components that are designed
to optimize and streamline the legal, trading and operations areas of financial services insti-
tutions that are burdened with high levels of complex contractual documentation.
IVYTEAM-SORECOGROUP
Associate Member
Alpenstrasse 9, P.O. BOX, CH-6304, Zug, Switzerland
www. ivyteam.ch
Tel: +41 41 710 80 20
heinz.lienhard@ivyteam.ch
Standard-based independent BPMS: easy deployment of run-time
solution directly from BPMN graphical process model. Rich Web interfaces besides
classical HTML make it the tool of choice for process-oriented Rich Internet Applications.
METODA S.P.A.
Associate Member
Via San Leonardo, 52, Salerno 84131 Italy
Giuseppe Callipo
Tel : [39] 0893067-111 / Fax : [39] 0893067-112
g.callipo@lineargruppo.it
NEC SOFT LTD.
Full Member
1-18-6, Shinkiba, Koto-ku, Tokyo, 136-8608, JAPAN
www.nec.com
Yoshihisa Sadakane, Sales and Marketing Senior Manager
Tel: [81]3-5569-3399 / Fax: [81]3-5569-3286
sadakane@mxw.nes.nec.co.jp
OBJECT MANAGEMENT GROUP, INC.
Association Member
140 Kendrick Street, Bldg A, Suite 300 Needham, MA. 02494 United States
Jamie Nemiah
www.omg.org
Tel: [1] 781-444-0404
nemiah@omg.org
OPENWORK
Full Member
Via Conservatorio 22, Milano, 20122 Italy
www.openworkBPM.com
Francesco Battista, Marketing Director
Tel: [39] 02-77297558
francesco.battista@openworkBPM.com
openwork is a pure Independent Software Vendor concentrating all efforts exclusively on its
openwork Business Process Management suite. openwork features an original method-
ology that makes use of daily business, non-technical language and approach, introducing
high-abstraction tools to map, share and maintain organizations shape and working rules.
Those agile tools also allow to reflect organizations evolutions, keeping them always aligned
with changing business needs. openwork is then able to act as an interpreter of graphic
representation of organizations shape and working rules, enabling paper manual processes
to become alive into finalized real-world web applications. openwork is the final solution of
a crucial problem: modelling business organizations and processes, getting at the same
time suitable fitting-like-a-glove BPM web applications, cutting down low added-value activi-
ties, technical complexity and costs. openwork suite including also Workflow Manage-
ment, Document Management and Business Activity Monitoring capabilities, has already
been used to build hundreds of complete solutions for customer companies of any sector
and size.
APPENDIX-MEMBERSHIP DIRECTORY
297
PECTRA TECHNOLOGY, INC.
Full Member
2425 West Loop South Suite 200, Houston TX 77027, USA
www.pectra.com
Federico Silva, Marketing Manager
fsilva@pectra.com
Tel: [1] 713-335-5562
PECTRA Technologys award-winning Business Process Management system, PECTRA
BPM Suite, is a powerful set of tools enabling discovery, design, implementation, mainte-
nance, optimization and analysis of business processes for different kinds of organizations.
PECTRA BPM Suite is an application that automates the processes and the most critical
tasks in the organization, generating optimum levels of operational effectiveness. It fulfills
all requirements demanded by todays organization, quickly and efficiently. Furthermore, it
increases the return on previous investments made in technology by integrating all existing
applications. Based on BPM technology it incorporates the concepts of: BAM (Business Ac-
tivity Monitoring) providing management with user-friendly graphic monitoring tools, to
follow up any deviation in the organization's critical success factors, with capabilities to
control and coordinate the organization's performance by means of graphic management
indicators; WORKFLOW offering powerful tools to automate and speed the organization's
business processes, improving communication and work-flow between people working in
different areas; carrying out the work more efficiently and producing customer satisfaction,
lower levels of bureaucracy and cost-reductions in day-to-day operations; EAI (Enterprise
Application Integration) enabling integration with all existing technologies in the organiza-
tion, regardless of their origin or platform, coordinating them to help the organization
achieve its goals more efficiently; and B2Bi (Business to Business Integration) enabling the
control and coordination of each and every link in the organization's value chain, providing
robust tools for business process management, and enterprise application integration, mak-
ing it possible to totally integrate suppliers, clients and partners in an easy and flexible way.
PEGASYSTEMS INC.
Full Member
101 Main Street, Cambridge, MA 02142 United States
www.pegasystems.com
Rosalind Morville, Sr. PR Manager
Tel: [1] 617-374-9600 ext. 6029
pr@pega.com
Pegasystems Inc. (Nasdaq: PEGA) provides software to automate complex, changing busi-
ness processes. Pegasystems, the leader in unified process and rules technology, gives
business people and IT departments the ability to use best processes across the enterprise
and outperform their competition. Our new class of Business Process Management (BPM)
technology makes enterprise systems easy to use and easy to change. By automating policy
manuals, system specifications and lines of manual coding with dynamically responsive
updates, Pegasystems powers the world's most sophisticated organizations to "build for
change. Pegasystems award-winning, standards-based BPM suite is complemented with
best-practice solution frameworks to help leaders in the financial services, insurance,
healthcare, life sciences, government and other markets drive growth and productiv-
ity. Headquartered in Cambridge, MA, Pegasystems has regional offices in North America,
Europe and the Pacific Rim.
PEARSON PLC
Associate Member
1 Lake Street, #3F18, Upper Saddle River NJ 07458 United States
www.pearson.com
Yonah Hirschman, Senior Analyst
Tel: [1] 201-236-7836
Yonah.Hirschman@PearsonEd.Com
PERSHING LLC
Associate Member
One Pershing Plaza, 8th Fl., Jersey City, NJ. 07399 United States
Regina DeGennaro, VP - Workflow Solutions
APPENDIX-MEMBERSHIP DIRECTORY
298
Tel: 201-413-4588
rdegennaro@pershing.com
PROJEKTY BANKOWE POLSOFT
Full Member
Plac Wolnosci 18, Poznan, 61-739 Poland
Boguslaw Rutkowski
Tel: [48] 61-859-93-11
boguslaw.rutkowski@pbpolsoft.com.pl
Projekty Bankowe Polsoft company is part of ComputerLand Group, second in size IT com-
pany group in Poland offering high quality services and products for industry and public
sector. PB Polsoft offers BPB Workflow technology, highly scalable standard-based workflow
system with strong EAI capabilities, especially for WebServices, rich process and form
(XForms) modeling tools as well as set of components for building workflow client portal
applications based on java portlet technology. BPB Workflow was developed in J2EE tech-
nology and can be used as both an embedded or a standalone server working in EJB con-
tainer. It has well defined Java WAPI and WebServices interface and offers web-based ad-
ministration and modeling tools. The workflow engine is cluster aware and has robust
build-in process fail-over and recovery capabilities.
SIEMENS MEDICAL SOLUTIONS
Associate Member
51 Valley Stream Parkway, Mail Stop B9C, Malvern, PA. 19355 United States
Anup Raina
Tel: [1] 610-219-6300
anup.raina@siemens.com
SOFT SOLUTIONS
Full Member
East 33rd Street, 9th Floor, New York, NY 10016, USA
4950, Yonge Street, Ste 400, Toronto, Ontario M2N 6K1, Canada
2, Allee Lavoisier, Villeneuve d'Ascq, France 59650
http://softsolutionsus.com
Walid Daccache, Project Manager
[Tel : 33] (3) 2 04 14 19 0 / Fax : 33] (3) 2 04 14 19 9
daccache.walid@softsolutions.fr
Founded in 1989, Soft Solutions is today a leading provider of web-based retail merchan-
dise management and decision-support software, with more than 80,000 active users
worldwide.
Soft Solutions Suite includes a full range of business applications that support the mer-
chandise management functions in the retail business, including Assortment planning,
Pricing and Promotions management, Global Data Synchronization, Forecasting, Advanced
analytics, Optimization and Vendor Funds management. These applications share a com-
mon retail business model foundation, the most flexible and the most comprehensive in the
industry. And, by way of ensuring consistent and auditable execution of business func-
tions, Soft Solutions also provides a fully integrated set of tools such as User and Security
Management, Workflow Management, Data integration, Reporting and Translation.
The companys worldwide client base consists of multi-divisional, multi-format, Tier-1 re-
tailers such as Carrefour, CVS pharmacy, Canadian Tire, Auchan, FNAC, Groupe Louis
Delhaize, Intermarch, Casino, FNAC, Galeries Lafayette, Castorama, Kingfisher, Cham-
pion, Match, Provera, Pinault-Printemps-Redoute, Monoprix, and Cora. Soft Solutions ap-
plication suite is completely based on Java/J2EE technology, conforms to the latest indus-
try and technology standards (including GS1), and is compatible with multiple databases,
industry application server packages and EAI solutions.
SOURCECODE TECHNOLOGY HOLDINGS, INC.
Full Member
4042 148
TH
Ave NE, Redmond, WA 98039, United States
http://www.k2workflow.com
Leah Clelland
Tel: 1 (425) 883-4200 / Fax: 1 (425) 671-0411
Leah@k2workflow.com
APPENDIX-MEMBERSHIP DIRECTORY
299
TELECOM ITALIA SPA
Associate Member
Via G. Reiss Romoli 274, Torino, Italy 10148
Giovanna Sacchi
Tel: [0039] (01) 12288040
giovanna.sacchi@telecomitalia.it
TIBCO SOFTWARE, INC.
Full Member
3303 Hillview Avenue, Palo Alto, CA 94304 USA
http://www.tibco.com/software/process_management/default.jsp
Justin Brunt, Sr. Product Manager
Tel: [44] (0) 1793 441300 / Fax: [44] (0) 1793 441333
jbrunt@tibco.com
TIBCO Software Inc. (NASDAQ:TIBX) provides enterprise software that helps companies
achieve service-oriented architecture (SOA) and business process management (BPM) suc-
cess. With over 3,000 customers, TIBCO has given leading organizations around the world
better awareness and agilitywhat TIBCO calls The Power of Now. TIBCO provides one of
the most complete offerings for enterprise-scale BPM, with powerful software that is capable
of solving not just the challenges of automating routine tasks and exception handling sce-
narios, but also the challenges of orchestrating sophisticated and long-lived activities and
transactions that involve people and systems across organizational and geographical
boundaries.
TOGETHER TEAMLSUNGEN GMBH
Associate Member
Elmargasse 2-4, Wien, A-1191, Austria
www.together.at
Alfred Madl, Geschaftsfuhrer
Tel: [43] 5 04 04 122 / Fax:[43] 5 04 04 11 122
a.madl@together.at
UNISYS CORPORATION
Full Member
1000 Cedar Hollow Road
Malvern, PA 19335
Shane C. Gabie, Technology Research Director
Tel: +1-610-648-2731
Shane.Gabie@unisys.com
Unisys Corporation (NYSE: UIS) is a worldwide technology services and solutions company.
Our consultants apply Unisys expertise in consulting, systems integration, outsourcing,
infrastructure, and server technology to help our clients achieve secure business opera-
tions. We build more secure organizations by creating visibility into clients business opera-
tions. Leveraging the Unisys 3D Visible Enterprise approach, we make visible the impact of
their decisions ahead of investments, opportunities and risks. For more information, visit
www.unisys.com.
VIGNETTE CORPORATION
Full Member
1601 South MoPac Expressway, Building 2
Austin, TX, 78746-5776, United States
www.vignette.com
Clay Johnson, Staff Engineer
Tel: [1] 512-741-1133 / Fax:[1] 512-741-4500
chjohnson@vignette.com
Vignette is the leading provider of content management solutions used by the most suc-
cessful organizations in the world to interact online with their customers, employees and
partners. By combining content management with integration and analysis applications,
Vignette enables organizations to deliver personalized information wherever it is needed,
integrate online and enterprise systems and provide real-time analysis of the customer ex-
perience. Vignette products are focused around three core capabilities that meet the needs
APPENDIX-MEMBERSHIP DIRECTORY
300
of today's e-business organizations: Content Management - the ability to manage and de-
liver content to every electronic touch-point. Content Integration - the ability to integrate a
variety of e-business applications within and across enterprises. Content Analysis - the
ability to provide actionable insight into the customer's relationship to a business.
W4 (WORLD WIDE WEB WORKFLOW)
Full Member
4 rue Emile Baudot, 91873 Palaiseau Cedex, France
www.w4global.com
Philippe Betschart, Marketing Director
Tel: [33] 1 64 53 19 12 / Fax:[33] 1 64 53 28 98
Philippe.Betschart@w4global.com
W4, one of the leading European software vendors specialized in Business Process Man-
agement , supplies more than 270 customers, serving more than 1 million people. For al-
most 10 years W4 has been widely acclaimed for its expertise in Human Workflow, which
guarantees transparently, via its functional architecture, task follow-up and traceability:
who does what, when and how. Whatever the particular need, there is a package available
allowing customers to take full advantage of the powerful W4 technology: Manage the
automation of any kind of process: W4 BPM Suite 2006 is a complete package, from
modelling to monitoring, dedicated to the enterprise process automation. This BPM package
is capable of managing the automation of complex work procedures involving high volumes
of users, as well as support procedures (finance, HR, etc.) and company-specific proce-
dures. Dedicated to end-users, W4 BPM Suite 2006 provides them with an easy tool for
modelling their processes. It also offers managers reporting and supervision functionalities.
Optimize how internal business needs are handled: W4 Ready for Business are out-of-
box packages that capitalize upon W4 BPM Suites powerful and consulting services to
deliver applications that correspond well to each individual customers particular way of
doing business. Thanks to W4 Ready for Business, companies can optimize how to handle
purchase, training and recruitment requests.
Embed an OEM component: W4 Embedded Edition is a collection of embeddable soft-
ware components for business process management (BPM), targeted at software vendors
working on all standard market platforms (ERP, Content management, EAI) Just some of
those who accelerate their business every day thanks to W4 are Barclays Bank, BNP
Paribas, Alcatel Space, Siemens Transportation Systems, Volvo Portugal, EMI Music
France, Cap Gemini, Teleca Solutions Italia, TNT, etc.
WORK MANAGEMENT EUROPE
Associate Member
Postbus 168, 3830 AD Leusden, The Netherlands
www.wmeonline.com
Cor H. Visser, Managing Director
Tel: [31] (33) 433 2223 / Fax: [31] (33) 433 2224
cvisser@wmeonline.com
WORKFLOW & GROUPWARE STRATEGIES
Associate Member
37 rue Bouret, Paris 75019 France
Martin Ader
Tel: [33] (1) 4 23 80 81 5
martin.ader@wngs.com
XIAMEN LONGTOP SYSTEM CO., LTD
Associate Member
15/F, Block A, Chuangxin Building, Software Park, Xiamen, 361005, P.R. China
Shou Sheng Ye, R&D Director
Tel: [86] (5) 922-396888
xyyang@longtop.com
301
Author Biographies
Our sincere thanks go to the authors who kindly gave their time, effort and expertise into
contributing papers that cover methods, concepts, case studies and standards in busi-
ness process management and workflow. These international industry experts and
thought leaders present significant new ideas and concepts to help you plan a successful
future for your organization. We also extend our thanks and appreciation to the members
of WfMC Review Committee who volunteered many hours of their valuable time in the
selection of the final submissions and who helped guide the content of the book.
LUIS JOYANES AGUILAR
[ljoyanes@fpablovi.org]
Universidad Pontificia de Salamanca, Campus Madrid, Spain.
P Juan XXIII, 3 - 28040 Madrid (Spain)
Phone: (34)667438400
Lus Joyanes Aguilar (Jan, Spain) holds a PhD in Computer Science (1997) and in Soci-
ology (1996) from Universidad Pontificia de Salamanca en Madrid. He has a bachelor de-
gree en Physics (1977) and Military Sciences (1970). Since 1991, he is the Systems and
Languages Department Director and Software Engineering in the Universidad Pontificia
de Salamanca en Madrid. Since 2001, he is Director of Master and PhD Programs in this
University. Actually he leads the research groups in Multidisciplinary Computer Re-
searches, Knowledge Management and Knowledge Society. He is author of more than 30
books and more than 70 communications in scientific magazines and conferences.
FRANCESCO BATTISTA
[francesco.battista@openworkBPM.com]
Marketing Director,
openwork
Via Conservatorio, 22, Milan 20122 Italy
Francesco BATTISTA, Computer Science graduated, since 1994 worked for Procter &
Gamble as Management Systems Project Manager in Italian and international projects,
than moved to major IT consultancy companies with account and project management
responsibilities for process reorganizations involving SAP and web technologies. Since
2004 Francesco is Marketing Director for openwork, the Italian BPM company, leading
also corporate communications. Published author, he is active speaker in BPM related
events.
ARNAUD BEZANCON
(arnaud.bezancon@advantys.com]
Chief Technical Officer,
ADVANTYS Solutions Ltd., Canada
1250 Rene Levesque West, Suite 2200 Montreal, Quebec H3B 4W8
Arnaud Bezancon, IT Engineer (Orsay, France), is the CTO of ADVANTYS (France) which
he co-founded in 1995 with his brother Alain (BBA, HEC, Montreal). ADVANTYS is a
leading ISV offering the Smart Enterprise Suite (SES). As a CTO, Arnaud manages the
software division of ADVANTYS and launched the SES listed in a Gartner Magic Quad-
rant in 2004. Arnaud designed ADVANTYS flagship product-the WorkflowGen work-
flow management system. Arnaud has developed a pragmatic approach of technology
based on real life experience. ADVANTYS' software are used by hundreds of major corpo-
rations worldwide.
ROBERT CAIN
[rcain@handysoft.com]
Sr. Product Manager,
APPENDIXAUTHORS
302
HandySoft Global Corporation
1952 Gallows Road, Suite 200,Vienna, VA. 22182 USA
Robert Cain is the Sr. Product Manager at HandySoft Global Corporation, where he is
responsible for managing the delivery of BizFlow, the companys award-winning BPM
platform, as well as the Compliance (SOX), and OfficeEngine
TM
BPM solutions. Robert
plays a lead role in defining the product roadmap for customers based on leading BPM
standards, methodologies and current customer needs. Robert has provided consulting
on requirements, re-engineering, implementation and maintenance of BPM projects, and
has delivered training on leading BPM methodologies. Prior to joining HandySoft Global
Corporation, he held positions at Booz Allen Hamilton and Parsons as Project Manager
for the US Air Force Environmental Restoration Program. His experience with environ-
mental restoration and IT technology was used to support various congressional and
presidential budgets. Robert holds a MS (Environmental Management) and BS (Biology)
from George Mason University, and MS from Marymount University.
JORGE CARDOSO
[jcardoso@uma.pt]
Departamento de Matemtica e Engenharias
Universidade da Madeira 9000-390
Funchal Portugal
Jorge Cardoso joined the University of Madeira (Portugal) in March 2003. He received his
Ph.D. in Computer Science from the University of Georgia (EUA) in 2002. Dr. Cardoso
organized the First, Second, Third, and Fourth International Workshop on Semantic and
Dynamic Web Processes. He has published over 65 refereed papers in the areas of work-
flow management systems, semantic Web, and related fields. He has also edited 3 books
on semantic Web and Web services. He is also on the Editorial Board of several journals.
LINUS K CHOW
[linus.chow@bea.com]
Principal Systems Engineer, Business Interaction Division,
BEA Systems, Inc.
8444 Westpark Drive 6th Floor McLean, VA 22102
Linus Chow is the Principal Systems Engineer for Business Process Management (BPM)
for the Public Sector at BEA systems. He has over 14 years of leadership and manage-
ment experience in information technology internationally with 7 years in workflow and
BPM. Currently, Linus, leads the adoption of BPM solutions for BEA Public Sector cus-
tomers. Prior to BEA he was in senior manager positions in BPM Services Delivery, Prod-
uct Management, Marketing, and General Management. He has played crucial roles in
expanding the growth of BPM and workflow adoption first in the US and then interna-
tionally from Australia to Switzerland.
He is published author and an active speaker on the Best Practices of BPM worldwide.
He is very active with the BPM industry frequently engaging with Brainstorm, WfMC,
BPMI, BPMG, IQPC, AIIM, and other industry organizations. A decorated former US
Army Officer, Linus has an MBA, MS in Management Information Systems, and BS in
Mathematics, as well as several industry certifications.
JONATHAN EMANUELE
[jonathan.emanuele@siemens.com]
Lead Analyst
Siemens Medical Solutions
51 Valley Stream Pkwy, Malvern, P A. 19355 USA
Jonathan Emanuele has a B.S. in Operations Research & Industrial Engineering and a
M.Eng. in Computer Science from Cornell University. He is currently the lead analyst for
the Siemens healthcare workflow management team. His poster entitled "Workflow Tech-
nology in Healthcare" won a distinguished poster award at the 2006 AMIA Spring Con-
gress.
APPENDIXAUTHORS
303
LAYNA FISCHER
[layna@wfmc.org]
Editor and Publisher,
Future Strategies Inc.,
2436 North Federal Highway, #374, Lighthouse Point, FL 33064 USA
www.futstrat.com
As the Official Editor and Publisher to WfMC and Director of the annual Global Awards
for Excellence in BPM and Workflow, Layna Fischer works closely with WfMC to promote
the mission of the WfMC with respect to industry awareness and educational content. Ms
Fischer was also the Executive Director of the Business Process Management Initiative
(now merged with OMG) and is on the board of BPM Focus (previously WARIA, Workflow
And Reengineering International Association), where she was CEO since 1994. Future
Strategies Inc., (www.futstrat.com) publishers of books and papers on business process
thought leadership specializes in dissemination of information about BPM and workflow
technology, business process redesign and electronic commerce. As such, the company
contracts and works closely with individuals and corporations throughout the USA and
the world.
Future Strategies Inc., is the also publisher of the business book series New Tools for
New Times, as well as the annual Excellence in Practice volumes of award-winning case
studies and the annual Workflow Handbook, published in collaboration with the WfMC.
Her experience in the computer industry includes being the president and CEO of a
multi-million dollar high-technology export company for seven years, during which time
she also founded an offshore franchise distribution company called Computer Direct. Ms.
Fischer was also a senior editor of a leading computer publication for four years and has
been involved in international computer journalism and publishing for over 20 years. She
was a founding director of the United States Computer Press Association in 1985.
GABRIEL FRANCIOSI
[gfranciosi@pectra.com]
Consulting Manager,
PECTRA Technology, Inc., USA
2425 West Loop South, Suite 200, Houston, Texas. USA
Gabriel Franciosi is Consulting Manager of PECTRA Technology. Gabriel Franciosi is an
IT specialist with more than 8 years of experience in the field. He currently manages
PECTRA Technology Consulting Services worldwide. He is also one of the leading mentors
of PECTRA Technology. He is responsible for the consulting and deployment areas of
PECTRA BPM Suite. He has also worked in several IT projects in Spain, Chile, Colombia,
Mexico, and Argentina. Gabriel Franciosi is 27 years old and he has earned a degree on
System Engineering issued by Crdoba State University. He has lectured and given
courses on BPM, EAI, and Workflow in different countries, such as Spain, Chile, Colom-
bia and Mexico. He also teaches at different universities in Argentina, and has attended
events, seminars, training courses, and conferences all over the world.
DR. SETRAG KHOSHAFIAN
[setrag@pega.com]
VP of BPM Technology,
Pegasystems Inc.
101 Main Street Cambridge, MA. 02142 United States
Setrag is one of the earliest pioneers and recognized experts in Business Process Man-
agement. He has been a senior executive for the past 15 years. Currently he is Vice Presi-
dent of BPM Technology at Pegasystems Inc. He is the strategic BPM technology and
thought leader at Pega. He is involved in numerous initiatives, including BPM Technology
Directions, Enterprise Content Management and BPM; Enterprise Service Bus and BPM;
Collaboration and BPM; Enterprise Performance Management and BPM; Service Oriented
Architectures (SOA); Six Sigma for continuous improvements in BPM projects; and BPM
Maturity Models. Setrag is the lead author of nine books and has numerous publications
APPENDIXAUTHORS
304
in business as well as technical periodicals. He has given seminars and presentations in
conferences, to technical and business communities. Setrag has also been a professor for
the past 20 years. He has taught graduate and undergraduate courses in several univer-
sities around the world where he provides his students a unique combination of aca-
demic depth combined with industry experience. Setrag holds a PhD in Computer Sci-
ence from the University of Wisconsin Madison. His latest book is Service Oriented En-
terprises, Auerbach Publications, ISBN 0849353602.
RAY HESS
[rhess@cchosp.com]
V.P., Information Management,
The Chester County Hospital
701 E. Marshall Street W. Chester, PA 19380 United States
Ray Hess is the Vice President for Information Management at the Chester County Hospi-
tal, a 225 bed community hospital in suburban Philadelphia, PA. He has clinical experi-
ence having provided patient care and rehabilitation for eight year. He has administrative
experience having managed Cardiology Services for four years and has received a master
degree in Healthcare Administration. He has an Information Management and Process
Improvement background having overseen Decision Support Services for fourteen years,
Health Information Management for six years, and Workflow automation/Clinical Deci-
sion Support efforts for over six years. He has spoken to audiences throughout the
United States and internationally on Healthcare Process Management theory, concepts,
and implementation techniques. He is currently devoting his efforts to workflow automa-
tion and clinical decision support efforts associated with the installation of a comprehen-
sive EMR and CPOE system at Chester County Hospital.
LAURA KOETTER
[laura.koetter@siemens.com]
Software Product Analyst,
Siemens Medical Solutions
51 Valley Stream Pkwy, Malvern, PA 19355, USA
Laura Koetter has a B.B.A. in Operations & Information Technology and was The College
of William and Marys Most Distinguished Graduate. She is currently an analyst for Sie-
mens healthcare workflow management, developing clinical process workflows for the
hospital community. Laura previously worked as the lead analyst in a clinical environ-
ment, defining and implementing process optimization and patient safety improvement
workflows for Riverside Health System, a multi-hospital enterprise. Laura has presented
at conferences on the topic of healthcare workflows and has been heavily involved in BPM
efforts in the healthcare industry for several years.
CHRISTIAN KUPLICH
[christian.kuplich@boc-de.com]
Senior Consultant at BOC Germany
Vossstrasse 22, 10117 Berlin, Germany
Christian Kuplich studied Information Systems in Berlin, Germany. Since 2002 he has
been responsible for the customizing and development activities of BOC Germany. As
senior consultant he is involved in several projects in the context of process based appli-
cation development mainly developing methods for model based specification and imple-
mentation of process based applications following the MDA approach.
SALVATORE LATRONICO
[salvatore.latronico@openworkBPM.com]
BPM Consultancy Director, openwork
Via Conservatorio, 22, Milan 20122 Italy
Salvatore LATRONICO, Physics graduated, is one of the founders and inventors of open-
work, the Italian BPM company born in 1998. Starting from methodology and architec-
APPENDIXAUTHORS
305
ture definition, he always played a basic role in the evolution of the openwork BPM suite
both for theoretical & high abstraction tools definition and technical matters. Salvatore is
now BPM consultancy Director for openwork, coordinating all BPM projects resources:
published author, he is active speaker in BPM related events.
CHRIS LAWRENCE
[ergonology@iafrica.com]
Business architecture consultant,
Old Mutual South Africa
PO Box 66, Cape Town 8000, South Africa
Chris left England for Cape Town in 1996 to co-found Global Edge, a business-
enablement competency in support of international financial services group Old Mutual.
He is now an independent business architecture consultant, specializing in business
process architecture and holistic delivery and change methodologies. His book 'Make
Work Make Sense' was published in 2005 (Future Managers, Cape Town;
www.makeworkmakesense.com), and he has also contributed to international publica-
tions on workflow and enterprise architecture. Chris has designed and implemented solu-
tions in the UK, US and Southern Africa, and holds Masters degrees from Cambridge
and London.
KYEONG EON LEE
[kelee@ktf.com]
Manager, e-Management Team
KTF, IT Service Group at IT Planning & Operation Office
Currently responsible for managing and developing KTF's BPM solution using HandySoft
BizFlow. He has interests in standardizing business resources and developing a reposi-
tory of business processes which can be utilized like a library. He received a Bachelor's
degree in Computer Engineering from ChungBuk University and, in 2001, joined
KTFreetel due to the corporate merger with KTM.com where he was responsible for man-
aging EIP (Enterprise Information Portal) and Workflow system. BPM-related projects
include EKP (Enterprise Knowledge Portal), upgrade of BPM system (HandySoft
BizFlow), Project Leader of BPM Advancement Project, Project Leader for Process Asset
Library project, and team member of the e-Management Team in the IT Service Depart-
ment at KTF.
HEINZ LIENHARD
[heinz.lienhard@ivyteam.ch]
Founder of ivyTeam,
Soreco-ivyteam
Alpenstrasse 9 , P.O.Box Zug CH-6304, Switzerland
Heinz Leinhard lives and works in Switzerland at the lovely lake of Zug. With ivyTeam he
has successfully brought together the web application and the workflow world. He re-
ceived a Masters degree in electrical engineering from the ETH (Switzerland), a Masters
degree in mathematical statistics from Stanford University (California, USA) and the Dr.
h.c. from the informatics department of ETH, Lausanne (Switzerland). For many years he
headed the central R&D labs of Landis&Gyr Corp., now part of the Siemens group, where
he built up important R&D activities in system theory, automatic control, informatics and
microtechnology.
DR. AMBUJ MAHANTI
[am@iimcal.ac.in]
Dean (Planning and Administration) and Professor - Management Information Systems,
Indian Institute of Management Calcutta, Diamond Harbour Road, Joka, Kolkata -
700104, West Bengal, India.
Dr. Ambuj Mahanti works as a Professor of Computer Science and Management Informa-
tion Systems at Indian Institute of Management Calcutta, Kolkata, India. His specializa-
APPENDIXAUTHORS
306
tions are in the area of Artificial Intelligence and Heuristic Programming, and Business
Intelligence. He has published extensively in international journals and conferences in-
cluding several publications in Journal of ACM, Theoretical Computer Science, Artificial
Intelligence, Information Processing Letters, etc. His current research interest includes
Combinatorial Auctions, Data Mining, Recommendation Systems, Business Intelligence,
and Workflow Verification.
CHARLES "FLIP" MEDLEY
[charles.medley@bea.com]
Senior Engineer, World Wide Field Operations,
BEA Systems
8444 Westpark Drive, McLean, VA 22105 USA
Charles "Flip" Medley is a Senior Engineer for Service Oriented Architecture (SOA) solu-
tions at BEA Systems. He has been managing and supporting successful SOA, J2EE,
and java solutions since 1995. Currently at BEA, he guides the strategic SOA invest-
ments made by Federal Agencies and ensures their success across an entire stack of SOA
products from service bus, to directory services, security, etc. He regularly engages with
systems integrators large and small on multi-year projects of significant value.
JAN MENDLING
[jan.mendling@wu-wien.ac.at]
Vienna University of Economics and Business Administration, Institute of Information
Systems and New Media, Augasse 2-6, A-1090 Vienna, Austria
Institute of Information Systems and New Media WU Wien - Vienna University of Eco-
nomics and Business Administration
Jan Mendling is a PhD student at the Institute of Information Systems and New Media at
the Vienna University of Economics and Business Administration. His research interests
include business process management, enterprise modeling, and workflow standardiza-
tion. Jan has published more than 50 papers and article and served as program commit-
tee member in several workshops and conferences. He is co-author of the EPC Markup
Language (EPML) and co-organizer of the XML4BPM workshop series.
DR. ING. JUAN JOS MORENO
[jmoreno@lithium.com.uy]
Director.
LITHIUM Software.
Av. Libertador Lavalleja 1532 Of. 1526. Montevideo - Uruguay
Juan J. Moreno is cofounder of Lithium Software (www.lithium.com.uy), a Business
Process Management and Workflow focused company, holding the intellectual property of
its DocFlow Workflow Management System and PROSimfony Business Process Manage-
ment Suite. He is also professor and researcher at the Engineering and Technologies
Faculty of the Universidad Catlica del Uruguay. He holds a PhD in Computer Science,
specialized in Software Engineering, from the Universidad Pontificia de Salamanca, in
Spain. He has dozens of technical and arbitrated publications, and has been recognized
with the third price of Innovator of the Year 2003 in his country, Uruguay.
NATHANIEL PALMER
[nathaniel@wfmc.org]
President, Transformation+Innovation and
Executive Director, Workflow Management Coalition
99 Derby Street, Suite 200, Hingham, MA 02043 USA
Nathaniel Palmer is President of Transformation+Innovation, as well as the Executive
Director of the Workflow Management Coalition. Previously he was Director, Business
Consulting for Perot Systems Corp, and also spent over a decade with Delphi Group as
Vice President and Chief Analyst. He is the author of over 200 research studies and pub-
lished articles, as well as The X-Economy (Texere, 2001). Nathaniel has been featured
APPENDIXAUTHORS
307
in numerous media ranging from Fortune to The New York Times. He is on the advisory
boards of many relevant industry publications, such as E-DOC, Business Integration
Journal and Business Transformation Journal, as well as the Board of Directors of Asso-
ciation of Information Management (AIIM) NE, and was nominated to represent the Gov-
ernor of Massachusetts on the Commonwealths IT Advisory Board.
SINNAKKRISHNAN PERUMAL
[krish@iimcal.ac.in]
Doctoral Student,
Indian Institute of Management Calcutta
C-302, Tagore Hall, IIM Calcutta, Diamond Harbour Road,
Joka Kolkata West Bengal 700104 India
Sinnakkrishnan Perumal is currently doing doctoral research in the area of workflow
verification at Indian Institute of Management Calcutta, Kolkata, India. He has published
many book chapters, and presented papers in various conferences. His research interests
include Workflow Verification, Business Process Management, Process Improvement, E-
Governance, Enterprise Architecture and Artificial Intelligence.
ALEXANDER PETZMANN
[alexander.petzmann@boc-eu.com]
Mag,
BOC
Wipplingerstrae 1, Vienna, 1010 Austria
Alexander Petzmann was born in 1971 and studied business administration in Vienna,
Austria and Los Angeles, USA. In 1997, he joined BOC and focused since then on busi-
ness process management from the business view as well as from the technical view as a
consultant. Currently Mr. Petzmann is focusing on business process and performance
portals and takes responsibility for a team of consultants in this field at BOC.
MICHAEL PUNCOCHAR
[michael.puncochar@boc-at.com]
Senior Consultant, Managing Director
BOC Unternehmensberatung GmbH,
Rabensteig 2, Vienna, 1010 Austria
Michael Puncochar was born in 1973 and studied business informatics in
Vienna, Austria. After his studies he worked in a process oriented
e-learning project at the department for knowledge engineering
(University of Vienna). In this project Michael was responsible for the
implementation of a metamodeling tool, which was used to design learning
processes. In 2003, he joined BOC and focused on projects in the context
of process oriented software engineering mostly based on the MDA
approach. Besides this he built up a competence team for IT Service- and
Architecture Management in BOC, as well as a team focusing on process
oriented software engineering.
JON PYKE
[jpyke@theprocessfactory.com]
Chairman WfMC
CTO, TheProcessFactory
Faris Lane, Woodham Surrey KT15 3DN United Kingdom
Jon was the Chief Technology Officer and a main board director of Staffware Plc from
August 1992 until was acquired by Tibco in 2004. He demonstrates an exceptional blend
of Business/People Manager; a Technician with a highly developed sense of where tech-
nologies fit and how they should be utilized. Jon is a world recognized industry figure; an
exceptional public speaker and a seasoned quoted company executive.
APPENDIXAUTHORS
308
As the CTO and Executive Vice President for Staffware Plc, Jon was responsible for a
development team geographically split into two countries (USA and the UK) and four loca-
tions. Jon also had overall executive responsibility for the product strategy, positioning,
public speaking etc. Finally, as a main board director he was heavily involved in PLC
board activities including merges and acquisitions, corporate governance, and board di-
rector of several subsidiaries. Jons final piece of work for Staffware was to conceive, de-
sign and oversee the development of the IProcess Engine.
Jon has over 30 years experience in the field of software development. During his career
he has worked for a number of software and hardware companies as well as user organi-
zations. When Jon joined Staffware in 1992 as the Technical Director Staffware was a
privately held company employing approximately 18 people with an annual turnover of
1.2 million during Jon's tenure with the company, it was taken public by floating on
the London Alternative Investment Market (AIM) in 1995 followed by a full listing on the
London Stock Exchange in 2000. The company achieved year on year growth of more
than 50% (CAGR) employing some 400 people in 16 countries with 40 million turn over.
Jon has written and published a number of articles on the subject of Office Automation,
BPM and Workflow Technology. More recently Jon has co-authored a book covering both
technical and business aspects of BPM. The book is published by Cambridge University
Press and is called Mastering you Organizations Processes.
Jon is a frequent speaker at international events and he regularly quoted in the National
and Industry press. Jon has excellent relationships with the analysts community and
senior figures in the Computer industry.
Jon co-founded and is the Chair of the Workflow Management Coalition. He is an AiiM
Laureate for Workflow and was awarded the Marvin Manheim award for Excellence in
workflow in 2003.
HAJO A. REIJERS
[h.a.reijers@tm.tue.nl]
Technische Universiteit Eindhoven,Department of Technology Management,
PO Box 513, 5600 MB Eindhoven, The Netherlands
Hajo A. Reijers is an assistant professor of Business Process Management at
Eindhoven University of Technology. He received his Ph.D. and M.Sc. in
Computer Science from the same institute and has worked for several
management consultancy firms, most recently as a manager within Deloitte.
His research interests include Business Process Redesign, workflow
management systems, and discrete event simulation. He published articles in
Information Systems, Journal of Management Information Systems, Computer
Supported Cooperative Work, Omega: The International Journal of Management Science,
Computers in Industry, and other journals.
CLAY RICHARDSON
[crichardson@ppc.com]
Practice Leader, Business Process Management Practice,
Project Performance Corporation
1760 Old Meadow Rd. 4th Floor, McLean, VA 22102 USA
Clay Richardson currently leads Project Performance Corporation's award-winning busi-
ness process management practice, where he directs process improvement and automa-
tion efforts for public and private sector clients, including the U.S. Housing and Urban
Development, Government of Bermuda, and U.S. Patent and Trademark Office. Prior to
joining Project Performance Corporation, Mr. Richardson served as Director of Profes-
sional Services with HandySoft Global Corporation, a pure-play BPM software vendor.
Mr. Richardson also served as Principal and Co-founder of StrictlyBizness, an e-business
consultancy that specialized in developing automated web-based solutions for private
and public sector clients. Mr. Richardson is a graduate of Boston Universitys highly re-
garded Business Process Management Certificate Program and is a regular presenter at
BPM industry conferences and events. In addition, he regularly facilitates business proc-
ess strategy and architecture workshops for public- and private-sector clients.
APPENDIXAUTHORS
309
STEVE ROTTER
[srotter@adobe.com)
Director of Product Marketing,
Adobe Systems Inc
345 Park Avenue, San Jose CA 95110
Steve Rotter is the Director of Product Marketing for Adobe Systems Enterprise and De-
veloper Solutions Business Unit. Steve Rotter has been helping organizations with their
business process management initiatives for almost 2 decades. Currently, Mr. Rotter is
the Director of Product Marketing with Adobe Systems where he helps drive the market-
ing strategy for Adobes process management and Rich Internet Application technologies.
Prior to joining Adobe, Mr. Rotter was co-founder and Vice President of Marketing of Q-
Link Technologies (which Adobe acquired in 2004), one of the pioneers in Business Proc-
ess Management software. Previously, Mr. Rotter was Managing Partner with Paradigm
Research, a business consulting firm specializing in process management and reengi-
neering strategies for the Global 2000. At Paradigm Research, Mr. Rotter led the organi-
zations practice area focused the developing process reengineering methodologies and
delivering process re-design solutions. Mr. Rotter has also held numerous industry man-
agement positions including Worldwide Marketing Manager within Motorolas Cellular
Infrastructure Group where his organization received the prestigious CEO Quality Award.
Mr. Rotter is a published author and frequently speaks on the subject of process man-
agement at industry events. Mr. Rotter holds a Masters degree from Northwestern Uni-
versity\'s Kellogg Graduate School of Management and serves as a volunteer on the Mar-
keting Advisory Board and Vision Council for World Vision, a non-profit, Christian relief
and development organization dedicated to helping children and their communities
worldwide. Mr. Rotter can be reached at srotter@adobe.com.
DAVID ORENSANZ SANTOS
David Orensanz Santos
[david.orensanz@boc-es.com]
Senior Consultant and Managing Director
BOC, Spain Velzquez, 71 Madrid, 28006 Spain David Orensanz was born in 1973 and
studied Computer Sciences at Universidad Pontificia de Salamanca. Madrid. In 1998, he
joined BOC to develop the company in Spain working as business and IT consultant.
Currently Mr. Orensanz is the resposible for projects in all Spain, Portugal and Latino-
amrica regarding Business Processes, ITIL and Performance Management (Balanced
Scorecards and KPIs).
MOHAMMED SHAIKH
[mohammed@imagexx.com]
Image X
6144 Calle Real #200, Santa Barbara, CA 93117
Mohammed Shaikh received his Bachelor of Technology and Master of Technology degree
from Indian Institute of Technology, Bombay India. He received his MBA and PhD from
University of Utah and also completed Master of Information Technology requirements
offered by AIIM. He is at present president of Image X and has been responsible for tech-
nical development of Image X's Document Management and Workflow system. He re-
ceived patent no.7,035,830 for "Method and apparatus for remote filing and recordation
of documents". Using the patented technology Image X has developed Electronic Filing.
Image X also developed "Digital Advanced Care Directive" in collaboration with CMA (Cali-
fornia Medical Association) and MedePass.
FEDERICO SILVA
[fsilva@pectra.com]
Marketing Manager,
PECTRA Technology, Inc., USA
2425 West Loop South, Suite 200, Houston, Texas. USA
APPENDIXAUTHORS
310
Federico Silva is Marketing Manager of PECTRA Technology, Inc. He is responsible for the
marketing strategy and corporate communications of PECTRA Technology and leads the
overall operations dealing with the positioning of all products, services and solutions of-
fered at different markets. Federico carries out marketing-related activities and actions,
including corporate communications, internal communications, branding, promotions
and advertising.
Federico joined PECTRA Technology in April, 2003, to assist and collaborate with US and
Canada operations, reporting directly to the CEO.In January 2004, he started to carry
out new markets' research and reorganizing the communications contents and channels.
In May 2004, he became head of the marketing area.
Before joining PECTRA Technology, Federico worked in the media. For over 10 years he
was responsible for editing, producing and hosting several TV and radio shows, writing
for the press, as well.
A graduate from the Crdoba State University (Argentina), Federico has a degree in
Communications, specializing in print media and institutional communications. He is 30-
years old and is fluent in Spanish, English and Italian.
KAI A. SIMON, PHD
[kai@instant-science.net]
Business Process Manager
ALTANA Pharma AG - a Nycomed company
Byk-Gulden-Str. 2, 78467 Konstanz, Germany
Kai Simon has been a member of the BPM community for almost 15 years. He has held
positions in academia, research, consulting and industry in various European countries.
Kai holds a doctoral degree in Informatics from Gteborg University (Sweden).
KEITH SWENSON
[kswenson@us.fujitsu.com]
VP of R&D,
Fujitsu Computer Systems
1250 Arques Ave. , Sunnyvale, CA. 94085 USA
Keith Swenson is Vice President of Research and Development at Fujitsu Computer Sys-
tems Corporation for the Interstage family of products. He is known for having been a
pioneer in web services, and has helped the development of standards such as WfMC
Interface 2, OMG Workflow Interface, SWAP, Wf-XML, AWSP, WSCI, and is currently
working on standards such as XPDL and ASAP. He has led efforts to develop software
products to support work teams at MS2, Netscape, and Ashton Tate. He is currently the
Chairman of the Technical Committee of the Workflow Management Coalition. In 2004 he
was awarded the Marvin L. Manheim Award for outstanding contributions in the field of
workflow.
Mr. Swenson holds both a Masters degree in Computer Science and a Bachelors degree
in Physics from the University of California, San Diego.
From 1995 to 1997 he served as Vice Chairman of the ACM Special Interest Group for
Group Support Systems (SigGROUP). In 1996, he was elected a Fellow of the Workflow
Management Coalition. In 2004 he was awarded the Marvin L. Manheim Award for out-
standing contributions in the field of workflow.
DR. JUAN J. TRILLES
[juanjo.trilles@grupoauraportal.com]
President,
AuraPortal BPMS
Germanias 84, Gandia 46702, Spain
Dr. Juan J. Trilles is Ph.D in Engineering and Science. He is also Master in Business
Administration for Madrid University. In 1980 he founded Dimoni Software, which be-
came a very respected Spanish software company.
APPENDIXAUTHORS
311
Today, he is President and Chief Software Architect of AuraPortal BPMS,
(www.auraportal.com) a company with presence in more than 15 countries in Europe
and America that, for the last 6 years has been developing a BPMS with Business Rules
and Document Handling that executes Processes directly from its model without any pro-
gramming effort. Important customers like PEMEX (Petroleos Mexicanos), one of the 50
largest corporations in the world, use AuraPortal BPMS to manage its processes.
Dr. Trilles has authored several papers on innovative ideas (i.e. Domain Cellular Account-
ing) applied today in enterprise management, which have been published in several
countries.
WIL VAN DER AALST
[w.m.p.v.d.aalst@tue.nl]
Technische Universitet Eindhoven, Department of Mathematics and Computer Science
PO Box 513, 5600 MB Eindhoven, The Netherlands
Prof.dr.ir. Wil van der Aalst is a full professor of Information Systems at
the Technische Universiteit Eindhoven (TU/e) having a position in both the
Department of Mathematics and Computer Science and the department of
Technology Management. Currently he is also an adjunct professor at
Queensland University of Technology (QUT) working within the BPM group. His
research interests include workflow management, process mining, Petri nets,
business process management, process modeling, and process analysis.
Professor Wil van der Aalst has published more than 60 journal papers, 10
books (as author or editor), 150 refereed conference publications, and 20
book chapters.
IRENE VANDERFEESTEN
[i.t.p.vanderfeesten@tm.tue.nl]
Technische Universiteit Eindhoven, Department of Technology Management,
PO Box 513, 5600 MB Eindhoven, The Netherlands
Irene Vanderfeesten is a Ph.D. student in Eindhoven University of Technology. She re-
ceived her Master's of Science degree in Computer Science from the same university
(2004) and is now working on the project "Intelligent software tools for workflow process
design" in the Information Systems group of the Department of Technology Management.
Her research interests include workflow management, business process redesign, prod-
uct based workflow design, and human aspects of information systems. She has received
the BPTrends Best Student Paper Award 2004 and has published papers at several con-
ferences and workshops.

313
Additional Workflow and
BPM Resources
NON-PROFIT ASSOCIATIONS AND RELATED STANDARDS RESEARCH ONLINE

AIIM (Association for Information and Image Management)
http://www.aiim.org
AIS Special Interest Group on Process Automation and
Management (SIGPAM)
http://www.sigpam.org
BPR On-Line Learning Center
http://www.prosci.com
BPM Focus
http://bpmfocus.org/
Business Process Management Initiative
http://www.bpmi.org see Object Management Group
IEEE (Electrical and Electronics Engineers, Inc.)
http://www.ieee.org
Institute for Information Management (IIM)
http://www.iim.org
ISO (International Organization for Standardization)
http://www.iso.ch
Object Management Group
http://www.omg.org
Open Document Management Association
http://nfocentrale.net/dmware
Organization for the Advancement of Structured Information
Standards
http://www.oasis-open.org
Society for Human Resource Management
http://www.shrm.org
Society for Information Management
http://www.simnet.org
Wesley J. Howe School of Technology Management
http://attila.stevens.edu/workflow
Workflow And Reengineering International Association (WARIA)
http://www.waria.com now BPMFocus
Workflow Management Coalition (WfMC)
http://www.wfmc.org
Workflow Portal
http://www.e-workflow.org

315
Index
A
Advance Medical Directives, 174
ALTANA Pharma, 147, 150, 155
ASAP, 11, 13
Asynchronous Service Access Protocol. See ASAP
B
BAM, see Business Activity Monitoring
Base Subgraph, 228
Basel II, 211, 212, 216, 222
Bed Management workflow, 134
BPDM, 11
BPEL, see Business Process Execution Language
BPEL4People, 28
BPM Center of Excellence, 73-79, 81, 84
BPM COE, see BPM Center of Excellence
BPM Maturity Model Strategy, 83
BPM Maturity Models, 73
BPM Reference Model, 8
BPMN, see Business Process Management Notation
BPMS Suite, 75
BPRI, see Business Process Runtime Interfaces.
Business Activity Monitoring, 69, 127, 136, 147, 209
Business Agility, 128, 129
Business Architecture, 91
Business Intelligence, 73
Business Intelligence, 73
Business Monitoring Framework, 110
Business Performance Management, 67
Business Process Execution Language, 10, 11, 12, 13, 14,179, 186
Business Process Management Notation, 9, 10, 11, 12, 13, 14, 179, 193,
195, 248, 250, 261, 277
Business Process Management Suites, 75, 85, 93, 203
Business Process Oriented Architecture, 85
Business Process Reengineering, 147, 156
Business Process Runtime Interfaces, 11
Business Rules, 211, 212, 221
C
Case Based Reasoning, 204
INDEX
316
Clinical Decision Support, 158
COE Roles, 76
Cohesion/cohesion metric, 180, 182, 190
Conformance checking, 185
Control-Flow Complexity, 182
Core processes, 56
Coupling /Coupling-cohesion ratio, 180, 182
CRM, 60
D
Decision points, 228
Definition
Business Process, 18
Business Process Management, 19
Workflow, 18
Degree metrics, 181
Density, 181
departmental workflows, 86
Digital Certificate, 167
Document Management System, 118
E
ebXML BPSS, see ebXML Business Process Specification Schema
ebXML Business Process Specification Schema, 11, 170
EKP, see Enterprise Knowledge Portal
electronic signature, 118
Emergency Department workflows, 146
Enterprise Architecture, 191, 192, 193, 202
Enterprise Knowledge Portal, 60
Enterprise Performance Management, 60, 66, 73, 84
Enterprise Performance Management, 73, 84
enterprise service bus, 46
ERP, see Enterprise Performance Management
Execution Graph, 108
F
Federal Enterprise Architecture, 52
Forrester Research, 17, 27
G
Gartner, Inc., 159
Global Excellence Awards for BPM and Workflow, 133
Global Justice XML Data Model, 170
Graph reduction rule method, 227
INDEX
317
H
Health Level 7 Organization, 170
Healthcare BPM Process Opportunities, 161
healthcare industry, 157, 158, 159, 164, 165, 167, 170, 171
HIPAA compliance, 167
Hollingsworth, David, 8, 243
Human Interaction Management, 30
Human Workflow, 85
I
Infection Control workflow, 134,139, 140, 145
Infinite loops, 225
Integrated Design Environment, 39
Integration with BPM, 59-66
Iteration table, 232
K
Kane County, 100
Key Performance Indicators, 67
Knowledge Maps, 205
Knowledge-Intensive Business Processes, 28
KT Freetel Co. Ltd., 55, 65
L
Lack of synchronization, 225, 231, 239
Lines of Code (LOC), 183
M
Mahanti-Sinnakkrishnan Cyclic Workflow Verification (MSCWV) Algorithm,
227, 228
Managing process change, 58
McCabe's cyclometric number, 182
MDA, 105
Model-based analysis, 185
Modularity, 180
N
natural language, 208
Node Constructs, 224
O
OASIS Open Document Format for Office Applications (ODF), 9
Object Management Group, 105
Object Oriented Programming, 35
ODF. See OASIS Open Document Format for Office Applications
INDEX
318
OOP. See Object Oriented Programming
open standards, role of, 9
Oversight and Performance Assurance, 59
P
Patterns, 205
performance indicators, 72
Petri Net, 227
pharmaceutical industry, 147
PMLC, 103, 104
Process Classification, 57
process instance pattern, 205
Process Management Life Cycle, 103
Process mining, 185
Process Prioritization, 57
Process Selection criteria, 57
processes to automate first, 56
process-is-software approach, 129
Product Data Model, 182
R
Random graph generator, 238
RAROC, 212, 218
real-time prototyping, 117
retirement tsunami, 33
RIAs. See Rich Internet Applications,
Rich Internet Applications, 97-102, 202
S
sales order workflow process, 231
Sarbanes-Oxley-Act, 149
Service Oriented Architecture, 28, 33-37, 44-53, 73, 81, 84, 85, 91-96, 106,
147, 193
Service Oriented Enterprise (SOE), 73
Services Process Management, 28
Shorter, David, 7
SOA, see Service Oriented Architecture
SOAP, 92
standards
workflow, 283
stovepiped technologies, 60
Structural Conflict Errors, 224
structural correctness, 223
INDEX
319
T
The Workflow Reference Model 10 Years On, 243
TNT Global Express Italy, 117
U
Unified Modeling Language, 105
W
Weighting significance, 57
WF Nets, 227
Wf-XML, see Workflow eXtensible Markup Language
Workflow API (WAPI), 283
Workflow eXtensible Markup Language. 11, 13
Workflow Graph, 108
workflow graph, 223
workflow patterns, 227
Workflow verification, 223
Workflow Management Coalition (WfMC)
definition, 283
membership, 284
X
XML Process Definition Language, 160, 193, 246
XPDL. See XML Process Definition Language

Das könnte Ihnen auch gefallen