Beruflich Dokumente
Kultur Dokumente
BPM
Oracle BPM 12c Gateways
ADF Runtime Interface Generator for BPM Human Tasks
Top 10 Things You Should Know About BPM 11g/12
Edition IV
January 2015
The SOA Magazine IV edition focuses on Industrial SOA articles which showcase that SOA is much more than a web
service. Rolando Carrasco and Arturo Viveros showcase in their SOA Myth Busters article the evolution of the SOA
Suite towards a complete platform over the last 10 years. An industrial SOA platform contains also API
management to secure web services, as well b2b as a trading hub between external partners.
What is the use case for Business Process Management versus Services Oriented Architecture. SOA is mainly used
for data mediation and process orchestration between different IT systems. BPM is focused on automation on
human based processes like an employee holiday request. Key is to re-use the SOA web services for your BPM
deployment. In our holiday request example we can re-use a web service from the HR system which informs the
employee of the available holidays. In this magazine edition you learn on the BPM Side more about Gateways and
how Link Consulting is using a generator to create Human tasks. Thanks to Mark Foster who highlights in his article
10 best practices for SOA Suite and BPM Suite a must read!
This quarterly newsletter is for both customers and partners who are active in the SOA space. The content includes
articles for IT decision makers, architects and developers. The goal of the newsletter is to update you on the latest
SOA technologies, market studies, trainings & certifications and conferences. We publish the newsletter in English
and some articles will be in Portuguese especially for the Brazilian marketing, thanks to Ricardo Puttini and his
team. Contribution is open for everybody! We want to publish your content! Like case studies, best practice,
technical examples and solutions and conferences. Feel free to submit your content to Marcello Marinho
(Portuguese articles) & Jrgen Kress (for English articles)
SOA Magazine IV
Table of Content
SOA Magazine IV
BPEL vs OSB
In this episode, we will dive into one of the hottest arguments Oracle SOA Practitioners have been
sustaining over the years: BPEL against Service Bus. Can and should they work together? Is one of them
better than the other? Are there any well-founded guidelines that I can rely on in order to decide
between them? And what about SOA Suite 12c? Around this subject there seem to be plenty of myths,
misunderstandings and misconceptions, so lets get it on and uncover as much of the truth as possible.
SOA Magazine IV
Now, it happens that Oracle has a product by the name Oracle BPEL Process Manager and
another named Oracle Service Bus. Thats what can add a little bit more of confusion to the
comparison. And this question: should I use BPEL or Service Bus? It is a common one within the
Oracle SOA professionals and we think to any other software company that supports these two
standards.
We remember our first years with this technologies, it was back in 2003-2004, when Oracle first
released Oracle BPEL PM. In those days, Oracle did have something to offer in the integration
space: Oracle Interconnect. But it was not that popular. When Oracle released BPEL PM, it
changed everything. It changed many professional careers, as well as it changed the market.
Oracle didnt yet had a product quite resembling an ESB, but it did have Oracle BPEL PM. With
that product it was able to compete in the Orchestration stage against players like BEA
Weblogic Integration, TIBCO, and Web Methods.
Ever since those days, Oracle BPEL PM (were referring to the tool, not the standard) was
capable of: routing calls, enrichment of messages, virtualizing the access to other services, data
model/format transformation through adapters to EIS systems like SAP, Oracle EBS, PeopleSoft,
Siebel, Mainframe, etc. Oracle hadnt started the rampage of buying PeopleSoft, Siebel, etc. so
there was no tight integration with them. It was just adapters and this tool which was able to
easily communicate with them.
SOA Magazine IV
As time went by, a combination of market pressure and industry trends, caused Oracle to
include in the 10g release of SOA Suite (year 2007) something called: Oracle Enterprise Service
Bus (OESB). Then Oracle could finally say: we do have as part of our stack an Enterprise Service
Bus.
As fancy as it sounded, this former version of Oracles ESB was somehow limited and never
really qualified as a best-of-breed product. Furthermore, it never boded well with architects,
developers and administrators who found it a tad complicated and unfriendly.
So, it seems like weve identified a first milestone in this BPEL vs OSB discussion. An early
version of the SOA Suite weve come to know (10g), which included:
An already robust and highly-liked product based on an up and coming industry
standard (BPEL)
A below average ESB as an optional component of the suite
At the time, Oracle ESB (OESB) wound up not being heavily used in most integration projects
outside of AIA implementations. For the time being and the state of industry requirements,
BPEL could exist and function mostly by its own.
Then in 2008, Oracle buys BEA Systems, and with this acquisition we come into a second
relevant milestone: The release of Oracle SOA Suite 11g in mid-2009. This new and revamped
stack introduced some very relevant changes:
1. Oracle WebLogic Server as the runtime platform for all of the tools included in the Suite.
2. Oracle Service Bus (based on the former BEA AquaLogic Service Bus) comes along, as a
true ESB capable of positioning itself as one of the market-leading players.
3. 10gs ESB (OESB), is rebranded as Mediator and stays as an optional component of the
Suite.
This proved to be a great move by Oracle, and from the technology/architecture standpoint it
gave to the components its specific weight. Now the Enterprise Service Bus
standards/capabilities were very well identified inside Oracle Service Bus. But if your SOA
initiative included orchestration, business activity monitoring, business rules, the Oracle SOA
Suite with the rest of the components, was an excellent option.
With this new stack, OSB could even be licensed on its own for clients that wanted to have the
ESB alone as a cornerstone for their SOA implementation, though the most usual scenario is for
the customers to have both the SOA Suite and the OSB as part of a multifaceted and well
complimented SOA Toolkit.
SOA Magazine IV
On the other hand, OESBs rebranding stirred some confusion and otherwise felt a little like a
demotion for an incumbent product that didnt quite stick and was replaced by a more
powerful tool.
Finally, coming back to the present we find ourselves months removed from a third big
milestone: SOA Suite 12c. In this new major release, the stack itself has been realigned
according to the current industry trends (Industrial SOA, Developer Productivity, Cloud and
Mobile). The tools are now more integrated than ever and Oracles intention is clearly to
revolutionize the way we approach and implement SOA.
In SOA Suite 12c it has become much more evident where and when to use BPEL or Service Bus,
or when to use them together. According to Oracle, there should be less confusion around this.
And in the way the IDEs, engines and monitoring are integrated in this release, the architecture
definitions should be clearer than ever.
So lets summarize everything that we just mentioned:
Year
2003-2007
2007-2009
2009-2013
BPEL
ESB
Acquisition of Collaxa by
The
JDeveloper
BPEL
designer is vastly improved
facilitating
the
SOA
developers work
SOA Magazine IV
2014 +
in SOA Suite
Oracle Service Bus (OSB)
leverages the former
ALSB and is positioned
immediately as a fullfledged ESB with lots of
capabilities.
The design-time tool for
OSB is Eclipse-based
(OEPE)
Oracle BPEL PM and OSB have finally converged into an integrated
developer environment (JDev) with the release of SOA Suite 12c
The concepts of Industrial SOA, Developer Productivity, Mobility
and Cloud Integration drive the improvements in the products,
and have been equally applied to both tools, making them more
compatible and complimentary than ever.
Looking at this timeline, one can easily see where a lot of confusion could have arisen despite
Oracles best efforts, especially before 12c:
BPEL Developers that have been working with the tool for a long time just love it, and
have seen the product grow and evolve in an orderly and standardized fashion, with a
constant IDE. This as opposed to an ESB that has suffered many changes so far and has
not been as easy to get familiarized with. This can lead to radical opinions that are not at
all uncommon like: BPEL is unquestionably better than the Service Bus, where a sound
product like OSB ends up being disqualified for all the wrong reasons.
SOA Professionals that were already used to work with BEA WebLogic Server, WLS
Workbench, Aqua Logic, Fuego etc., are much more prone to like OSB and understand
its potential and capabilities. They may even find it easier and more dynamic than BPEL
PM due to familiarity with the IDE and web-console. OSB is just friendlier than BPEL and
Eclipse is a much better IDE than JDev. It wouldnt be strange to hear an assessment
like this one from an accomplished developer with a background in BEA technology.
People that are newer to Oracle FMWs stack, always seem to be wondering which one
of the products is the best alternative, if they are making the right choice and even if
they are over-utilizing or sub-utilizing one of them. In this case, rather than
understanding the products as complimentary to each other, we would be unnecessarily
stressing out and questioning ourselves over which one to choose.
Architects and programmers that used to work with the tools from a different vendor
(IBM, Software AG, TIBCO, etc.) and are now working with Oracle FMW, usually have
trouble identifying the stack at first glance, so they tend to gravitate towards the
product which seems to be more familiar and less problematic to them and stay away
SOA Magazine IV
from the rest based on their past good / bad experiences. This kind of scenario often
leads to the most disparate opinions such as: BPEL just doesnt work at all and its use
implies a terrible danger for the client, well be better off by using exclusively a Service
Bus. Just imagine the turmoil an statement like this can produce, especially when
pronounced by an expert with a long track record in integration projects.
But is this controversy really just a matter of perception, past-experience, or even posttraumatic stress disorder? Weve already dived enough into the history, so lets have a look
now at some other kind of facts.
Here well have a look at some of these patterns and whether they are more suitable to be applied by
the utilization of BPEL and/or Service Bus:
SOA Design Pattern
Data Model Transformation
Data Format Transformation
State Repository
Rules Centralization
Process Abstraction
Process Centralization
Asynchronous Queuing
Intermediate Routing
Event-Driven Messaging
Protocol Bridging
Atomic Service Transaction
Compensating Service Transaction
Reliable Messaging
Policy Centralization
BPEL
X
OSB
x
x
X
X
X
X
X
x
x
x
x
X
X
X
x
x
SOA Magazine IV
As we can see in the table, each tool has its own set of relevant capabilities, some of them shared by
both. JCA Adapters can also help us extend the native functionality of the products, especially in the
case of BPEL.
Based on the information weve just analyzed, we can definitely come up with a collection of pretty
accurate guidelines regarding the use of one product or the other for certain scenarios:
BPEL
Service virtualization using OSB
SOA Magazine IV
Asynchronous queuing scenario with one-way invocation of a BPEL process through OSB
Oracle SOA Suite 12c, if anything, encourages SOA Practitioners to use BPEL and OSB together
in order to design and implement robust solutions with the ability to provide the required
flexibility, SLA compliance and transactional capacity the industry looks for.
The following improvements (among many others) seem to be the most relevant in this regard:
SOA Magazine IV
10
Conclusion
While there indeed was a time when one product was more robust and mature than the
other in the Oracle FMW Stack (BPEL > Service Bus), this is not true anymore. Even in
11g this point was arguable to some degree, but in 12c both tools have been equally
upgraded and aligned with the industry tendencies mentioned beforehand.
Some of the controversy between these tools, especially among developers, had to do
with the presence of different IDEs for BPEL PM and OSB. This proved to be a very
impractical situation that has already been taken care of in 12c. JDev is now the IDE of
choice for developing Oracle SOA Suite components, whether it is with BPEL or Service
Bus.
Even though BPEL + Service Bus have always been able to work together, in the past
there was no pressing need to design solutions based on such a combination. Moreover,
the separation of concerns between the tools was not as clear as it is now, thanks to the
maturation of SOA Methodology. Nowadays, especially with SOA Suite 12c, BPEL & OSB
comprise a terrific and necessary combination. A dynamic duo whose combined
capabilities will let us face and successfully figure out the multiple challenges of
Industrial SOA, as well as reap its significant benefits.
We certainly hope that this body of work has been interesting, useful and enjoyable for you
appreciated reader. Lets meet again in our next episode!! The SOA Myth Busters
SOA Magazine IV
11
Rolando Carrasco
Rolando Carrasco is a SOA Architect, co-founder and part of the S&P
Solutions team in Mexico and Latin America. Hes been working with
Oracle SOA since 2003/2004, and his professional career has been
focused in the integration space. He worked for HP and Oracle.
In Oracle he was part of the Product Management team with
responsibilities in the Latin-American region. Rolando is also codirector of the Oracle Users Group in Mexico ORAMEX. This user group is focused on Oracle
Technology, and since 2012 has been coordinating activities oriented to deliver events for the
community, and among other things to coordinate the OTN Tour.
The projects where Rolando has participated involve the usage of SOA, BPM, Webcenter, Identity
Management, Weblogic, Exalogic, and Webcenter. All of those technologies have been implemented
successfully in different customers, industries and even countries (Costa Rica, Peru, and Honduras).
Rolando has several published articles/videos at his own blog (oracleradio.blogspot.com) and also in the
Oracle Technology Network in Spanish.
Rolando is also an Oracle ACE.
Contact:
Blog
Arturo Viveros
Arturo is an outstanding professional currently based in Mexico City, with 10
years of experience in the development, design, architecture and delivery of
IT Projects for a variety of industries. He is also a regular speaker in
technology conferences, both in Mexico and abroad. He is an Oracle ACE
Associate and works as an IT Senior Architect in S&P Solutions.
Arturo is also part of the coordinating committee for ORAMEX (Oracle User
Group in Mexico) and has recently achieved the Oracle SOA Certified IT
Architect certification as well as the Cloud Certified Architect and SOA
Certified Architect grades from Arcitura Inc. He is a certified trainer
authorized to deliver the SOA School and Cloud School modules both in English and in Spanish.
Arturos technical articles are frequently published on Oracle OTN (Spanish), and his blog
(oracletechnocore.blogspot.com) includes also several articles about Oracle technology in both
languages.
Contact:
SOA Magazine IV
12
SOA Suite/B2B as a Critical Mission Hub for a High Volume Message Use
Case
Introduction
Stop. Think. Ok, in the meanwhile 2 seconds has passed and 250 messages more were processed by a
mission critical hub built with Oracle B2B and SOA Suite which connects thousands of trading partners
and processes millions of messages per day, handling 40% of Global Air cargo Traffic.
In this article, you will find described high availability solution architecture, covering B2B and core SOA
Suite components as BPEL, along with Business Rules, Mediator and BAM integration, as well as lessons
learned in conducting such complex and mission-critical project starting from a set of legacy
applications.
Imagine now how many messages were processed when you finish reading it.
Mission and scope
The mission was to build a high available and performant message hub between different intervenients
in the air cargo industry integrating 15 thousand trading partners exchanging 3 Billion messages every
year and execute complex document validation, multi-factor identification, correlation and batching,
dynamic routing, transformations and monitoring.
The document standards involved elevated B2B core product for the implementation due to its
enhanced features available for document management. Main document standards are Air Cargo
market defined standards as IATA - Cargo Interchange Message Procedures (CIMP), EDI CargoFact and
CargoXML, as long other customer tailored documents based in FlatFile, XML and other custom format
totalling 238 different document types. Another important and distinctive requirement was the relation
SOA Magazine IV
13
between the envelope types in the exchanged messages. In this respect, except the standardize formats
EDI, XML that implicitly define the envelope type to follow, all the others, including CIMP, are decoupled
from the type of envelope that can wrap the message, and we are talking about 99 different and
customizable envelope types built to be used with any document format.
The transportation channels that can be used for message exchange are quite vast as well: different
trading partner can communicate with the platform using 8 different transportation protocols, both for
inbound and outbound.
All of these enhancements were done, never neglecting the flexibility that allows evolving the platform
for on-boarding customers, new message and envelope types achieving the desirable expansibility
needed in a constant change market wherein the velocity to add capabilities is preponderant.
Multiple document protocol support as EDI, XML, Flat File and Custom-Defined
Document definition tools available with possibility to implement rich and complex validation rules
- Oracle Document Editor + Custom Editors for custom documents
SOA Magazine IV
14
On the document management side, its the ideal platform to leverage business standard documents
and manage agreements. However, Oracle B2B acts as a gateway requiring architecture to manage the
complete end-to-end business processes. This is where Oracle SOA Suite Service Component
Architecture (SCA) components come into picture. Oracle BPEL, Mediator and Business Rules play the
major role as back-end pieces in the global architecture.
Oracle B2B is in fact a component part of Oracle SOA Suite what makes the integration quite simple and
efficient. The integration between both components is ensured via out-of-the-box adapter implemented
as binding component in a SCA. All of this leveraged by the use of SOA Metadata Services repository
(MDS) allowing the sharing of different artefacts, as XML schemas or WSDL files among different
architectural components as central repository for B2B artefacts and configuration.
Service Composite Applications components were ultimate to implement the following capacities:
SOA Magazine IV
15
Oracle B2B and SOA Suite, together provide a natural and integrated architecture that enables a unified
platform with end-to-end instance capabilities, empowering standardization, governance, and security.
Solution architecture
B2B as a hub assumes that it is acting both as inbound and outbound gateway sharing the same
configuration for both directions. Trading partners are connected via inbound listening channels and
trading partner channels for delivery. In the message exchange process, B2B is responsible for
identifying both the sender and recipient(s) of the message, identifying the message type and verifying if
the sender is active to send such type of message, if the recipient is able to receive it and in which
format it is able to. Message validation and parsing (from raw to XML) for inbound messages and
message construction (from XML to raw) for outbound are also performed on B2B domain.
The host trading partner will be the virtual receiver of all incoming documents and the virtual sender of
all outgoing documents eliminating any point to point relation between trading partners and
documents. All of this consolidates the possibility of document normalization, allowing the definition of
message classes. This facilitates the usage of common objects between message types and formats
SOA Magazine IV
16
Apart of the functionalities mapped to B2B and SOA SCA, its integration across the SOA platform allows
to naturally implementing an end to end process tracking and exception handling having all exceptions
being delivered to SCA to be handled in properly. In this case in particular, the solution is covering
acknowledge to the sender that the message failed and the reasons behind the exception. This is an
extremely valuable solution feature, since most of the exceptions are typically generated from invalid
documents. This allows the sender to correct and fix the messages interactively reducing the time
needed to implement such corrections.
This is also helps the customers to fit to the market standards reducing custom made message
definitions.
Inbound
Transport callouts are assigned to Inbound/Outbound B2B channels responsible for parsing the
envelope to a B2B Internal Properties native structure. Then, two possible paths can be followed for
document parsing:
SOA Magazine IV
17
This functionality offers the possibility to implement custom classes to deal with document types that
are not covered out-of-the-box. In this implementation, a custom parser was implemented to generate
XSD and JAVA classes from standard language Augmented BackusNaur Form i(ABNF) that can be
associated as B2B Document Callout to dynamically execute parsing, construction and validation of
documents. Please, bear in mind that a good architecture of this callouts is paramount to guarantee full
performance answer when processing such documents.
SCA Core processing
The message is delivered via B2BAdapter to a composite and so entering the core domain of the
solution. It is then in this domain that the message is handled and all the necessary core functionalities
are provided:
Mapping from message type structure to canonical structure. Normalizing the message allows to
reuse functionalities that become agnostic to every different message type to be processed. Those
mappings are implemented using XSLTs that are read dynamically from the MDS
Business rules are verified and deliver an outcome list of functionalities that affects the message
exchange processing
The message is orchestrated to the different core functionalities guided by rules
Communication with the B2B repository for enrichment and operations using the API provided
The message is mapped from canonical structure to the message type structure and delivered via
B2BAdapter to B2B
Outbound
Similar to the inbound. the message and envelope are constructed using the xEngine or Document
callout and delivered to the final recipient using the configured trading partner channel.
BAM Integration
The integration between the components and BAM is established on the B2B side by native integration
based in Advance Queue and from SOA SCA via sensors.
Exception Handling
The exception handling implemented transversally allows end-to-end coverage of exceptions. B2B
exceptions are communicated natively to a SCA via B2B exception handling queue. EDN infrastructure
together with B2B Exception handling and SOA Fault Handling Framework is used for exception handling
purpose guaranteeing 100% message reliability on the platform.
What about performance?
SOA Magazine IV
18
For performance, a full implementation of the message processing without the need to set any database
connection was defined leveraging all the configuration and persistence on the Oracle SOA Infra; This
significantly reduced the interactions with the DB.
Cherry picking the components to reduce dehydration and persistence allowed to maintain reliability
and adequate performance. On composites side a one-way-delivery policy exchange pattern was
defined as a core pattern allowing the composites to be fine-tuned in order to reduce the persistence
and improve performance.
Since B2B and SOA Suite SCA were running in the same domain, the integration adapter between B2B
and SCA was configured to use in-memory methods; a very effective option in terms of performance and
reliability. Other options as JMS or AQ can be used in case B2B and SOA are not running in the same
domain.
Bottom line
This was a demanding project but represents an important challenge for a platform that excels in all of
the full capabilities of Oracle B2B. It was a winning strategy since most of core functionalities were
presented and thanks to a close collaboration between implementation team and Oracles Product
Management and Development teams elevated B2B to a more complete solution meeting recent
market demands.
Success Factors:
Final remarks
Access all of these Oracle B2B functionalities on Oracle SOA Suite 12c or, for the version 11g, by
downloading and installing the SOA bundle Patch 19190139 11.1.1.7.5 or the merge patch 18952479 via
My Oracle Support.
SOA Magazine IV
19
For more details and guidance on how to use some of this B2B functionalities feel free to have a look at
contents in one of our blogs: http://fusionbpmsoa.blogspot.co.uk/ Article title: Oracle SOA Suite/B2B as
a Critical Mission Hub for a High Volume Message Use Case
Blog
SOA Magazine IV
20
Abstract
Oracle Fusion Middleware (OFM) is a leading business innovation platform that enables enterprises to
create and run agile intelligent business applications while providing a wide range of features. These
allow operational efficiency and agility during process development time, in doing so organizations can
reach process quality in a faster and improved way.
OFM platform includes a wide range of tools and technologies in order to satisfy different needs. In our
case we have made use of two important technologies BPM, to streamline customers business
processes, and ADF, that simplifies development by providing out-of-the-box infrastructure services and
visual and declarative development experience.
In what concerns BPM technology, human tasks activities are an important player regarding efficiency
and effectiveness since it enables to model an easy interaction with the end user in a BPM process.
The visual and declarative experience of ADF allows us to create these human tasks in two ways. The
first approach is to use the out-of-the-box option to auto-generate the Human Task form. Secondly by
manually creating UI pages as the number of human task forms designed.
The first approach will generate as many, separate, projects as human tasks, the latter will create as
many user interface pages as human task forms. Based on these assumptions we have developed an
ADF custom library that generates, at runtime, the human tasks interface. We have achieved this by
using only one task flow, decreasing our development time significantly.
Complementarily to this task flow we implemented a java library that creates an abstraction layer of the
BPM interactions. In this library we added some processing of the human task payloads, and provide
basic caching of services and user contexts. This mechanism has proved us that having one single
interface can really decrease development time of new processes at least 85.7% as well having costeffectiveness in hours. In this article, understand how we implemented our human task interface
generator in major customers on main business areas such as Retail, Finance and Energy.
SOA Magazine IV
21
Introduction
The projects that we have developed in Oracle BPM have plenty of interaction with the use of human
tasks and the options we have for its development can bring a huge complexity by having several
projects, or several user interface pages, to manage.
Develop once reuse often has always been our motto when developing with Oracle ADF, and based on
these assumptions and in our knowledge in BPMs middleware we focused our efforts on centralizing in
a single interface, Human Task Generator (HTG), all human tasks that we would generate. On this single
interface we are able to display whichever ADF faces components offer, with dependencies between
them, process the human task with custom or system actions, enable or disable user interaction based
on permissions, and display process information as well as help topics so the user can be provided with
context.
In order to achieve this goal we developed our HTG based on a custom made library (BpmLib) that
abstracts BPMs middleware java classes. In our BpmLib we expose all the needed methods to get
human tasks payload, replying actions, and so on.
In the next chapters of this article, we will provide in more detail how we have implemented our HTG as
well as the BPM library.
Architecture
Our HTG was engineered based on the convention that it should be reusable and extensible
interchangeably of the aplication, project or even customer who will use it, so that if our clients
requirements change as well as the application target we could easily adjust to new requests.
We divided our implementation in three layers, (1) Data layer, (2) Business Layer and (2) View Layer, as
shown in Figure 1.
SOA Magazine IV
22
The Data Layer encloses two levels of information: BPMs data in which BPM Suite resides on, and the
custom made data model which supports the customers business logic. In this last case the amount and
type of information depends on each customer. For instance, in one of our clients we did not have the
necessity to implement any type of data logic, while in other we had a fairly complex database model.
The Business Layer clusters and abstracts the features and requirements of the Human Task Generator.
Our BpmLib Library collects the data for each process and their related human tasks through BPMs
middleware API and exposes it to be consumed by the Model and/or Application Module applications
as showed in Figure 1. In no case do we bypass BPMs middleware API to directly access BPMs data in
the data base. By following this rule we protect our code of any change that may occur in BPMs data
structure in following releases of oracles BPM product. The Model application exposes data through
View Objects. The Application Module application clusters it and exposes them. Here single methods
are also made available to the View Layer.
The View Layer consumes the methods and View Objects exposed by the Business Layer and display data
through the Human Task Generator application. The HTG has the capability to be reused by different
types of environments, for example: (1) ADF applications, (2) ADF Mobile Browser applications, and (3)
MAF applications. Depending on the application target, you may choose the HTG interface to use. Each
one is an application deployed as a Shared Library to allow isolation and reusability providing two
immediately advantages: (1) is available to be deployed as a shared library to any Weblogic Server, and
(2) can be incorporated in any final ADF application.
SOA Magazine IV
23
SOA Magazine IV
24
processes. So, as we implement more and more processes, the number of new types for each new
process rapidly decreases allowing us to reduce implementation complexity and times.
Then on the BpmLib we created our own java class representing a payload item, containing the name of
the item (a key to a resource file), its value, type, if it is editable or read only, and if it is visible. Lastly we
expose two methods:
GetTaskPayload Method
The getTaskPayload method starts by calling the following code:
ITaskMetadataService.getTaskDefinition(context,task).getWorkflowConfiguration().getPayload().getMessageAttribute();
It returns a list of MessageAttributeType. Each MessageAttributeType represents one entry of the task
payload that we see on the .task file. While this gives us some information (name, type, and if it is
updatable) it is not enough, so we need to also get the actual schema of the payload. This schema is
available on the MDS, and there are several ways to access it. We chose to use the URL
http://[host]:[port]/soa-infra/services/[partition]/[composite]/[path_to_file]/[name_of_task]Payload.xsd
and get an object representing the payload. From here is just a matter of iterating
MessageAttributeType list and for each item get the corresponding XMLElement of the payload and
build our own PayloadItem object to return.
UpdateTaskPayload
The updateTaskPayload method receives the values to update in a Map<String, Object> where the String
(key) is the name of the payload item, and the Object (value) is its value. The actual class of the Object is
dependent on the type of the payload item. For simple types it's the closest equivalent in java to the
type, for example a String, an Integer, or a Date. For complex types it's a String with the XML value that
needs to be stored. It starts by calling the Task.getPayloadAsElement() method to get an XMLElement
with the current payload. Then it also gets the XMLSchema in the same way as the getTaskPayload. This
is needed to know the type of each payload item. It then updates the XMLElement with the values
passed in the parameter and calls the ITaskService().updateTask(context, task) method to save the
changes.
SOA Magazine IV
25
SOA Magazine IV
26
Our HTG transformation model Engine is structured on java classes to represent human tasks attributes.
The java classes we have created for that purpose are: HumanTaskAttribute,
HumanTaskAttributeFacade, HumanTaskPayload. Their relationship can be viewed in Figure 3.
HumanTaskAttribute java class represent each attribute of the human task payload regardless of being
based on a primitive type, simple type or even complex type. In those cases in which the attribute is a
complex type, each XSD element is represented as a HumanTaskAttribute. The variables we have
defined for this java class are:
name internal attribute name. If the attribute is a complex type the name is the
concatenation of human tasks attribute name plus the name of the element inside the
XSD.
value the value the attribute is to be set.
type contains the type of the attribute, i.e. STRING for string attributes, INTEGER for
integer attributes, and so on. For complex types the type is similar to XSD name in order
to be easily recognized, i.e. ADDRESS for attributes based on address XSD.
isReadOnly true if the attribute is not editable, otherwise false.
isRequired true if the attribute is mandatory, otherwise false.
isHidden true if the attribute must not be displayed on the interface, otherwise false.
This type of attributes behaves like auxiliary variables.
SOA Magazine IV
27
isVisible true if the attribute should be visible, otherwise false. This variable is useful
for setting conditional visibility of the attributes on the interface.
HumanTaskAttributeFacade java class encloses all attributes of each human task payloads attribute,
regardless of the type (primitive, simple or complex). In those cases where the attribute is based on a
complex type this java class has as many HumanTaskAttributes as elements in the XSD. In the other
cases, primitive and simple attributes, this class has only one HumanTaskAttribute attribute. This
situation can be viewed in Figure 3. The variables we have defined for this java class are:
type contains the same value as the attribute type in HumanTaskAttribute java class.
This variable exists only for code simplicity reasons on later data access.
attributes represents the list of HumanTaskAttributes (List<HumanTaskAttribute>).
getValue get human tasks attribute value based on attributes internal name. If
internal name doesnt exist a NULL is returned.
setValue set the new value for attributes internal name.
hasKey returns true if human tasks internal name exists, otherwise returns false.
getValues get all human task payload attributes. The returning value is a java.util.Map
where the Key contains the attribute internal name and the Value the value of the
attribute.
Regarding HumanTaskPayload, this java class encloses all human task payloads attributes regardless of
its attributes complexity (primitive, simple or complex type). This java class has the following variable:
For this same java class we have defined three methods that are based on HumanTaskAttributes
methods:
getValues get all human task payload attributes. The returning value is a java.util.Map
where the Key contains the attribute internal name and the Value the value of the
attribute.
getValue get human tasks attribute value based on attributes internal name. If
internal name doesnt exist a NULL is returned.
setValue set the new value for attributes internal name.
SOA Magazine IV
28
In addition to previous java classes we have two more with no less importance HumanTaskUtilities and
HumanTaskGeneratorAMImpl. The first one is a utility class that helps parsing the XSDs complex types
while the second one is the implementation class of the HumanTaskGenerator application module
where the method getHumanTaskPayload is exposed and returns all human task payload attributes
after transformation.
The java class HumanTaskUtilities has the following methods to perform the transformation:
During iteration over human task payload attributes returned by BpmLib the type for the current
attribute is determined. Depending on its type one of the following methods is called:
SOA Magazine IV
29
Declarative Component
Image
Address
BetweenDates
SearchModal
Our experience tell us that many other declarative components are able to be developed with different
levels of complexity depending on each customer needs to display data.
Input
Parameters
Preprocessing
Runtime Component
Design
Our HTG module is defined by a set of input parameters. These input parameters are used to determine
which human task to show. Then a preprocessing is done in order to: (1) get the human task payload
based on our custom made BpmLib library and (2) determine if the user that is trying to acquire the task
has permissions to interact with. After this two previous steps are completed each human task attribute
is rendered according to its type.
SOA Magazine IV
30
The generated human tasks have system and custom actions. System actions are equal for all tasks and
are those available in BPM Workspace, the latter are defined for each task during process development.
Regarding the last step of our execution flow, Runtime Component Design, we have developed a task
flow for rendering the human task attributes. The task flow is implemented as depicted in Figure 4.
All starts by getting human tasks payload attributes with previous transformations already done. Then
its time to set attributes visibility. This kind of visibility is related to attributes conditional visibility, i.e. if
there is any attribute that its visibility is based on another attributes value then it is defined at this
moment. The next step is about getting the values to populate the selectOneChoice ADF Faces
component. Last but not least, the HTGPayload fragment is responsible for rendering all human task
attributes from any possible Human Task, as shown in Attachment 1.
Note: This task flow may and can be enhanced for those cases you have an ADF faces control where you
need to search information in modal in order to set human tasks attributes values.
In HTGPayload fragment an iteration over all human task payload attributes is performed and
depending on its type it will be rendered by one of the underlying components available from the set of
components inside the af:foreach ADF faces component. Any new component needed should be
putted here.
Each component is populated based on the data structure previously presented to represent all human
task payloads attributes regardless of its complexity. For more detail about how to assign components
properties you can find an example here.
SOA Magazine IV
31
Simple Component level. For example: input fields for numeric types only.
Custom Component level. For example, dependent fields in a custom component.
Task submission level. For example, submission of the task form with validation rules for
individual components that cannot be filled.
In what concerns the first type of validation, Simple Components, it is defined in validator
components property, as depicted in Figure .
In order to centralize all of this type of validations we have created ValidatorsHTG java class. This class
is instantiated with session scope in Managed Beans tab of faces-config.xml file. In Figure 5 are shown
two component validations of two different types in ValidatorsHTG java class.
SOA Magazine IV
32
Regarding validation for Custom Components, this can be done in two ways: (1) in the validator
components property of the declarative component, basic validation, or (2) in custom classes for more
complex validation, for example, imagine that Address declarative component is only valid if Country
field is filled as well as Address and Postal Code, otherwise an error message is raised to the
frontend. For this kind of behavior we have developed HTGComplexTypesValidation java class where
all generic validations will be available for future reuse. In Figure 6 is shown the Address complex type
validation.
SOA Magazine IV
33
This approach encloses components behavior and complexity in one single place and therefore
promotes more and better reuse.
Finally, validation at task submission reuses complex types validations as well as any other validation.
The developer has full freedom and control to do any validation for any human task of any BPM process.
Validations engine is structured as depicted in Figure 7, where ProcessToValidateA and
ProcessToValidateB are java classes of real future BPM processes.
SOA Magazine IV
34
HTGValidations (Figure 8) is the top level java class called by the user just before replying to the human
task. The human task is replied if and only if all attributes (simple or complex attributes) and overall
dependencies between attributes are valid. The method validateHumanTask is the main method
called to perform the validation of processes human tasks.
SOA Magazine IV
35
Human tasks validation is made inside of each process java class created for the purpose as shown in
Figure 8. For example, inside of ProcessToValidateA java class, Figure 9, human tasks for this process
will be validated.
SOA Magazine IV
36
SOA Magazine IV
37
(1) How useful our approach can be in the development process of a new application?
Since we developed our HTG in well-defined building layers (Data Layer, Business Layer and View Layer)
where each one communicates with the nearest layer, they can be developed independently of the
others. Even more, if there isnt broken contract on methods of above layers.
Each project inside of each layer is available as a Shared Library and therefore deployed independently
to the Weblogic server. This approach takes the same advantage, as described before, along with not
being necessary to deploy all the projects to the server if only one has really changed.
When a new application is ready to be developed, independently of the business area, and needs to
provide Human Tasks interfaces, the developer just imports the library into the application and easily
drag-and-drop the HTG wherever is needed. It behaves as a regular ADF Faces component.
The user will be facing a standardized interface for all human tasks independently of the process.
Nevertheless, the developer will have fewer interventions regarding layouts, forgotten fields, etc.., and
therefore lower error-prone probability.
(2) How reusable can it be? There are dependencies on customer and his business area?
Here we do a study on the implementation of the HTG in the three different customers of different
business areas provided us with insight of the level of reusability of our engineered solution. The next
table (Table 2) agglomerates the data by customers business area, as well as the total number of
processes and human tasks created using HTG versus human tasks created from scratch.
Retail
Finance
Energy
Total
N Processes
36
48
N Human Tasks
96
34
135
Generator 81
34
120
Human
(HTG)
Task
SOA Magazine IV
38
15
15
84.4%
100%
100%
88.9%
Another overall perspective of the previous results can also be seen in the next graphic, Figure 10.
The previous table depicts the success of using our solution, in the banking industry the HTG, satisfies all
customer needs. In the retail sector there is a great percentage of satisfaction, and finally in the last case
the customer needs are satisfied but only a few small processes were implemented.
One of the leading reasons for these results was our capability to be able to evolve Complex Component
Types as the customer needed them, with the underlying validations for each component and the
overall human task. The modularization of our HTG in different layers was another advantage since we
never mixed business logic inside viewing data and vice-versa.
(3) Is there a real world benefit spending more time at the beginning implementing a
generic methodology in spite of following the regular way Oracle developed its product?
All our customers wanted their own custom made Workspace, integrated within a single application,
with customized look-and-feel, and custom business rules associated both in the process and interface.
Based on this starting point, in any situation we needed to develop a library to interact with the BPMs
middleware, the BpmLib. Once we developed our own library, and after some experiments regarding
developing human tasks interfaces, we saw that developing a custom hand-made interface to generate
human tasks was not so far away.
After understanding all these facts we tried to realize the impact of a HTG would have. Therefore we
made some tests to compare our HTG with generating human tasks interfaces using Oracles out-of-the-
SOA Magazine IV
39
box approach. The results are based on our customers requirements and level of customization needed
to achieve the same final results in both cases. The following results regard the development of each
new human task interface (Table 3).
Table 3 - Spent time setting a human task interface using HTG versus without using
(In Hours)
Implement Validations
Data submission
Total
As Table 3 depicts, using HTG takes less 6 hours developing each new human task interface,
independently of the process.
Another important conclusion is the cost-effectiveness that we took by using HTG instead of the regular
approach. In the next table we present these results:
Using HTG
Total human tasks which used HTG
120
SOA Magazine IV
40
840
85.7%
As you can see by using our engineered solution we were able to have a cost-effectiveness of 85.7%
hours during our developments fewer interventions and therefore lowering probability of development
errors.
Conclusions
Based on our experience in major projects in different industries, we engineered an accelerator to
automatically generate the interfaces for the processes human tasks. This mechanism has proved to
decrease implementation time at least 85.7%, and thus increasing productivity.
In order to achieve our main goal we developed the HTG (Human Task Generator) based on a custom
library, BpmLib, that abstracts the interactions with the BPMs middleware. We followed the convention
that HTG should be reusable and extensible interchangeably of the application, project or even
customer.
We structured our HTG in three layers: Data layer, Business layer and View layer, each one with a well
define scope and purpose so that future changes or enhancements could be easily overcome.
The entire module BpmLib, Model, Application Module and HTG is decoupled from the application
business logic, and thus can be used in other BPM applications. Not only can the entire module be
reused, but also its components independently (BPMLib, Model-AppModule).
We have answered some important questions in detail, such as: (1) how useful our approach can be in
the development process of a new application, (2) how reusable can it be depending on the customer
and his business area, and (3) if there is a real world benefit spending more time at the beginning
developing a generic methodology in spite of following the regular use of the product.
In conclusion, this approach of developing a reusable module for human task generation is only feasible
if at starting point we have the knowledge that there is a significant amount of processes/human tasks
to be developed. Having in regard that after the module is developed every change is centralized
allowing time saving in development achieving a better return of investment.
SOA Magazine IV
41
Pedro Gabriel
Pedro Gabriel is an ADF and BPM developer at Link Consulting, a
Portuguese Oracle Platinum Partner. He is responsible for developing and
architecturing ADF solutions for the retail and finance sector. He is also
involved in BPM projects where he implemented well accepted solutions.
Pedro Gabriel interests continue to focus on these technologies and
spread his knowledge to different Oracle technologies. Before, he worked
with Microsoft Technologies
Contact:
Blog
Danilo Manmohanlal
Danilo works at Link Consulting since 2010 and is one of the leading ADF
architects. He is involved in enterprise-level BPM/ADF projects and has
implemented solutions for the retail and finance sector.
Contact:
Blog
Diogo Henriques
Diogo Henriques is the BPM Technical Leader at Link Consulting, a Portuguese Oracle Platinum Partner.
He has worked with Oracle BPM for almost 5 years, since 11.1.1.3 Beta, including a few large scale
projects. Before, he worked with Oracle Workflow for 3 years.
Contact:
Blog
SOA Magazine IV
42
Attachments
SOA Magazine IV
43
Im not going to bore you with the details about the installation by giving a installation guide. It
took me about 40 minutes from scratch (excluding downlOERoad time). The steps are describes
in the installation guide Oracle provides. OAC is part of the OER 12c installation jar, but can be
licensed and installed, as an own managed domain, without licensing and installing OER.
The steps to take on high level (from scratch):
1. Download and install Oracle Database, Fusion Middleware Infrastructure 12c, Oracle Enterprise
Repository 12c, RCU patch 18791727 and Weblogic patch 18718889 (these last two are
important, else you cant install OAC).
2. Run RCU (oracle_common/bin/rcu.sh|bat) and create the OAC repository
3. Run Weblogic Domain creation (weblogic/bin/config.sh|bat) and create a new which includes
OAC.
4. After installation and startup of weblogic and managed service you can find the OAC console at
url: http://serverhost:8111/oac
Note: if you harvesting from another weblogic server (like SOA Suite 12c), the weblogic patch
should also be installed there.
Taking the first steps
When taking my first steps the official getting started guide can help you a lot. OAC has four
high-level features. OAC collects services, it has a harvester which creates API assets in OAC.
After harvesting you can add metadata to the API assets like description, tags and
documentation. After harvesting and editing an API asset it can be published so it is visible for
application development. Published APIs can be discovered and used through the API Catalog
console and via the Oracle JDeveloper Oracle Enterprise Repository plug-in.
Logging in for the first time
After installing FMW & OAC 12c and starting the Weblogic server you can login into the
provided OAC console (default) running at http://serverhost:8111/oac.
SOA Magazine IV
44
SOA Magazine IV
45
OAC12c: Admin page for configuring users, departments, sessions and change system settings
Collecting / harvesting services
SOA Magazine IV
46
The first activity is to harvest APIs into OAC. The harvester is used to populate OAC with API
assets from SOA Suite and Service Bus or other deployed services. The harvesting process can
be run from the command line or can be integrated into the build process. The latter can be
used to automatically do the harvest at build time. Harvested API assets will get a Draft state
and wont be visible to developers yet.
For this blog I will use the command line harvester. There are to types of harvesters; one for
SOA Suite, OER & File-based assets and one for harvesting OSB. I will use the first harvester and
use it with integrated SOA Suite 12c environment.
SOA Magazine IV
47
weblogic
-remote_server_type
SOASuite
The result after running the harvest command should look something like on the image below
(in preview mode).
SOA Magazine IV
48
After harvesting you first asset(s) you can login the OAC console to see the result. Assets that
are harvested will get the Draft status, thats why after logging into the OAC console again, no
APIs are published and visible on the dashboard. Search on Service Type = Draft to view newly
harvested APIs.
SOA Magazine IV
49
SOA Magazine IV
50
Link to details page: Get URL of API asset details page you currently look at and can share.
Example:
http://soabpm:8111/oac/index.jsp?assetid=50003&renderMinMaxButton=false&renderNextPre
vButtons=false
Toggle between tabs and view all: You can view the API asset details in tabs (tab per category)
and
view
them
on
single
page
like
displayed
above.
SOA Magazine IV
51
Add to My APIs: Click to bookmark API. After bookmarking API it is visible under MyAPIs page.
Edit details: Page to edit API asset details. More about this below.
Delete items: Delete the API asset from the repository.
To add metadata or edit other details of an API asset click on the edit icon/button to open the
Edit Asset page. On this page you can change the name, change the version, add keywords and
descriptions, but also add a link to documentation URL and assign/upload an icon. To publish
an API change the API status from Draft to Published. It is also possible to set the API active
status from Active to Retired.
SOA Magazine IV
52
OAC12c: On the Edit Asset page details like, keyword, description and version can be changed
Publish a draft API
To publish an API you just need to change the API status to published and saving the asset.
On the overview page the API status is changed and the API details can be exported to Excel
and PDF.
SOA Magazine IV
53
After taking the first steps the APIs can be shared with other users. To do so new users can be
added with specific roles. As admin user click on the Admin menu item and choose the Users
section (selected by default). From this page new users can be created, you can search for users
and edit & delete users.
SOA Magazine IV
54
To create a new user click in the Users section on the Add User icon on the top-right of the page
section. The Create New User page is displayed. The page is divided into three sub sections;
user information, roles and departments. The user information section contains form fields for
the username, password, real name and email & phone information. It is also possible to force
changing the password and allow the password to never expire. A user can have one of four
statuses; Active, Inactive, Lockout (after 3 failed logins) and Unapproved.
SOA Magazine IV
55
SOA Magazine IV
56
SOA Magazine IV
57
To switch to a different user you can sign out of the OAC console by selecting the Sign Out
options under your user menu. Just click on the arrow on the right side of your name.
SOA Magazine IV
58
Logging in as Developer will give even less options. A developer can only search in published API
and add APIs to there favorites.
Als user you can add APIs to you favorites (My APIs). To add a API to your favorites go to the API
asset details page and add the API to My APIs by clicking the Add this to My APIs button.
After adding it to My APIs the details page is update and shows how many times it is added
(usage) in the past 6 months. Users that have added the API to there My APIs can review on the
API asset.
SOA Magazine IV
59
SOA Magazine IV
60
OAC12c: My APIs showing all added API and actions to review or delete a API
To write a review about an API click on the first (review) icon on the right side of the APIs name.
A pop-up is opened with a form that the user can use to submit a review. The user can give a
rating and a comment (max 4000 characters). After submitting the review the page is
redirected to the a page which displays all reviews.
SOA Magazine IV
61
The administrator can perform some other tasks that I didnt discuss yet. On the Sessions
section page all (active) sessions are displayed and an administrator can look into the details of
the session. An administrator can also delete sessions, which means if a user is active in the
console the user will be logged out.
An administrator can change a lot of setting to change the behavior of the OAC console. The
system settings page is divided into four main sections; Functional settings, Server Settings,
Enterprise Authentication and Import and Export settings (not the actual import/export).
The functional settings sections contains settings for search results and printing details via PDF:
SOA Magazine IV
62
The server settings section contains settings for embedding HTML in asset details:
The Enterprise Application section contains settings for connection to a LDAP server for user
management:
SOA Magazine IV
63
The import / export of the repository can be done in the corresponding section, but instead of it
being done in the web UI it uses java Webstart.
SOA Magazine IV
64
My Verdict
To conclude this blog post I will give my verdict about Oracle API Catalog 12c. In the last week I
had some ups but also some downs. I started on my windows laptop and installed everything
locally. At first everything seemed fine, I could harvest my first Asset, but after trying to view
the details I got a error after another. In the days that followed I was helped by Oracle
Development and in special by Mark Dutra, but we couldnt figure out what the problem was. I
think it has something to do with security settings and how the laptop is configured (domain
wise).
After creating a Linux VM and installed everything again I finally had success and the tables
were turned. I really like the interface, it is very clean and uses a common workflow on every
page. The use of a separate harvester (command-line or ANT task) is in one way a great
solution, because you can use it in your already existing build process, but I missed the option
to do this using the OAC console, you dont always have access to use scripting.
The harvested APIs are added as Draft and an admin or curator can edit the information and
publish the API. This version is a great start with lots of information already, like the endpoint,
WSDL/WADL summary including methods and message payload. But I miss the possibility to
register dependencies between APIs. If API are harvested from one service the separate APIs
are not linked to each other.
SOA Magazine IV
65
Adding APIs to your My APIs and the possibility to write a review can be handy. The simple
metrics tell you about the usage and who uses it which is already usefull, hopefully in the future
more metrics will be available in OAC.
As an administrator you can perform a lot of tasks using the UI. Adding users, looking at active
sessions and import/export the repository. It worked fine on Linux, but on windows I had no
result. Also the possibility to connect to an LDAP will make it much easier to add users and
departments.
Pros:
Cons:
SOA Magazine IV
66
Siebel, OSM & BRM, which were all 3 acquired by Oracle acquired in the year 2006
Optionally a Product Master PH4C (Product Hub for Communications), even though RODOD can
work without a Product Master. Ph4C is an Oracle E-Business Suite Instance
An out of the Box Integration Application Integration Architecture (AIA) with Process
Integration Packs (PIPs) implemented in Oracle SOA with BPEL.
Master Data Management (MDM) PIP: Sending Product Data from Ph4C to Siebel, BRM
(OSM), optional
Agent Assisted Billing Care (AABC) PIP: Providing BRM Customer Billing Information into
Siebel, optional but usually implemented
Order To Cash (O2C) PIP: Integrating Siebel, OSM and BRM for the Order Flow
All Applications are used independently of RODOD, including AIA. Only the 3 listed Process
Integration Packs are RODOD specific
RODOD can work without the MDM & AABC PIP but not without the O2C PIP
In virtually any RODOD Implementation, You also find see an Enterprise Service Bus (for Example Oracle
Service Bus) to connect RODOD to external Systems. In particularl OSM will need to talk to the Network
System, Siebel will usually talk to Systems which render eg Client Information.
RODODs equivalent on the Telecommunications Network side (Operating Support Systems) is Oracles
Rapid Service Design and Order Delivery (RSDODO) consisting of OSM, Unified Inventory Management
(UIM, Automated Service Activation Program (ASAP)& Oracle Communications IP Service Activator).
RODOD as a Telecommunications Order management Systems supports three main functional areas
with several Order Types.
SOA Magazine IV
67
The Order Flow over these Systems is facilitated via the Integration Layer, the so called the Application
Integration Architecture Order To Cash Process Integration pack (AIA O2C PIP).
SOA Magazine IV
68
AIA deploys Composites in Weblogic, which can be monitored in Oracle Enterprise Manager. The
SOA Suite runs on a Database with specific Database Schemas, important Tables are the so called
XREF Tables, where concepts common to Application like Orders, Accounts and Products are linked.
The Metadata Service Repository holds AIA Artefacts like EBOs, EBSs, WSDLS etc.
An important Concept of the Application Integration Architecture are the so called Enterprise
Business Object. They offer a Generic Data Model, to which Concepts which are common to all
Applications (Products Customer, Orders) can be easily mapped. AIA connects to Applications via
adapters, called Application Business Connector Services, which are usually written in BPEL.
AIA essentially connects to Applications, translates System specific messages to its generic Data
model and organizes the message exchange with different technologies (Queues, Webservices) and
Integration Patterns.
An Integration layer needs to be able to map equivalent Concepts in different Systems to each
other. This done via so called Cross references. Cross Reference, here XREF tables are a Construct
from the Oracle SOA layer designed to map values for equivalent entities created in different
applications. An XREF Table assigns a common global key to data objects from different applications,
which need to be mapped to each other
In RODOD the AIA layer uses such tables to map Siebel, OSM, BRM and PH4C identifiers and keys for
Products, Customer and Orders.
With an for Products existing in Siebel and BRM:
A customer wants to add an optional mobile phone internet subscription to his internet
home service.
The Call Center Agent enters a change order to add mobile phone internet product to his
existing home based internet subscription.
When the OSM order management sends a fulfillment order billing request to AIA, the O2C
PIP looks up the cross reference based on the Siebel Asset Integration ID to find the BRM
service POID.
The O2C PIP then calls the correct BRM API with the correct arguments to add the product
to the service in BRM
Find a Schema listing the main Concepts with the example of Siebel and BRM communicating via the
O2C PIP in RODOD.
SOA Magazine IV
69
Calling & Orchestrating Enterprise Business Services to fulfil a complex task like an orderflow
Managing cross-referencing of instance identifiers, validating and enriching content (if required)
Most AIA ABCS are using BPEL (state full conversations, XSLT is used for transformation
SOA Magazine IV
70
O2C PIP
The O2C AIA PIP connects RODOD (Siebel, BRM, OSM) via five flows:
Process Sales Order Fulfilment: Send Orders From Siebel to OSM COM
Synchronize Fulfilment Order Billing Account: Synchronize Customer Accounts between
Siebel and BRM via OSM COM
Bill Fulfilment: Initiating Billing from OSM COM towards BRM
Provision Order and Update Fulfilment Order: Send Order from OSM COM (Fulfilment)
to OSM SOM (Provisioning) to talk to the OSS Network Layer
Update Sales Order: Update Order Status from OSM COM (Fulfilment) back to Siebel
SOA Magazine IV
71
OSM COM / Fulfilment: orchestrates order tasks in the RODOD BSS Systems. It
generates a plan, how the order should be executed and coordinates the calls
OSM SOM / Provisioning: talks to the OSS Network Systems and coordinates the Order
Provisioning downstream
Note that the O2C can also synchronize Product Model Data. You can use O2C instead of the MDM PIP
in case You plan to use RODOD without the Ph4C Product Master.
O2C - Process Sales Order Fulfilment:
Send Orders From Siebel to OSM COM. Siebel CRM creates an application business message
(SalesOrderABM) with sales order details and enqueues the ABM in the AIA_SALESORDERJMSQUEUE
queue.
ProcessSalesOrderFulfillmentSiebelCommsJMSConsumer
dequeue the ABM from the queue and passes it on to
ProcessSalesOrderFulfillmentSiebelCommsReqABCSImpl
ProcessSalesOrderFulfillmentSiebelCommsReqABCSImpl
transforms the ABM into an enterprise business message
(ProcessSalesOrderFulfillmentEBM)
and
routes
it
to
ProcessSalesOrderFulfillmentOSMCFSCommsJMSProducer
ProcessSalesOrderFulfillmentOSMCFSCommsJMSProducer wraps
the EBM into OSM CreateOrder message format and enqueues
the CreateOrder message into AIA_CRTFO_IN_JMSQ.
The store and forward mechanism forwards the CreateOrder message from AIA WebLogic server to
OSM WebLogic
SOA Magazine IV
72
1.
SOA Magazine IV
73
2.CommsProcessFulfillmentOrderBillingAccountListEBF:
3. CommsProcessBillingAccountListEBF:
4.QueryCustomerPartyListSiebelProvABCSImplV2:
6. CommsProcessBillingAccountListEBF :
7. SyncCustomerPartyListBRMCommsProvABCSImpl
sends
SyncCustomerPartyListResponseEBM
message
CommsProcessBillingAccountListEBF (asynchronous delayed response mode)
sends
the
ProcessBillingAccountListResponseEBM
response
message
to
the
CommsProcessFulfillmentOrderBillingAccountListEBF (asynchronous delayed response mode)
back
to
8. ProcessFulfillmentOrderBillingAccountListRespOSMCFSCommsJMSProducer :
SOA Magazine IV
74
1. ProcessFulfillmentOrderBillingOSMCFSCommsJMSConsumer :
2. ProcessFulfillmentOrderBillingBRMCommsProvABCSImpl :
billing artifacts,
purchased products
purchased discounts
3.ProcessFulfillmentOrderBillingResponseOSMCFSCommsJMSProducer:
forwards
the
ProcessFulfillmentOrderBillingResponseEBM
AIA_UPDBO_IN_JMSQ queue
message
in
the
Store-and-forward mechanism forwards message from AIA weblogic to the OSM WebLogic server.
SOA Magazine IV
75
routed to ProcessProvisioningOrderOSMPROVJMSProducer.
2. ProcessProvisioningOrderOSMPROVJMSProducer:
Store-and-forward mechanism places the message towards OSM SOM, so it can dequeue it for further
processing.
During provisioning update messages are enqueued by OSM SOM into OSM WebLogic and moved to the
AIA_FOPROV_OUT_JMSQ queue using store-and-forward.
3. ProcessFulfillmentOrderUpdateOSMPROVCommsJMSConsumer
route to ProcessFulfillmentOrderUpdateOSMCFSCommsJMSProducer
4. ProcessFulfillmentOrderUpdateOSMCFSCommsJMSProducer
OSM SOM picks up the message, initiates the necessary actions in the OSS Provisioning Systems and
updates the status of the order.
SOA Magazine IV
76
routes
it
to
the
UpdateSalesOrderSiebelCommsProvABCSImpl service.
UpdateSalesOrderOSMCFSCommsJMSConsumer
works
with a sequencer. If an update causes system or business error,
further updates to the account are locked in the sequencer
table. A business error must be removed from the
sequencer table to unlock the account. In case of a System error, the message must be resubmitted.
2.UpdateSalesOrderSiebelCommsProvABCSImpl:
convert the UpdateSalesOrderEBM message into a Siebel application business message (ABM)
Order Status field in the Siebel user interface represents the overarching status throughout order
capture and order fulfillment. Fulfillment Status is a sub-status to a Status of Open in Siebel.
Benedikt Herudek
As an Integration and Telecommunications Expert Benedikt works at
Accenture, the Netherlands. His area of expertise spans from different
Oracle Applications (eBusiness Suite, Siebel) over the Database to Oracle
SOA & AIA Middleware Technologies. He is part of Accenture's RODOD
expert pool and implemented the Solutions in different European Countries
in different Roles as Analyst, Technical Architect & Teamlead. Benedikt is a
Certified Oracle & SOA Implementation Specialist as well as a Certified
Siebel Specialist and tries in his work to understand complex Solutions
consisting of different Technologies from an E2E perspective. Benedikt is also a Specialist in Oracle
Enterprise Manager and how to use the tool to monitor the Oracle Infrastructure and Application Stack.
Contact
SOA Magazine IV
77
with
the
Governing
SOA Magazine IV
78
Exclusive Gateway
Exclusive and Inclusive gateways consist of two outbound sequence flows; a default sequence flow
representing the normal path between two objects and a conditional sequence flow to control the
process
flow
based
on
the
evaluation
of
an
expression.
The "Exclusive Gateway" is one of the most commonly used gateways where you can split your process
into two or more paths. When a token reaches an exclusive gateway each of the conditional outbound
sequence flows is evaluated in the order that you specified during design time when configuring the
exclusive gateway and the first conditional flow that is evaluated to true is taken. If none of the
conditional outbound sequence flows evaluates to true then the token moves down the default
sequence flow. Please note that if you do not specify a default outbound sequence flow on an "Exclusive
Gateway" you will get an error at design time and will not be able to compile and deploy your process.
So let's see how you use an "Exclusive Gateway" to control the flow of a process. I have created a new
BPM application using the "BPM Application" JDeveloper template and in the "Project SOA Settings"
step i have selected "Composite with BPMN Process".
SOA Magazine IV
79
This will bring up the "BPMN 2.0 Process Wizard" where you are prompted to specify a process name
and the service type. In this demo I have selected "Asynchronous Service".
SOA Magazine IV
80
In the arguments step I have created two input arguments, OrderId of type int and OrderAmount of type
decimal, and an output argument, Status of type string.
SOA Magazine IV
81
When you click finish it will open the process. Using the structure window I have created three process
data objects to store the input arguments I have created above and hold the output argument value
(orderId of type int, orderTotal of type decimal and status of type string).
Next I assigned the two input arguments (OrderId and OrderTotal) to the process data objects (orderId,
orderTotal) by double-clicking on the "Start" activity, going to the "Implementation" tab and selecting
"Data Associations".
SOA Magazine IV
82
Please note that i did the same thing for the "End" activity but this time i have mapped the status
process
data
object
to
the
Status
output
argument.
Just for demo purpose I came up with a very simple scenario where I will use an exclusive gateway to
auto approve an order if the order total is less than 100 and to set the status to pending if the order
total is greater than 100. I would like to stress that this is just for demo purposes as I would highly
recommend that you use a business rules component to store the order total threshold rather than
hard-coding it in the process.
SOA Magazine IV
83
Furthermore I dropped two script tasks, one between the exclusive gateway and the end activity and
one just above the first script task and mapped the two status values (auto approved and pending)
respectively using the "Data Associations" on each of the script tasks.
When you dropped the exclusive gateway between the start and end activity it automatically joined the
exclusive gateway with the end activity using a default outbound sequence flow. Next I have created a
conditional outbound sequence flow from the exclusive gateway to the "Set Order Status to Pending"
script task and defined an XPath expression to check whether the orderTotal is greater than 100.
SOA Magazine IV
84
I finally provided some labels on the two outbound flow to make them better readable and joined the
second script task with the end activity as follows.
SOA Magazine IV
85
Deploy your process on the integrated Weblogic server and run a test using a small order (having an
order total of less than 100). The process should follow the default outbound flow and the order should
be auto-approved.
Now run another test with an order having an order total of greater than 100. The process should follow
the conditional outbound flow and the order should be flagged as pending.
SOA Magazine IV
86
An exclusive gateway can also be used to define a loop to check for conditions and re-executing previous
steps. So to define a loop using an exclusive gateway just connect a sequence flow to a previous object.
Inclusive Gateway
The inclusive gateway, just like the exclusive gateway, enables you to split your process into two or
more paths. The intrinsic difference between an exclusive gateway and an inclusive gateway is that in an
exclusive gateway, the process only continues down one of several paths (if multiple outgoing sequence
flows are present) while in an inclusive gateway a process will follow all conditional paths whose
expressions are evaluated to true.
Furthermore, in an inclusive gateway a process will only follow the default path only if no conditional
expressions evaluate to true. Because of this particular characteristic the notation of an inclusive
gateway consist of a split and a merge inclusive gateway.
SOA Magazine IV
87
An inclusive gateway can consist multiple outgoing conditional sequence flows for an inclusive gateway
split. However, an inclusive gateway must define a default sequence flow. All conditional expressions
that evaluate to true are executed; otherwise the default sequence flow is executed.
At run time, the BPM engine generates a token for each conditional sequence that evaluates to true. If
none of the conditional sequence flows evaluate to true then a token gets generated for the default
sequence flow. The process will pause and will resume only when all tokens have reached the merge
So let's see how you can use an inclusive gateway in a process. I have created a new BPM application
using the "BPM Application" JDeveloper template and in the "Project SOA Settings" step i have selected
"Composite with BPMN Process".
SOA Magazine IV
88
This will bring up the "BPMN 2.0 Process Wizard" where you are prompted to specify a process name
and the service type. In this demo I have selected "Asynchronous Service".
In this demo I will be simulating a Banking Supervision process where a specific department of a Central
Bank is responsible for over-sighting it's financial institutions and based on certain decision points
various
documents
are
required
to
be
generated.
Therefore I will create two input arguments, NonComplianceLetter and LetterToCentralBank, both of
type boolean to denote whether these two type of documents are required to be generated and an
output string argument, DocumentsGenerated, to act as a confirmation of which documents where
generated.
SOA Magazine IV
89
When you click finish it will open the process. Using the structure window I have created three process
data objects to store the input arguments I have created above and hold the output argument value
(nonComplianceLetter and letterToCentralBank both of type boolean and documentsGenerated of type
string).
SOA Magazine IV
90
Next I assigned the two input arguments (NonComplianceLetter and LetterToCentralBank) to the
process data objects (nonComplianceLetter, letterToCentralBank) by double-clicking on the "Start"
activity, going to the "Implementation" tab and selecting "Data Associations".
Please note that i did the same thing for the "End" activity but this time i have mapped the
documentsGenerated process data object to the DocumentsGenerated output argument.
As already mentioned I will be simulating a Banking Supervision process where based on the two
boolean input arguments I will generate either both documents (Non-Compliance Letter and Letter to
Central Bank), one of the documents (Non-Compliance Letter or Letter to Central Bank) or will not
generated any documents at all. And to implement such a scenario I will use an inclusive gateway.
Please pay attention how JDeveloper automatically adds an inclusive merge gateway with every
inclusive split gateway.
SOA Magazine IV
91
Furthermore I dropped three script tasks, one between the inclusive split and merge gateway, one
above the inclusive gateway and one below the inclusive gateway.
On each script task, using the "Data Associations" provided some static text to the documentsGenerated
process data object to display whether a document is generated (please note that in the second script
task I used the concat function to concatenate the string value from the first script task).
SOA Magazine IV
92
Because I want to have the third script task (Do not Generate Any Letters) marked as the default
sequence flow I will delete the default sequence flow from the second script task (Generate Letter to
Central Bank) and redefine the sequence flows.
SOA Magazine IV
93
For the first two conditional outbound sequence flows I have defined an XPath expression to check
whether its marching document is selected. For the "Generate Non-Compliance Letter" conditional flow
you
should
have
an
XPath
condition
similar
to
"xp20:matches(string(bpmn:getDataObject('nonComplianceLetter')), '\s*(?i:true|1)\s*')". For the
"Generate Letter to Central Bank" conditional outbound sequence flow your XPath should be similar to
"xp20:matches(string(bpmn:getDataObject('letterToCentralBank')), '\s*(?i:true|1)\s*')".
SOA Magazine IV
94
Deploy your process on the integrated Weblogic server and run a test with just one of the documents
selected (let's say LetterToCentralBank set to true). The second conditional sequence flow is evaluated
to true and if you inspect the "End" message DocumentsGenerated output element you should see that
the Letter to Central Bank document has been generated.
SOA Magazine IV
95
Now test your process without selecting any of the two documents. The process should have followed
the default sequence flow. You can confirm this by inspecting the "End" message DocumentsGenerated
output element; it should read "No documents generated".
SOA Magazine IV
96
If you switch to graphical view you will see that the process followed the default outbound sequence
flow.
SOA Magazine IV
97
Parallel Gateway
The parallel gateway enables you to perform multiple tasks simultaneously, allowing your process to
follow several unconditional paths in parallel. When your process token reaches a parallel gateway
activity, the parallel gateway will create a token for each outgoing parallel sequence flow. Your process
will wait until all tokens have arrived at the parallel gateway merge activity before resuming with the
rest
of
the
activities.
You should be very careful with the parallel gateway activity because if one of the tokens that have been
created by the parallel gateway doesn't arrive at the parallel gateway merge activity, then your process
will
freeze.
So let's see how you can use the parallel gateway in a process. Let's assume that you are implementing
an order process and that at some point in your process you want to request for quotations from two
different suppliers. Once you have received both quotations your process should resume (ideally pick
the
lowest
quotation
but
I
will
not
be
implementing
this
part).
I created a BPM application with a default BPM project (named both application and project
"ParallelGatewayDemo") having an empty composite.
Next I created an asynchronous BPMN process and named it OrderProcess without any arguments.
SOA Magazine IV
98
As already stated above in this demo I will be simulating an order process where I need to send two
supplier quotation. So I will use the parallel gateway. Please note how JDeveloper automatically adds a
parallel gateway merge activity when you drop a parallel activity on your process.
SOA Magazine IV
99
To simulate a supplier's request for quotation I used a human task component. I created a single human
task for both suppliers accepting all the defaults and just changing the human task title for each case.
For simplicity reasons I changed the human task assignment type from "Lane Participants" to "Names
and expressions" and assigned the "weblogic" user as a new user participant.
SOA Magazine IV
100
So let's test our process. Since we didn't specify any arguments you can directly invoke your process. If
you open the flow trace you will see that two human tasks have been created and are pending.
SOA Magazine IV
101
Go to the BPM workspace, login using the weblogic user and if you followed my exact steps and you
should see two tasks. Approve one of them.
If you go back to the flow trace you should see that there process is waiting for the second human task.
SOA Magazine IV
102
Approve the second human task (from the BPM workspace) and go back to the flow trace; you process
should be completed.
SOA Magazine IV
103
Both tokens from both outgoing parallel sequence flow have reached the parallel gateway merge
activity which signaled to the process that all parallel activities have been completed.
Download sample application: Parallel Gateway
Complex Gateway
In this fourth article of a five-part series we will go through the complex gateway, a gateway that is
similar to an inclusive gateway but at the same time allows you to define a voting pattern to determine
whether your instance should continue execution even if not all outgoing sequence flows have been
completed.
As with all the other gateways that we've seen till now, the complex gateway involves a split activity
which can be either an inclusive gateway (see "Inclusive and Complex" pattern) or an exclusive gateway
and
a
complex
merge
gateway
(see
"Parallel
and
Complex"
pattern).
Similar to an inclusive gateway, a process implementing the "Inclusive and Complex" pattern will follow
all conditional paths whose expressions are evaluated to true. All conditional expressions that evaluate
to true are executed; otherwise the default sequence flow is executed.
SOA Magazine IV
104
A process implementing the "Parallel and Complex" pattern will follow all unconditional paths defined at
design time.
What is really interesting with the complex gateway is that it allows you to define a condition on the
complex gateway merge activity to control whether the instance should continue even if not all of the
complex
gateway
paths
have
been
completed.
When a process reaches a complex gateway activity it will create a token for each outgoing sequence
flow that evaluates to true. You can configure the complex gateway merge activity to continue
execution even if not all of the tokens have arrived at the complex gateway merge activity. For example,
if you have three parallel sequence flows executed you can configure the complex gateway to continue
after two or more tokens have arrived at the merge activity. This is really handy when you want to
implement
a
voting
based
pattern
on
process
activities.
Let's see how you can use the complex activity in a demo process. I will base my demo on the process I
created in part three of my five-part series, Oracle BPM 12c Gateways (Part 3 of 5): Parallel Gateway.
Let's assume that you are implementing an order process and that at some point in your process you
want to request for quotations from three different suppliers. Your process shouldn't wait for a
response from all three suppliers; it is sufficient that you receive quotations from just two suppliers.
Once you have received quotations from any two out of the three suppliers your process should
continue (ideally pick the lowest quotation but I will not be implementing this part).
SOA Magazine IV
105
So let's start with creating a BPM application with a default BPM project (named both application and
project "ComplexGatewayDemo") having an empty composite.
I created next an asynchronous BPMN process and named it OrderProcess (without any arguments).
SOA Magazine IV
106
As already mentioned, in this demo I will be simulating an order process where I need to send three
supplier quotations. The process shouldn't wait for all three quotations but just any two. So I will use the
complex gateway and please notice how JDeveloper will automatically use the "Inclusive and Complex"
pattern, adding an inclusive gateway split activity and a complex gateway merge activity.
Since I want to send a request to all three suppliers I will change the complex gateway pattern to
"Parallel and Complex". You can do so by right-clicking on the inclusive split activity and choosing from
the context menu "Change Gateway configuration to -> Parallel and Complex".
To simulate a supplier's request for quotation I used a human task component. I created a single human
task for all three suppliers accepting all the defaults and just changing the human task title for each case.
I created three user tasks on the process creating default sequence flows from the parallel split activity
to all three user tasks and from each user task a default activity to the complex gateway merge activity.
You should notice that there is a warning on the complex gateway merge activity that the expression
from node is empty. If you go to the implementation details of the complex gateway activity you will see
that you can define either a simple or an xpath expression to control, by selecting the "Abort pending
SOA Magazine IV
107
flows" check box when the process should continue even if not all tokens have reached the complex
gateway.
There is a special predefined variable in a complex gateway called "activationCount" that will return you
the number of tokens that have arrived to the merge gateway. You can use this with other custom data
objects
to
form
complex
expressions.
In my demo, I only care for any two quotations so I will just use the activationCount variable to count
the number of tokens that have reached the complex gateway. If two or more tokens have reached the
complex gateway merge activity then my process shouldn't wait for the third token but instead continue
with its execution.
For simplicity reasons I changed the human task assignment type from "Lane Participants" to "Names
and expressions" and assigned the "weblogic" user as the new user participant.
SOA Magazine IV
108
Let's test our process. Since we didn't specify any arguments you can directly invoke your process. If you
open the flow trace you will see that three human tasks have been created and are pending.
SOA Magazine IV
109
Go to the BPM workspace, login as weblogic (assuming that you have assigned the tasks to weblogic)
and you should see three tasks. Submit one of them and go back to the flow trace. You should see that
one human task has been completed and the other two are still pending.
SOA Magazine IV
110
Go back to the BPM workspace, submit one of the remaining two tasks and once the task is submitted
click on the refresh icon. You should see that the third task get removed from your inbox. This is because
of the expression we have defined on the complex gateway (to abort all pending flows if that expression
was
evaluated
to
true).
If you go back to the flow trace you should see that all three human tasks have been completed and the
process ended.
SOA Magazine IV
111
Even though only two tokens reached the complex gateway, it was enough to signal the process to
continue
with
its
execution.
Download sample application: Complex Gateway
Event-based Gateway
My last article on gateways is on the event-based gateway, another type of gateway supported by
Oracle
BPM
12c
to
provide
divergence
in
processes.
The event-based gateway is very similar, conceptual wise, to the exclusive gateway in the sense that we
can have various outgoing sequence flows but only one branch is followed. The operational difference
though is that, as its name suggests, the event-based gateway uses events for defining the branching
conditions and decisions rather than data-specific conditions.
SOA Magazine IV
112
An event-based gateway can consist of multiple events; however the first event that occurs will
determine
the
execution
path
that
will
be
followed.
Using the order process as an example, once an order is received and validated it needs to be processed.
Assuming that we have multiple warehouses, the order will be processed by the warehouse that can
process all inventory items ordered and responds first. The process will wait until one of the warehouses
responds to the request for processing the order. However, the process cannot wait indefinitely.
So let's see how we can implement the above scenario using the event-based gateway.
Let's start with creating the basic BPM application and BPM project (named both application and project
"EventBasedGatewayDemo") and selecting "Composite with BPMN Process" in step 3.
SOA Magazine IV
113
This will bring up the "BPMN 2.0 Process Wizard" where you are prompted to specify a process name
and the service type. Give your process a name, for example OrderProcess and select "Asynchronous
Service" as the service type.
SOA Magazine IV
114
In step 3 of the BPMN 2.0 Process Wizard you are prompted to specify the input and output process
arguments. A typical order definition would consist of an order id, a collection of order items, the order
total, customer details and bill and ship address details. For simplicity reasons my demo process will only
consist of the order id (of type int) and the order total (of type decimal) as input arguments and a status
argument of type string as output.
SOA Magazine IV
115
The event-based gateway is composed of the event-based split gateway and two or more target events.
These events can be either message catch events, timer catch events or receive tasks. Please note that
you cannot mix message catch events and receive tasks within the same event-based gateway.
Select and drop the "Event Based" gateway on the default sequence flow between the "Start" and "End"
activities. You should notice that JDeveloper automatically has added three activities; an event based
split gateway activity, a catch message activity and a timer catch activity.
SOA Magazine IV
116
There is an error on the Timer activity because it doesn't have an outgoing default sequence flow. Give
the event based gateway split activity and the catch message activity some meaningful names and add a
new catch message activity just below the default created catch message activity and define default
sequence flows for all activities.
Next set the timer activity to wait for 30 seconds for a reply from the warehouses and define an
implementation on the two catch message activities. Again for simplicity and demo reasons I have used
the "Define Interface" message exchange type and created only a single argument, order id of type int.
The implementation of both catch message activities are exactly the same, the only difference is the
operation name.
SOA Magazine IV
117
So we have a process that expects some events to arrive from the two warehouses while its being
executed. This means that the main process, which in our case its the order process needs to find a way
to correlate these intermediate events with the itself. To do that we need to create a correlation
property (an attribute to correlate the main process with the intermediate events) and a correlation key
(which defines the properties to use in the correlation which consists of one or more correlation
properties).
I will use the order id to correlate the events that will be flowing from the warehouses into the main
order process, therefore I have created a correlation property named "orderId" and of type int and a
correlation key named "ck_orderId" with the orderId correlation property selected.
SOA Magazine IV
118
We now need to instruct the process and intermediary events to use the orderId correlation property
we created. The main process will be responsible for initiating the correlation and matching events that
will be coming in from the warehouses with the main process. Therefore on the message start activity
select under "Implementation" select "Correlations" and select the orderId correlation property and
map it to the orderId process input argument. Ensure that you have selected the "Initiates" option to
instruct the process to correlate an instance of the order process with the supplied order id.
SOA Magazine IV
119
The same correlation definitions need to be applied on the two catch message activities. Please ensure
that the "Initiates" option is not selected since this will be done upon the instantiation of the process.
SOA Magazine IV
120
Deploy your process and run a test instance using the "start" operation. If you inspect the audit trail you
should see that the instance is waiting at the event-based split gateway activity.
SOA Magazine IV
121
Run a new test instance of your order process but this time use one of the warehouse operations (for
example, use the "warehouseB" operation). If you inspect the audit trail you should notice that the
order is now completed and that the process followed the second branch.
SOA Magazine IV
122
The biggest challenge in modeling and implementing processes is knowledge of the components set. I
hope that by the end of this five-part series you have a clearer view on gateways and how to use each
gateway to define your control and deviation points within your processes.
Antonis Antoniou
Antonis Antoniou is a Technical Director working for Oracle Platinum Partner
eProseed. He is a Fusion Middleware Expert, a specialist in the areas of
Enterprise 2.0, Business Process Management and Service Oriented
Architecture and a certified professional on Oracle Application Grid, Oracle
WebCenter Portal, Oracle WebCenter Content, ADF, Oracle BPM and Oracle
SOA. Antonis has long-time experience as developer, coach, trainer and
architect and has leaded multiple complex projects on Oracle Fusion
Middleware across Europe and Middle-East and across various industries
(telecom, financial services, public sector). Antonis started his professional
career at Deloitte where he worked for 8 years, reaching the level of senior manager in charge of the
technology integration service line before joining eProseed. Antonis is an avid technology evangelist and
a regular speaker at various Oracle conferences and events.
Contact:
Blog
SOA Magazine IV
123
Call for content SOA Magazine & Service Technology Magazine & OTN
We want to publish your SOA & BPM content in the Service Technology Magazine and the SOA Magazine and OTN!
You write about SOA and BPM best practices, code samples, reference cases, governance, code samples, mobile
integration, cloud integration? Let us know we are very keen to publish your articles! Please send them to:
Service Technology Magazine
OTN
SOA Magazine
Contribution
Contribute
Contribute
SOA Magazine IV
124
2. Payload Size
It can often be simpler at the time of BPM process design to have one large payload schema that
includes all elements for every possible interaction within the lifetime of an instance, and pass this
everywhere within the instance, including to human tasks and their UIs.
HOWEVER The cost of this, both at runtime and in terms of the number and size of database rows, can
be large. The whole payload must be written to SOAINFRA database at dehydration points within the
lifetime of a process instance & in-between these dehydration points, data objects associated with this
payload are held in memory.
BUT Appropriate design of the payload schema (flatter & simpler) can reduce the size considerably.
The optimal solution would be to pass only key-values in the payload and retrieve detail values as-andwhen needed inside the process, however this can lead to over-complicating the process design with
technical services. A sensible balance is always the best approach.
SEE XML_DOCUMENT Table Growth
SOA Magazine IV
125
3. Partitioning / Purging
BPM audits heavily, this can be extremely useful for business insight
HOWEVER The SOAINFRA database growth can be larger than expected
BUT Partitioning & purging are critical to limiting database growth. Test purging thoroughly as part of a
normal stress/load test cycle. Determine whether loop purge outside of the online window is
sufficient, if not consider also using parallel purge during quiet periods during the online day.
Partitioning is a good option in most cases, in 11g SOAINFRA must be partitioned post-installation but in
12c it is an installation option.
SEE SOA 11g Database Growth Management Strategy Paper & SOA Partitioning
4. Negative Testing
SOA Suite provides a comprehensive fault policy framework & BPM has inbuilt fault-handling
constructs, allowing the vast majority of technical and business exceptions to be handled gracefully.
HOWEVER Failure to properly negative test potential exceptions, individually & in bulk, can lead to
inadequate operational guidelines & faults occurring in production which can be hard to recover.
BUT Ensure that thorough negative testing happens in a like-live pre-production environment. Use this
testing as a basis for building a robust fault-policy framework. Involve operations staff in this testing
process in order to build an appropriate Operations Guide
SEE Fault Policy Framework
SOA Magazine IV
126
8.Process Versioning
BPM is highly flexible in the management of process versions and the in-flight instances that run on
them. Old and new revisions can run in parallel in a co-existence strategy. Small fixes to a revision can
have the associated in-flight instances patched automatically. New revisions of a process can have
some or all in-flight instances migrated from an older revision.
HOWEVER Not all changes to the design of a process will allow automatic patching or migration of inflight instances.
BUT It is possible to design around these limitations, and also possible in some case to force
deployment and/or use Alter Flow post deployment to massage in-flight instances to the appropriate
activity with the correct instance data.
SOA Magazine IV
127
SEE Restrictions on patching of in-flight instances (11g) & Restrictions on migration of in-flight
instances (11g) & Alter Flow (11g) & Instance Patching & Instance Patching Revisited
SOA Magazine IV
128
Mark Foster
Mark Foster is a Consulting Solution Architect in the FMW Architects
Team. He is focused on BPM & ACM and is widely acknowledged as
one of the go-to experts in these products having helped customers
worldwide, from architecture reviews to escalations. He has over 25
years experience in the IT industry and over 10 years as an Integration
& Solutions Architect. Prior to Oracle, Mark worked for Sun
Microsystems & SeeBeyond in very similar roles, and before this Mark
worked as an Integration Architect at Royal Bank of Scotland /
NatWest Bank. Outside of work Mark is a keen runner & triathlete
having competed at distances up to IronMan. Mark lives in France,
just south of Strasbourg at the foot of the Vosges mountains.
Contact:
Blog
SOA Magazine IV
129
Adaptive Case Management is ultimately about allowing knowledge workers to work the way that they
want to work and to provide them with the tools and information they
need to do so effectively.
As
Surendra
Reddy
points
out
in
his
foreword:
Imagine a fully integrated ACM system layered into the value stream of
an enterprise. The customer support team is able to focus on customer
needs, with easy access to the entire companys repertoire of
knowledge, similar cases, information, and expertise, as if it were a
service. To truly accommodate customers, companies must vest real
power and authority in the people and systems that interact directly
with customers, at the edge of the organization and beyond. ACM
augments business processes to deliver true data-driven process
infrastructure entering enterprises into the age of intelligent machines
and intelligent processes. ACM empowers the knowledge worker to collaborate, derive new insights,
and fine tune the way of doing business by placing customers right in the left where they belong, to
drive
innovation
and
organizational
efficiencies
across
the
global
enterprise.
ACM also helps organizations focus on improving or optimizing the line of interaction where our people
and systems come into direct contact with customers. Its a whole different thing; a new way of doing
business that enables organizations to literally become one living-breathing entity via collaboration and
adaptive data-driven biological-like operating systems. ACM is not just another acronym or business fad.
ACM is the process, strategy, framework, and set of tools that enables this evolution and maturity.
ACM, in my opinion, is the future blueprint for the way of doing business.
Thriving on Adaptability describes the work of managers, decision makers, executives, doctors, lawyers,
campaign managers, emergency responders, strategists, and many others who have to think for a living.
These are people who figure out what needs to be done, at the same time that they do it.
In award-winning case studies covering industries as a diverse as law enforcement, transportation,
insurance, banking, state services, and healthcare, you will find instructive examples for how to
transform your own organization.
This important book follows the ground-breaking best-sellers, Empowering Knowledge Workers, Taming
the Unpredictable, How Knowledge Workers Get Things Done, and Mastering the Unpredictable and
provides important papers by thought-leaders in this field, together with practical examples, detailed
ACM case studies and product reviews. Get the book here.
SOA Magazine IV
130
SOA Magazine IV
131
40% governance
36% service marketplace
Gartner concludes By 2015, CSBs will represent the single-largest category of growth in cloud
computing, moving from a sub-$1 billion market in 2010 to a composite market counted in the hundreds
of billions of dollars. Currently SAAS and IAAS are the largest categories in cloud computing. AS the
market potential is huge more and more leading cloud providers offer a complete platform. For example
Salesfore.com stated with CRM services, added force a PaaS platform and offers Salesforce Identity a
cloud service intermediation. When a market opportunity becomes interested and large enough these
large providers like Google, Salesforce, Microsoft, IBM or Oracle will offer services. These providers do
have the market power to offer complete and proprietary cloud economies.
An interesting development are the joint partnerships between leading cloud vendors. Oracle and
Microsoft announced a partnership in June 2013. As part of this partnership cloud consumers and
deploy Oracle databases and Oracle WebLogic middleware on the Microsoft Azure cloud. Salesforce and
Oracle plan to standardize and integrate their SAAS cloud services. Salesforce CRM should be integrated
with Oracle HCM and Financials. This new partnerships will reduce the need for CSBs. The cloud
consumers might profit from better integrated solutions and better interoperability between cloud
providers. Or in the other case these large vendors might enable cloud brokerage between their cloud
solutions out of the box. Additional more and more IT companies offers Cloud Service Brokers out of the
box. For example Oracle launched the Cloud Adapters like pre-build cloud service intermediation
brokers between solutions like Salesfore.com, RightNow and on-premises applications like SAP R3.
Eventually this integration services will also become available in the Oracle Cloud Integration Service
planed for mid 2015.
Challenges cloud service broker
Potential challenges between several cloud providers might increase the overall SLAs. For example a
CSB brokers two credit check services from different cloud providers. If one provider fails, the other
cloud provider might be still available. The CSB brokers of this service and the cloud consumer can still
get credit check results. On the other hand if the CSB fails all cloud services are not available for the
cloud consumer, therefore the CSB might be a single point of failure. The cloud consumer might
decrease this risk by using several CSBs for the same cloud services. This might increase the complexity
significant. Imagine the use of several CRM solutions integrated with several e-mail solutions and be
synchronized and managed by several CBS.
Examining the cloud broker market opportunity & role of IP
Cloud brokers need to offer their consumer an added value. IP is the foundation for CSB. In todays
model integration between systems is delivered often on a project base by a system integrator. These SI
can leverage this knowledge and re-use it to build and become a CSB. Based on the Cloud Provider SIs
will be supported with a marketing platform to promote the CSBs. For example Oracle gives its partners
to promote re-usable assets like CSBs within the Cloud Marketplace.
Cloud broker IaaS
An emerging area of CBS is the brokerage of IaaS. Solutions deployed on a defined stack like Red Hat
OpenStack can be shifted between private and or different public clouds. Red Hat offers a technology
call Cloud Forms to manage the stack and broker the workload. Other vendors like Oracle offer similar
technology e.g. Nimbula. Cloud consumers get the freedom to choose the IaaS provider.
SOA Magazine IV
132
References
Source Defining Cloud Services Brokerage: Taking Intermediation to the Next Level
Source Defining Cloud Services Brokerage: Taking Intermediation to the Next Level
Source: http://www.oracle.com/us/corporate/press/1964592
Source: http://www.oracle.com/us/corporate/press/1964798
Jrgen Kress
As a middleware expert Jrgen works at Oracle EMEA Alliances and Channels,
responsible for Oracle's EMEA fusion middleware partner business. He is the
founder of the Oracle SOA & BPM and the WebLogic Partner Communities
and the global Oracle Partner Advisory Councils. With more than 5000
members from all over the world the Middleware Partner Community is the
most successful and active community at Oracle. Jrgen manages the
community with monthly newsletters, webcasts and conferences. He hosts
his annual Fusion Middleware Partner Community Forums and the Fusion
Middleware Summer Camps, where more than 200 partners get product
updates, roadmap insights and hands-on trainings. Supplemented by many web 2.0 tools like twitter,
discussion forums, online communities, blogs and wikis. For the SOA & Cloud Symposium by Thomas Erl,
Jrgen is a member of the steering board. He is also a frequent speaker at conferences like the SOA &
BPM Integration Days, JAX, UKOUG, OUGN, or OOP
Contact:
Blog
SOA Magazine IV
133
ABNF Meta-language often used as definition language for rules associated with document exchange
http://en.wikipedia.org/wiki/Augmented_Backus%E2%80%93Naur_Form
SOA Magazine IV
134