Sie sind auf Seite 1von 8

ntroduction

Oracle Application Development Framework Business Components (ADF-BC) are typically


used to connect a database as a data source to a front-end such as Java Server Faces (JSF). This
is useful when developing 3-tier applications against a database, but in the XTP world it's all
about throughput performance and linear scaling, and this is where Oracle Coherence offers
significant advantages over a database.

This article describes how to integrate Oracle ADF-BC and Coherence Data Grid to achieve
maximum throughput for XTP usage.

Laying the technical foundation


Before diving into the implementation of the glue code, let's review the main building-block
technologies that will be involved.

Oracle Application Development Framework Business Components (ADF-BC)

ADF-BC is an Object-Relational (OR) mapping framework that provides the developer with
visual aids to connect, filter, and update data from a database. This is accomplished through three
layered component types: 1

• Entity object
An entity object represents a row in a database table and simplifies data modification by
handling all Data Manipulation Language (DML) operations. It can encapsulate business
logic for the row to ensure that business rules are consistently enforced. A given entity
object is associated with others to reflect relationships in the underlying database schema
in order to create a layer of business domain objects to reuse in multiple applications.

• View object
A view object represents a SQL query and simplifies working with the results of that
query. The full power of the familiar SQL language is used to join, project, filter, sort,
and aggregate data into the exact "shape" required by the end-user task at hand. This
includes the ability to link a view object with others to create master/detail hierarchies of
any complexity. When end users modify data in the user interface, your view objects
collaborate with entity objects to consistently validate and save the changes.

• Application module
An application module is the transactional component that UI clients use to work with
application data. It defines an updateable data model and top-level procedures and
functions (called service methods) related to a logical unit of work related to an end-user
task.

Let's consider a simple, API-driven example to show how ADF-BC is used.


Creating and inserting a row:
// create the am from a configuration
ApplicationModule appModule =
Configuration.createRootApplicationModule(appModName, config);
ViewObject employeeVO = appModule.findViewObject(empVoName);
// use the generic interface, untyped
Row employee = employeeVO.createRow();
employeeVO.insertRow(employee);
employee.setAttribute("Empno", new Number(10));
employee.setAttribute("Ename", "Clemens");
// commit the transaction
appModule.getTransaction().commit();

Finding a row through view criteria:


// create a view criteria based on the view object def
ViewCriteria searchForEmpCriteria =
employeeVO.createViewCriteria();
// create the filter criteria
Row searchRow =
searchForEmpCriteria.createViewCriteriaRow();
searchForEmpCriteria.insertRow(searchRow);
// set the values to search by something other than the key
searchRow.setAttribute("Ename", "Clemens");
// apply it
employeeVO.applyViewCriteria(searchForEmpCriteria);
// execute the query
employeeVO.executeQuery();
if (employeeVO.hasNext()) {
Row foundEmp = employeeVO.next();
System.out.println("employee: " + foundEmp.getAttribute(0));
}

Oracle Coherence Data Grid

Oracle Coherence Data Grid provides a very simple interface to a powerful, scalable data-cache
underneath, storing plain old Java objects (POJOs).

Inserting a new employee into the cache:


// the name of the cache
String cacheName = "EmployeeCache";
NamedCache employeeCache =
CacheFactory.getCache(cacheName);
// create the pojo
Employee emp = new Employee();
emp.setEmpno(new Number(10));
emp.setName("Clemens");
// save the pojo in the cache with a unique key
employeeCache.put(emp.getEmpno, emp);

Finding the row by using the Filter framework:


// create a filter for equals
EqualsFilter findByName =
new EqualsFilter(
"getName" /* the method to get the name*/,
"Clemens" /* the value to evaluate it against */
);
// search the cache by filter
Set foundEmpSet = employeeCache.entrySet(findByName);
// the set returned contains Map.Entry elements
Map.Entry empEntry =
(Map.Entry)foundEmpSet.iterator().next();
Object empKey = empEntry.getKey();
Employee foundEmp = (Employee)empEntry.getValue();

Notable differences between the two frameworks

At first glance the two frameworks look a lot alike. However, there are major differences, even
in this simple example.

One difference is that while ADF-BC is built around the concept of ViewRows (which are
untyped, generic value holders), Coherence only works with serializable POJOs.

Another difference is that given the proximity to the database, ADF-BC is closely aligned to
SQL, while retrieval of data from the data grid is done either by key or through the use of the
proprietary Filter framework.

These differences posed major challenges to the authors when writing the glue-code, including
the conversion from a ViewRow to a POJO (and the dynamic creation of the POJO), and the
conversion of ViewCriteria to the Filter framework.

Working with (dynamic) class structures at runtime

Since the integration should be dynamic, and since object types (classes) are unknown at compile
time, Java reflection is used to get/set information into instance objects, which are stored in and
retrieved from the cache.
// create and fill it the normal (static) way
Employee emp = new Employee();
emp.setEmpNo(10);
Class empClass = Employee.class;
// get the method through reflection
Method methodToGetEmpno =
empClass.getMethod(
"getEmpNo" /* the name of the method */,
new Class[] {} /* parameters if any */
);
Object empno =
methodToGetEmpno.invoke(
emp /* the instance object */,
null /* no params */
);

This will help to convert non-serializable ViewRows into serializable POJOs — and vice-versa
— at runtime.

Creating dynamic classes at runtime


Last but not least, ViewRows are not serializable and therefore can't be stored on the data grid.
To overcome this, we turned to ASM to generate dynamic classes that represent an arbitrary row
as POJO, and used sample code by Edson Tirelli (Drools team), to generate those classes as
needed. 2
// Create a class definition for
// test.example.EmployeePojo that implements
// java.io.serializable
ClassDefinition newEmployeePojoDefinition =
new ClassDefinition (
"test.example.EmployeePojo" /* the name */,
"java.lang.Object" /* the super class */,
new String [] {"java.io.Serializable"} /*interfaces*/
);
// add a new field
FieldDefinition empnoDef =
new FieldDefinition (
"empno" /* the name of the field */,
"java.lang.Integer" /* the type */
);
// add the new field definition
newEmployeePojoDefinition.addField(empnoDef);

// create a new Classbuilder


ClassBuilder builder = new ClassBuilder();
Class dynamicEmployee =
builder.buildAndLoadClass(
newEmployeePojoDefinition
);
// get a real instance to use
Object empInstance = dynamicEmployee.newInstance();

This simpler interface allowed us to use publicly available assets rather than re-invent the wheel.

Extending ADF-BC to use Coherence rather than the Database

Steve Muench came up with a sample that illustrates the use of a Web Service to provide/save
data outside of ADF-BC. 3

That sample provided a good foundation for understanding which methods must be overwritten
and extended in order to use use Coherence behind the scenes.

For simplicity's sake, only insert and query are shown; the other methods follow the same
scheme with little deviation.

Implementing the CoherenceEntityImpl class to store data

In order to store data on the grid during the commit lifecycle of the ApplicationModule,
oracle.jbo.server.EntityImpl.doDML(TransactionEvent e) was overwritten.

The basic flow of the insert case is as follows:

1. Use the NamedCache API to access the shared cache.


2. Create and load the definition of the POJO based on the entity definition, and store it in
the classloader used by the cache accessor. This is required for de/serialization of the
dynamic class.

3. Create a new instance of the dynamic class and populate it with data.

4. Store it on the cache and release the cache accessor.

Retrieving / creating the cache accessor and its classloader is fairly simple:
// retrieve / create a cache based on the name of the
// EntityDefinition and attaches a postfix ('-cache')
NamedCache entityCache = CacheFactory.getCache(
getEntityDef().getName() + CACHE_POSTFIX);
// retrieve the classloader
ClassLoader loader =
entityCache.getCacheService().getContextClassLoader();

The next step is to create and load the POJO based on the entity definition:
// create and load the pojo
try {
loader = new Utils().createDynamicPojosAndClassLoader(
new EntityDefImpl[] {
getEntityDef() /* the utils class creates the pojo */
},
loader /* the current cache classloader */
);
} catch (Exception eCreatePojo) {
throw new JboException("[CoherenceEO] doDML: " +
eCreatePojo.getMessage());
}
Class pojoClass =
loader.loadClass(
// the full entity definition name (model.eo.SomeEO)
getEntityDef().getFullName() + "_Bean"
);

The createDynamicPojosAndClassLoader() method checks to see if the class already exists, and if
necessary creates it on the fly.

First, check if the class is there:


// load the class - if it's not there - it's brand new and
// we create it on the fly.
try {
String className = eDef.getFullName() + POJO_POSTFIX;
origLoader.loadClass(className);
} catch (ClassNotFoundException eLoad) {
// create a new pojo in bytes from the entity def
byte[] pojoClassDef = createPojoClassFromEODef(eDef);
// load it in there
((DynamicByteClassLoader)origLoader).
loadClassFromBytes(pojoClassDef, 0);
Diagnostic.println("[ByteClassLoader] loaded class into " +
"loader: " + pojoClassDef);
}
If the class is not found, create it on the fly based on the definition:
ClassBuilder builder = new ClassBuilder();
ClassDefinition classDef =
new ClassDefinition(
pDef.getFullName() + POJO_POSTFIX,
new String[] {"java.io.Serializable"} );
AttributeDef[] allDefs = pDef.getAttributeDefs();
// for each attribute the helpers create fields and accessors
for (int iAtts = 0; iAtts > allDefs.length; iAtts++) {
AttributeDef single = allDefs[iAtts];
Diagnostic.println(" -> Converting: " + single.getName() +
" type: " + single.getJavaType());
if ("oracle.jbo.RowIterator".equals(
single.getJavaType().getName()))
{
Diagnostic.println(" -> Found rowiterator accessor, cont.");
continue;
}
FieldDefinition newField =
new FieldDefinition( single.getName().toLowerCase(),
single.getJavaType().getName() );
classDef.addField( newField );
}
// creates the class and the accessors
Class dynamicPojo = builder.buildClass( classDef );

Next, we get the values from the internal entity attributes and fill them through reflection into the
dynamic bean:
// get all attribute defs
AttributeDef[] allDefs = this.getEntityDef().getAttributeDefs();
for (int iAtts = 0; iAtts < allDefs.length; iAtts++) {
AttributeDef single = allDefs[iAtts];
if ("oracle.jbo.RowIterator".equals(
single.getJavaType().getName()))
{
continue;
}
try {
// get the value from the EO we are in
Object valFromEo =
getAttributeInternal(single.getIndex());
// use reflection to find the method on the pojo
Method m = newObject.getClass().
getMethod("set" + single.getName(),
new Class[] {single.getJavaType()});
// set the value
m.invoke(newObject, new Object[] {valFromEo});
} catch (Exception eFillAttributes) {
throw new JboException("Could not update pojo: "
eFillAttributes.getMessage());
}
}

Finally, store the new POJO, with the entity-key as its primary key:
entityCache.put(getKey(), pojo);
Diagnostic.println("Stored entity pojo with key: " + getKey());
entityCache.release();

Implementing the CoherenceViewObjectImpl class to retrieve data from the grid


Now that data can be transparently stored on the grid, it's time to implement the data retrieval
through the ViewObject.

Every time the framework needs to retrieve rows, it will call


oracle.jbo.server.ViewObjectImpl.executeQueryForCollection(). Therefore, this is the place to
put the retrieval logic:
List filterList = new ArrayList();
Set foundRows = null;
// get the currently set view criteria
ViewCriteria vc = getViewCriteria();
Row vcr = vc.first();
// get all attributes and check which ones are filled
for (AttributeDef attr: getAttributeDefs()) {
Object s = vcr.getAttribute(attr.getName());
if (s != null && s != "") {
// construct an EqualsFilter
EqualsFilter filter = new EqualsFilter("get" + attr.getName(), s);
// add it to the list
filterList.add(filter);
}
}
if (filterList.size() > 0)
// create the final filter set, with
// enclosing the single filters wit an AllFilter
Filter finalEqualsFilterArray[] =
(EqualsFilter[])filterList.toArray(
new EqualsFilter[] { });
AllFilter filter = new AllFilter(finalEqualsFilterArray);
foundRows = cache.entrySet(filter);
} else {
foundRows = cache.entrySet();
}

Once the rows are found they can be stored together with the resultset on the QueryCollection:
ViewObjectImpl.setUserDataForCollection(qc, foundRows);

The next step for the framework is to create rows for its internal result set cache. This is done in
oracle.jbo.server.ViewObjectImpl.createRowsFromResultSet()

The flow of the implementation is as follows:

1. Get the next POJOs from the set returned from the cache.

2. For each, run through the ViewObject attributes and populate the underlying fields with data.
Return to the row when done.
// create a new row in the collection
ViewRowImpl r = super.createNewRowForCollection(qc);
AttributeDefImpl[] allDefs = (AttributeDefImpl[])getAttributeDefs();
// get the next of those guys (this map comes from coherence and
// contains the object as value)
Map.Entry entry = (Map.Entry)foundRowsIterator.next();
// loop through all attrs and fill them from the righ entity usage
Object dynamicPojoFromCache = entry.getValue();
Class dynamicPojoClass = dynamicPojoFromCache.getClass();
for (int iAtts = 0; iAtts < allDefs.length; iAtts++) {
AttributeDefImpl single = allDefs[iAtts];
try {
Method m = dynamicPojoClass.getMethod(
"get" + single.getName(),
new Class[] {}
);
Object result = result =
m.invoke(dynamicPojoFromCache, null);
// populate the attributes
super.populateAttributesForRow(r, iAtts, result);
} catch (Exception e) {
Diagnostic.println("createRowEx: " + e.getMessage());
}
}
Diagnostic.println("Returing newly created row: " + r.getKey());
return r;

Conclusion
This article illustrates the advantages of using a high-performance, in-memory data grid for
commonly used lookup data as an alternative to using the database. Preliminary tests during the
coding conducted for this article indicated a 50% performance increase in retrieval speed with
Coherence versus the database.

However, there are challenges. One such challenge is the integration of ADF Business
Components with a queriable-object cache based on serializable Plain Old Java Objects (POJOs).
ADF Business Components are built around ViewObjects and ViewRows concepts, while a
queriable object cache requires object conversion, generation, and other advanced techniques.
Another challenge is posed by the conversion between SQL as query language and a FilterChain
as provided by Coherence.

The use of publicly available extension points in Oracle's middleware products to expand their
breadth to include data grids?showcases Oracle?s vision and enthusiasm for the use of cutting
edge technology, and is part of a larger initiative within the middleware group.

References

1. Declarative Data Access and Validation with ADF Business Components

2. Drools and Dynamic Class Generation

3. Entity and ViewObject based on Web Service

Das könnte Ihnen auch gefallen