Red Mavericks articles

To break the rules you must first master them

ADF Namings Conventions – Part II

Hi all,

In my previous post ADF Namings Conventions – Part I I have focused my attention on:

  • Application & Project Namings
  • Packages Namings
  • Business Components Namings

Today I will start to talk about Model & View Controller Project namings.

 

Model Namings

For new ADF applications we are requested to set the names for Model and ViewController projects. In what concerns the Model it should following naming:

<PROJECT_NAME> + <MODULE_NAME> + Model

Example: RMKMyAdfLibModel

 

For the package structure it should be configured as follows:

<DOMAIN_NAME> + . + <PROJECT_NAME> + . + <MODULE_NAME> + . + model

Example: red.mavericks.rmk.my.adf.lib.model

 

As you can see, package structure follows the same structure defined in my previous post plus “model“.

View Controller Namings

In ViewController projects we have a wide range of namings we can follow since we have multiple features we can take advantage of. For that reason we divided it in sub topics. .

 

Project Name

 

The name for the project should be defined as follows:

<PROJECT_NAME> + <MODULE_NAME> + Controller

Example: RMKMyAdflibController

By using “Controller” prefix we are able to automatically identify projects type and purpose.

 

Project Default Package Structure

 

Package structure for View Controller project should be defined as follows:

<DOMAIN_NAME> + . + <PROJECT_NAME> + . + <MODULE_NAME> + . + view

Example: red.mavericks.rmk.my.adf.lib.view

As you can see, package structure follows the same structure defined in my previous post plus “view“.

 

Images, CSS and JavaScript Directories

 

Images, CSS and JavaScript directories should be defined right under “Web Content” folder. The folders should have the following names:

 

Folder Type Folder Path
 To contain images  resources/images
 To contain CSS files  resources/css
 To contain javascript files  resources/js

 

Example:

ImagesCssJavascript

 

Inside “web.xml” file in ViewController project set the following mappings:

<servlet-mapping>  
   <servlet-name>resources</servlet-name>  
   <url-pattern>resources/images/*</url-pattern>  
</servlet-mapping>  
<servlet-mapping>  
   <servlet-name>resources</servlet-name>  
   <url-pattern>resources/css/*</url-pattern>  
</servlet-mapping>  
   <servlet-mapping>  
   <servlet-name>resources</servlet-name>  
   <url-pattern>resources/js/*</url-pattern>  
</servlet-mapping>

 

Bounded Task Flows, JSFF, JSPX Directories

 

Reusable bounded task flows can be published in ADF Libraries and consumed by other ADF applications. For this use case is important, and a must, that all bound task flows have a unique name. If this isn’t accomplished ADF has no way to distinguish between task flows with the same name.

Before creating bounded task flows you should set a folder structure under “WEB-INF” folder. After this folder structure is created you can start creating your bounded task flows.

Folder structure should be created according to the following convention:

 

 <DOMAIN_NAME> + <PROJECT_NAME> + <MODULE_NAME> + view + <BUSINESS_AREA>

Example:

taskFlowsFolderStructure

JSF, JSFF and JSPX files must be saved on the same folder of the task flow.

Page definitions will be automatically generated with same package structure under Application Sources folder. Example: red.mavericks.rmk.my.adf.lib.view.financial or red.mavericks.rmk.my.adf.lib.view.retail

Managed Beans for Task Flows should be created under Application Sources folder with the following package structure:

<DOMAIN_NAME> + <PROJECT_NAME> + <MODULE_NAME> + <BUSINESS_AREA>

Example:

taskFlowsManagedBeans

 

We decided to create Managed Beans outside “view” package in order to not have a mixture between page definitions and java classes.

 

In my next post I will focus my attention on:

  • Task flows as well as Templates and some more regarding View Controller Projects
  • JAR, WAR, EAR files

 

Don’t miss my next post 🙂

Cheers,

Pedro Gabriel

@PedrohnGabriel

 

This is a cross post with LinkConsulting. For more Oracle Middleware related posts, please visit http://www.linkconsulting.com/oracle

Post Header photo by Lefteris Heretakis

ADF Naming Conventions - Part I

ADF Namings Conventions – Part I

Hi all,

Today I’m focusing my attention to ADF naming conventions.

Beside this post I will write two more in order to cover as much as possible all areas of this subject. In the last post I will provide a PDF with all information covered in these series of posts.

Motivation

During ADF applications development we may encounter many development challenges. One of these challenges is about implementing a naming convention to be used by all involved project developers during implementation.

Each developer have his own background and his own ideas on how things should be implemented. We want them to have freedom of thought in order to get the best approaches to reach the goal, but what we really don’t want is to have multiple ways of doing the same thing otherwise we might face really difficult challenges in the future, namely around software maintenance and bug tracing.

Also, the developer roster may change during project development. For the new ones who enter we need to provide proper training. If we can follow conventions we will have shorter training periods and they will be brought to speed quicker while familiarizing with the application.

After the application is deployed in the production environment, we face a new challenge, Maintenance and Support. Big headaches usually appear right there, and they can be even bigger if we don’t follow these important naming conventions in our applications’ code.

I have found some information here about this topic but we needed more, and we needed to instantiate it to our projects, so we decided to defined our own ADF Naming Conventions, to be used organization-wide on our ADF projects.

In this post I will share my experience and our ADF Naming convention rules regarding the following topics:

  • Application & Project Namings
  • Packages Namings
  • Business Components Namings

Abbreviations

Consider the following terms used in these series of posts:

Abreviation Definition Example
PROJECT_NAME Customer or project’s short name. Try to use only the necessary characters that are able to describe your customer/project so that everyone can understand it.  RMK (Red Mavericks)
MODULE_NAME Provide a name that describes the target of the project (module).  MyAdfLib
DOMAIN_NAME Internet domain name.  red.mavericks
BUSINESS_AREA Feature Business area (retail, financial, etc.)  retail/financial

Application & Project Naming

For Applications and Projects we defined the following naming rules:

Application Project
<PROJECT_NAME> + <MODULE_NAME> + App

Example: RMKMyAdfLibApp

<PROJECT_NAME> + <MODULE_NAME>

Example: RMKMyAdfLib

IMPORTANT NOTE: For both previous naming rules, decide if they should be in upper case, lower case or camel case. Once you do, stick to it!

Packages Namings

In the wizard for creating a new application, you are prompted to set up the “Application Package Prefix“. This package prefix will define the root package for the projects contained in this new application.

The default “Application Package Prefix” for new applications should be:

<DOMAIN_NAME> + . + <PROJECT_NAME>

Example: red.mavericks.rmk

 

For each new project, you will inherit the application’s root package structure. Nevertheless you should configure it to have a distinct name from other projects. With this, you are able to identify your modules/libraries. The package structure for your project should follow the next pattern:

<DOMAIN_NAME> + . + <PROJECT_NAME> + . + <MODULE_NAME>

Example: red.mavericks.rmk.my.adf.lib

Business Components Naming

For this topic, we will provide our proposed package structure and file naming rules for Business Components.

IMPORTANT NOTE: Files should be named in camel case.

Packages Structure

For Business Components projects, we should gather Entity Objects, View Objects, View Links and Associations under the following packages::

Type Package Description
 Entity Associations  <DOMAIN_NAME>+ . +<PROJECT_NAME>+ . +<MODULE_NAME>adfc.entity.associations  Contain entity associations
 Entity Objects  <DOMAIN_NAME>+ . +<PROJECT_NAME>+ . +<MODULE_NAME>adfc.entity.objects  Contain entity objects
 View Links  <DOMAIN_NAME>+ . +<PROJECT_NAME>+ . +<MODULE_NAME>adfc.view.links  Contain view links
 View Objects  <DOMAIN_NAME>+ . +<PROJECT_NAME>+ . +<MODULE_NAME>adfc.view.objects  Contain view objects
 Model JPX File  <DOMAIN_NAME>+ . +<PROJECT_NAME>+ . +<MODULE_NAME>adfc  Contain model JPX file

Example:

Entity Associations:  red.mavericks.rmk.my.adf.lib.adfc.entity.associations
 Entity Objects:  red.mavericks.rmk.my.adf.lib.adfc.entity.objects
 View Links:  red.mavericks.rmk.my.adf.lib.adfc.view.links
 View Objects:  red.mavericks.rmk.my.adf.lib.adfc.view.objects
 Model’s JPX File:  red.mavericks.rmk.my.adf.lib.adfc

File Naming

For each Business Component type, you should provide the following prefixes:

File Type Suffix Example
 Entity Associations  Assoc  SomeNameAssoc
 Entity Objects  EO  SomeNameEO
 View Links  VL  SomeNameVL
 View Objects  VO  SomeNameVO
 List Of Values  LOV SomeNameLOV
 Model JPX File  Equal to project’s name

View Link File Naming

View Link names should be self-explanatory so we can easily identify their purpose. Based on this we defined the following pattern::

<ViewObjectSourceName> + <ViewObjectDestinationName> + VL

Example: EmployeeEmployeeBranchVL

Application Modules

You may have multiple “Application Modules” in your application. For this case we should be able to identify their purpose as well as their business area target. Based on this we followed the next pattern:

<PROJECT_NAME> + <MODULE_NAME> + AM

Example: RMKMyAdfLibAM

End of Part I – Next episodes…

In my next post I will talk about:

  • Model & View Controller Projects

Cheers,

Pedro Gabriel

@PedrohnGabriel

Post image by Tim Green

This is a cross post with Link Consulting, for which Pedro currently works. If you need any help regarding Oracle Middleware projects, visit http://www.linkconsulting.com and drop them a line

The Great Migration

ADF Applications – Migration from 12.1.3 to 12.2.1

Hi all and welcome to a new article on Red Mavericks!

This article will present some problems and respective solutions in order to migrate ADF applications from 12.1.3 to 12.2.1. In this case, we used an example application that consists in a framework with several ADF applications and dependencies between them. In the current version (12.1.3), the framework works without issues.

ADF 12.2.1 comes with several corrections of customer-reported and internal (unpublished) errors, along with a new features that can be useful when developing applications:

  • Responsive Layout (Masonry Layout, af:matchMediaBehavior)
  • Visualization components (NBox, new charts and enhancements)
  • ADF Business Components (Creating RESTfull Web Services with Application Modules)

For more information about the changes in the ADF 12.2.1, click here to visit oracle documentation.

For clarify any misleading concept, I will assume the following meanings in this document:

  • Application: Group of ADF applications
  • ADF application: Group of ADF project
  • ADF project: Single ADF project

 

JDeveloper wizard migration

In order to do the migration to the new ADF version, we used the JDeveloper 12.2.1 to open all ADF projects and followed the steps in the migration wizard:

MigrationsStep1

MigrationsStep2

MigrationsStep3

Repeat this steps for every ADF projects in your application.

In this process, we did the initial migration required by the JDeveloper with success in all projects.

Advice – Debug Problems

In my migration case, I had several problems with ADF projects that contains declarative components.

In complex ADF applications with multiple dependencies is hard to know what is going on. So my advice is to try to start with simple ADF projects, with fewer dependencies, and then move on to the more complex ones. If there are declarative components that you can insert into a test ADF application to try them out, do it, and see if you can run the test or if there is a problem with it.

Validate, Update and Add ADF projects missing dependencies

For each ADF project you migrate to the new version,  check the libraries in the ADF project properties:

  1. Right click on the ADF project and select “Project Properties…” (our double click over the ADF project)projectDependenciesStep1
  2. Check all the libraries tabs, including Facelets Tag Libraries, JSP Tag Libraries and Libraries and Classpath:projectDependenciesStep3
  3. Sometimes JDeveloper doesn’t correctly migrate all libraries to the new versions. In this case, you need to remove the faulty libraries and add them manually.projectDependenciesStep4

Another aspect to take a quick look is the ADF Model projects. Sometimes during the ADF project migration, the ADF generates ViewController files in the Model project. My advice is to delete this new files using the window explorer because they can be a source of problems when trying to run your ADF application (refresh the application in JDeveloper after removing the files).

ADFModelStep1

ADFModelStep2

Also update the BPM, UCM and others libraries you have inside your ADF projects. To do this copy the libraries you need from your <oracle_home>/oracle_commom/modules, where <oracle_home> is the directory home where your jdeveloper 12.2.1 was installed.

BPM_UCM_libs

 

Then you have to check and update (in case the library has a different name) any of this libraries decencies inside your project properties.

Clean ADF projects before compile and deploy

In order to certify all the ADF project will compile using the new configurations, I recommend you to do the following steps:

  1. For each ADF project, go to “Build” menu and select “Clean All

projectClean

  1. You can also delete any existing compiled code like .jar, .war, … Use file explorer for this.

jarsWarsClean

Classes not found errors

  1. application.ModuleException: java.lang.ClassNotFoundException: oracle.adfinternal.view.faces.facelets.rich.IncludeHandler and oracle.adf.view.faces.bi.webapp.GraphServlet

If you are stuck with this exception and already done all recommended steps in Validate, Update and Add ADF projects missing dependencies, try to add a new empty page to your adfc-config.xml (you can remove it after). This will automatically add some libraries in the project properties. In my case it adds the “ADF DVT Core Runtime”, “Oracle BI Graph” and correct some others dependencies if it’s the case.

 

  1. application.ModuleException: java.lang.ClassNotFoundException: oracle.dss.rules.ComponentTypeConverter

This typically is associated with ADF projects that contain declarative components. I usually leave this one to solve in the end, because it is more complex to find the problem.

First thing I like to do is cleaning up the dependencies and add new ones that are used when we created a new declarative components ADF project. To guide you, try to remove the dependencies that aren’t in the following list and add the ones that your project is missing:

  1. In project->properties->Libraries and Classpath:declarativeComponents1
  2. In project->properties->JSP Tag Libraries:declarativeComponents2
  3. In project->properties->Facelets Tag Libraries:declarativeComponents3

If this doesn’t solve your problem, let’s try another approach. Before this, make a quick backup of your project so if anything goes wrong you can rollback and without losing your application.

First thing, close the declarative components ADF application from JDeveloper. You need to do this because you will change the .jpr ADF project file.

Search in your project for the tag “<hash>” followed by “<value n=”displayName” v=”WEB-INF/lib”/>”. In my declarative component, I solved the problem removing a lot of .jars that are referred here:

    1. Not working:
<hash>
     <value n="displayName" v="WEB-INF/lib"/>
     <hash n="filters">
        <list n="rules">
           <hash>
              <value n="pattern" v="xml-apis-ext.jar"/>
              <value n="type" v="1"/>
           </hash>
           <hash>
              <value n="pattern" v="prefuse.jar"/>
              <value n="type" v="1"/>
           </hash>
           <hash>
              <value n="pattern" v="dvt-utils.jar"/>
              <value n="type" v="1"/>
           </hash>
           <hash>
              <value n="pattern" v="dvt-shared-js.jar"/>
              <value n="type" v="1"/>
           </hash>
           <hash>
              <value n="pattern" v="dvt-facesbindings.jar"/>
              <value n="type" v="1"/>
           </hash>
           <hash>
              <value n="pattern" v="dvt-databindings.jar"/>
              <value n="type" v="1"/>
           </hash>
           <hash>
              <value n="pattern" v="dvt-basemaps.jar"/>
              <value n="type" v="1"/>
           </hash>
           <hash>
              <value n="pattern" v="batik-xml.jar"/>
              <value n="type" v="1"/>
           </hash>
          (…)
           <hash>
              <value n="pattern" v="jarUtilities.jar"/>
              <value n="type" v="1"/>
           </hash>
           <hash>
              <value n="pattern" v="com.bea.core.apache.commons.collections_1.0.0.0_3-2.jar"/>
              <value n="type" v="1"/>
           </hash>
           <hash>
              <value n="pattern" v="**"/>
              <value n="type" v="0"/>
           </hash>
        </list>
     </hash>
     <value n="internalName" v="libraries"/>
     <list n="selectedLibraries">
        <string v="ADF Faces Runtime 11"/>
        <string v="ADF DVT Faces Runtime"/>
     </list>
     <value n="targetWithinJar" v="WEB-INF/lib"/>
     <value n="type" v="3"/>
  </hash>
    1. Working:
<hash>
     <value n="displayName" v="WEB-INF/lib"/>
     <hash n="filters">
        <list n="rules">
           <hash>
              <value n="pattern" v="**"/>
              <value n="type" v="0"/>
           </hash>
        </list>
     </hash>
     <value n="internalName" v="libraries"/>
     <value n="targetWithinJar" v="WEB-INF/lib"/>
     <value n="type" v="3"/>
     <list n="unselectedLibraries">
        <string v="JarUtilities.jar"/>
        <string v="AdfLibResourceBundles.jar"/>
     </list>
 </hash>

Compile and Run

Now you can compile all ADF projects and deploy them to the WebLogic server. Don’t forget to see for each ADF project if the deploy ends successfully.

projectCompileStatus

 

Conclusion

Sometimes the ADF applications migration is not a simple button press, so when you are thinking of migrate our ADF application take in consideration that it is possible to lose some time to make all your code working in a new ADF version. However, I hope that reading this document could help you and speed up any problem situation you face in this process.

Keep checking out Red Mavericks for additional tips of Oracle Middleware technology

Cheers,
Pedro Curto

(this article is also published at Link Consulting’s website)

Post image by Steve Corey

Error in JDeveloper 12.2.1 - OSB project with XQuery changes to SOA project

Error in JDeveloper 12.2.1 – OSB project with XQuery changes to SOA project

Hi all,

After having performed a quick dive into JDeveloper 12.2.1, I came across a quite interesting feature. OSB projects started to get converted to SOA projects! Being clueless about the reason for this strange phenomenon, I conducted a quick investigation and concluded it was due to adding XQuery transformations to the project. Below are my findings and how to go around this problem if you had the misfortune of coming across it as well.

Symptoms

While working on an OSB project in Jdeveloper 12.2.1, after adding an XQuery transformation, JDeveloper changes it to a SOA project. The project structure will change and an error will be produced by Jdeveloper when completing the XQuery creation. This will occur also when importing an OSB project that contains XQuery tranformations in Jdeveloper 12.2.1. Basically, as long as there is an XQuery file in the project, it will be converted to a SOA project. When trying to deploy the project, Jdeveloper will open the Wizard for SOA project deployment, which will fail, because we have in fact an OSB project.

Adding an XQuery file to the project

Below is an example of what happens when adding an XQuery file to an existing OSB Project.

Sample OSB project before adding XQuery

Sample OSB project before adding XQuery

 

Sample OSB project after adding XQuery

Sample OSB project after adding XQuery

Meanwhile, in the JDeveloper log, the below error can be seen as well:

Uncaught exception
java.lang.NullPointerException
at oracle.tip.tools.ide.fabric.addin.SCAProjectConfigurator.configSCAProject(SCAProjectConfigurator.java:216)
at oracle.tip.tools.ide.fabric.addin.SCAProjectConfigurator.configSCAProject(SCAProjectConfigurator.java:131)
at oracle.tip.tools.ide.fabric.addin.SCAProjectConfigurator.configSCAProject(SCAProjectConfigurator.java:126)
at oracle.tip.tools.ide.fabric.addin.wizard.CompositeCreator.createComposite(CompositeCreator.java:284)
at oracle.tip.tools.ide.fabric.addin.wizard.CompositeCreator.createEmptyComposite(CompositeCreator.java:202)
at oracle.tip.tools.ide.fabric.addin.SCATechnologyListener$2.run(SCATechnologyListener.java:123)
at java.awt.event.InvocationEvent.dispatch(InvocationEvent.java:311)
at java.awt.EventQueue.dispatchEventImpl(EventQueue.java:756)
at java.awt.EventQueue.access$500(EventQueue.java:97)
at java.awt.EventQueue$3.run(EventQueue.java:709)
at java.awt.EventQueue$3.run(EventQueue.java:703)
at java.security.AccessController.doPrivileged(Native Method)
at java.security.ProtectionDomain$1.doIntersectionPrivilege(ProtectionDomain.java:75)
at java.awt.EventQueue.dispatchEvent(EventQueue.java:726)
at oracle.javatools.internal.ui.EventQueueWrapper._dispatchEvent(EventQueueWrapper.java:169)
at oracle.javatools.internal.ui.EventQueueWrapper.dispatchEvent(EventQueueWrapper.java:151)
at java.awt.EventDispatchThread.pumpOneEventForFilters(EventDispatchThread.java:201)
at java.awt.EventDispatchThread.pumpEventsForFilter(EventDispatchThread.java:116)
at java.awt.EventDispatchThread.pumpEventsForHierarchy(EventDispatchThread.java:105)
at java.awt.EventDispatchThread.pumpEvents(EventDispatchThread.java:101)
at java.awt.EventDispatchThread.pumpEvents(EventDispatchThread.java:93)
at java.awt.EventDispatchThread.run(EventDispatchThread.java:82)

Trying to deploy the project

Let’s say we are brave enough to ignore what happened and go ahead and try to deploy the project. We will be presented with the Deployment Wizard for SOA projects, which will fail if we go through all the way, since we have an OSB project in hands. Below you can see highlighted an indication that JDeveloper is looking at the project mistakenly as a SOA project.

Trying to deploy affected OSB project

Trying to deploy affected OSB project

Solution

There is a patch to fix this problem. To retrieve it, go to Oracle Support and refer to

Doc ID 2090174.1. Be sure to shut down JDeveloper before applying the patch and also to recreate the Default Domain in your installation (if you happen to be using the Integrated Server). As for the projects already affected, the cleanest approach is to recreate them and import the old files into them. As an alternative, it’s also possible to delete all soa-related artefacts from the projects and manually editing the .jpr files to remove the line highlighted below.

<hash n="oracle.ide.model.TechnologyScopeConfiguration">
  <list n="technologyScope">
    <string v="SOA"/>
    <string v="ServiceBusTechnology"/>
    <string v="XML"/>
    <string v="XML_SCHEMA"/>
  </list>
</hash>

Applies to

JDeveloper 12.2.1.0.0, when creating XQuery files in OSB projects
JDeveloper 12.2.1.0.0 when importing OSB projects containing XQuery files

And there you go. I hope this helps you to tackle this odd problem.

 

Carlos Pona (@carlospona84)

Carlos Pona is a SOA Technical Leader at Link Consulting.

Post header image by lees bus pics

You no longer need to be a genius to do pattern matching

Tapping into life – An Introduction to Stream Analytics

Dear Readers,

Welcome to a new stream (no pun intended) on Red Mavericks articles. This time, we’ll be doing an introduction on Oracle’s new Stream Analytics.

We’ll be guiding you through this new, and very cool, product showing what it is and what it can do to leverage this largely untapped resource which is event stream analysis. In fact, streams are everywhere and are becoming more and more open and accessible. If you “wiretap” these, listen to them and understand the behavioral patterns , you can build extremely valuable applications that will help you deliver more to your customers.

It’s a whole new ball game. I hope you find this interesting.

What is Oracle Stream Analytics?

Oracle Stream Analytics (previously Oracle Stream Explorer) is, in fact, an application builder platform, focused on applications that process events coming from the most various systems, internal or external to the organization, thus enabling Business Insight information and deriving relevant data from these events.

Stream Analytics - Login Screen

Stream Analytics – Welcome to Fast Data Business Insight

It works using an Event Processing Engine to perform Fast Data Analysis over a large number of events that typically appear in a given timeframe.

It also provides a run-time platform that will allow you to run and manage the applications you built.

It’s not a new Oracle Event Processor. It uses OEP as the underlying Event Processing Engine (you can also use Apache Spark as a processing engine, if you prefer. More on this in other articles)

The real power in Oracle Stream Analytics is, curiously, in its UI. As an application builder, it went to great lengths to keep the UI really easy to use. The result is, in my view, very well achieved, with enough simplicity to allow that Business Users, provided they have a bit of technical knowledge, can actually build  applications on their own or with little help from the IT.

Concepts and Ideas

But to be able to build these applications, you must first understand the concepts and rules behind them. We’ll explain these by mixing real-life concepts and their representations on the platform (Oracle Stream Analytics). Let’s start by the main concepts…

Event

An Event is the representation of something that happened in a particular time. This is most important, as events must always be correlated with a notion of time, of when it happened.

Shape

A Shape is the data structure representation of an event. It describes the actual information structure of an event, to ensure at least a minimum of data coherence between events that represent the same occurrence type. If you have a bit of technical knowledge, try to think of the shapes as the XSD of the event.

Events that represent the same type of occurrence should use the same Shape. Events that represent different types of occurrences should use different shapes.

Stream

A Stream is a sequence of data elements (in this case Events) made available over time. These data elements have shapes that must be known before hand to allow proper processing. The easiest way to visualize a Stream is to think of a food processing plant conveyor belt transporting vegetables from one point to another inside the plant.

A Bell Pepper Stream

A Bell Pepper “Stream” – Photo by the US Department of Agriculture

As the vegetables go through the conveyor belt they will be made available at a given time at the output of the belt. This will be the point where the person or the system will collect the bell peppers and process them.

Source

A Source represents the system that is making a given stream available. Typically it represents a system that is producing its own data streams or “proxying” data streams from other systems. Stream Analytics will connect to Sources by making Connections to them.

Target

A Target is a channel to where Stream Analytics will send the result of the event processing work. A Target will connect downstream to other systems and will obey to a given Shape.

Exploration

An Exploration is Stream Analytics‘ way to process events. It allows for events to be filtered, combined and enriched with additional data, as well as allowing for event data manipulation and conversion, when suited, thus producing their own events which are the result of all of this processing.

Explorations can use other the product of other Explorations as their inputs, as well as Streams and Reference Data Tables (called simply References in Stream Analytics), which are used to enrich the Exploration outputs.

For instance, a Stream can contain the status of a given vending machine, identified by an internal vending machine ID, while its GPS coordinates are stored in a reference database table. This way, the vending machine doesn’t have to send the GPS coordinates every 5 seconds along with the status, as this information will not change frequently or by itself.

Pattern

A Pattern is, well… a pattern 🙂 a repetitive regularity that can be identified by some means.

Stream Analytics allow to create new Explorations based on given patterns such as trends over time, geospace boundary checks, Top/Bottom N matches, etc… and, if there are matches, pass these on to Targets.

Stream Analytics - Patterns Palette

Stream Analytics – Patterns Palette

Timeframe

A Timeframe defines the time window reference for a given Exploration event processing. Stream Analytics allow you to define two characteristics of the Timeframe:

  • Range – The universe of events that will be considered when making Exploration processing, for instance by using aggregate functions. In plain English, the range is used to limit the events considered calculating averages, max values or event counts (e.g: Nr of Events of type A happening in the last 30 minutes). As there can be too much events, it’s essential to have some kind of boundaries in which the analysis makes sense
    • If a sensor states that is operating below a given threshold, it’s important to know that it’s not a sporadic event that happens once a year, but something that is happening every 2 minutes in the last hour.
  • Eval. Frequency – The frequency in which the events are passed on to the Exploration. Sometimes, it’s important to collect the data from the Exploration inputs not at every milisecond, but in bigger intervals. This will stipulate the cadence at which a given exploration produces results (and thus pushes them to Targets)

Although some of these concepts may seem confusing and unclear, as we go through the next articles and use them, they’ll become second nature.

 

So that’s a wrap on this article.

On our next article, we’ll start building our example application. Be prepared to have some fun doing it. Until then…

Maverick (José Rodrigues)

Document metadata Set/Get

WebCenter Content User Interface – Get and Set Document’s Metadata

Hi all,

In my previous post about WebCenter Content User Interface I have written an overview about the available features of this new UI as well as how to start customizing it, and my experience and important milestones regarding installation. You can find my previous post here.

Today I’m going a little further and focus the attention on how to get and set document’s current properties and metadata attributes using the API available in the “WccAdfCustomization” application.

In my last post I’ve dragged-and-dropped a button to the “docInfoTabs.jsff” page that you can find inside of the “WccAdfLibrary.jar” library of the ViewController project. This is how it looks like:MyCustomButton

Now what we want is to get both document’s properties and metadata, and then set the comments for this same document when we click on the button. To do this we need to perform the following steps:

  1. Open the JDeveloper with “Default Role”. Then go to the ViewController project and create “MyCustomClass” java class. Create an action event listener method to be used by our button.MyCustomClass
  2. Assign “MyCustomClass” java class to the managed beans of “adfc-config.xml” file. From now on this class is available in the context of the “docInfoTabs.jsff” page.ManagedBean
  3. Go to your “docInfoTabs.jsff.xml” file and set the actionListener property like is shown in the next image:ActionListener
  4. Our “docInfoTabs.jsff” page is invoked inside “wccdoc.xml” bounded task flow. If you go to Managed Beans tab you can see the available classes inside it. We will take advantage of “DocInfoBean” class.WccdocTaskFlow
  5. Let’s return to our “MyCustomClass” java class and insert the following code inside “myCustomButtonAction”.
package wcc.custom.view;

import java.io.Serializable;

import java.util.ArrayList;
import java.util.HashMap;
import java.util.Iterator;
import java.util.Map;

import javax.faces.component.UIComponent;
import javax.faces.component.UIViewRoot;
import javax.faces.context.FacesContext;
import javax.faces.event.ActionEvent;

import oracle.wcc.adf.model.Attachment;
import oracle.wcc.adf.model.Revision;
import oracle.wcc.adf.model.form.Field;
import oracle.wcc.adf.model.form.Form;
import oracle.wcc.adf.model.form.Item;
import oracle.wcc.adf.vc.DocInfoBean;
import oracle.wcc.adf.vc.DocumentOperationHelper;
import oracle.wcc.adf.vc.DynamicFormBean;

public class MyCustomClass {
    public MyCustomClass() {
        super();
    }
    
    public void myCustomButtonAction(javax.faces.event.ActionEvent p1) {
        
        //Getting document's properties
        
        //Getting the instance of 'DocInfoBean' class.
        DocInfoBean docInfoBean = (DocInfoBean)ADFUtils.getValueFromViewScope("wccDocInfoBean");
        //Getting the current revision of the selected document.
        Revision revision = docInfoBean.getRevision();
        System.out.println("Document Title:" + revision.getDDocTitle());
        System.out.println("Document Author: " + revision.getDDocAuthor());
        System.out.println("Document Last Modified Date: " + revision.getDReleaseDate());
        System.out.println("Document Status: " + revision.getLocalizedDStatus());
        System.out.println("Document Revision Number:" + revision.getDRevLabel());
        System.out.println("Document Comments: " + revision.getXComments());
        System.out.println("Document Profile: " + revision.getXIdcProfile());
        System.out.println("Document Type: " + revision.getDDocType());
        System.out.println("Document Format: " + revision.getDFormat());
        System.out.println("Document File Size: " + revision.getDFileSize());
        
        //Getting all metadata of the current revision of the selected document.
        DynamicFormBean dynamicFormBean = docInfoBean.getMetadataFormBean();
        Form form = dynamicFormBean.getForm();
        HashMap<String, Serializable> fieldValues = form.getFieldValues();
                
        Iterator iter = fieldValues.entrySet().iterator();
        while (iter.hasNext())
        {
            Map.Entry pair = (Map.Entry)iter.next();
            System.out.println(pair.getKey() + " = " + pair.getValue());
        }
        
        revision.setXComments("My Custom Comment");
        docInfoBean.setRevision(revision);
        //Set document's metadata for the current revision.
        DocumentOperationHelper.updateRevision(revision);
    }
}

6. The “getValueFromViewScope” method in my ADFUtils class look likes this:

public static Object getValueFromViewScope(String variable) {
      return AdfFacesContext.getCurrentInstance().getViewScope().get(variable);
}

7. Finally run your application and test the button.

 

Cheers,

Pedro Gabriel

@PedrohnGabriel

Post image by Cross Duck with small changes.

Webcenter UI Customization

WebCenter Content User Interface – Overview/Installation/Customization

Hi all,

Welcome to a new Red Mavericks article.

Today I’m writing about WebCenter Content User Interface. I will provide an overview over this renovated interface, some considerations regarding my installation experience process and what we can customize.

This article refers to Webcenter Content 11.1.1.9. We already have Webcenter’s 12c version, but the 11g is still widely used, even in most of our projects.

So, Oracle WebCenter Content User Interface offers a revamped interface, more user-friendly, with sophisticated:

  • Searching capabilities
  • Upload of multiple documents at once, each one with its own metadata form
  • Preview of documents with the related contents, metadata and revisions
  • Selection of document’s revisions via tabs
  • Tagging a document as Favorite
  • Perform actions over one or multiple documents in the list of documents

This new interface without any doubt increases users’ experience and usability as well as offering a simple way to store, secure, retrieve, and share any type of document.

I think this new user interface puts Webcenter Content more tuned to the reality of the current Oracle web applications, brings it to a new level of experience.

WebCenter Content User Interface was entirely redesigned using ADF and can be easily customized. From developer’s point-of-view is much easier to develop new features targeted to specific customer needs since we just need to use ADF instead of Idoc Script.

Administration configurations still remain only accessible on the default native WebCenter Content interface, but from my point-of-view this is not an issue since we can think of this interface as a backoffice and the new one as the interface for the end-users.

 

Installation

In order to use WebCenter Content User Interface you must have already installed the WebCenter Content 11.1.1.8 or 11.1.1.9 versions.

Then you will need to perform a second installation for the new WebCenter Content User Interface with both Weblogic Server and Oracle Development Framework. This second installation requires a separate domain.

You can perform this second installation in two ways:

  1. In one single host, i.e., in the same host where you have previously installed WebCenter Content domain.wcccui
  2. In completely separate  machines as well.wccui2

In both cases this new domain will then connect to WebCenter Content Server. You can follow the installation instructions here.

During my installation I felt some lack of information regarding MDS configuration for this new domain that will contain the WebCenter Content User Interface. Basically, before you can run the scripts mentioned in the installation instruction in the weblogic shell to create the metadata partition (point 6 of Chapter 12.5) you will have to:

  1. Download the Repository Creation Utility (RCU) 11.1.1.8 via Oracle E-Delivery
  2. Run ./rcu located in rcuHome/bin folder
  3. When you reach the “Step 3 of 7: Select Components” you should select:
    • WebCenter Content
      • If the database you are connecting to is based on 12c you should unselect the option “Oracle Information Rights Management” since it is not supported.
  4. Oracle Platform Security Services” inside of “AS Common Schemas

Customization

After a successful installation of this new domain you are ready to use it and customize it based on your customer requirements.

In the installation machine you are able to find the full ADF application of this new interface. You can find it in this location

WCCUI_MW_HOME/oracle_common/webcenter/wccadf/WccAdfCustomization.zip

 

In order to use this application you will have to install the version 11.1.2.4 of the JDeveloper. You can find more details here.

Inside of the “WccAdfCustomization” application you should select the “Show libraries” in the Application Navigator in order to have access to important jars that you can take advantage for your developments.

This application contains two projects:

  • Model – Gathers the data from the WebCenter Content Server. In this project you should turn your attention to “jar”.WccAdfMdsIntegration
  • ViewController – This is the project were you will do interface customizations. Here you have the “jar” library. In this library you can find all the pages of the WebCenter Content User Interface.

 

WccAdfLibrary WccAdfLibrary2

During your customization you cannot directly edit the pages from “WccAdfLibrary.jar” library. You will have to create what Oracle call as “Seeded Customization Layer””.

You can have as many seeded customization layers as you want.

 

So what really is a “Seeded Customization Layer”?

A “Seeded Customization Layer” is a container of one type of customizations. In most cases, one layer would be sufficient but if multiple layer values are defined, the customizations for those values are applied in the order in which they are defined.

For example, imagine that you want different customizations for each deployable environment or different customers. The layers you create will be under the customer layer.

 

Where can I create a customization layer?

In order to create a customization you need to open the “adf-config.xml” file under “Application Resources”. Find the “customerCustomizationLayerValues” properties and insert the name we want for your layer.CustomizationLayer

Then open the “CustomizationLayerValues.xml” file also under the “Application Resources”. Find the “customer” layer and set the same name as you have given in the “value” property.

CustomizationLayer2

 

How to customize a page?

To customize a page you need to open the JDeveloper in “Customization Developer” role.CustomizationDeveloper

 

After JDeveloper having started you can notice that the “customer” customization layer is selected with the “Demo” layer that we have previously set in the configurations’ files. If you define more than one layer you can select the one you want to perform your customizations.CustomizationLayer3

Expand the “WccAdfLibrary.jar” library from the “ViewController” project and open the “docInfoTabs.jsff” page from the “wcc” package. As you can notice you cannot directly edit the code but you can drag-and-drop new controls or even use the “Property Inspector” to edit them.

For this example try to drag-and-drop a button as shown in the next image.

CustomizationLayer4

After you have inserted the new button you can notice that a new file is generated for you in the “ViewController” project with the new button. From now on you can edit the button properties over the new file “docInfoTabs.jsff.xml”.

CustomizationLayer5

The generated packages structure identifies the “Seeded Customization Layer”.

 

How can I test my customizations?

Before you can start testing your application in your Integrated Weblogic Server you need to set the idc connection and admin user in the “connections.xml” file under the “Application Resources”. This configuration specifies the default, local WebCenter Content server connection.

On the next post we’ll explore how to use these same button to get document’s current properties and metadata, change one of the metadata attributes and save it using the API available in the “WccAdfCustomization” application.

connections

 

Cheers,

Pedro Gabriel

@PedrohnGabriel

 Image post by Chad Horwedel

Process Timers

Process Timers – Controlling the time in which your process executes

Hello everybody,

Following up a series of questions around setting timers in the Oracle Community forums, I decided to write this article to try and guide their use and how these can be used to control process execution.

Let’s start!

The Use Case

We’ll begin by setting up the scenario in which we’ll have to control our process flow.

Imagine that you want to have a part of your process that executes immediately if the current time is between 08:00am and 04:00pm (16:00 hours for us Europeans), or wait until 08:00am if it’s outside that interval.

It’s frequent to have some kind of control in parts of the processes, for instance when you want to send SMS to your customers. You certainly don’t want to do it at 03:00am.

How will we make this?

We should use a Catch Timer event, of course, and XPATH’s DateTime functions to check the current time and to set the timer to way for next morning’s 08:00.

The Catch Timer event has several ways to be configured (triggered at specific dates and times, on a specific schedule – every day at 10:28:00 (repeatable), or in a time cycle – every 2 minutes), but we’ll focus on the one where we configure the timer to wait for a specific time and date. More on the others perhaps in another article.

We’ll illustrate the use of timers with an example process. You can, of course, adapt it to your needs.

Defining the execution conditions 

So you start by defining a gateway that will split the execution between:

  • Immediate
  • Wait for 08:00am
    • This will have to be split into prior to midnight and after midnight. but for now, we’ll consider the scenario of only two options.

So, you set the expression on the conditional flow that will do the immediate execution, leaving the condition that must wait for 08:00 as the unconditional (default) branch.

The expression should be something like this:

Timer Setting

Timer Setting

xp20:hours-from-dateTime(xp20:current-dateTime()) >= 8  and xp20:hours-from-dateTime(xp20:current-dateTime()) <= 16

The function xp20:current-dateTime() gets the current Date and Time of when the decision is evaluated.

The function xp20:hours-from-dateTime(xs:dateTime) gets the ‘Hours’ integer from a dateTime object.

So you check if the current time is after 08:00am and before 04:00pm.

  • If it is, it follows the Green Light path, i.e. the immediate execution path.
  • If not, it will follow the Red Light path, and will wait on the timer for a green light (until 08:00am the next day, as per the requirements)

For the test process comprehension, check the process flow below.

Timer Test Process

Timer Test Process

So, only one more step to go: Setting the timer to the next 08:00am available.

This is achieved by setting the Timer implementation first to Type=Time Date (red arrow) and then setting the appropriate XPATH expression (orange arrow)

Timer Type Setting

Timer Type Setting

The XPATH expression is as follows:

xp20:add-dayTimeDuration-to-dateTime(xp20:current-date(),’P01DT08H’)

The add-dayTimeDuration-to-dateTime(xs:dateTime,formattingString) function adds an interval of date/time to a dateTime object.

The interval is set using the format ‘PyyYmmMddDThhHmmMssS‘.

The xp20:current-date() function returns the current date without the time associated, meaning it considers time = 00:00:00.

So, we’re stating that we want to add to the current date the amount of 01 day and 08 hours.

Warning

You would think that this solves the issue, but not quite. It solves the issue if the process reaches the decision point until midnight. After midnight, you can’t add a whole new day and then another 08 hours.

So you should split your flow further to handle these two scenarios:

  • Wait occurs prior to midnight => XPATH Expression interval = ‘P01DT08H’
  • Wait occurs after midnight => XPATH Expression interval = ‘P00DT08H”

I’m pretty sure there are other ways to do it, but I decided to do it like this:

Timer Process Final

Timer Process Final

In which I set the XPATH for Time Date type of the other Timer (Red Light / After Midnight) as

xp20:add-dayTimeDuration-to-dateTime(xp20:current-date(),’P00DT08H’)

So this should solve the case.

I added the project file for this:

CommunityTimerDefinition

Cheers

Maverick (José Rodrigues)

Post Header image by Henrique Simplicio

ADF Listeners - Part III

ADF Application Event Listeners, QueryString Parameters reinjection between sessions – Part III

In ADF applications as any web application you can make use of QueryString parameters to perform custom actions inside of your application. But what happens to your QueryString parameters when the current session expires? In ADF they are lost between sessions. So, how can we reinject the QueryStrings parameters in the new session? You have two options:

  • You can manually call again the application with the same QueryString parameters. But what if you’re using your application inside of another? Probably this option may not work.
  • You can try to catch the session timeout. But you need to catch the session timeout before it actually expires otherwise you lost the QueryString parameters. How to do it?

The answer to the previous question resides on the ADF application Events Listeners. In my previous posts you can find details about ADF application Events Listeners. Take a look here.

In this post I will show you how to accomplish this task.

 

Solution

In order to gather the current session QueryString parameters and reinject them in the newest one before session timeout occur you need to create the following custom listeners:

  • Custom Phase Listener
  • Custom Filter

Inside of the custom phase listener class we will be listening the lifecycle phases of our application, gather the QueryString parameters and save them in a cookie so we are able to access them in the future when the session timeout happens. The following code explains how to do it.

package view.listeners.session;

import java.util.Map;

import javax.faces.context.ExternalContext;
import javax.faces.context.FacesContext;
import javax.faces.event.PhaseEvent;
import javax.faces.event.PhaseId;
import javax.faces.event.PhaseListener;

import javax.servlet.http.Cookie;
import javax.servlet.http.HttpServletRequest;
import javax.servlet.http.HttpServletResponse;
import javax.servlet.http.HttpSession;

public class CustomPhaseListener implements PhaseListener{
    public CustomPhaseListener() {
        super();
    }

    public void afterPhase(PhaseEvent phaseEvent) {
        System.out.println("CustomPhaseListener");
    }

    public void beforePhase(PhaseEvent phaseEvent) {
        System.out.println("CustomPhaseListener");
        FacesContext vFacesContext = phaseEvent.getFacesContext();
        ExternalContext vExternalContext = vFacesContext.getExternalContext();
        HttpSession session = (HttpSession) vExternalContext.getSession(false);
        boolean newSession = (session == null) || (session.isNew());
        boolean postback = !vExternalContext.getRequestParameterMap().isEmpty();
        boolean timedout = postback && newSession;
        
        if (!timedout)
        {
            //Returns all QueryString paramters but you are only interested in 
            //yours parameters.
            Map<String, Object[]> queryStringMap = ((HttpServletRequest)vExternalContext.getRequest()).getParameterMap();
            
            //This utility class detectes which QueryString parameters are 
            //generated  by the ADF or have been created by you.
            QueryStringUtils qsConstants = new QueryStringUtils();
            //Returns a string with all QueryString parameters created by you.
            String queryString = qsConstants.getStringQueryStringParameters(queryStringMap);
            
            if (queryString != null && !queryString.isEmpty() &&
                !vExternalContext.getRequestCookieMap().containsKey(CookieConstants.COOKIE_QUERY_STRING_NAME))
            {
                //Inserts the QueryString parameters in a COOKIE.
                HttpServletResponse vResponse = (HttpServletResponse)vExternalContext.getResponse();
                Cookie cookie = new Cookie(CookieConstants.COOKIE_QUERY_STRING_NAME, queryString);
                vResponse.addCookie(cookie);
            }
        }
    }

    public PhaseId getPhaseId() {
        return PhaseId.ANY_PHASE;
    }
}

You can find the code for the QueryStringUtils here:

package view.listeners.session;

import java.net.URLDecoder;
import java.net.URLEncoder;

import java.util.ArrayList;
import java.util.Hashtable;
import java.util.List;
import java.util.Map;
import java.util.Map.Entry;

public class QueryStringUtils {
    
    //Example of QueryString parameters that the application accepts.
    public static final String QUERY_STRING_1 = "_qs1";
    public static final String QUERY_STRING_2 = "_qs2";
    
    private List listOfQueryStringConstants;

    public QueryStringUtils(){
        listOfQueryStringConstants = new ArrayList();
        listOfQueryStringConstants.add(QUERY_STRING_1);
        listOfQueryStringConstants.add(QUERY_STRING_2);
    }
    
    public List getQueryStringConstantsNames() {
        return listOfQueryStringConstants;
    }
    
    public Hashtable getHashQueryStringParameters(Map<String, Object[]> queryStringParams) {
        Hashtable result = new Hashtable();
        
        for(Entry<String, Object[]> entry : queryStringParams.entrySet()) {
              if (getQueryStringConstantsNames().contains(entry.getKey())) 
              {
                  Object propertyValue = entry.getValue()[0];
                  if (propertyValue != null)
                  {
                      try
                      {
                          String propertyValueAux = URLDecoder.decode(propertyValue.toString(), "UTF-8");
                          result.put(entry.getKey(), propertyValueAux);
                      }
                      catch(Exception ex) {
                          ex.printStackTrace();
                      }
                  }    
              }
        }
        
        return result;
    }

    public String getStringQueryStringParameters(Map<String, Object[]> queryStringParams) {
        String result = "";
        int i = 0;
        
        for(Entry<String, Object[]> entry : queryStringParams.entrySet()) {
              if (getQueryStringConstantsNames().contains(entry.getKey())) 
              {
                  Object propertyValue = entry.getValue()[0];
                  if (propertyValue != null)
                  {
                      try
                      {
                          String propertyValueAux = URLEncoder.encode(propertyValue.toString(), "UTF-8");
                          if (i == 0)
                            result += entry.getKey() + "=" + propertyValueAux;
                          else
                            result += "&" + entry.getKey() + "=" + propertyValueAux;
                          i ++;
                      }
                      catch(Exception ex) {
                          ex.printStackTrace();
                      }
                  }    
              }
        }
        
        return result;
    }
}

In the class CookieConstants you have:

package view.listeners.session;

public class CookieConstants {
    
    public static final String COOKIE_QUERY_STRING_NAME = "QUERY_STRING_COOKIE_EXAMPLE";
}

In the custom filter we will create a custom ServletRequest so we can insert the QueryString parameters for each new request. When the session timeout happens the custom filter is invoked and the QueryString parameters saved in the cookie are read and inserted in the request of the newest session created. Here is the code for the custom filter:

package view.listeners.session;

import javax.servlet.Filter;
import javax.servlet.FilterChain;
import javax.servlet.FilterConfig;
import javax.servlet.ServletRequest;
import javax.servlet.ServletResponse;
import javax.servlet.http.HttpServletRequest;


public class CustomFilter implements Filter{
    
    private FilterConfig _filterConfig = null;
    
    public void init(FilterConfig filterConfig) {
        _filterConfig = filterConfig;
    }

    public void doFilter(ServletRequest servletRequest,
                         ServletResponse servletResponse,
                         FilterChain filterChain) throws java.io.IOException, 
                                                         javax.servlet.ServletException{
        CustomHttpServletRequestWrapper requestWrapper = new CustomHttpServletRequestWrapper((HttpServletRequest)servletRequest);
        filterChain.doFilter(requestWrapper, servletResponse);
    }

    public void destroy() {
        _filterConfig = null;
    }
}

In the custom ServletRequest we will manipulate the request data and reinject the QueryString parameters. The code for the custom ServletRequest is presented next:

package view.listeners.session;

import java.util.ArrayList;
import java.util.Enumeration;
import java.util.Hashtable;

import java.util.Map;

import javax.servlet.http.Cookie;
import javax.servlet.http.HttpServletRequest;
import javax.servlet.http.HttpServletRequestWrapper;

public class CustomHttpServletRequestWrapper extends HttpServletRequestWrapper{
    
    private Hashtable<String,String> queryStringParameters;
    
    public CustomHttpServletRequestWrapper(HttpServletRequest request) {
        super(request);
        setHashQueryStringParameters();
    }

    private String getCookieQueryStringParameters() {
        String result = null;
        Cookie[] cookie = super.getCookies();
        
        for(int i = 0; i < cookie.length; i++) {
            if (cookie[i].getName().equalsIgnoreCase(CookieConstants.COOKIE_QUERY_STRING_NAME)) {
                result = cookie[i].getValue();
            }
        }
        
        return result;
    }

    private void setHashQueryStringParameters() {
        queryStringParameters = new Hashtable<String, String>();
        String parameters = getCookieQueryStringParameters();
        
        if (parameters != null && !parameters.isEmpty()) 
        {
            String[] splitParameters = parameters.split("&");
            
            for (int j = 0; j < splitParameters.length; j++)
            {
                String[] keyValue = splitParameters[j].split("=");
                queryStringParameters.put(keyValue[0], keyValue[1]);
            }
        }
    }

    public ArrayList getQueryStringParametersNames() {
        return new ArrayList<String>(queryStringParameters.keySet());
    }

    @Override
    public Map getParameterMap() {
        return super.getParameterMap();
    }

    @Override
    public Enumeration getParameterNames() {
        Enumeration enumeration = super.getParameterNames();
        
        if(queryStringParameters.size() != 0){
            //Custom Enumeration to set the default ADF QueryString parameters 
            //and our custom parameters.
            CustomEnumeration myEnumeration = new CustomEnumeration(enumeration, getQueryStringParametersNames());
            enumeration = myEnumeration.getEnumeration();
        }
        return enumeration;
    }

    @Override
    public String[] getParameterValues(String string) {
        String[] result = super.getParameterValues(string);
        
        if (queryStringParameters.size() != 0 && queryStringParameters.containsKey(string)){
            result = new String[1];
            result[0] = queryStringParameters.get(string);
        }
        
        return result;
    }

    @Override
    public String getParameter(String string) {
        String result = super.getParameter(string);
        
        if (queryStringParameters.containsKey(string))
            result = queryStringParameters.get(string);
        
        return result;
    }
}

The code for the CustomEnumeration class can be found here:

package view.listeners.session;

import java.util.ArrayList;
import java.util.Enumeration;

import weblogic.utils.enumerations.IteratorEnumerator;

public class CustomEnumeration {
    private Enumeration enumeration;
    private ArrayList<String> queryStringParametersNames;
    
    public CustomEnumeration(Enumeration enumeration, ArrayList<String> queryStringParametersNames) {
        this.enumeration = enumeration;
        this.queryStringParametersNames = queryStringParametersNames;
    }
    
    public Enumeration getEnumeration() {
        ArrayList<String> arrayList = new ArrayList<String>(); 
        while (enumeration.hasMoreElements()) {
            arrayList.add((String)enumeration.nextElement());
        }
        
        arrayList = addqueryStringParameters(arrayList);
        
        Enumeration result = new IteratorEnumerator(arrayList.iterator()); 
        return result;
    }
    
    private ArrayList<String> addqueryStringParameters(ArrayList<String> arrayList) {
        
        for (int i = 0; i < queryStringParametersNames.size(); i++) {
            String value = queryStringParametersNames.get(i);
            if (!arrayList.contains(value))
                arrayList.add(value);
        }
        return arrayList;
    }
}

Conlusion

By intercepting the application lifecycle phases and the page requests using a custom phase listener and a custom filter we have been able to reinject QueryString parameters between sessions.

 

Cheers,

Pedro Gabriel

 

@PedrohnGabriel

Post original image by Fe Ilya

ADF Listeners - Part II

ADF Application Event Listeners – Part II

Hi all,

In my previous post  ADF Application Event Listeners – Part I I have focused in some of the ADF applications event listeners that we can listen and how to do it. In this post I will show you some examples of data that you can get from those event listeners and in some cases rewrite it.

 

HTTP Session Events

After application session is created the sessionCreated method is triggered. From the HttpSessionEvent input parameter in the same method you are able to access data from these classes:

  • servlet.http.HttpSession
  • servlet.internal.session.MemorySessionData
  • servlet.internal.session.MemorySessionContext
  • servlet.internal.WebAppServletContext

For internal classes is not recommend to make any change on them, however, you are able to access it for your own purpose.

In order to get these previous classes you just need the following:

public void sessionCreated(HttpSessionEvent httpSessionEvent) {
   HttpSession httpSession = httpSessionEvent.getSession();   

   MemorySessionData msd = (MemorySessionData)httpSessionEvent.getSource();
                
   MemorySessionContext msc = (MemorySessionContext)msd.getContext();

   WebAppServletContext appServletContext = msd.getWebAppServletContext();
}

 

In the next table is presented the resume of what you may find.

Class Method Description
HttpSession getId() Returns de the unique identifier assigned to the created session.
getCreationDate() Returns the time when the session was created measured in milliseconds.
getMaxInactiveInterval() Returns the maximum time interval, in seconds that the servlet container will keep the session open between client accesses. If configured the value will be the same as configured in the property “Session Timeout” in the “web.xml” file.
getLastAccessedTime() Returns the last time the client sent a request associated with this session measured in milliseconds.
Invalidate() Invalidated this session then unbinds any objects bound to it.
MemorySessionData getConcurrentRequestCount() Returns the number of concurrent requests for the application.
MemorySessionContext getCurrOpenSessionsCount() Returns the number of concurrent open sessions for the application.
getPersistentStoreType() The scope type where the session data is stored. This value will be the same as configured in the property “Store Type” in the “weblogic-application.xml”.
getTotalOpenSessionsCount() Returns the number open sessions for the application.
getWebAppServletContext() Returns the class that retrieves the context of the current servlet of the application. We will see in further detail this class.

Servlet Context Listener

The Servlet Context is initiated during the first application access after deployment. For future accesses to the application the servlet context is reused interchangeably of the new sessions created.

After the servlet context is initialized the contextInitialized method is triggered. From the ServletContextEvent input parameter in the same method you are able to access data from these classes:

  • servlet.internal.WebAppServletContext
  • application.internal.ApplicationContextImpl
  • management.configuration.AppDeploymentMBean
  • servlet.internal.WebAppConfigManager
  • management.configuration.WebAppComponentMBeanImpl
  • management.runtime.ServletRuntimeMBean

For internal classes is not recommend to make any change on them, however, you are able to access it for your own purpose.

In order to get these previous classes you just need the following:

public void contextInitialized(ServletContextEvent servletContextEvent) {
   WebAppServletContext wasc = (WebAppServletContext)servletContextEvent.getSource();
        
   ApplicationContextImpl acimpl = (ApplicationContextImpl)wasc.getApplicationContext();
   
   AppDeploymentMBean appDeploymentMBean = acimpl.getAppDeploymentMBean();
        
   WebAppComponentMBeanImpl appComponentMBean = WebAppComponentMBeanImpl)wasc.getMBean();
        
   ServletRuntimeMBean[] srmb = wasc.getServletRuntimeMBeans();
}

 

In the next table is presented the resume of what you may find.

Class Method Description
WebAppServletContext getAppDisplayName() Application display name.
getMBean() Returns the WebAppComponentMBeanImpl class.
getServletRuntimeMBeans() Returns the ServletRuntimeMBean class.
getClasspath() Returns the path for all jars included in the application.
getDocroot() Returns the path for the WAR file of the application.
getMajorVersion() Returns the major version of the application.
getMinorVersion() Returns the minor version of the application.
getApplicationSecurityRealmName() Returns the security realm name of the user logged in.
getApplicationParameters() Application Configuration Parameters. This is further detailed next.
getAbsoluteSourcePath() The path of the application’s installation.
getServer() Returns the name of the server where the application is deployed in.
getServerInfo() Returns the server info where the application is deployed in. We will provide an example next.
ApplicationContextImpl getAuthRealmName() Returns the realm name of the user logged in.
getDefaultEncoding() Returns application’s default encoding.
getDispatchPolicy() The class of work executed in queues of the weblogic.
getMimeTypeDefault() Returns application’s mime type.
AppDeploymentMBean getSessionCookieName() Returns the internal name of the session’s cookie.
WebAppConfigManager getSessionJDBCConnectionTimeoutSecs() Returns in seconds the JDBC connection timeout.
getSessionPersistentStoreDir() In my tests I have found “session_db” value.
getSessionPersistentStoreTable() In my tests I have found “wl_servlet_sessions” value.
WebAppComponentMBeanImpl getSessionPersistentStoreType() In my tests I have found “memory” value.
getSessionTimeoutSecs() Returns in seconds the session time out.
ServletRuntimeMBean getName() Returns the name of the servlet.
getType() Returns the type of the servlet.

 

In the getServerInfo() method from the WebAppServletContext class you can find something like this:

“WebLogic Server 10.3.5.0 Fri Apr 1 20:20:06 PDT 2011 1398638 Oracle WebLogic Server Module Dependencies 10.3 Thu Mar 3 14:37:52 PST 2011 Oracle WebLogic Server on JRockit Virtual Edition Module Dependencies 10.3 Thu Feb 3 16:30:47 EST 2011”

 

In the getApplicationParameters() method from the ApplicationContextImpl class you can find for example the following parameters:

Parameter Definition
“weblogic.app.rmiGracePeriod” The amount of time, in seconds, that the work manager accepts and schedules RMI calls until there are no more RMI requests arriving within the RMI grace period during a gracefulshutdown or a retirement.
“weblogic.app.ignoreSessions” Immediately places the application into Administration mode without waiting for current HTTP sessions to complete.
“weblogic.app.adminMode” Indicates that a running application should switch to Administration mode and accept only Administration requests via a configured Administration channel. If this option is not specified, the running application is stopped and cannot accept Administration or client requests until is it restarted.

 

In the ServletRuntimeMBean class you can find for example the following servlets:

Name Type
BIGRAPHSERVLET ServletRuntime
JspServlet ServletRuntime
Faces Servlet ServletRuntime
FileServlet ServletRuntime
resources ServletRuntime
adw ServletRuntime
WebServiceServlet ServletRuntime
MapProxyServlet ServletRuntime
BIGAUGESERVLET ServletRuntime
GatewayServlet ServletRuntime

Servlet Request Listener

The Servlet Request Listener is triggered whenever a new request is made to the server regarding the current application. For those cases the requestInitialized method is triggered with the ServletRequestEvent class as input parameter.

ServletRequestEvent and ServletContextEvent extend from the same java class “java.util.EventObject”. You will be able to find the same data as detailed for the “Servlet Context Listener”.

Phase Listeners

You can listen to all lifecycle phases. From the beforePhase and afterPhase methods you are able to access and manipulate data from these classes:

  • servlet.http.HttpServletRequest
  • servlet.http.HttpServletResponse

You can get these classes as detailed next:

public void beforePhase(PhaseEvent phaseEvent) {
   //You get the current phase for what the event has been triggered.        
   System.out.println("getPhaseId: " + phaseEvent.getPhaseId());
        
   FacesContext vFacesContext = phaseEvent.getFacesContext();    
   ExternalContext vExternalContext = vFacesContext.getExternalContext();
        
   HttpServletRequest vRequest = (HttpServletRequest)vExternalContext.getRequest());
        
   Map<String, Object[]> queryStringMap = vRequest.getParameterMap();
        
   HttpServletResponse vResponse = (HttpServletResponse)vExternalContext.getResponse();
}

 

Note: Don’t forget to set the getPhaseId method to trigger all or only the phases that you really want when you are creating your own custom Phase Listener.

public PhaseId getPhaseId() {
  return PhaseId.ANY_PHASE;
}

 

If you create a custom Filter you are able to manipulate the data from the request and response by setting the data from the servletRequest and servletResponse input parameters’ classes for the doFilter method.

In order to set the data from the servletRequest class do the following steps:

  1. Create a class that extends from “servlet.http.HttpServletRequestWrapper” class.CustomHttpServletRequestWrapper
  2. Override the methods you want to customize.CustomHttpServletRequestWrapper_OverideMethods
  3. Set the new class in the doFilter method.
    public void doFilter(ServletRequest servletRequest,
                         ServletResponse servletResponse,
                         FilterChain filterChain) throws java.io.IOException, javax.servlet.ServletException{
            
       CustomHttpServletRequestWrapper requestWrapper = new CustomHttpServletRequestWrapper((HttpServletRequest)servletRequest);
            
       filterChain.doFilter(requestWrapper, servletResponse);
    }
    
    

 

On the next post we’ll explore how to use these application events to get custom query string parameters and reinject them in new sessions after session’s time out.

Cheers,

Pedro Gabriel

 

@PedrohnGabriel

Feature Image by Melvin Gaal