Tuesday, 19 November 2013

Working with DVM in SOA 11g




       Hi, I want to Discuss About  Domain Value Maps(DVM) in Oracle SOA Suite 11G in this post

 Domain Value Maps(DVM)  They enable you to map from one vocabulary used in a given domain to another vocabulary used in a different domain. For example, one domain may represent a city with a long name (Mumbai), while another domain may represent a city with a short name (MB). In such cases, you can directly map the values by using domain value maps.

Domain value map values are static. You specify the domain value map values at design time using Oracle JDeveloper, and then at runtime, the domain value map columns are looked up for values.

Domain Value Map Features

Qualifier Support

Qualifiers qualify mappings. A mapping may not be valid unless qualified with additional information. For example, a domain value map containing a city code-to-city name mapping may have multiple mappings from KN to Kensington because Kensington is a city in both Canada and the USA. Therefore, this mapping requires a qualifier (USA or Canada) to qualify when the mapping becomes valid

Qualifier Support Example
Country (Qualifier) CityCode CityName
USA BO Boston
USA BELG_NC Belgrade
USA BELG_MN_Streams Belgrade
USA NP Northport
USA KN Kensington
Canada KN Kensington

Note: You can also specify multiple qualifiers for a domain value map
            Qualifier Order Support:- A qualifier order is used to find the best match during lookup at run time. Domain value maps support hierarchical lookup. If you specify a qualifier value during a lookup and no exact match is found, then the lookup mechanism tries to find a more generalized match by setting the higher order qualifiers to "". It proceeds until a match is found, or until a match is not found with all qualifiers set to "".
              One-to-Many Mapping Support:- You can map one value to a multiple values in a domain value map. For example, a domain value map for Payment Terms can contain mapping of payment terms to three values such as discount percentage, discount period, and total payment period.
Scenario:   
     In My example we use File Read and write Using DVM Logic 


   

In the above Example Mediater Mediating All Incoming files to File Write Location based on below XSD,

<?xml version="1.0" encoding="windows-1252" ?>
<xsd:schema xmlns:xsd="http://www.w3.org/2001/XMLSchema"
            xmlns="http://www.example.org"
            targetNamespace="http://www.example.org"
            elementFormDefault="qualified">
  <xsd:element name="ListofOrder">
    <xsd:complexType>
    <xsd:sequence>
    <xsd:element name="Order" >
    <xsd:complexType>
    <xsd:sequence>
     <xsd:element name="OrderID" type="xsd:string" />
    <xsd:element name="basedata">
    <xsd:complexType>
    <xsd:sequence>
     <xsd:element name="CityCode" type="xsd:string" />
      <xsd:element name="CityName" type="xsd:string" />
       <xsd:element name="State" type="xsd:string" />
        <xsd:element name="Country" type="xsd:string" />
    
    </xsd:sequence>
    </xsd:complexType>
  </xsd:element>
  <xsd:element name="PricingData" maxOccurs="unbounded">
    <xsd:complexType>
    <xsd:sequence>
     <xsd:element name="Freightamount" type="xsd:string" />
      <xsd:element name="PricingData" type="xsd:string" />
       <xsd:element name="Taxamount" type="xsd:string" />
    
    </xsd:sequence>
    </xsd:complexType>
  </xsd:element>
    </xsd:sequence>
    </xsd:complexType>
  </xsd:element>
    </xsd:sequence>
    </xsd:complexType>
  </xsd:element>
</xsd:schema>
Now i will Write file Data Based on DVM File.
Step 1: 
Create .DVM FILE using  Right click on project -> Select SOA-> Transformation-> Domain Value Map . Give some name to domain value map file and give some name to the domain fields then click OK.
Step 2: Define File Name and Description for the DVM file you created.  Enter the Domain Name and Domain Value      Enter the Doamin value to the particular domain name and click on OK 



Step 3: You can Create a Domain and Domain values as below



Step 4: While you creating a Domain prompts you below window

The Create Domain Popup window  is displayed 
Enter the Name in the Name field
Select the True if you want to column as a Qualifier
Select Qualifier Order field enter a qualifier number (If you want enabled this field then select as true in Qualifier field)
Click on OK (As discussed in
Domain Value Map Features)

Repeat same steps to create a Domain Country with Qualifier set to True


Step 4:  Create a Domain Value Map File as below



Step  5: Edit the transformation file in Mediater to add DVM

If you select the Advanced tab then you will get DVM Functions 
Now add lookupValue in the middle which element do you want lookup



Step 6:  When you double click on the DVM function Popup window will appear 

 In our case let us get CityName while we pass CityCode as inputs , only when  qualifiers matched




Specify values for the following fields in the Edit Function – lookupValue1M dialog:
  • In the dvmLocation field, enter the location URI of the domain value map file or click Browse to the right of the dvmLocation field to select a domain value map file. You can select an already deployed domain value map from the metadata service (MDS) and also from the shared location in MDS. This can be done by selecting the Resource Palette.
  • In the sourceColumnName field, enter the name of the domain value map column that is associated with the source element value, or click Browse to select a column name from the columns defined for the domain value map you previously selected.
  • In the sourceValue field, enter a value or press Ctrl-Space to use the XPath Building Assistant. Press the up and down arrow keys to locate an object in the list, and press Enter to select an item.
  • In the targetColumnName field, enter the name of the domain value map column that is associated with the target element value, or click Browse to select the name from the columns defined for the domain value map you previously selected.
  • In the QualifierCloumnName field,enter a value of the domain value map column that is associated with the qualifier  element value , or click Browse to select a column name from the columns defined for the domain value map you previously defied to be qualifier.
  •  In the QualifierValue nter a value or press Ctrl-Space to use the XPath Building Assistant. Press the up and down arrow keys to locate an object in the list, and press Enter to select an item.


Step 7:  Finally Transformation look like below.




For our Test case working with Qualifiers add 2 more fields to DVM file


So based on the value passed in state and country Citycode value is decided

if the test case contains state as AP, Country as India : DVMlookup gives you HYD_AP


Using DVM in Assign Activity


We can also achieve DVM function through Assign activity in BPEL



      Deploy our service ready to go with DVM

                                               

                                                                                                             By DeepthiReddy

Sunday, 17 November 2013

FILE- ADAPTER to Process Files Using Mediater



      Hi in this post we wil discuss about JCA Adapters for FILES. The Oracle File  enable a BPEL process or a Mediator to exchange (read and write) files on local file systems and remote file systems (through use of the file transfer protocol (FTP)). The file contents can be both XML and non-XML data formats.


The Oracle File are based on JCA 1.5 architecture. JCA provides a standard architecture for integrating heterogeneous enterprise information systems (EIS). The JCA Binding Component of the Oracle File and FTP Adapters expose the underlying JCA interactions as services (WSDL with JCA binding) for Oracle BPEL PM integration. 

Overview: In this tutorial, I will explain how to read a multiple record  txt file using ReadFileAdapter and then to write   to txt file using WriteFileAdapter. We don’t have two separate FileAdapters but based on the read and write operation we mention them as ReadFileAdapter or WriteFileAdapter.
ReadFileAdapter will Receive the input data from file and translate the data based on file configurations  defined and post the XML messages.
WriteFileAdapter will receive the XML messages and translate it into actual data  and write to a file.


We are now going to implement first with Mediater and then with BPEL

 Step 1: Configuring FileAdapter to read Local .txt File


Create a soa  project FilereadWrite, with Empty Composite
Drag drop FileAdapter from Servicecomponets of Componentpallete to Exposedservices




Step 2: Defining FileAdapter to read a test File

ServiceName:Multipleread click next

Interface:Define from operation and schema click next

Select Read from Operation Type as we need to read from a file



Ignore the below options don't read file content ,use file streaming , read file as attachment we will discuss in next blog

click next





Select the ‘Physical Path’ option' Logical Path ' is used when we are deploying our service to remote server
Select the path where you incoming  txt file is expected.
I have placed by .txt file in the @ ‘C:\FILEADAPTER\MultipleRead’ directory
Process files recursively
Archive processed files @ D:\Archive_Files\MultipleArchive
select the option ‘Delte files after successful retrieval’ 

Note: Best Practice to maintain different directory for Incoming and Archive Files 


Click Next





I want to define specific file to be read by My fileAdapter
Select 'File Wildcards' Option
Enter the patten name in ‘Include Files with Name Pattern ' :  test*.txt
Enter the name pattren in 'Exclude files with name pattren' :  exclude*.txt 


Note: Files with Include name pattren will be picked ignoring Files with Exclude File pattren

I have entered test*.txt which means that all the files starting with ‘test’ word prefix will be picked for processing.
Ex:
test.txt will be picked
test23.txt will be picked
test43423test.txt will be picked
testexclude.txt     will be picked

soatest.txt will not be picked
1test.txt     will not be picked
exclude.txt  will not be picked
exclude43.txt will not be picked

click next



  Polling frequncy  defines time interval to read the folder for new incoming files
     Minimum File age defines the restriction on file age to be read

in our case the incoming folder will be read for every 60 seconds for every file aged minimum of 0 seconds  

click next
          

 Now we are going to define Schema for the Incoming messages , click next





Select the option to create a new schema ,Select the file type to be Delimited file with contains data separeted by comma

sample file 

1340,Deepthi,Hr,developer,20.000
1341,lasya,Hr,developer,40.000
1342,Madhavi,Hr,developer,80.000 



 Give the Location of file which has the sample representing the incoming file data
I have created a sample txt file . Expecting 200 Number of rows in incoming file ,should process all the rows
Charecter set  US-ACII

Click Next


 
  Now we can specify the Namespace f the Schema.Enter name for the element to represent the record

click next and Finish
.xsd(XML Schema Definition) file will created by default ,
Change the column names and types as needed ,Name the schema as below


We are done with FILEADAPTER  read operation type


Step 3: Configuring FileAdapter to write the Text file Read in above step


 Drag drop FileAdapter from Servicecomponets of Componentpallete to External references


ServiceName:Multiplewrite click next

Now Select Write from Operation Type as we need to write to  a file.


Click Next


 Define the path to which we are writing the file to ,In my case i am writing to D:\FILEADAPTER\Multiplewrite

FileNaming convention sets the naming pattren to write the files

Multiplewrite_%SEQ%.txt

%SEQ% will increase the numbers from 1 onwards which means if you have 10 files to write then your file names will be Multiplewrite_1.txt, Multiplewrite_2.txt, Multiplewrite_3.txt ….Multiplewrite_10.txt
Click Next    



Select the schema created in step 2 testfile3_2.xsd 'Root-Element'
Click Next and Finish

Step 4: Insert Mediater in component swimlane and connect Mediator to  Read fileadapter and write fileadapter we created in Step 2,Step 3




Our composite looks like below


Edit the Mediater to define Routing Rules


Create a Mapping File that transforms incoming file to Outcoming data



Click Ok and Save

Edit the Mapping File




We are now ready to test our data


Place a sample Delimited txt file named : testfile23 @D:\FILEADAPTER\Multipleread

1381,Deepthi,Hr,developer,20.000
1382,Madhavi,Hr,sr.developer,20.000
1383,Sarada,Hr,jr.developer,60.000
1384,Swathi,Hr,sr.developer,40.000
1385,Saranya,Hr,jr.developer,80.000
1386,ramya,Hr,jr.developer,50.000 



now check @D:\FILEADAPTER\Multiplewrite you will find a file Multiplewrite_1

having the above delimited data



Happy learning



                                                                                                           By DeepthiReddy

Friday, 15 November 2013

Native Logging Mechanism In SOA 11g



     In this Blog ,I want to discuss about native Logging Mechanism which allow us to track our instance flow

Have You ever heard of generate log messages inside SOA Suite's logging files (diagnostics.log)
It facilitates to  write input payload messages or any status messages to the log files at certain points from the BPEL process

From JDeveloper, with a BPEL flow opened, do this:

Step 1: Insert some imports at the top of the source (first ones inside process tag):

<bpelx:exec import="java.util.logging.Logger" /> 
<bpelx:exec import="java.util.logging.Level" /> 
<bpelx:exec import="oracle.fabric.logging.LogFormatter" />


Step 2 : Insert a Java Embedding action with the following code:

        "  Logger logger = Logger.getLogger("oracle.soa.Logger");
           LogFormatter.configFormatter(logger);
            logger.log(Level.WARNING, "Default log method");  "

That's it, basically. The trick is to configure the logger instance using SOA Fabric's LogFormatter


To avoid doing these steps every time you need to log something, you can use a class.
 
Step 1: Create a new Java Class inside your composite project, with the code below:


package info.deepthi.soa.logging;

import java.util.logging.Level;
import java.util.logging.Logger;
import oracle.fabric.logging.LogFormatter;

public class MyCustomLog {
  private static final Logger logger = Logger.getLogger("oracle.soa.Logger");

  static {
       LogFormatter.configFormatter(logger);
  }

  public static final void log(String message) {
     logger.log(Level.WARNING, message);
  }
}

Step 2: Call the function from your Java Embedding actions:
  
info.deepthi.soa.logging.MyCustomLog.log("Logging class method");
 

After building, deploying and running your process, this is what you're going to see from Enterprise Manager's Log page



In my  code has "info.invaders.Logger" instead of "oracle.soa.Logger" as showed above. If you want to know why, keep reading this post :-)


There's one detail you may have seen that deserves some explanation: I passed a specific package and class (oracle.soa.Logger) in order to retrieve a logger instance. I did this 'cause this package is configured by default when installing the product, and this releases us of some configuration.

But, if you want to use a specific log level for your messages without messing with anything else, the best way to do so is to define a new package and configure it accordingly. Here's how.

  • Define the package you want to use. I'm going with "info.invaders".

  • Locate the file logging.xml of your server and open it. Inside the domain folder, navigate to /config/fmwconfig/servers/server_name.

    Warning: you have to configure the logging.xml file of each SOA instance (server). Not the most cluster-friendly procedure, but that's the way it is.

  • Include a XML block like this, specifying your package and the lowest log level you want:



    <logger name="info.invaders " level="WARNING:1">
          <handler name="odl-handler"/>
          <handler name="wls-domain"/>
          <handler name="console-handler"/>
        </logger>



                                                                                                        By deepthiReddy

Thursday, 14 November 2013

BPEL Sensors (Variable,Fault,Activity)



  In This Post we will see how can we define BPEL sensors and in how many ways we can use these sensors for different purposes.
Sensors are used to declare interest in specific events throughout the life cycle of a BPEL process instance. In a business process, that can be the activation and completion of a specific activity or the modification of a variable value in the business process.
When a sensor is triggered, a specific sensor value is created. For example, if a sensor declares interest in the completion of a BPEL scope, the sensor value consists of the name of the BPEL scope and a time stamp value of when the activity was completed. If a sensor value declares interest in a BPEL process variable, then the sensor value consists of the value of the variable at the moment it was modified, a time stamp when the variable was modified, and the activity name and type that modified the BPEL variable.
The data format for sensor values is normalized and well-defined using XML schema.
A sensor action is an instruction on how to process sensor values. When a sensor is triggered by Oracle BPEL Process Manager, a new sensor value for that sensor is created. After that, all the sensor actions associated with that sensor are performed. A sensor action typically persists the sensor value in a database or sends the normalized sensor value data to a JMS queue or topic. For integration with Oracle Business Activity Monitoring, the sensor value can be sent to the BAM adapter.
You can define the following types of sensors, either through Oracle JDeveloper
1) Variable sensors ----Variable sensors are used to monitor variables(or parts of a variable) of a BPELprocess.2) Fault sensors ---- Fault sensors are used to monitor BPEL faults.3) Activity sensors ----Activity sensors are used to monitor the execution of activities within a BPEL process.

Once we create a sensor we can associate an action to it. When a sensor is triggered by Oracle BPEL Process Manager, a new sensor value for that sensor is created. After that, all the sensor actions associated with that sensor are performed. A sensor action typically persists the sensor value in a database or sends the normalized sensor value data to a JMS queue or topic. For integration with Oracle Business Activity Monitoring, the sensor value can sent to the BAM adapter.
When we create a sensor we will get below filesbpel_process_name_sensor.xml --- defines the sensor definitions of a BPEL process.


bpel_process_name_sensorAction.xml --- defines the sensor action definitions of a BPEL process.
Using sensors in BPEL:

Step 1:Create a soa project with BPEL as component (can be any template empty, sync or async)
We will create variable, activity, fault sensors in order respectively.
Now open the bpel process and go to structure window, right click variablesensor and select create variable sensor as below.
Step 2:Give a proper name to sensor and select the variable to which you want to create a sensor and finish the creation.
 Step 3: Now right click activity sensor and select create activity sensor as below.

Give a proper name to sensor and select the activity to which you want to create a sensor and finish the creation.



Step 4:  To create fault sensor right click fault and select create. Now provide the name and select the fault for which we want to create. (In this example we are going with selection failure) as below.


Step 5:  Now we are done with creation of sensors and we are about to create sensor actions.
For this right click sensor actins and select create and then sensor action.

Here give some name and select the publish type as Database.(we have other options, in this sample we are going with database. Infact in BPEL console we will see the sensors with action -publish type as database.)


Database: Publishes the sensor data to the reports schema in the database. The sensor data can then be queried using SQL.
JMS Queue: Publishes the sensor data to a JMS queue.
JMS Topic: Publishes sensor data to JMS topic.
Custom: Publishes the data to custom Java class.
JMS Adapter: Publishes to remote queues or topics. (Above JMS Queue and JMS Topic publish types only publishes to local JMS destinations)




Step 6: Now in edit all the sensors we have cretaed above and select the sensor action thatwe created now as below.

Step 7: Now deploy the composite application and initiate it from em page.
        Now open the instance and go to sensor values tab.
    


In sensor values page we can see the sensors for variable, activity and fault sensors with the values like in below pics.


                                                                                                              By DeepthiReddy