Real time Synchronization of Suppliers leveraging Event Driven Architecture

Usecase

We often see that many organizations may not be in a position to adopt entire suite of Oracle SaaS applications and completely replace their existing application portfolio, rather organizations can successfully pursue coexistence strategies by selective uptake of cloud offerings that serve as a logical extension to their existing solutions.

Oracle Sourcing is the next generation application for smarter negotiations with the suppliers.  Sourcing cloud is an integral part of the Procurement Cloud and is generally implemented with other Procurement cloud products like Purchasing and Supplier Portal. Sourcing cloud can also be implemented as a standalone offering with other cloud/on-premise Procurement applications, that makes an ideal coexistence scenario.

In such coexistence scenarios, organizations across the world face typical challenges as how to create/synchronize master data and transactional data across different systems within the organization. Supplier data can flow through a number of different interconnected systemsA unique ID assigned by Oracle Procurement Cloud that identifies suppliers internally often needs to be cross referenced in many downstream applications.

This use case covers Real Time Synchronization of Suppliers data from Oracle Procurement Cloud to downstream applications. We will use Oracle Integration (OIC) to build such integration use case in conjunction with an Enterprise Messaging Platform like Apache Kafka and ATP (Autonomous Transaction Processing).

Background

Oracle Fusion Supplier Model provides a Supplier Outbound Synchronization Service feature that will generate a supplier record snapshot for any external supplier repository to consume

Real-time synchronization Implementation of Fusion supplier records to downstream application can be done in a couple of ways in Oracle Integration

1.       Implementing using Supplier Outbound Synchronization Service

When any update is made to a supplier record the synchronization service will generate a snapshot of the supplier record in an XML file that models the Fusion supplier structure and transport the document over HTTPS to a single server destination

2.      Implementing using Out of the box Supplier Create/Update Events in Procurement Cloud

The business event is triggered on each update on a supplier record including all child entities such as:

·       Business Classifications

·       Products and Services

·       Transaction Tax

·       Payments

·       Addresses

·       Sites

·       Contacts

We will use Event-driven approach to implement the Supplier Synchronization use case. At the end of this blog we will discuss about a few disadvantages of implementing using Outbound Synchronization Service instead of Events.

An Event-Driven Architecture for data and applications is a modern design approach centered around data that describes “events”. An event-driven architecture enables applications to act on these events as they occur. This is in contrast to traditional architectures, which largely dealt with data as batches of information to be inserted and updated at some defined interval, and responded to user-initiated requests rather than new information. With a scalable event-driven architecture, you can both create and respond to a large number of events in real time. An event-driven architecture is particularly well-suited for loosely coupled application architecture with a Producer and Consumer style.









The above logical architecture represents a common event-driven design pattern. Wherein, ERP cloud events feed actions in applications and repositories. The event is generated by some action, an ERP Cloud adapter receives the event, and is fed through a OIC Kafka Adapter (Producer), which is then processed in Kafka Platform, and finally, a result is delivered to a downstream application or repository. The whole use case can be achieved with Out of the box capabilities in OIC leveraging ERP Cloud Adapter in conjunction with Kafka Adapter’s Producer/Consumer capabilities. They key advantages of the architecture is that the producer framework easily fits into existing Enterprise messaging platform (ex. Oracle Streaming Service, Kafka). Moreover, the Consumer flow captures and transforms the real time events from Producer flow (ex. Supplier Updates, New PO etc) at defined interval and pass on to downstream systems to manage uniform system of record across various applications.

Configure

Enable Supplier Events

ERP Cloud Adapter supports subscribing to Supplier Create/Update Events. This is a new feature in ERP cloud and needs to be enabled.

These events include SupplierNumber and SupplierID attributes in the output payload. The Oracle Integration needs to use the event attributes to invoke the Suppliers REST API to get the required supplier information from Procurement Cloud.

From ERP Cloud Navigate to Navigator -> My Enterprise -> New Features -> Available Features. Search for “Suppliers” Functional Area

Select Enable and in Edit Suppliers, select the check box for Enable Outbound Supplier Profile Integration Using Oracle Integration Cloud



Installing Kafka

Since we will be using Kafka as Enterprise Messaging Platform to publish Supplier Information let’s us go throw quick set of steps to setup Kafka.

1.       Download KAFKA from https://kafka.apache.org/downloads

2.      Unzip the gzip kafka binary file


3.      From a command prompt run <kafka_home>bin\windows kafka-topics.bat and verify that you get a bunch of outputs

4.      Add <kafka_home>\bin\windows complete to PATH environment variable to execute the .bat from any where

5.      Kafka Uses zookeeper to coordinate the brokers/cluster topology and is a consistent file system for configuration information. So, we first start zookeeper and then kafka

6.      Create a “data” directory in <kafka_home>

7.      Create below two directories under <kafka_home>\bin to hold kafka and zookeeper data

 





8.      Edit <kafka_home>\config\zookeeper.properties and modify the dataDir as below








9.      To start zookeeper from command line execute

zookeeper-server-start.bat c:\<kafka_home>config\zookeeper.properties

10.  Edit <kafka_home>\config\server.properties and modify the log.dirs as below







11.    To start kafka from command line execute

kafka-server-start.bat c:\<kafka_home>config\server.properties

Creating Kafka Topic

1.       From a command prompt execute the below to create “supplier_topic”

> kafka-topics.bat --zookeeper 127.0.0.1:2181 --topic supplier_topic --create –partitions 3 --replication-factor 1

You should see a message that the topic created

> kafka-topics.bat --zookeeper 127.0.0.1:2181 –list

You should see list of topics including “supplier_topic”

Installing Connectivity Agent

Installing connectivity agent is mandatory in this case as we have Kafka running a local machine and it is behind firewall.

To install connectivity agent follow the steps @  https://docs.oracle.com/en/cloud/paas/integration-cloud/agent.html







Complete the high lighted steps above to install Connectivity Agent and Create an Agent Group and associate the agent to the Agent Group.

Installing ATP Database

In our usecase we want to store the Supplier information in downstream application like ATP. I am providing here a high level steps to install ATP and create Database object

Create an ATP database




 














Select Create Autonomous Database















ATPDev instance created







Select DB Connection and download the Instance Wallet

https://docs.oracle.com/en/cloud/paas/autonomous-database/adbsa/connect-download-wallet.html#GUID-B06202D2-0597-41AA-9481-3B174F75D4B1

Navigate to Service Console -> Development -> Database Actions 

Select SQL Development Tile

Run the SQL script

DROP TABLE v_Supplier;

CREATE TABLE v_Supplier (

  v_SupplierId number NOT NULL PRIMARY KEY,

  v_SupplierName varchar2(20),

  v_SupplierNumber number,

  v_BusinessRelationship varchar2(20),

  v_Status varchar2(20),

  v_DUNSNumber int,

  v_TaxRegistrationNumber varchar2(20),

  v_TaxpayerId varchar2(20),

  v_TaxpayerCountryCode varchar2(20),

  v_SupplierTypeCode varchar2(20),

  v_TaxOrganizationType varchar2(20)

);

 

Creating Connections in OIC

a.      Create Apache Kafka Adapter Connection

Create a new connection selecting Apache Kafka Adapter and provide the connection configuration






b.      Create ATP Adapter Connection

Connection Configuration

Provide Service name from tnsnames.ora file available in the Wallet.zip downloaded previously








Choose Security Auth scheme as JDBC over SSL and provide the auth related information

·       Wallet

·       Wallet Password

·       DB Username and Password

Identify the service name from tnsnames.ora in the Wallet file (ex: atpdev_low)






Check the adapter docs if you plan to use other flavors of connectivity

Implementation of Integration Flow

We need to implement 2 Integration flow as per the logical architecture defined

  1.                  Producer Flow
  2.                   Consumer Flow

Let’s see some high level steps to implement both the Integration flows

Producer Flow in OIC




Logic

  •                    Business Event raised from ERP Cloud is captured in OIC (ERP Cloud Adapter)
  •                    Enrich Supplier Info Invoking Supplier REST Service (ERP Cloud Adapter)
  •                   Produce to Kafka Topic (Kafka Adapter)

 

1.       updateSupplier (ERP Cloud Adapter Connection) – Select Supplier Updated event. This public event signals when a supplier is updated










2.      getSupplierInfo (ERP Cloud Adapter Connection)  – Invoke Suppliers Rest resource to get enriched supplier info



 







3.       Map to getSupplierInfo – Map  SupplierId to the REST API template parameters






4.      produceSupplierMessage  (Kafka Adapter Connection) –

a.      Select Publish record to Kafka Topic







b.      Select the Supplier topic created previously











c.       Provide a sample Supplier Response Json Message Structure to publish and Map the required elements

Consumer Flow in OIC




Logical flow

  •         Consume from the Kafka Topic (Kafka Adapter)
  •       Upsert Supplier information into database (ATP Adapter)

1.       consumeSupplierUpdate (Kafka Adapter Connection) – Consume from supplier_topic and provide a consumer group with a polling frequency









2.      Provide a sample JSON document used in Producer Flow

3.      upsertSupplier  (ATP Adapter Connection) – upsert Supplier Record in ATP Database Table










And Select the supplier table



 






Testing the End-End Flow

Login into ERP Cloud with Procurement Manager Role and select Suppliers









Select Manage Suppliers Task and search for a Supplier and update a field. Update DUNS number.

Monitoring

Navigate to OIC Monitoring tab and observe two flow got triggered






Producer Flow







Consumer Flow



 


 

 


Implementation Best Practice

 At the starting of the blog we mentioned about 2 options to implement Real time Supplier Synchronization in OIC. Let’s see some pros & cons and the best design pattern to implement

Disadvantages of Implementing Supplier Outbound Sync Service

  • Need to create an explicit Interface implementing Supplier Outbound Sync Service WSDL à printXMLAsync Operation
  • Response XML String from the service needs to be converted to mapper aware XML message which requires additional orchestration steps in the OIC Integration Flow
  • Importantly, on an Event condition of Supplier record XML payload is sent across to OIC endpoint which is Fire and Forget pattern by Fusion.

Recommendation is to Leverage OIC Out of the box ERP Cloud Adapter aware Supplier Business Events

  • Business Events are natively supported by Oracle ERP Cloud Adapter which protects from any changes in ERP Cloud WSDL interface changes just in case. 
  • When an Event based integration is activated, OIC creates a subscription entry in Oracle ERP Cloud. For some reason if the Event based Integration Flow is not reachable ERP Cloud will re-try the business events multiple times with exponential time gap.















Comments

  1. Great article. Please see if you can elaborate on ERP behavior on delivery of event to OIC specially when OIc down or any other reason of failure, what is done to ensure event is always delivered

    ReplyDelete
  2. Mohan - Deloitte

    Nice Article

    ReplyDelete
  3. i am following you 2020 your Blog spot @kishore very brief Explanation you will give

    Good article helps more people who are working on

    Oracle ERP Cloud Implementations and technical consultants.

    ReplyDelete
  4. Great blog! This is really helpful for my reference. Do share more such posts and keep us updated.
    Latest Version Of Android
    Latest Android Version

    ReplyDelete
  5. Hi Kishore.

    I am have been trying to achieve Event based Subscription in OIC from Oracle Cloud using Oracle ERP Cloud adapter. However unable to find any document which tells me what setup is required on both instances for using Token based auth. I tried setting up using old method (csf-key) but that also doesnt seems to be working. Can you please point out a setup document for Token based authentication.

    ReplyDelete
    Replies
    1. With update config steps, able to receive messages in OIC. Thanks a lot

      Delete
  6. It is amazing and wonderful to visit your site.Thanks for sharing this information
    Mulesoft Online Training in India
    Best Mulesoft Training

    ReplyDelete
  7. Thanks for sharing the informative post. Looking for erp software in chennai, with free consultation.

    ReplyDelete

Post a Comment