Real time Synchronization of Suppliers leveraging Event Driven Architecture


We often see that many organizations may not be in a position to adopt entire suite of Oracle SaaS applications and completely replace their existing application portfolio, rather organizations can successfully pursue coexistence strategies by selective uptake of cloud offerings that serve as a logical extension to their existing solutions.

Oracle Sourcing is the next generation application for smarter negotiations with the suppliers.  Sourcing cloud is an integral part of the Procurement Cloud and is generally implemented with other Procurement cloud products like Purchasing and Supplier Portal. Sourcing cloud can also be implemented as a standalone offering with other cloud/on-premise Procurement applications, that makes an ideal coexistence scenario.

In such coexistence scenarios, organizations across the world face typical challenges as how to create/synchronize master data and transactional data across different systems within the organization. Supplier data can flow through a number of different interconnected systemsA unique ID assigned by Oracle Procurement Cloud that identifies suppliers internally often needs to be cross referenced in many downstream applications.

This use case covers Real Time Synchronization of Suppliers data from Oracle Procurement Cloud to downstream applications. We will use Oracle Integration (OIC) to build such integration use case in conjunction with an Enterprise Messaging Platform like Apache Kafka and ATP (Autonomous Transaction Processing).


Oracle Fusion Supplier Model provides a Supplier Outbound Synchronization Service feature that will generate a supplier record snapshot for any external supplier repository to consume

Real-time synchronization Implementation of Fusion supplier records to downstream application can be done in a couple of ways in Oracle Integration

1.       Implementing using Supplier Outbound Synchronization Service

When any update is made to a supplier record the synchronization service will generate a snapshot of the supplier record in an XML file that models the Fusion supplier structure and transport the document over HTTPS to a single server destination

2.      Implementing using Out of the box Supplier Create/Update Events in Procurement Cloud

The business event is triggered on each update on a supplier record including all child entities such as:

·       Business Classifications

·       Products and Services

·       Transaction Tax

·       Payments

·       Addresses

·       Sites

·       Contacts

We will use Event-driven approach to implement the Supplier Synchronization use case. At the end of this blog we will discuss about a few disadvantages of implementing using Outbound Synchronization Service instead of Events.

An Event-Driven Architecture for data and applications is a modern design approach centered around data that describes “events”. An event-driven architecture enables applications to act on these events as they occur. This is in contrast to traditional architectures, which largely dealt with data as batches of information to be inserted and updated at some defined interval, and responded to user-initiated requests rather than new information. With a scalable event-driven architecture, you can both create and respond to a large number of events in real time. An event-driven architecture is particularly well-suited for loosely coupled application architecture with a Producer and Consumer style.

The above logical architecture represents a common event-driven design pattern. Wherein, ERP cloud events feed actions in applications and repositories. The event is generated by some action, an ERP Cloud adapter receives the event, and is fed through a OIC Kafka Adapter (Producer), which is then processed in Kafka Platform, and finally, a result is delivered to a downstream application or repository. The whole use case can be achieved with Out of the box capabilities in OIC leveraging ERP Cloud Adapter in conjunction with Kafka Adapter’s Producer/Consumer capabilities. They key advantages of the architecture is that the producer framework easily fits into existing Enterprise messaging platform (ex. Oracle Streaming Service, Kafka). Moreover, the Consumer flow captures and transforms the real time events from Producer flow (ex. Supplier Updates, New PO etc) at defined interval and pass on to downstream systems to manage uniform system of record across various applications.


Installing Kafka

Since we will be using Kafka as Enterprise Messaging Platform to publish Supplier Information let’s us go throw quick set of steps to setup Kafka.

1.       Download KAFKA from

2.      Unzip the gzip kafka binary file

3.      From a command prompt run <kafka_home>bin\windows kafka-topics.bat and verify that you get a bunch of outputs

4.      Add <kafka_home>\bin\windows complete to PATH environment variable to execute the .bat from any where

5.      Kafka Uses zookeeper to coordinate the brokers/cluster topology and is a consistent file system for configuration information. So, we first start zookeeper and then kafka

6.      Create a “data” directory in <kafka_home>

7.      Create below two directories under <kafka_home>\bin to hold kafka and zookeeper data


8.      Edit <kafka_home>\config\ and modify the dataDir as below

9.      To start zookeeper from command line execute

zookeeper-server-start.bat c:\<kafka_home>config\

10.  Edit <kafka_home>\config\ and modify the log.dirs as below

11.    To start kafka from command line execute

kafka-server-start.bat c:\<kafka_home>config\

Creating Kafka Topic

1.       From a command prompt execute the below to create “supplier_topic”

> kafka-topics.bat --zookeeper --topic supplier_topic --create –partitions 3 --replication-factor 1

You should see a message that the topic created

> kafka-topics.bat --zookeeper –list

You should see list of topics including “supplier_topic”

Installing Connectivity Agent

Installing connectivity agent is mandatory in this case as we have Kafka running a local machine and it is behind firewall.

To install connectivity agent follow the steps @

Complete the high lighted steps above to install Connectivity Agent and Create an Agent Group and associate the agent to the Agent Group.

Installing ATP Database

In our usecase we want to store the Supplier information in downstream application like ATP. I am providing here a high level steps to install ATP and create Database object

Create an ATP database


Select Create Autonomous Database

ATPDev instance created

Select DB Connection and download the Instance Wallet

Navigate to Service Console -> Development -> Database Actions 

Select SQL Development Tile

Run the SQL script

DROP TABLE v_Supplier;

CREATE TABLE v_Supplier (

  v_SupplierId number NOT NULL PRIMARY KEY,

  v_SupplierName varchar2(20),

  v_SupplierNumber number,

  v_BusinessRelationship varchar2(20),

  v_Status varchar2(20),

  v_DUNSNumber int,

  v_TaxRegistrationNumber varchar2(20),

  v_TaxpayerId varchar2(20),

  v_TaxpayerCountryCode varchar2(20),

  v_SupplierTypeCode varchar2(20),

  v_TaxOrganizationType varchar2(20)



Creating Connections in OIC

a.      Create Apache Kafka Adapter Connection

Create a new connection selecting Apache Kafka Adapter and provide the connection configuration

b.      Create ATP Adapter Connection

Connection Configuration

Provide Service name from tnsnames.ora file available in the downloaded previously

Choose Security Auth scheme as JDBC over SSL and provide the auth related information

·       Wallet

·       Wallet Password

·       DB Username and Password

Identify the service name from tnsnames.ora in the Wallet file (ex: atpdev_low)

Check the adapter docs if you plan to use other flavors of connectivity

Implementation of Integration Flow

We need to implement 2 Integration flow as per the logical architecture defined

  1.                  Producer Flow
  2.                   Consumer Flow

Let’s see some high level steps to implement both the Integration flows

Producer Flow in OIC


  •                    Business Event raised from ERP Cloud is captured in OIC (ERP Cloud Adapter)
  •                    Enrich Supplier Info Invoking Supplier REST Service (ERP Cloud Adapter)
  •                   Produce to Kafka Topic (Kafka Adapter)


1.       updateSupplier (ERP Cloud Adapter Connection) – Select Supplier Updated event. This public event signals when a supplier is updated

2.      getSupplierInfo (ERP Cloud Adapter Connection)  – Invoke Suppliers Rest resource to get enriched supplier info


3.       Map to getSupplierInfo – Map  SupplierId to the REST API template parameters

4.      produceSupplierMessage  (Kafka Adapter Connection) –

a.      Select Publish record to Kafka Topic

b.      Select the Supplier topic created previously

c.       Provide a sample Supplier Response Json Message Structure to publish and Map the required elements

Consumer Flow in OIC

Logical flow

  •         Consume from the Kafka Topic (Kafka Adapter)
  •       Upsert Supplier information into database (ATP Adapter)

1.       consumeSupplierUpdate (Kafka Adapter Connection) – Consume from supplier_topic and provide a consumer group with a polling frequency

2.      Provide a sample JSON document used in Producer Flow

3.      upsertSupplier  (ATP Adapter Connection) – upsert Supplier Record in ATP Database Table

And Select the supplier table


Testing the End-End Flow

Login into ERP Cloud with Procurement Manager Role and select Suppliers

Select Manage Suppliers Task and search for a Supplier and update a field. Update DUNS number.


Navigate to OIC Monitoring tab and observe two flow got triggered

Producer Flow

Consumer Flow




Implementation Best Practice

 At the starting of the blog we mentioned about 2 options to implement Real time Supplier Synchronization in OIC. Let’s see some pros & cons and the best design pattern to implement

Disadvantages of Implementing Supplier Outbound Sync Service

  • Need to create an explicit Interface implementing Supplier Outbound Sync Service WSDL à printXMLAsync Operation
  • Response XML String from the service needs to be converted to mapper aware XML message which requires additional orchestration steps in the OIC Integration Flow
  • Importantly, on an Event condition of Supplier record XML payload is sent across to OIC endpoint which is Fire and Forget pattern by Fusion.

Recommendation is to Leverage OIC Out of the box ERP Cloud Adapter aware Supplier Business Events

  • Business Events are natively supported by Oracle ERP Cloud Adapter which protects from any changes in ERP Cloud WSDL interface changes just in case. 
  • When an Event based integration is activated, OIC creates a subscription entry in Oracle ERP Cloud. For some reason if the Event based Integration Flow is not reachable ERP Cloud will re-try the business events multiple times with exponential time gap.


  1. Great article. Please see if you can elaborate on ERP behavior on delivery of event to OIC specially when OIc down or any other reason of failure, what is done to ensure event is always delivered

  2. Mohan - Deloitte

    Nice Article


Post a Comment