Quantcast
Channel: ATeam Chronicles
Viewing all 155 articles
Browse latest View live

Integrating Oracle Social Data and Insight Cloud Service with Oracle Business Intelligence Cloud Service (BICS)

$
0
0

Introduction

 

This article outlines how to integrate Oracle Social Data and Insight Cloud Service with Oracle Business Intelligence Cloud Service (BICS). Two primary data movement patterns are described:

(a) JSON data is retrieved from an external cloud source using REST Web Services.

(b) Data is inserted into the Schema Service database with Apex PL/SQL functions.

The above topics have previously been discussed in past A-Team BICS Blogs. What makes this article unique, is that it retrieves and displays the results real-time. These results are stored temporary in the database while viewed via a Dashboard. This data could then be permanently archived if desired.

The eighteen steps below have been broken into two parts. Part A follows a similar to pattern to that covered in “Integrating Oracle Service Cloud (RightNow) with Oracle Business Intelligence Cloud Service (BICS) – Part 2″. Part B incorporates ideas covered in “Executing a Stored Procedure from Oracle Business Intelligence Cloud” Service” and “Using the Oracle Business Intelligence Cloud Service (BICS) REST API to Clear Cached Data“.


PART A – Retrieve & Load Data


1)    Review REST API Documentation

2)    Build REST Search Endpoint URL

3)    Run apex_web_service.make_rest_request

4)    Formulate JSON Path

5)    Create BICS Tables

6)    Parse JSON

7)    Execute PL/SQL

8)    Review Results


PART B – Trigger Results Real-Time


9)    Add Clear BI Server Cache Logic

10)  Create Function – to execute Stored Procedure

11) Test Function – that executes Stored Procedure

12)  Create Dummy Table (to reference the EVALUATE function)

13)  Create Repository Variables

14)  Create Expression in Data Modeler (that references EVALUATE function)

15)  Create Analysis – that executes EVALUATE function

16)  Create Analysis – to display results

17)  Create Dashboard Prompt

18)  Create Dashboard

Main Article

 

Part A – Retrieve & Load Data

 

Step 1 – Review REST API Documentation

 

Begin by reviewing the REST APIs for Oracle Social Data and Insight Cloud Service documentation. This article only covers using the /v2/search end point. The /v2/search is used to retrieve a list of companies or contacts that match a given criteria. There are many other end points available in the API that may be useful for various integration scenarios.

Step 2 – Build REST Search Endpoint URL

 

Access Oracle Social Data and Insight Cloud Service from Oracle Cloud My Services.

The Service REST Endpoint (Company and Contact Data API) will be listed.

For Example: https://datatrial1234-IdentityDomain.data.us9.oraclecloud.com/data/api

Append v2/search to the URL.

For Example: https://datatrial1234-IdentityDomain.data.us9.oraclecloud.com/data/api/v2/search

Step 3 – Run apex_web_service.make_rest_request

 

1)    Open SQL Workshop from Oracle Application Express

Snap2

2)    Launch SQL Commands

Snap3

3)    Use the code snippet below as a starting point to build your PL/SQL.

Run the final PL/SQL in the SQL Commands Window.

Replace the URL, username, password, identity domain, and body parameters.

For a text version of the code snippet click here.

For detailed information on all body parameters available click here.

l_ws_response_clob CLOB;
l_ws_url VARCHAR2(500) := ‘YourURL/data/api/v2/search';
l_body CLOB;

l_body := ‘{“objectType”:”People”,”limit”:”100″,”filterFields”:
[{"name":"company.gl_ult_dun","value":"123456789"},
{"name":"person.management_level","value":"0"},
{"name":"person.department","value":"3"}],”returnFields”:
["company.gl_ult_dun","person.parent_duns", "person.first_name",
"person.last_name","person.department","person.management_level","person.gen
der_code","person.title","person.standardized_title","person.age_range","per
son.company_phone","person.company_phone_extn","name.mail","person.co_offica
l_id"]}‘;
–use rest to retrieve the Data Service Cloud – Social Data
apex_web_service.g_request_headers(1).name := ‘Content-Type';
apex_web_service.g_request_headers(1).value := ‘application/json';
apex_web_service.g_request_headers(2).name := ‘X-ID-TENANT-NAME';
apex_web_service.g_request_headers(2).value := ‘Identity Domain’;
l_ws_response_clob := apex_web_service.make_rest_request
(
p_url => l_ws_url,
p_username => ‘Username‘,
p_password => ‘Password‘,
p_body => l_body,
p_http_method => ‘POST’
);
dbms_output.put_line(dbms_lob.substr(l_ws_response_clob,12000,1));
dbms_output.put_line(dbms_lob.substr(l_ws_response_clob,12000,12001));
dbms_output.put_line(dbms_lob.substr(l_ws_response_clob,12000,24001));

4)    Run the query. A subset of the JSON results should be displayed in the Results section of SQL Commands.

Additional dbms_output.put_line’s may be added should further debugging be required.

It is not necessary at this stage to view the entire result set. The key to this exercise is to prove that the URL is correct and can successfully be run through apex_web_service.make_rest_request.

5)    Currently the body parameter filterFeilds only accepts “value” and not “DisplayValue”; thus, it may be necessary to create dimension lookup tables. For this exercise two dimension look-up tables are used.

Look-up values may change over time and should be re-confirmed prior to table creation.

“Step 5 –  Create the BICS Database Tables” describes how to create the two look-up tables below.

Department – Lookup Values

Value    DisplayValue
0        Administration
1        Consulting
2        Education
3        Executive
4        Facilities
5        Finance
6        Fraternal Organizations
7        Government
8        Human Resources
9        Operations
10       Other
11       Purchasing
12       Religion
13       Research & Development
14       Sales & Marketing
15       Systems

Management Level – Lookup Values

Value    DisplayValue
0        C-Level
1        Vice-President
2        Director
3        Manager
4        Other

 

Step 4 – Formulate JSON Path


1)    When formulating the JSON path expression, it may be useful to use an online JSON Path Expression Tester.

There are many different free JSON tools available online. The one below is: https://jsonpath.curiousconcept.com

2)    For this exercise the below values will be exacted from the JSON.

Each path was tested in the JSON Path Expression Tester.

The attribute numbers 1-12 are associated with the order in which returnFields have been specified in the body parameter. Thus, attribute numbers may differ from the example if:

a) Fields are listed in a different sequence.

b) An alternative number or combination of fields is defined.

Value                             JSON Path Expression 
totalHits                         ‘totalHits’
company.gl_ult_dun                 parties[*].attributes[1].value
person.parent_duns                 parties[*].attributes[2].value
person.first_name                  parties[*].attributes[3].value
person.last_name                   parties[*].attributes[4].value
person.department                  parties[*].attributes[5].value
person.management_level            parties[*].attributes[6].value
person.gender_code                 parties[*].attributes[7].value
person.title                       parties[*].attributes[8].value
person.standardized_title          parties[*].attributes[9].value
person.age_range                   parties[*].attributes[10].value
person.company_phone               parties[*].attributes[11].value
person.company_phone_extn          parties[*].attributes[12].value
name.mail                          parties[*].email
person.co_offical_id               parties[*].id

Step 5 –  Create BICS Tables

 

1)    Open SQL Workshop from Oracle Application Express

Snap2

2)    Launch SQL Commands

Snap3

3)    Create the SOCIAL_DATA_CONTACTS table in the BICS database.

To view the SQL in plain text click here.

CREATE TABLE SOCIAL_DATA_CONTACTS(
COMPANY_DUNS_NUMBER VARCHAR2(500),
CONTACT_DUNS_NUMBER VARCHAR2(500),
FIRST_NAME VARCHAR2(500),
LAST_NAME VARCHAR2(500),
DEPARTMENT VARCHAR2(500),
MANAGEMENT_LEVEL VARCHAR2(500),
GENDER VARCHAR2(500),
JOB_TITLE VARCHAR2(500),
STANDARDIZED_TITLE VARCHAR2(500),
AGE_RANGE VARCHAR2(500),
COMPANY_PHONE VARCHAR2(500),
COMPANY_PHONE_EXT VARCHAR2(500),
EMAIL_ADDRESS VARCHAR2(500),
INDIVIDUAL_ID VARCHAR2(500));

4)    Create and populate the DEPARTMENT_PROMPT look-up table in the BICS database.

CREATE TABLE DEPARTMENT_PROMPT(DEPT_NUM VARCHAR(500), DEPT_NAME VARCHAR2(500));

INSERT INTO DEPARTMENT_PROMPT(DEPT_NUM, DEPT_NAME) VALUES(‘0′,’Administration’);
INSERT INTO DEPARTMENT_PROMPT(DEPT_NUM, DEPT_NAME) VALUES(‘1′,’Consulting’);
INSERT INTO DEPARTMENT_PROMPT(DEPT_NUM, DEPT_NAME) VALUES(‘2′,’Education’);
INSERT INTO DEPARTMENT_PROMPT(DEPT_NUM, DEPT_NAME) VALUES(‘3′,’Executive’);
INSERT INTO DEPARTMENT_PROMPT(DEPT_NUM, DEPT_NAME) VALUES(‘4′,’Facilities’);
INSERT INTO DEPARTMENT_PROMPT(DEPT_NUM, DEPT_NAME) VALUES(‘5′,’Finance’);
INSERT INTO DEPARTMENT_PROMPT(DEPT_NUM, DEPT_NAME) VALUES(‘6′,’Fraternal Organizations’);
INSERT INTO DEPARTMENT_PROMPT(DEPT_NUM, DEPT_NAME) VALUES(‘7′,’Government’);
INSERT INTO DEPARTMENT_PROMPT(DEPT_NUM, DEPT_NAME) VALUES(‘8′,’Human Resources’);
INSERT INTO DEPARTMENT_PROMPT(DEPT_NUM, DEPT_NAME) VALUES(‘9′,’Operations’);
INSERT INTO DEPARTMENT_PROMPT(DEPT_NUM, DEPT_NAME) VALUES(’10’,’Other’);
INSERT INTO DEPARTMENT_PROMPT(DEPT_NUM, DEPT_NAME) VALUES(’11’,’Purchasing’);
INSERT INTO DEPARTMENT_PROMPT(DEPT_NUM, DEPT_NAME) VALUES(’12’,’Religion’);
INSERT INTO DEPARTMENT_PROMPT(DEPT_NUM, DEPT_NAME) VALUES(’13’,’Research & Development’);
INSERT INTO DEPARTMENT_PROMPT(DEPT_NUM, DEPT_NAME) VALUES(’14’,’Sales & Marketing’);
INSERT INTO DEPARTMENT_PROMPT(DEPT_NUM, DEPT_NAME) VALUES(’15’,’Systems’);

5)    Create and populate the MANAGEMENT_LEVEL_PROMPT look-up table in the BICS database.

CREATE TABLE MANAGEMENT_LEVEL_PROMPT(ML_NUM VARCHAR(500), ML_NAME VARCHAR2(500));

INSERT INTO MANAGEMENT_LEVEL_PROMPT(ML_NUM, ML_NAME) VALUES(‘0′,’C-Level’);
INSERT INTO MANAGEMENT_LEVEL_PROMPT(ML_NUM, ML_NAME) VALUES(‘1′,’Vice-President’);
INSERT INTO MANAGEMENT_LEVEL_PROMPT(ML_NUM, ML_NAME) VALUES(‘2′,’Director’);
INSERT INTO MANAGEMENT_LEVEL_PROMPT(ML_NUM, ML_NAME) VALUES(‘3′,’Manager’);
INSERT INTO MANAGEMENT_LEVEL_PROMPT(ML_NUM, ML_NAME) VALUES(‘4′,’Other’);

Step 6 – Parse JSON


For a text version of the PL/SQL snippet click here.

Replace URL, username, password, identity domain, and body parameters.

The code spinet has been highlighted in different colors grouping the various logical components.

Blue

Rest Request that retrieves the data in JSON format as a clob.

Yellow

Lookup “Value” codes based on users selection of “DisplayValue” descriptions.

Purple

Logic to handle ‘All Column Values’ when run from BICS. (This could be handled in many other ways … and is just a suggestion.)

Green

Convert JSON clob to readable list -> Parse JSON values and insert into database.

Code advice: Keep in mind that p_path is expecting a string. Therefore, it is necessary to concatenate any dynamic variables such as the LOOP / Count ‘i’ variable.

Red

Array to handle entering multiple duns numbers.

Grey

Left pad Duns numbers with zeros – as this is how they are stored in Oracle Social Data and Insight Cloud Service.

CREATE OR REPLACE PROCEDURE SP_LOAD_SOCIAL_DATA_CONTACTS(
p_company_duns varchar2
,p_department varchar2
,p_management_level varchar2
) IS
l_ws_response_clob CLOB;
l_ws_url VARCHAR2(500) := ‘YourURL/data/api/v2/search';
l_body CLOB;
l_num_contacts NUMBER;
v_array apex_application_global.vc_arr2;
l_filter_fields VARCHAR2(500);
l_pad_duns VARCHAR2(9);
l_department_num VARCHAR2(100);
l_management_level_num VARCHAR2(100);

DELETE FROM SOCIAL_DATA_CONTACTS;
–lookup department code
IF p_department != ‘All Column Values’ THEN
SELECT MAX(DEPT_NUM) into l_department_num
FROM DEPARTMENT_PROMPT
WHERE DEPT_NAME = p_department;
END IF;
–lookup management level code
IF p_management_level != ‘All Column Values’ THEN
SELECT MAX(ML_NUM) into l_management_level_num
FROM MANAGEMENT_LEVEL_PROMPT
WHERE ML_NAME = p_management_level;
END IF;
–loop though company duns numbers
v_array := apex_util.string_to_table(p_company_duns, ‘,’);
for j in 1..v_array.count LOOP
–pad duns numbers with zeros – as they are stored in the system this way
l_pad_duns := LPAD(v_array(j),9,’0′);
–logic to handle All Column Values
IF p_department != ‘All Column Values’ AND p_management_level != ‘All Column Values’ THEN
l_filter_fields := ‘”filterFields”:[{"name":"company.gl_ult_dun","value":"' || l_pad_duns || '"},{"name":"person.department","value":"'|| l_department_num || '"},{"name":"person.management_level","value":"'|| l_management_level_num || '"}]‘;
ELSE
IF p_department = ‘All Column Values’ AND p_management_level != ‘All Column Values’ THEN
l_filter_fields := ‘”filterFields”:[{"name":"company.gl_ult_dun","value":"' || l_pad_duns || '"},{"name":"person.management_level","value":"'|| l_management_level_num || '"}]‘;
ELSE
IF p_department != ‘All Column Values’ AND p_management_level = ‘All Column Values’ THEN
l_filter_fields := ‘”filterFields”:[{"name":"company.gl_ult_dun","value":"' || l_pad_duns || '"},{"name":"person.department","value":"'|| l_department_num || '"}]‘;
ELSE
l_filter_fields := ‘”filterFields”:[{"name":"company.gl_ult_dun","value":"' || l_pad_duns || '"}]‘;
END IF;
END IF;
END IF;
–build dynamic body
l_body := ‘{“objectType”:”People”,”limit”:”100″,’ || l_filter_fields || ‘,”returnFields”:["company.gl_ult_dun","person.parent_duns", "person.first_name", "person.last_name","person.department","person.management_level","person.gender_code","person.title","person.standardized_title","person.age_range","person.company_phone","person.company_phone_extn","name.mail","person.co_offical_id"]}';
–use rest to retrieve the Data Service Cloud – Social Data
apex_web_service.g_request_headers(1).name := ‘Content-Type';
apex_web_service.g_request_headers(1).value := ‘application/json';
apex_web_service.g_request_headers(2).name := ‘X-ID-TENANT-NAME';
apex_web_service.g_request_headers(2).value := ‘identity domain';
l_ws_response_clob := apex_web_service.make_rest_request
(
p_url => l_ws_url,
p_username => ‘UserName’,
p_password => ‘Password’,
p_body => l_body,
p_http_method => ‘POST’
);
–parse the clob as JSON
apex_json.parse(l_ws_response_clob);
–get total hits
l_num_contacts := CAST(apex_json.get_varchar2(p_path => ‘totalHits’) AS NUMBER);
–loop through total hits and insert JSON data into database
IF l_num_contacts > 0 THEN
for i in 1..l_num_contacts LOOP

INSERT INTO SOCIAL_DATA_CONTACTS(COMPANY_DUNS_NUMBER, CONTACT_DUNS_NUMBER,FIRST_NAME,LAST_NAME,DEPARTMENT,MANAGEMENT_LEVEL,GENDER,JOB_TITLE,STANDARDIZED_TITLE,AGE_RANGE,COMPANY_PHONE,COMPANY_PHONE_EXT,EMAIL_ADDRESS,INDIVIDUAL_ID)
VALUES
(
v_array(j),
apex_json.get_varchar2(p_path => ‘parties['|| i || '].attributes[2].value’),
apex_json.get_varchar2(p_path => ‘parties['|| i || '].attributes[3].value’),
apex_json.get_varchar2(p_path => ‘parties['|| i || '].attributes[4].value’),
apex_json.get_varchar2(p_path => ‘parties['|| i || '].attributes[5].value’),
apex_json.get_varchar2(p_path => ‘parties['|| i || '].attributes[6].value’),
apex_json.get_varchar2(p_path => ‘parties['|| i || '].attributes[7].value’),
apex_json.get_varchar2(p_path => ‘parties['|| i || '].attributes[8].value’),
apex_json.get_varchar2(p_path => ‘parties['|| i || '].attributes[9].value’),
apex_json.get_varchar2(p_path => ‘parties['|| i || '].attributes[10].value’),
apex_json.get_varchar2(p_path => ‘parties['|| i || '].attributes[11].value’),
apex_json.get_varchar2(p_path => ‘parties['|| i || '].attributes[12].value’),
apex_json.get_varchar2(p_path => ‘parties['|| i || '].email’),
apex_json.get_varchar2(p_path => ‘parties['|| i || '].id’)
);
end loop; –l_num_contacts
END IF;    –greater than 0

end loop; –v_array.count
commit;

Step 7 – Execute PL/SQL

 

Run the PL/SQL in Apex SQL Commands. Test various combinations. Test with duns numbers with less than 9 digits.

SP_LOAD_SOCIAL_DATA_CONTACTS(‘123456789′,’All Column Values’,’All Column Values’);

SP_LOAD_SOCIAL_DATA_CONTACTS(‘123456789′,’Administration’,’All Column Values’);

SP_LOAD_SOCIAL_DATA_CONTACTS(‘123456789′,’All Column Values’,’Manager’);

SP_LOAD_SOCIAL_DATA_CONTACTS(‘123456789′,’Administration’,’Manager’);

SP_LOAD_SOCIAL_DATA_CONTACTS(‘1234567′,’All Column Values’,’All Column Values’);

Step 8 – Review Results

 

Confirm data was inserted as expected.

SELECT * FROM SOCIAL_DATA_CONTACTS;

Part B – Trigger Results Real-Time

 

Oracle Social Data and Insight Cloud Service data will be retrieved by the following sequence of events.

 

a)    A BICS Consumer selects required input parameters via a Dashboard Prompt.

b)    Selected Dashboard Prompt values are passed to Request Variables.

(The Request Variable temporary changes the state of the Repository Variable for that Session.)

c)    Session Variables (NQ_SESSION) are read by the Modeler Expression using the VALUEOF function.

d)    EVALUATE is used to call a Database Function and pass the Session Variable values to the Database Function.

e)    The Database Function calls a Stored Procedure – passing parameters from the Function to the Stored Procedure

f)    The Stored Procedure uses apex_web_service to call the Rest API and retrieve data in JSON format as a clob.

g)    The clob is parsed and results are returned and inserted into a BICS database table.

h)    Results are displayed on a BICS Analysis Request, and presented to the BICS Consumer via a Dashboard.


Step 9 – Add Clear BI Server Cache Logic


For a text version of the PL/SQL snippets in step 9-11 click here.

This step is optional depending on how cache is refreshed / recycled.

Replace BICS URL, BICS User, BICS Pwd, and BICS Identity Domain.

Add cache code after insert commit (after all inserts / updates are complete).

Repeat testing undertaken in “Step 7 – Execute PL/SQL”.

DECLARE
l_bics_response_clob    CLOB;

–clear BICS BI Server Cache
apex_web_service.g_request_headers(1).name := ‘X-ID-TENANT-NAME';
apex_web_service.g_request_headers(1).Value := ‘BICS Identity Domain‘;
l_bics_response_clob := apex_web_service.make_rest_request
(
p_url => ‘https://BICS_URL/bimodeler/api/v1/dbcache’,
p_http_method => ‘DELETE’,
p_username => ‘BICS UserName‘,
p_password => ‘BICS Pwd
);
–dbms_output.put_line(‘Status:’ || apex_web_service.g_status_code);


Step 10 – Create Function – to execute Stored Procedure

 

CREATE OR REPLACE FUNCTION LOAD_SOCIAL_DATA_CONTACTS
(
p_company_duns IN VARCHAR2,
p_department IN VARCHAR2,
p_management_level VARCHAR2
) RETURN INTEGER
IS PRAGMA AUTONOMOUS_TRANSACTION;

SP_LOAD_SOCIAL_DATA_CONTACTS(p_company_duns,p_department,p_management_level);
COMMIT;
RETURN 1;

Step 11 – Test Function – that executes Stored Procedure

 

SELECT LOAD_SOCIAL_DATA_CONTACTS(‘123456789′,’All Column Values’,’All Column Values’) FROM DUAL;

 

Step 12 – Create Dummy Table – to reference the EVALUATE function


For a text version of the PL/SQL snippet click here


1)    Create Table

CREATE TABLE DUMMY_REFRESH
(REFRESH_TEXT VARCHAR2(255));

2)    Insert descriptive text into table

INSERT INTO DUMMY_REFRESH (REFRESH_TEXT)
VALUES (‘Hit Refresh to Update Data’);

3)    Confirm insert was successful

SELECT * FROM DUMMY_REFRESH;

 

Step 13 – Create Repository Variables


Create a Repository Variable in the BICS Modeler tool for each parameter that needs to be passed to the function and stored procedure.

Snap12

Snap13

Snap14

 

Step 14 –  Create Expression in Data Modeler – that references EVALUATE function

 

Create the expression in the BICS Modeler tool using EVALUATE to call the function and pass necessary parameters to the function and stored procedure.

EVALUATE(‘LOAD_SOCIAL_DATA_CONTACTS(%1,%2, %3)’,VALUEOF(NQ_SESSION.”r_company_duns”),VALUEOF(NQ_SESSION.”r_department”),VALUEOF(NQ_SESSION.”r_management_level”))

 

Snap2

 

Step 15 – Create Analysis – that executes EVALUATE function


Create an Analysis and add both field from the DUMMY_REFRESH table. Hide both field so that nothing is returned.

 Snap4

Snap5

Snap6

Step 16 – Create Analysis – to display results

 
Add all or desired fields from SOCIAL_DATA_CONTACTS table.

 Snap16

Step 17 – Create Dashboard Prompt


For each Prompt set the corresponding Request Variable.

*** These must exactly match the names of the repository variables created in “Step 13 – Create Repository Variables” ***

Snap9

Snap10

Snap11

Snap15

For each prompt manually add the text for ‘All Columns Values’ and exclude NULL’s.

Snap19

Snap20

The Dashboard also contains a workaround to deal multiple Dun’s numbers. Currently VALUELISTOF is not available in BICS. Therefore, it is not possible to pass multiple values from a Prompt to a request / session variable; since, VALUEOF can only handle a single value.

A suggested workaround is to put the multi-section list into a single comma delimiter string – using LISTAGG. The single string can then be read by VALUEOF and logic in the stored procedure can read through the array.

CAST(EVALUATE_AGGR(‘LISTAGG(%1,%2) WITHIN GROUP (ORDER BY %1 DESC)’,”DUNS_NUMBERS”.”COMPANY_DUNS_NUMBER”,’,’) AS VARCHAR(500))

 
Step 18 – Create Dashboard


There are many ways to design the Dashboard for the BICS Consumer. One suggestion is below:

The Dashboard is processed in seven clicks.

1)    Select Duns Number(s).

2)    Select Confirm Duns Number (only required for multi-select workaround described in Step 17).

3)    Select Department or run for ‘All Column Values’.

4)    Select Management Level or run for ‘All Column Values’.

5)    Click Apply. *** This is a very important step as Request Variables are only read once Apply is hit ***

6)    Click Refresh – to kick off Refresh Analysis Request (built in Step 15).

7)    Click Get Contact Details to display Contact Analysis Request (built in Step 16).

Snap7

Ensure the Refresh Report Link is made available on the Refresh Analysis Request – to allow the BICS Consumer to override cache.

Snap17

Optional: Make use of Link – Within the Dashboard on Contact Analysis Request to create “Get Contact Details” link.

Snap18

Further Reading


Click here for the Application Express API Reference Guide – MAKE_REST_REQUEST Function.

Click here for the Application Express API Reference Guide – APEX_JSON Package.

Click here for the REST APIs for Oracle Social Data and Insight Cloud Service guide.

Click here for more A-Team BICS Blogs.

Summary


This article provided a set of examples that leverage the APEX_WEB_SERVICE_API to integrate Oracle Social Data and Insight Cloud Service with Oracle Business Intelligence Cloud Service (BICS) using the Connect REST API web services.

The use case shown was for BICS and Oracle Social Data and Insight Cloud Service integration. However, many of the techniques referenced could be used to integrate Oracle Social Data and Insight Cloud Service with other Oracle and non-Oracle applications.

Similarly, the Apex MAKE_REST_REQUEST and APEX_JSON examples could be easily modified to integrate BICS or standalone Oracle Apex with any other REST web service that is accessed via a URL and returns JSON data.

Techniques referenced in this blog could be useful for those building BICS REST ETL connectors and plug-ins.

Key topics covered in this article include: Oracle Business Intelligence Cloud Service (BICS), Oracle Social Data and Insight Cloud Service, Oracle Apex API, APEX_JSON, apex_web_service.make_rest_request, PL/SQL, BICS Variables (Request, Repository, Session), BICS BI Server Cache, BICS Functions (EVALUATE, VALUEOF, LISTAGG), and Cloud to Cloud integration.


Oracle HCM Cloud – Bulk Integration Automation Using SOA Cloud Service

$
0
0

Introduction

Oracle Human Capital Management (HCM) Cloud provides a comprehensive set of tools, templates, and pre-packaged integration to cover various scenarios using modern and efficient technologies. One of the patterns is the batch integration to load and extract data to and from the HCM cloud. HCM provides the following bulk integration interfaces and tools:

HCM Data Loader (HDL)

HDL is a powerful tool for bulk-loading data from any source to Oracle Fusion HCM. It supports important business objects belonging to key Oracle Fusion HCM products, including Oracle Fusion Global Human Resources, Compensation, Absence Management, Performance Management, Profile Management, Global Payroll, Talent and Workforce Management. For detailed information on HDL, please refer to this.

HCM Extracts

HCM Extract is an outbound integration tool that lets you select HCM data elements, extracting them from the HCM database and archiving these data elements as XML. This archived raw XML data can be converted into a desired format and delivered to supported channels recipients.

Oracle Fusion HCM provides the above tools with comprehensive user interfaces for initiating data uploads, monitoring upload progress, and reviewing errors, with real-time information provided for both the import and load stages of upload processing. Fusion HCM provides tools, but it requires additional orchestration such as generating FBL or HDL file, uploading these files to WebCenter Content and initiating FBL or HDL web services. This post describes how to design and automate these steps leveraging Oracle Service Oriented Architecture (SOA) Cloud Service deployed on Oracle’s cloud Platform As a Service (PaaS) infrastructure.  For more information on SOA Cloud Service, please refer to this.

Oracle SOA is the industry’s most complete and unified application integration and SOA solution. It transforms complex application integration into agile and re-usable service-based components to speed time to market, respond faster to business requirements, and lower costs.. SOA facilitates the development of enterprise applications as modular business web services that can be easily integrated and reused, creating a truly flexible, adaptable IT infrastructure. For more information on getting started with Oracle SOA, please refer this. For developing SOA applications using SOA Suite, please refer to this.

These bulk integration interfaces and patterns are not applicable to Oracle Taleo.

Main Article

 

HCM Inbound Flow (HDL)

Oracle WebCenter Content (WCC) acts as the staging repository for files to be loaded and processed by HDL. WCC is part of the Fusion HCM infrastructure.

The loading process for FBL and HDL consists of the following steps:

  • Upload the data file to WCC/UCM using WCC GenericSoapPort web service
  • Invoke the “LoaderIntegrationService” or the “HCMDataLoader” to initiate the loading process.

However, the above steps assume the existence of an HDL file and do not provide a mechanism to generate an HDL file of the respective objects. In this post we will use the sample use case where we get the data file from customer, using it to transform the data and generate an HDL file, and then initiate the loading process.

The following diagram illustrates the typical orchestration of the end-to-end HDL process using SOA cloud service:

 

hcm_inbound_v1

HCM Outbound Flow (Extract)

The “Extract” process for HCM has the following steps:

  • An Extract report is generated in HCM either by user or through Enterprise Scheduler Service (ESS)
  • Report is stored in WCC under the hcm/dataloader/export account.

 

However, the report must then be delivered to its destination depending on the use cases. The following diagram illustrates the typical end-to-end orchestration after the Extract report is generated:

hcm_outbound_v1

 

For HCM bulk integration introduction including security, roles and privileges, please refer to my blog Fusion HCM Cloud – Bulk Integration Automation using Managed File Trasnfer (MFT) and Node.js. For introduction to WebCenter Content Integration services using SOA, please refer to my blog Fusion HCM Cloud Bulk Automation.

 

Sample Use Case

Assume that a customer receives benefits data from their partner in a file with CSV (comma separated value) format periodically. This data must be converted into HDL format for the “ElementEntry” object and initiate the loading process in Fusion HCM cloud.

This is a sample source data:

E138_ASG,2015/01/01,2015/12/31,4,UK LDG,CRP_UK_MNTH,E,H,Amount,23,Reason,Corrected all entry value,Date,2013-01-10
E139_ASG,2015/01/01,2015/12/31,4,UK LDG,CRP_UK_MNTH,E,H,Amount,33,Reason,Corrected one entry value,Date,2013-01-11

This is the HDL format of ElementryEntry object that needs to be generated based on above sample file:

METADATA|ElementEntry|EffectiveStartDate|EffectiveEndDate|AssignmentNumber|MultipleEntryCount|LegislativeDataGroupName|ElementName|EntryType|CreatorType
MERGE|ElementEntry|2015/01/01|2015/12/31|E138_ASG|4|UK LDG|CRP_UK_MNTH|E|H
MERGE|ElementEntry|2015/01/01|2015/12/31|E139_ASG|4|UK LDG|CRP_UK_MNTH|E|H
METADATA|ElementEntryValue|EffectiveStartDate|EffectiveEndDate|AssignmentNumber|MultipleEntryCount|LegislativeDataGroupName|ElementName|InputValueName|ScreenEntryValue
MERGE|ElementEntryValue|2015/01/01|2015/12/31|E138_ASG|4|UK LDG|CRP_UK_MNTH|Amount|23
MERGE|ElementEntryValue|2015/01/01|2015/12/31|E138_ASG|4|UK LDG|CRP_UK_MNTH|Reason|Corrected all entry value
MERGE|ElementEntryValue|2015/01/01|2015/12/31|E138_ASG|4|UK LDG|CRP_UK_MNTH|Date|2013-01-10
MERGE|ElementEntryValue|2015/01/01|2015/12/31|E139_ASG|4|UK LDG|CRP_UK_MNTH|Amount|33
MERGE|ElementEntryValue|2015/01/01|2015/12/31|E139_ASG|4|UK LDG|CRP_UK_MNTH|Reason|Corrected one entry value
MERGE|ElementEntryValue|2015/01/01|2015/12/31|E139_ASG|4|UK LDG|CRP_UK_MNTH|Date|2013-01-11

SOA Cloud Service Design and Implementation

A canonical schema pattern has been implemented to design end-to-end inbound bulk integration process – from the source data file to generating HDL file and initiating the loading process in HCM cloud. The XML schema of HDL object “ElementEntry” is created. The source data is mapped to this HDL schema and SOA activities will generate the HDL file.

Having a canonical pattern automates the generation of HDL file and it becomes a reusable asset for various interfaces. The developer or business user only needs to focus on mapping the source data to this canonical schema. All other activities such as generating the HDL file, compressing and encrypting the file, uploading the file to WebCenter Content and invoking web services needs to be developed once and then once these activities are developed they also become reusable assets.

Please refer to Wikipedia for the definition of Canonical Schema Pattern

These are the following design considerations:

1. Convert source data file from delimited format to XML

2. Generate Canonical Schema of ElementEntry HDL Object

3. Transform source XML data to HDL canonical schema

4. Generate and compress HDL file

5. Upload a file to WebCenter Content and invoke HDL web service

 

Please refer to SOA Cloud Service Develop and Deploy for introduction and creating SOA applications.

SOA Composite Design

This is a composite based on above implementation principles:

hdl_composite

Convert Source Data to XML

“GetEntryData” in the above composite is a File Adapter service. It is configured to use native format builder to convert CSV data to XML format. For more information on File Adapter, refer to this. For more information on Native Format Builder, refer to this.

The following provides detailed steps on how to use Native Format Builder in JDeveloper:

In native format builder, select delimited format type and use source data as a sample to generate a XML schema. Please see the following diagrams:

FileAdapterConfig

nxsd1

nxsd2_v1 nxsd3_v1 nxsd4_v1 nxsd5_v1 nxsd6_v1 nxsd7_v1

Generate XML Schema of ElementEntry HDL Object

A similar approach is used to generate ElementEntry schema. It has two main objects: ElementEntry and ElementEntryValue.

ElementEntry Schema generated using Native Format Builder

<?xml version = ‘1.0’ encoding = ‘UTF-8’?>
<xsd:schema xmlns:xsd=”http://www.w3.org/2001/XMLSchema” xmlns:nxsd=”http://xmlns.oracle.com/pcbpel/nxsd” xmlns:tns=”http://TargetNamespace.com/GetEntryHdlData” targetNamespace=”http://TargetNamespace.com/GetEntryHdlData” elementFormDefault=”qualified” attributeFormDefault=”unqualified” nxsd:version=”NXSD” nxsd:stream=”chars” nxsd:encoding=”UTF-8″>
<xsd:element name=”Root-Element”>
<xsd:complexType>
<xsd:sequence>
<xsd:element name=”Entry” minOccurs=”1″ maxOccurs=”unbounded”>
<xsd:complexType>
<xsd:sequence>
<xsd:element name=”METADATA” type=”xsd:string” nxsd:style=”terminated” nxsd:terminatedBy=”|” nxsd:quotedBy=”&quot;”/>
<xsd:element name=”ElementEntry” type=”xsd:string” nxsd:style=”terminated” nxsd:terminatedBy=”|” nxsd:quotedBy=”&quot;”/>
<xsd:element name=”EffectiveStartDate” type=”xsd:string” nxsd:style=”terminated” nxsd:terminatedBy=”|” nxsd:quotedBy=”&quot;”/>
<xsd:element name=”EffectiveEndDate” type=”xsd:string” nxsd:style=”terminated” nxsd:terminatedBy=”|” nxsd:quotedBy=”&quot;”/>
<xsd:element name=”AssignmentNumber” type=”xsd:string” nxsd:style=”terminated” nxsd:terminatedBy=”|” nxsd:quotedBy=”&quot;”/>
<xsd:element name=”MultipleEntryCount” type=”xsd:string” nxsd:style=”terminated” nxsd:terminatedBy=”|” nxsd:quotedBy=”&quot;”/>
<xsd:element name=”LegislativeDataGroupName” type=”xsd:string” nxsd:style=”terminated” nxsd:terminatedBy=”|” nxsd:quotedBy=”&quot;”/>
<xsd:element name=”ElementName” type=”xsd:string” nxsd:style=”terminated” nxsd:terminatedBy=”|” nxsd:quotedBy=”&quot;”/>
<xsd:element name=”EntryType” type=”xsd:string” nxsd:style=”terminated” nxsd:terminatedBy=”|” nxsd:quotedBy=”&quot;”/>
<xsd:element name=”CreatorType” type=”xsd:string” nxsd:style=”terminated” nxsd:terminatedBy=”${eol}” nxsd:quotedBy=”&quot;”/>
</xsd:sequence>
</xsd:complexType>
</xsd:element>
</xsd:sequence>
</xsd:complexType>
</xsd:element>
<xsd:annotation>
<xsd:appinfo>NXSDSAMPLE=/ElementEntryAllSrc.dat</xsd:appinfo>
<xsd:appinfo>USEHEADER=false</xsd:appinfo>
</xsd:annotation>
</xsd:schema>

ElementEntryValue Schema generated using Native Format Builder

<?xml version = ‘1.0’ encoding = ‘UTF-8’?>
<xsd:schema xmlns:xsd=”http://www.w3.org/2001/XMLSchema” xmlns:nxsd=”http://xmlns.oracle.com/pcbpel/nxsd” xmlns:tns=”http://TargetNamespace.com/GetEntryValueHdlData” targetNamespace=”http://TargetNamespace.com/GetEntryValueHdlData” elementFormDefault=”qualified” attributeFormDefault=”unqualified” nxsd:version=”NXSD” nxsd:stream=”chars” nxsd:encoding=”UTF-8″>
<xsd:element name=”Root-Element”>
<xsd:complexType>
<xsd:sequence>
<xsd:element name=”EntryValue” minOccurs=”1″ maxOccurs=”unbounded”>
<xsd:complexType>
<xsd:sequence>
<xsd:element name=”METADATA” type=”xsd:string” nxsd:style=”terminated” nxsd:terminatedBy=”|” nxsd:quotedBy=”&quot;”/>
<xsd:element name=”ElementEntryValue” type=”xsd:string” nxsd:style=”terminated” nxsd:terminatedBy=”|” nxsd:quotedBy=”&quot;”/>
<xsd:element name=”EffectiveStartDate” type=”xsd:string” nxsd:style=”terminated” nxsd:terminatedBy=”|” nxsd:quotedBy=”&quot;”/>
<xsd:element name=”EffectiveEndDate” type=”xsd:string” nxsd:style=”terminated” nxsd:terminatedBy=”|” nxsd:quotedBy=”&quot;”/>
<xsd:element name=”AssignmentNumber” type=”xsd:string” nxsd:style=”terminated” nxsd:terminatedBy=”|” nxsd:quotedBy=”&quot;”/>
<xsd:element name=”MultipleEntryCount” type=”xsd:string” nxsd:style=”terminated” nxsd:terminatedBy=”|” nxsd:quotedBy=”&quot;”/>
<xsd:element name=”LegislativeDataGroupName” type=”xsd:string” nxsd:style=”terminated” nxsd:terminatedBy=”|” nxsd:quotedBy=”&quot;”/>
<xsd:element name=”ElementName” type=”xsd:string” nxsd:style=”terminated” nxsd:terminatedBy=”|” nxsd:quotedBy=”&quot;”/>
<xsd:element name=”InputValueName” type=”xsd:string” nxsd:style=”terminated” nxsd:terminatedBy=”|” nxsd:quotedBy=”&quot;”/>
<xsd:element name=”ScreenEntryValue” type=”xsd:string” nxsd:style=”terminated” nxsd:terminatedBy=”${eol}” nxsd:quotedBy=”&quot;”/>
</xsd:sequence>
</xsd:complexType>
</xsd:element>
</xsd:sequence>
</xsd:complexType>
</xsd:element>
<xsd:annotation>
<xsd:appinfo>NXSDSAMPLE=/ElementEntryAllSrc.dat</xsd:appinfo>
<xsd:appinfo>USEHEADER=false</xsd:appinfo>
</xsd:annotation>
</xsd:schema>

In Native Format Builder, change “|” separator to “,” in the sample file and change it to “|” for each element in the generated schema.

Transform Source XML Data to HDL Canonical Schema

Since we are using canonical schema, all we need to do is map the source data appropriately and Native Format Builder will convert each object into HDL output file. The transformation could be complex depending on the source data format and organization of data values. In our sample use case, each row has one ElementEntry object and 3 ElementEntryValue sub-objects respectively.

The following provides the organization of the data elements in a single row of the source:

Entry_Desc_v1

The main ElementEntry entries are mapped to each respective row, but ElementEntryValue entries attributes are located at the end of each row. In this sample it results 3 entries. This can be achieved easily by splitting and transforming each row with different mappings as follows:

<xsl:for-each select=”/ns0:Root-Element/ns0:Entry”> – map pair of columns “1” from above diagram

<xsl:for-each select=”/ns0:Root-Element/ns0:Entry”> – map pair of columns “2” from above diagram

<xsl:for-each select=”/ns0:Root-Element/ns0:Entry”> – map pair of columns “3” from above diagram

 

Metadata Attribute

The most common use cases are to use “merge” action for creating and updating objects. In this use case, it is hard coded to “merge”, but the action could be set up to be dynamic if source data row has this information. The “delete” action removes the entire record and must not be used with “merge” instruction of the same record as HDL cannot guarantee in which order the instructions will be processed. It is highly recommended to correct the data rather than to delete and recreate it using the “delete” action. The deleted data cannot be recovered.

 

This is the sample schema developed in JDeveloper to split each row into 3 rows for ElementEntryValue object:

<xsl:template match=”/”>
<tns:Root-Element>
<xsl:for-each select=”/ns0:Root-Element/ns0:Entry”>
<tns:Entry>
<tns:METADATA>
<xsl:value-of select=”‘MERGE'”/>
</tns:METADATA>
<tns:ElementEntry>
<xsl:value-of select=”‘ElementEntryValue'”/>
</tns:ElementEntry>
<tns:EffectiveStartDate>
<xsl:value-of select=”ns0:C2″/>
</tns:EffectiveStartDate>
<tns:EffectiveEndDate>
<xsl:value-of select=”ns0:C3″/>
</tns:EffectiveEndDate>
<tns:AssignmentNumber>
<xsl:value-of select=”ns0:C1″/>
</tns:AssignmentNumber>
<tns:MultipleEntryCount>
<xsl:value-of select=”ns0:C4″/>
</tns:MultipleEntryCount>
<tns:LegislativeDataGroupName>
<xsl:value-of select=”ns0:C5″/>
</tns:LegislativeDataGroupName>
<tns:ElementName>
<xsl:value-of select=”ns0:C6″/>
</tns:ElementName>
<tns:EntryType>
<xsl:value-of select=”ns0:C9″/>
</tns:EntryType>
<tns:CreatorType>
<xsl:value-of select=”ns0:C10″/>
</tns:CreatorType>
</tns:Entry>
</xsl:for-each>
<xsl:for-each select=”/ns0:Root-Element/ns0:Entry”>
<tns:Entry>
<tns:METADATA>
<xsl:value-of select=”‘MERGE'”/>
</tns:METADATA>
<tns:ElementEntry>
<xsl:value-of select=”‘ElementEntryValue'”/>
</tns:ElementEntry>
<tns:EffectiveStartDate>
<xsl:value-of select=”ns0:C2″/>
</tns:EffectiveStartDate>
<tns:EffectiveEndDate>
<xsl:value-of select=”ns0:C3″/>
</tns:EffectiveEndDate>
<tns:AssignmentNumber>
<xsl:value-of select=”ns0:C1″/>
</tns:AssignmentNumber>
<tns:MultipleEntryCount>
<xsl:value-of select=”ns0:C4″/>
</tns:MultipleEntryCount>
<tns:LegislativeDataGroupName>
<xsl:value-of select=”ns0:C5″/>
</tns:LegislativeDataGroupName>
<tns:ElementName>
<xsl:value-of select=”ns0:C6″/>
</tns:ElementName>
<tns:EntryType>
<xsl:value-of select=”ns0:C11″/>
</tns:EntryType>
<tns:CreatorType>
<xsl:value-of select=”ns0:C12″/>
</tns:CreatorType>
</tns:Entry>
</xsl:for-each>
<xsl:for-each select=”/ns0:Root-Element/ns0:Entry”>
<tns:Entry>
<tns:METADATA>
<xsl:value-of select=”‘MERGE'”/>
</tns:METADATA>
<tns:ElementEntry>
<xsl:value-of select=”‘ElementEntryValue'”/>
</tns:ElementEntry>
<tns:EffectiveStartDate>
<xsl:value-of select=”ns0:C2″/>
</tns:EffectiveStartDate>
<tns:EffectiveEndDate>
<xsl:value-of select=”ns0:C3″/>
</tns:EffectiveEndDate>
<tns:AssignmentNumber>
<xsl:value-of select=”ns0:C1″/>
</tns:AssignmentNumber>
<tns:MultipleEntryCount>
<xsl:value-of select=”ns0:C4″/>
</tns:MultipleEntryCount>
<tns:LegislativeDataGroupName>
<xsl:value-of select=”ns0:C5″/>
</tns:LegislativeDataGroupName>
<tns:ElementName>
<xsl:value-of select=”ns0:C6″/>
</tns:ElementName>
<tns:EntryType>
<xsl:value-of select=”ns0:C13″/>
</tns:EntryType>
<tns:CreatorType>
<xsl:value-of select=”ns0:C14″/>
</tns:CreatorType>
</tns:Entry>
</xsl:for-each>
</tns:Root-Element>
</xsl:template>

BPEL Design – “ElementEntryPro…”

This is a BPEL component where all the major orchestration activities are defined. In this sample, all the activities after transformation are reusable and can be moved to a separate composite. A separate composite may be developed only for transformation and data enrichment that in the end invokes the reusable composite to complete the loading process.

 

hdl_bpel_v2

 

 

SOA Cloud Service Instance Flows

The following diagram shows an instance flow:

ElementEntry Composite Instance

instance1

BPEL Instance Flow

audit_1

Receive Input Activity – receives delimited data to XML format through Native Format Builder using File Adapter

audit_2

Transformation to Canonical ElementEntry data

Canonical_entry

Transformation to Canonical ElementEntryValue data

Canonical_entryvalue

Conclusion

This post demonstrates how to automate HCM inbound and outbound patterns using SOA Cloud Service. It shows how to convert customer’s data to HDL format followed by initiating the loading process. This process can also be replicated to other Fusion Applications pillars such as Oracle Enterprise Resource Planning (ERP).

Behind the Delete Trigger in Sales Cloud Application Composer

$
0
0

Cautionary Details on Delete Trigger Behavior with TCA Objects

Developers and technically-inclined users who have ever needed to extend Oracle Sales Cloud are probably familiar with Application Composer (known as App Composer to its friends) — the built-in, browser-based collection of tools that makes it possible to extend Sales Cloud safely without requiring a level of system access that would be inconsistent with and unsafe for the cloud infrastructure. Likewise, many who have built App Composer extensions probably know about object triggers and how to add custom Groovy scripts to these events. Object trigger logic is a major part of most Sales Cloud extensions, especially when there is a need to communicate with external systems. With current Sales Cloud releases (Rel9 and Rel10), harnessing these trigger events and arming them with Groovy scripts arguably has become one of the more effective strategies for developing point-to-point data synchronization extensions.

Existing documentation on trigger usage in App Composer is located in two places: the Groovy Scripting Reference  and the guide for Customizing Sales. But due to the scope of these documents and the extent of topics that require coverage, the authors were unable to provide detailed information on triggers.  These reference guides were never meant to offer best practice recommendations on the use of specific triggers for different purposes. Given this need — that Sales Cloud extension developers need more guidance in order to be more proficient when using object triggers — there are numerous areas requiring deeper technical investigation.  These topics can, and probably should, be covered in detail in the blog arena.

One area requiring additional clarification and vetting of options is a somewhat obscure anomaly in the behavior of delete triggers across different objects in Sales Cloud. By design, objects belonging to Oracle’s Trading Community Architecture (TCA) data model – for example Accounts, Addresses, Contacts, Households, and more – are never deleted physically from the database, at least not through the native Sales Cloud application UI. Therefore, delete triggers do not fire as expected for these objects. In other words, any piece of App Composer extension logic touching TCA objects that includes a delete trigger as one of its components will probably fail. For non-TCA objects, triggers specific to delete actions work as designed. This post will explore differences in delete trigger behavior between TCA and non-TCA data objects in Sales Cloud. The illustrative use case used for this post is a requirement to keep a rudimentary audit trail of deleted object activity in Sales Cloud, tracking the object deleted along with user and timestamp values.

Prerequisites for the Use Case

A custom object, “DeleteAuditLog”, will act as the container for storing the archived object, user, and timestamp details. It has the following attributes:

Field Name Field Type Standard/Custom Additional Info
CreatedBy Text Standard value used to determine who performed the delete
CreationDate Date Standard date of deletion
RecordName Text Standard
LastUpdateDate Date Standard
LastUpdatedBy Text Standard
Id Number Standard
ObjectId Number Custom holds id of object deleted
ObjectType Text Custom holds type of object deleted
ObjectDetail Text Custom holds details for object deleted

Granted, the data elements that will be saved to the custom object do not represent an extremely meaningful audit trail; the intent is not to implement a complete solution but rather to demonstrate the potential of what is possible with trigger scripts.

Although not absolutely required, a global function that takes care of the DeleteAuditLog record creation and assignment of attribute values eliminates duplicate code.  Having it in place as a global function is consistent with modular coding best practices. Here are the details for this global function:

triggers8

Trigger Overview

For readers who have not yet ventured into the world of App Composer triggers, a short introduction is in order. Creating Groovy scripts for event triggers is a way to extend Sales Cloud by telling the system to do something extra when object-related system events occur. That “something extra” can take a variety of forms: validating a field, populating a custom field, calling out to an external web service, creating new instances of objects (standard or custom), or reacting to a given value in an object field and doing some extra processing. Different sequences of triggers fire when objects are created, modified, or deleted. Some triggers fire and are shared across multiple UI actions; others are single-purpose.

There are up to fifteen different object trigger events exposed in App Composer. Not all of these fifteen trigger events are exposed across all objects, however.  For example, the “Before Commit to the Database” trigger is not exposed for objects belonging to the Sales application container. To access and extend trigger events, navigate to the object of interest in the left navigation frame after selecting the correct application container, and then expand the object accordion, which will expose the “Server Scripts” link.

triggers1

Clicking the Server Scripts link populates the App Composer content window with the Server Scripts navigator for the selected object, one component of which is the tab for Triggers. (There are additional tabs in the content window for Validation Rules and Object Functions.) Selecting the Trigger tab exposes areas for Object Triggers and Field Triggers. New, edit, and delete actions for triggers are available through icons are through the Action drop-down menus for Object Triggers and Field Triggers.

triggers2

The majority of triggers are related to database events: before/after insert, before/after update, before/after delete, before/after commit, before/after rollback, and after changes are posted to the database. There are several remaining triggers related to ADF page life cycle events: after object create, before modify, before invalidate, and before remove.

For investigative purposes, it can be revealing to create Groovy scripts for these trigger events in order to reveal firing order and other trigger behaviors; in fact, that was the strategy used here to clarify trigger behavior across different types of objects. Thus, a typical trigger script that does nothing other than to log the trigger event might consist of the following two lines:

println 'Entering AfterCreateTrigger ' + adf.util.AddMilliTimestamp()
println 'Exiting AfterCreateTrigger ' + adf.util.AddMilliTimestamp()

(NOTE: AddMilliTimestamp is a global function that displays time formatted with an added milliseconds component.)

After Groovy trigger scripts are in place for the object trigger events in focus, it then becomes possible to test multiple actions (e.g. object creates, updates, deletes) across different objects in the Sales Cloud user interface. This results in debug-style statements getting written to server logs, which can then be examined in the App Composer Run Time Messages applet to discover end-to-end trigger behavior for various UI actions. The logs for commonly-performed UI actions on Sales Cloud objects follow below. (Run Time Messages content was exported to spreadsheet format to allow selecting the subset of messages applicable to each UI action).

Object create followed by Save and Close:

triggers3

Object modify followed by Save and Close:

triggers4

Object (non-TCA) delete followed by User Verification of the Delete Action:

triggers5

Normal Delete Trigger Behavior

From the above listings, delete triggers work pretty much as expected, at least for non-TCA objects. BeforeRemove, BeforeInvalidate, and BeforeModify events occur before the database-related events – Before/After Delete and AfterCommit – fire. Given this sequence of events, if the goal of the Sales Cloud extension is to log details of the delete event, then it probably makes the most sense to target the unique trigger that fires as soon as the object deletion becomes a known sure thing but before the transaction commit in order to get the log record created in the same database transaction. In this case, therefore, focusing on the AfterDelete event should be optimal; it only fires for the delete action and it occurs, by definition, after the delete event occurs in the database.

There is a behavior unique to the delete action and its chain of triggers, however, that makes implementation of this straightforward approach a bit more complicated.  After the BeforeModify trigger event, which occurs fairly early in the event chain, getting a handle to the to-be-deleted record becomes impossible.  If a need exists, therefore, to read any of the record’s attribute values, it has to be done during or before the BeforeModify event.  After that event the system treats the record as deleted so effectively it is no longer available.

Because the audit log requirement requires reading the value of an object id and then writing that to the audit trail record, hooking into the BeforeModify event is required.  But because the BeforeModify trigger is no longer unique to the delete UI action, the script would somehow have to include a check to make sure that the trigger is a part of the delete chain of events and not being fired for a normal update action.  There does not seem to be a way to perform this check using any native field values, so one option might be to push field values onto the session stack in the BeforeModify trigger, and then pull them off the session stack in the AfterDelete trigger.

Script for the BeforeModify trigger event:

println 'Entering BeforeModify ' + adf.util.AddMilliTimestamp()
adf.userSession.userData.put('ObjectId', SalesAccountPartyId)
adf.userSession.userData.put('ObjectDetail', 'Name: ' + Name + ', Org: ' + OrganizationName)
println 'Exiting BeforeModify ' + adf.util.AddMilliTimestamp()

Script for the AfterDelete trigger event:

println 'Entering AfterDelete ' + adf.util.AddMilliTimestamp()
println 'Creating Delete Audit Log Record'
def objType = 'Sales Lead'
def objId = adf.userSession.userData.ObjectId
def objDtl = adf.userSession.userData.ObjectDetail
def logResult = adf.util.CreateDeleteAuditLog(objType, objId, objDtl) ? 
  'Delete Log Record created OK' : 'Delete Log Record create failure'
println logResult
println 'Exiting AfterDelete ' + adf.util.AddMilliTimestamp()

TCA Objects and the Delete Trigger

Implementing a similar trigger for a TCA object (e.g. Account) leads to a far different outcome. The failure to log the Account UI delete event becomes clear when the debug Run Time Messages associated with the event are examined.

The delete action on the Account object results in the following series of triggers getting fired:

triggers6

The above listing of fired triggers is representative of what takes place when a TCA object record is “deleted” in the UI.  By design, soft deletes occur instead of physical deletes, so the sequence of trigger events looks more like the objects are being modified than deleted. And actually, record updates are indeed what occur.  For TCA objects, instead of being physically deleted they are marked as inactive by changing the value of the PartyStatus (or similar) field to ‘I’. This value tells the system to treat records as if they no longer exist in the database.

Therefore, hooking into delete trigger events for TCA objects will never have the desired effect.  What can be done about the audit log use case?  Now that this behavior is known for TCA objects, and knowing that the value of the PartyStatus (or equivalent) field can be used to check for the delete UI action, all of the audit log logic can be contained in the BeforeModify trigger event.  There is no need to push and pull values off of the session.  Hooking into the BeforeModify trigger event remains viable for TCA objects even though the chain of triggers is far different.

Here is a sample script, which shares some pieces of the delete script for Sales Lead above, for the Account (TCA) object:

println 'Entering BeforeModifyTrigger ' + adf.util.AddMilliTimestamp()
// check for delete activity
if (isAttributeChanged('PartyStatus') && PartyStatus == 'I') {
  println 'Creating Delete Audit Log Record'
  def objType = 'Account'
  def objId = PartyId
  def objDtl = OrganizationName
  def logResult = adf.util.CreateDeleteAuditLog(objType, objId, objDtl) ? 
    'Delete Log Record created OK' : 'Delete Log Record create failure'
  println logResult
} else {
  println 'Record Change other than a delete attempt'
}
println 'Exiting BeforeModifyTrigger ' + adf.util.AddMilliTimestamp()

Short-Term Workarounds and Long-Term Prognosis

Multiple customer service requests related to the delete trigger anomaly for TCA objects have been filed across various versions of Sales Cloud, and this activity has resulted in at least one KnowledgeBase article (Unable To Restrict Deletion Of Contacts Via Groovy (Doc ID 2044073.1) getting published about the behavior. A workaround entirely consistent with what was presented here for TCA objects is discussed in the article. For the longer term, enhancement request #21557055 has been filed and approved for a future release of Sales Cloud.

Integrating Oracle Document Cloud and Oracle Sales Cloud, maintaining data level business object security

$
0
0

Introduction

When customers see the rich functionality available in Oracle Documents Cloud they often ask if they can use this functionality within their Oracle Fusion SaaS Applications to store, and share, documents. At first the integration appears to be quite straightforward, e.g. use the Documents Cloud Web API, embed an iFrame that points to the relevant opportunity id folder in Oracle Documents Cloud and all should be good. However the implementer will soon realise that this approach, on its own, would not respect the Oracle Fusion Applications data security model, I.e. who has authorization to see the documents? and therefore is not production worthy.

The remainder of this blog article describes the challenges faced when integrating Oracle Documents Cloud with Oracle Sales Cloud and proposes a design pattern which solves many of the issues encountered, whilst ensuring the pattern is flexible enough to be used by multiple SaaS applications.

Once Oracle Documents Cloud is embedded within Oracle Sales Cloud, the user then will have many additional features available ot them, such as :

  • Hierarchical folders
  • In Place viewing of documents of different types (Word, Excel, Powerpoint etc)
  • Drag and Drop interface for adding new documents
  • And more

For more information on Oracle Documents Cloud features please see Oracle Documents Cloud webpage

What are the challenges embedding Oracle Documents Cloud with Oracle Sales Cloud?

There are a number of considerations which need to be taken into account when embedding Oracle Documents cloud into Oracle Sales Cloud, namely:

Angelo_DocCld_sampleimage

Screenshot of Documents Cloud Embedded within a Sales Cloud

  • How do you embed the Oracle Documents Cloud user interface in Oracle Sales Cloud?
  • Security, how are you going to ensure documents are secured based on SAAS data security policies?, these normally do not map to PaaS security roles
  • How do you ensure the integration is easily maintainable?

It is possible that you will have multiple Oracle SaaS applications installed (Sales, HCM, ERP, Service etc.) and ideally you would want to use the same integration pattern for all of your SaaS integrations. This however raises its own set of challenges given that different SaaS products offer different extensibility tools (Application Composer, .NET, PHP etc) and differing levels of extensibility, therefore to ensure the highest level of reuse, you will need to use the lowest common denominator extensibility option and move common logic to a common location, such as a middletier.

Architecture

BlogArchitecture

High Level Architecture

Oracle Java Cloud – SaaS Extensions is a specific variant of the Oracle Java Cloud server but designed specifically to integrate with Oracle SaaS Applications. JCS-SX’s main differentiators are identity association with Fusion Applications and low cost due to its multi tenanted architecture For more information please see the Oracle Java Cloud – SaaS Extensions (JCS-SX) website.

The implementation of this pattern uses the Oracle Java Cloud – SaaS Extensions (JCS-SX) service to host some java code, in the form of a JEE Servlet. The functionality of this servlet includes :

  • Ensuring only Fusion Applications Authenticated Users can access a document folder
  • If an object doesn’t already have a folder created, the servlet creates the folder in Oracle Documents cloud then stores it in Oracle Sales Cloud again. This is done for to ensure subsequent calls are more efficient.
  • Only folders for which the user can “see” from the SaaS application are able to be accessed – thus ensuring the solution respects SaaS visibility rules
  • Return a generated html page which displays the Oracle Documents Cloud folder for the Sales Cloud object requested

Historically a pattern previous commonly used involved using Oracle Sales Cloud groovy code to call a SOAP façade hosted on JCS-SX, which in turn communicated using the REST API with Oracle Documents Cloud to create folders in Oracle Documents Cloud. This pattern is not used here for a number of reasons.

  • This would require additional customizations/groovy code on the SaaS side to call our SOAP-REST Façade and one of the desires of this integration pattern is to reduce the amount of SaaS side customization as much as possible so that can support as many SaaS products as possible with a single code base.
  • This would make the JCS-SX code more complex than it really needs to be, we would need multiple entry points (createFolder, ShareFolder, generate an AppLink etc)
  • Finally although Oracle Sales Cloud has great outbound SOAP service support it is often best practice to encapsulate multiple, potentially complex, SOAP calls in a middle tier layer, in our case a servlet hosted on JCS-SX.

The added benefit is that given that the bulk of the business logic is in our Java layer, our integration can easily be used for other SaaS applications, like HCM,ERP, Service Cloud, even non Oracle SaaS products, which offer different levels of customization/languages but fundamentally they all support the concept of embedding an iFrame and passing parameters..

The Integration Flow

As the saying goes “A picture is worth a thousand words” so at this point let’s look at the integration flow implemented by the pattern. We will do this by means of a flow diagram and go through each step explaining the functionality explaining the design consolidations the developer should be aware of and offering some limited code samples.

Angelo_DocCld_flow

SalesCloud to DocCloud Integration Flow

Step 1 is the action of calling the integration from within a Oracle Sales Cloud tab. Here we are embedding the integration servlet within Sales Cloud using the App composer framework and some groovy scripting. The groovy script below calls the JCS-SX servlet, passing data (the object type, the object number, a JWT security token and optionally a Oracle Documents Cloud folder stored in a custom field) and the response of the Java servlet is a HTML page which redirects to the specific Oracle Documents Cloud Folder.

def jwt=(new oracle.apps.fnd.applcore.common.SecuredTokenBean().getTrustToken();
def docCloudIntegrationURL = oracle.topologyManager.client.deployedInfo.DeployedInfoProvider.getEndPoint("DocSalesCloudIntegration");
def url = docCloudIntegrationURL+"?objectnumber="+OptyNumber+"&objecttype=OPPORTUNITY&jwt="+jwt+"&folderGUID="+folderGUID;
return url;

TIP : In the above example we have used a feature of Oracle Fusion Applications called “Topology manager” to store the endpoint of our JCS-SX hosted Servlet. This is good practice as it allows a level of indirection where the developer can store the physical hostname/url of the servlet in one place and reuse it in many places. e.g. Tab for Opportunities, Tab for Accounts etc etc.

For more information please refer to the Oracle Documentation : Working with user Tokens & Topology Manager

For specific steps on how to create a tab in Sales Cloud simplified UI please see this documentation link or this short YouTube video by our fusion Applications Developer Relations group.

Step 2 is now being executed from within our servlet running on JCS-SX. This is where we check to see that the user who’s requested to see the objects documents, actually has permission to do so. To accomplish this we issue a REST query back to Oracle Sales Cloud asking if this user (identified by the JWT security token passed) can query this specific object using the object number.

To check the object is accessible we issue the following REST call in our Java code

GET <SalesCloudServerURL>/salesApi/resources/latest/opportunities/123456?fields=Name&onlyData=true


TIP :  For efficiency purposes, when querying data from Oracle Fusion Applications using the REST, or SOAP , API ensure that you only return the data you require.For the REST API simply add the parameter “fields” with a comma seperated list of fieldnames. For the SOAP API you can add <findAttribute> tags with the fields you wish to return.

If the query to Oracle Sales Cloud returns data (i.e. a single row) then we can assume the user has valid access to the object, at that time, and thus can proceed. If we get a “no data found” (i.e. a 404 error) then we simply return a security exception to the user interface as the user has no access. In practice the user should never receive this error as the URL call is generated based on them navigating to a screen with access but for security reasons this check is necessary.

The principal advantage of this approach is we are using SaaS application functionality to determine if a record is visible by a user rather than trying to determine it from the Java servlet . We assume that if you can query the object from the SaaS API then you have access. Additionally this technique will work for any SaaS application regardless of how it “secures” data visibility, e.g. Oracle Sales Cloud uses territory management to determine visibility whereas HCM Cloud uses job roles. For non Fusion Applications we are assuming that SaaS applications API respects its user interfaces visibility rules in the same way as Oracle Fusion Applications.

Step 3 is concerned with checking, or finding, the folder in Oracle Documents Cloud where the objects documents are stored (e..g files for an opportunity). This integration pattern stores the objects documents in a hierarchy within Oracle Documents Cloud. The application root directory contains a collection of folders, one for each object type (e.g. Opportunity, Account etc.) and then within that a sub folder (ObjectNumber-ObjectName) of the object we’re looking for.

If a folder GUID is passed in as a parameter to the Java servlet then we then simply need to check, using the GUID, that the folder exists in Documents Cloud and then move on to step 4 of the process. If we are not passed a folder GUID then we need to perform an in-order traversal of the hierarchy and find the folder, whose name would be <ObjectNumber>-<ObjectName>. This second method is not going to be very efficient as we could have a scenario where we could have 1000s of folders which we would need to trawl through. Thankfully it should only occur the first time a folder is accessed and subsequent requests would be quicker as we store the folder GUID in Sales Cloud in a custom field.

This second approach however does have a massive advantage in that it can be used for SaaS Applications where it is not possible to store a custom field in the SaaS application and pass it in context of the user interface. So although less efficient it will continue to work and give us more options. If we were in this scenario then I would strongly recommend one of the following strategies be implemented to reduce the folder count to only include a subset of all documents.

  • A data aging mechanism where objects relating to inactive/old Accounts/Opportunities etc. are archived off
  • Or the use of a database store, like Database Cloud Service, to hold a mapping table of DocCloud GUIDs to Oracle Sales Cloud objects, indexed by the Oracle Sales Cloud ObjectID/Type.
Angelo_DocCld_heirarchy

Example of hierarchy stored in Document Cloud

Example Object type hierarchy within Documents Cloud

Step 4 is only executed if the folder does not exist. The lack of the folder implies that no documents have been stored for this object and the next step is to create the folder. This would normally happen the first time a user shows the tab in Oracle Sales Cloud. It is also worth noting that the folder only gets created in Oracle Documents Cloud when, and only when, the Documents Cloud tab in Oracle Sales Cloud is selected, this way we don’t get empty folders in Oracle Documents Cloud.

If you had used groovy scripts in Oracle Sales Cloud and created a folder on the creation event of a Oracle Sales Cloud object (e.g. creation of a new opportunity), you would cause the creation of a number of empty folders and require more customizations to be done in Oracle Sales Cloud.

The Documents Cloud REST call for creating a folder is as follows :

POST <DocCloudServerURL>/folders/FF4729683CD68C1CCCCC87DT001111000100000001
{
    name : "1234-Vision-Website",
    description : "Folder for Vision Software Opportunity 1234 Website Deal"
}

The long HEX number is the parent folder GUID, which is either passed in from SalesCloud or discovered by an inorder traversal of the documents cloud folder hierarchy.

Step 5, In step 3 we walked through a fixed hierarchy to find the folder using a combination of Object Number and Object Name (e.g. “1234-Vision-Website”), as mentioned earlier this approach isn’t efficient therefore at this stage we store the discovered, or generated, folder GUID in Oracle Sales Cloud using for subsequent requests

TIP : For more information on how to create custom fields in Sale Cloud checkout this youtube viewlet from Oracle Fusion Application Developer Relations.

Other Oracle Fusion Applications products also allow the storage of custom fields in the SaaS instance by a technology called “FlexFields”, if your interested checkout this ATeam Chronicles blog article on Descriptive FlexFields.

Step 6 is only reached when we’ve found the folder on Oracle Documents Cloud and have checked that he user has access to the folder. Now all we need to do is generate an HTML page which will show the documents. Specifically we want to show the user the folder with the documents BUT importantly not allow them to navigate in/out of the folder. This can be achieved by using a feature of Oracle Documents Cloud called “AppLinks”. An AppLink is a generated URL which gives temporary access to an Oracle Documents Cloud folder, or item. In this step the Java servlet generates some HTML and JavaScript code which when sent back to the browser iFrame, which executes it and subsequently redirects to Oracle Document Cloud to the AppLink URL previously generated

The REST call which generates a Folder AppLink is shwon below. In this example we are using the role of “contributor” as we want to give our users the ability to add, and remove, files

POST <serverURL>/folders/FF4729683CD68C1CCCCC87DT001111000100000001
{
   "assignedUser": "Any User",
    "role : "contributor"
}

For more information on AppLinks please see this article on Oracle Documents Cloud and AppLinks on the A-Team Chronicles website.

Step 7 Is now the final phase of the servlet which is to return a HTML page back to the iframe embedded within Oracle Sales Cloud

Step 8 is the page being rendered in the iFrame in Oracle Sales Cloud, at this point the HTML/JavaScript is executed and redirects the iframe to Oracle Documents Cloud showing the appropriate page in Oracle Documents Cloud.

 

Conclusion

From the above pattern you can see that it’s perfectly possible to integrate Oracle Sales Cloud and Oracle Documents Cloud in a manner that not only is incredibly functional, efficient but importantly maintains the complex security model enjoyed by SaaS applications like Oracle Sales Cloud. This pattern also highlights many micro patterns which are used to create this integration, namely (with links to other blog entries)

 

A complete downloadable asset, containing the sample code above ready to be deployed into Oracle JCS-SX, will be made available soon.

 

 

 

Automating Data Loads from Taleo Cloud Service to BI Cloud Service (BICS)

$
0
0

Introduction

This article will outline a method for extracting data from Taleo Cloud Service, and automatically loading that data into BI Cloud Service (BICS).  Two tools will be used, the Taleo Connect Client, and the BICS Data Sync Utility.   The Taleo Connect Client will be configured to extract data in CSV format from Taleo, and save that in a local directory.  The Data Sync tool will monitor that local directory, and once the file is available, it will load the data into BICS using an incremental load strategy.  This process can be scheduled to run, or run on-demand.

 

Main Article

This article will be broken into 3 sections.

1. Set-up and configuration of the Taleo Connect Client,

2. Set-up and configuration of the Data Sync Tool,

3. The scheduling and configuration required so that the process can be run automatically and seamlessly.

 

1. Taleo Connect

The Taleo Connect Tool communicates with the Taleo backend via web services and provides an easy to use interface for creating data exports and loads.

Downloading and Installing

Taleo Connect tool can be downloaded from Oracle Software Delivery Cloud.

a. Search for ‘Oracle Taleo Platform Cloud Service – Connect’, and then select the Platform.  The tool is available for Microsoft Windows and Linux.

1

 

b. Click through the agreements and then select the ‘Download All’ option.

 

1

c. Extract the 5 zip files to a single directory.

d. Run the ‘Taleo Connect Client Application Installer’

2

e. If specific Encryption is required, enter that in the Encryption Key Server Configuration screen, or select ‘Next’ to use default encryption.

f. When prompted for the Product Packs directory, select the ‘Taleo Connect Client Application Data Model’ folder that was downloaded and unzipped in the previous step, and then select the path for the application to be installed into.

 

Configuring Taleo Connect

a. Run the Taleo Connect Client.  By default in windows, it is installed in the “C:\Taleo Connect Client” directory.  The first time the tool is run, a connection needs to be defined.  Subsequent times this connection will be used by default.

b. Enter details of the Taleo environment and credentials.  Important – the user must have the ‘Integration Role’ to be able to use the Connect Client.

c. Select the Product and correct version for the Taleo environment.  In this example ‘Recruiting 14A’.

d. Select ‘Ping’ to confirm the connection details are correct.

3

 

Creating Extracts from Taleo

Exporting data with Taleo Connect tool requires an export definition as well as an export configuration.  These are saved as XML files, and can then be run from a command line to execute the extract.

This article will walk through very specific instructions for this use case.  More details on the Connect Client can be found in this article.

1. Create The Export Definition

a. Under the ‘File’ menu, select ‘New Export Wizard’

1

b. Select the Product and Model, and then the object that you wish to export.  In this case ‘Department’ is selected.

Windows7_x64

c. To select the fields to be included in the extract, chose the ‘Projections’ workspace tab, as shown below, and then drag the fields from the Entity Structure into that space.  In this example the whole ‘Department’ tree is dragged into the Projections section, which brings all the fields in automatically.

 

Windows7_x64

d. There are options to Filter and Sort the data, as well as Advanced Options, which include using sub-queries, grouping, joining, and more advanced filtering.  For more information on these, see the Taleo Product Documentation.  In the case of a large transaction table, it may be worth considering building a filter that only extracts the last X period of data, using the LastModifiedDate field, to limit the size of the file created and processed each time.  In this example, the Dataset is small, so a full extract will be run each time.

 

Windows7_x64

e. Check the ‘CSV header present’ option.  This adds the column names as the first row of the file, which makes it easier to set up the source in the Data Sync tool.

Windows7_x64

f. Once complete, save the Export Definition with the disk icon, or under the ‘File’ menu.

 

2. Create The Export Configuration

a. Create the Export Configuration, by selecting ‘File’ and the ‘New Configuration Wizard’.

6

b. Base the export specification on the Export Definition created in the last step.

Windows7_x64

c. Select the Default Endpoint, and then ‘Finish’.

8

d. By default the name of the Response, or output file, is generated using an identifier, with the Identity name – in this case Department – and a timestamp.  While the Data Sync tool can handle this type of file name with a wildcard, in this example the ‘Pre-defined value’ is selected so that the export creates the same file each time – called ‘Department.csv’.

Windows7_x64

e. Save the Export Configuration.  This needs to be done before the schedule and command line syntax can be generated.

f. To generate the operating system dependent syntax to run the extract from a command line, check the ‘Enable Schedule Monitoring’ on the General tab, then ‘Click here to configure schedule’.

g. Select the operating system, and interval, and then ‘Build Command Line’.

h. The resulting code can be Copied to the clipboard.  Save this.  It will be used in the final section of the article to configure the command line used by the scheduler to run the Taleo extract process.

Windows7_x64

i.  Manually execute the job by selecting the ‘gear’ icon

 

Menubar

 

j. Follow the status in the monitoring window to the right hand side of the screen.

In this example, the Department.csv file was created in 26 seconds.  This will be used in the next step with the Data Sync tool.

Windows7_x64

 

2. Data Sync Tool

The Data Sync Tool can be downloaded from OTN through this link.

For more information on installing and configuring the tool, see this post that I wrote last year.  Use this to configure the Data Sync tool, and to set up the TARGET connection for the BICS environment where the Taleo data will be loaded.

 

Configuring the Taleo Data Load

a. Under “Project” and “File Data”, create a new source file for the ‘Department.csv’ file created by the Taleo Connect tool.

1

Windows7_x64

b. Under ‘Import Options’, manually enter the following string for the Timestamp format.

yyyy-MM-dd’T’HH:mm:ssX

This is the format that the Taleo Extract uses, and this needs to be defined within the Data Sync tool so that the CSV file can be parsed correctly.

1

c. Enter the name of the Target table in BICS.  In this example, a new table called ‘TALEO_DEPARTMENT’ will be created.

Windows7_x64

d. The Data Sync tool samples the data and makes a determination of the correct file format for each column.  Confirm these are correct and change if necessary.

Windows7_x64

e. If a new table is being created in BICS as part of this process, it is often a better idea to let the Data Sync tool create that table so it has the permissions it requires to load data and create any necessary indexes.  Under ‘Project’ / ‘Target Tables’ right click on the Target table name, and select ‘Drop/Create/Alter Tables’

Windows7_x64

f. In the resulting screen, select ‘Create New’ and hit OK.  The Data Sync tool will connect to the BICS Target environment and execute the SQL required to create the TALEO_DEPARTMENT target table

2

g. If an incremental load strategy is required, select the ‘Update table’ option as shown below

Windows7_x64

h. Select the unique key on the table – in this case ‘Number’

Windows7_x64

i. Select the ‘LastModifiedDate’ for the ‘Filters’ section.  Data Sync will use this to identify which records have changed since the last load.

Windows7_x64

In this example, the Data Sync tool suggests a new Index on the target table in BICS.  Click ‘OK’ to let it generate that on the Target BICS database.

Windows7_x64

 

Create Data Sync Job

Under ‘Jobs’, select ‘New’ and name the job.  Make a note of the Job name, as this will be used later in the scheduling and automation of this process

 

Windows7_x64

 

Run Data Sync Job

a. Execute the newly created Job by selecting the ‘Run Job’ button

Windows7_x64

b. Monitor the progress under the ‘Current Jobs’ tab.

Windows7_x64

c. Once the job completes, go the the ‘History’ tab, select the job, and then in the bottom section of the screen select the ‘Tasks’ tab to confirm everything ran successfully.  In this case the ‘Status Description’ confirms the job ‘Successfully completed’ and that 1164 rows were loading into BICS, with 0 Failed Rows.  Investigate any errors and make changes before continuing.

Windows7_x64

 

3. Configuring and Scheduling Process

As an overview of the process, a ‘.bat’ file will be created and scheduled to run.  This ‘bat’ file will execute the extract from Taleo, with that CSV file being saved to the local file system.  The second step in the ‘.bat’ file will create a ‘touch file’.  The Data Sync Tool will monitor for the ‘touch file’, and once found, will start the load process.  As part of this, the ‘touch file’ will automatically be deleted by the Data Sync tool, so that the process is not started again until a new CSV file from Taleo is generated.

a. In a text editor, create a ‘.bat’ file.  In this case the file is called ‘Taleo_Department.bat’.

b. Use the syntax generated in step ‘2 h’ in the section where the ‘Taleo Export Configuration’ was created.

c. Use the ‘call’ command before this command.  Failure to do this will result in the extract being completed, but the next command in the ‘.bat’ file not being run.

d. Create the ‘touch file’ using an ‘echo’ command.  In this example a file called ‘DS_Department_Trigger.txt’ file will be created.

Windows7_x64

e. Save the ‘bat’ file.

f. Configure the Data Sync tool to look for the Touch File created in step d, by editing the ‘on_demand_job.xml’, which can be found in the ‘conf-shared’ directory within the Data Sync main directory structure.

Windows7_x64

g. At the bottom of the file in the ‘OnDemandMonitors’ section, change the ‘pollingIntervalInMinutes’ to be an appropriate value. In this case Data Sync will be set to check for the Touch file every minute.

h. Add a line within the <OnDemandMonitors> section to define the Data Sync job that will be Executed once the Touch file is found, and the name and path of the Touch file to be monitored.

Windows7_x64

In this example, the syntax looks like this

<TriggerFile job=”Taleo_Load” file=”C:\Users\oracle\Documents\DS_Department_Trigger.txt”/>

 

The Data Sync tool can be configured to monitor for multiple touch files, each that would trigger a different job.  A separate line item would be required for each.

h. The final step is to Schedule the ‘.bat’ file to run at a suitable interval.  Within Windows, the ‘Task Scheduler’ can be found beneath the ‘Accessories’ / ‘System Tools’ section under the ‘All Programs’ menu.  In linux, use the ‘crontab’ command.

 

Summary

This article walked through the steps for configuring the Taleo Connect Client to download data from Taleo and save to a location to be automatically consumed by the Data Sync tool, and loaded to BICS.

 

Further Reading

Taleo Product Documentation

Getting Started with Taleo Connect Client

Configuring the Data Sync Tool for BI Cloud Services

EDI Processing with B2B in hybrid SOA Cloud Cluster integrating On-Premise Endpoints

$
0
0

Executive Overview

SOA Cloud Service (SOACS) can be used to support the B2B commerce requirements of many large corporations. This article discusses a common use case of EDI processing with Oracle B2B within SOA Cloud Service in a hybrid cloud architecture. The documents are received and sent from on-premise endpoints using SFTP channels configured using SSH tunnels.

Solution Approach

Overview

The overall solution is described in the diagram shown here.

B2BCloudFlow(1)(1)An XML file with PurchaseOrder content is sent to a SOACS instance running in Oracle Public Cloud (OPC) from an on-premise SFTP server.

The XML file is received by an FTP Adapter in a simple composite for hand-off to B2B. The B2B engine within SOACS then generates the actual EDI file and transmits it over an SFTP delivery channel back to an on-premise endpoint.

In reality, the endpoint can be any endpoint inside or outside the corporate firewall. Communication with an external endpoint is trivial and hence left out of the discussion here. Using the techniques of SSH tunnels, the objective here is to demonstrate the ease by which any on-premises endpoint can be seamlessly integrated into the SOA Cloud Service hybrid solution architecture.

Our environment involves a SOACS domain on OPC with 2 managed servers. Hence, the communication with an on-premise endpoint is configured using SSH tunnels as described in my team-mate, Christian Weeks’ blog on SSH tunnel for on-premises connectivity in SOA Cloud clusters[1].

If the SOACS domain contains only a single SOACS node, then a simpler approach can also be used to establish the on-premise connectivity via SSH tunneling, as described in my blog on simple SSH tunnel connectivity for on-premises databases from SOA Cloud instance[2].

The following sections walk through the details of setting up the flow for a PurchaseOrder XML document from an on-premise back-end application, like eBusiness Suite to the 850 X12 EDI generated for transmission to an external trading partner.

Summary of Steps

  • Copy the private key of SOACS instance to the on-premise SFTP server
  • Update the whilelist for SOACS compute nodes to allow traffic flow between the SOACS compute nodes and the on-premise endpoints via the intermediate gateway compute node, referred to as CloudGatewayforOnPremTunnel in rest of this post from here onwards. This topic has also been extensively discussed in Christian’s blog[1].
  • Establish an SSH tunnel from the on-premise SFTP Server (OnPremSFTPServer) to the Cloud Gateway Listener host identified within the SOA Cloud Service compute nodes (CloudGatewayforOnPremTunnel). The role of this host to establish the SSH tunnel for a cluster has been extensively discussed in Christian’s blog[1]. This SSH tunnel, as described, will specify a local port and a remote port. The local port will be the listening port of SFTP server, (default is 22) and the remote port can be any port that is available within the SOACS instance (e.g. 2522).
  • Update FTP Adapter’s outbound connection pool configuration to include the new endpoint and redeploy. Since we have a cluster within the SOA Cloud service, the standard JNDI entries for eis/ftp/HAFtpAdapter should be used.
  • Define a new B2B delivery channel for the OnPremise SFTP server using the redirected ports for SFTP transmission.
  • Develop a simple SOA composite to receive the XML  payload via FTP adapter and hand-off to B2B using B2B Adapter.
  • Deploy the B2B agreement and the SOA composite.
  • Test the entire round-trip flow for generation of an 850 X12 EDI from a PurchaseOrder XML file.

sftpTunnel

Task and Activity Details

The following sections will walk through the details of individual steps. The environment consists of the following key machines:

  • SOACS cluster with 2 managed servers and all the dependent cloud services within OPC.
  • A compute node within SOACS instance is identified to be the gateway listener for the SSH tunnel from on-premise hosts (CloudGatewayforOnPremTunnel)
  • Linux machine inside the corporate firewall, used for hosting the On-Premise SFTP Server (myOnPremSFTPServer)

I. Copy the private key of SOACS instance to the on-premise SFTP server

When a SOACS instance is created, a public key file is uploaded for establishing SSH sessions. The corresponding private key has to be copied to the SFTP server. The private key can then be used to start the SSH tunnel from the database server to the SOACS instance.

Alternatively, a private/public key can be generated in the SFTP server and the public key can be copied into the authorized_keys file of the SOACS instance. In the example here, the private key for the SOACS instance has been copied to the SFTP server. A transcript of a typical session is shown below.

slahiri@slahiri-lnx:~/stage/cloud$ ls -l shubsoa_key*
-rw——- 1 slahiri slahiri 1679 Dec 29 18:05 shubsoa_key
-rw-r–r– 1 slahiri slahiri 397 Dec 29 18:05 shubsoa_key.pub
slahiri@slahiri-lnx:~/stage/cloud$ scp shubsoa_key myOnPremSFTPServer:/home/slahiri/.ssh
slahiri@myOnPremDBServer’s password:
shubsoa_key                                                                                100% 1679        1.6KB/s     00:00
slahiri@slahiri-lnx:~/stage/cloud$

On the on-premise SFTP server, login and confirm that the private key for SOACS instance has been copied in the $HOME/.ssh directory.

[slahiri@myOnPremSFTPServer ~/.ssh]$ pwd
/home/slahiri/.ssh
[slahiri@myOnPremSFTPServer ~/.ssh]$ ls -l shubsoa_key
-rw——-+ 1 slahiri g900 1679 Jan  9 06:39 shubsoa_key
[slahiri@myOnPremSFTPServer ~/.ssh]$

II. Create whitelist entries to allow communications between different SOACS compute nodes and on-premise SFTP server

The details about creation of a new security application and rule have been discussed extensively in Christian’s blog[1]. For the sake of brevity, just the relevant parameters for the definition are shown here. These entries are created from the Compute Node Service Console under Network tab.

Security Application
  • Name: OnPremSFTPServer_sshtunnel_sftp
  • Port Type: tcp
  • Port Range Start: 2522
  • Port Range End: 2522
  • Description: SSH Tunnel for On-Premises SFTP Server
Security Rule
  • Name: OnPremSFTPServer_ssh_sftp
  • Status: Enabled
  • Security Application: OnPremSFTPServer_sshtunnel_sftp (as created in last step)
  • Source: Security Lists – ShubSOACS-jcs/wls/ora-ms (select entry that refers to all the managed servers in the cluster)
  • Destination: ShubSOACS-jcs/lb/ora_otd (select the host designated to be CloudGatewayforOnPremTunnel, which could be either the DB or LBR VM)
  • Description: ssh tunnel for On-Premises SFTP Server

III. Create an SSH Tunnel from On-Premise SFTP Server to the CloudGatewayforOnPremTunnel VM’s public IP

Using the private key from Step I, start an SSH session from the on-premise SFTP server host to the CloudGatewayforOnPremTunnel, specifying the local and remote ports. As mentioned earlier, the local port is the standard port for SFTP daemon, e.g. 22. The remote port is any suitable port that is available in the SOACS instance. The syntax of the ssh command used is shown here.

ssh -R :<remote-port>:<host>:<local port> -i <private keyfile> opc@<CloudGatewayforOnPremTunnel VM IP>

The session transcript is shown below.

[slahiri@myOnPremSFTPServer ~/.ssh]$ ssh -v -R :2522:localhost:22 -i ./shubsoa_key opc@CloudGatewayforOnPremTunnel
[opc@CloudGatewayforOnPremTunnel ~]$ netstat -an | grep 2522
tcp        0      0 127.0.0.1:2522              0.0.0.0:*                   LISTEN
tcp        0      0 ::1:2522                         :::*                            LISTEN
[opc@CloudGatewayforOnPremTunnel ~]$

After establishing the SSH tunnel, the netstat utility can confirm that the remote port 2522 is enabled in listening mode within the Cloud Gateway VM. This remote port, 2522 and localhost along with other on-premises SFTP parameters can now be used to define an endpoint in FTP Adapter’s outbound connection pool in Weblogic Adminserver (WLS) console.

IV. Define a new JNDI entry for FTP Adapter that uses the on-premise SFTP server via the SSH  tunnel

From WLS console, under Deployments, update FtpAdapter application by defining parameters for the outbound connection pool JNDI entry for clusters, i.e eis/Ftp/HAFtpAdapter.

The remote port from Step II is used in defining the port within the JNDI entry for FTP Adapter. It should be noted that the host specified will be CloudGatewayforOnPremTunnel instead of the actual on-premise hostname or address of the SFTP server, since the port forwarding with SSH tunnel is now enabled locally within the SOACS instance in Step III.

It should be noted that SOA Cloud instances do not use any shared storage. So, the deployment plan must be copied to the file systems for each node before deployment of the FTP Adapter application.

The process to update the FtpAdapter deployment is fairly straightforward and follows the standard methodology. So, only the primary field values that are used in the JNDI definition are provided below.

  • JNDI under Outbound Connection Pools: eis/Ftp/HAFtpAdapter
  • Host:CloudGatewayforOnPremTunnel
  • Username: <SFTP User>
  • Password: <SFTP User Password>
  • Port:2522
  • UseSftp: true

V. Configure B2B Metadata

Standard B2B configuration will be required to set up the trading partners, document definitions and agreements. The unique configuration pertaining to this test case involves setting up the SFTP delivery channel to send the EDI document to SFTP server residing on premises inside the corporate firewall. Again, the remote port from Step III is used in defining the port for the delivery channel. The screen-shot for channel definition is shown below.

edicloud6After definition of the metadata, the agreement for outbound 850 EDI is deployed for runtime processing.

VI. Verification of SFTP connectivity

After the deployment of the FTP Adapter. another quick check of netstat for port 2522 may show additional entries indicating an established session corresponding to the newly created FTP Adapter. The connections are established and disconnected based on the polling interval of the FTP Adapter. Another alternative to verify the SFTP connectivity will be to manually launch an SFTP session from the command-line as shown here.

[opc@shubsoacs-jcs-wls-1 ~]$ sftp -oPort=2522 slahiri@CloudGatewayforOnPremTunnel
Connecting to CloudGatewayforOnPremTunnel…
The authenticity of host ‘[cloudgatewayforonpremtunnel]:2522 ([10.196.240.130]:2522)’ can’t be established.
RSA key fingerprint is 93:c3:5c:8f:61:c6:60:ac:12:31:06:13:58:00:50:eb.
Are you sure you want to continue connecting (yes/no)? yes
Warning: Permanently added ‘[cloudgatewayforonpremtunnel]:2522′ (RSA) to the list of known hosts.
slahiri@cloudgatewayforonpremtunnel’s password:
sftp> quit
[opc@shubsoacs-jcs-wls-1 ~]$

While this SFTP session is connected, a quick netstat check on the CloudGatewayforOnPremTunnel host will confirm the established session for port 2522 from the SOACS compute node.

[opc@CloudGatewayforOnPremTunnel ~]$ netstat -an | grep 2522
tcp        0       0 0.0.0.0:2522                       0.0.0.0:*                               LISTEN
tcp        0      0 10.196.240.130:2522         10.196.246.186:14059        ESTABLISHED
tcp        0       0 :::2522                                 :::*                                       LISTEN
[opc@CloudGatewayforOnPremTunnel ~]$

VII. Use the newly created JNDI to develop a SOA composite containing FTP Adapter and B2B Adapter to hand-off the XML payload from SFTP Server to B2B engine

The simple SOA composite diagram built in JDeveloper for this test case is shown below.

The JNDI entry created in step IV (eis/ftp/HAFtpAdapter) is used in the FTP Adapter Wizard session within JDeveloper to set up a receiving endpoint from the on-premises SFTP server. A simple BPEL process is included to transfer the input XML payload to B2B. The B2B Adapter then hands-off the XML payload to the B2B engine for generation of the X12 EDI in native format.

edicloud4

Deploy the composite via EM console to complete the design-time activities. We are now ready for testing.

VIII. Test the end-to-end EDI processing flow

After deployment, the entire flow can be tested by copying a PurchaseOrder XML file in the polling directory for incoming files within the on-premise SFTP server. An excerpt from the sample XML file used as input file to trigger the process, is shown below.

[slahiri@myOnPremSFTPServer cloud]$ more po_850.xml
<Transaction-850 xmlns=”http://www.edifecs.com/xdata/200″ xmlns:xsi=”http://www.w3.org/2001/XMLSchema-instance” XDataVersion=”1.0″ Standard=”X12” Version=”V4010” CreatedDate=”2007-04-10T17:16:24″ CreatedBy=”ECXEngine_837″>
     <Segment-ST>
           <Element-143>850</Element-143>
           <Element-329>16950001</Element-329>
      </Segment-ST>
      <Segment-BEG>
           <Element-353>00</Element-353>
           <Element-92>SA</Element-92>
           <Element-324>815455</Element-324>
           <Element-328 xsi:nil=”true”/>
           <Element-373>20041216</Element-373>
        </Segment-BEG>
–More–(7%)

The FTP Adapter of the SOA composite from SOACS instance will pick up the XML file via the SSH tunnel and process it in Oracle Public Cloud within Oracle B2B engine to generate the EDI. The EDI file will then be transmitted back to the on-premise SFTP server via the same SSH tunnel.

Results from the completed composite instance should be visible in the Enterprise Manager, as shown below.

edicloud2

Content of the EDI file along with the SFTP URL used to transmit the file can be seen in the B2B console, under Wire Message Reports section.

edicloud1

Summary

The test case described here is a quick way to demonstrate the concept that SOA Cloud Service can be very easily used in a hybrid architecture for modelling common B2B use cases, that require access to on-premise endpoints. The EDI generation process and all the business layer orchestration can be done in Oracle Public Cloud (OPC) with SOA Suite. Most importantly, integration with on-premise server endpoints can be enabled as needed via SSH tunnels to provide a hybrid cloud solution.

Acknowledgements

SOACS Product Management and Engineering teams have been actively involved in the development of this solution for many months. It would not have been possible to deliver such a solution to the customers without their valuable contribution.

References

  1. 1. Setting up SSH tunnels for cloud to on-premise with SOA Cloud Service clusters – Christian Weeks, A-Team
  2. 2. SOA Cloud Service – Quick and Simple Setup of an SSH Tunnel for On-Premises Database Connectivity - Shub Lahiri, A-Team

Using Oracle BI Answers to Extract Data from HCM via Web Services

$
0
0

Introduction

Oracle BI Answers, also known as ‘Analyses’ or ‘Analysis Editor’, is a reporting tool that is part of the Oracle Transactional Business Intelligence (OTBI), and available within the Oracle Human Capital Management (HCM) product suite.

This article will outline an approach in which a BI Answers report will be used to extract data from HCM via web services.  This provides an alternative method to the file based loader process (details of which can be found here)

This can be used for both Cloud and On-Premise versions of Oracle Fusion HCM.

Main Article

During regular product updates to Oracle HCM, underlying data objects may be changed.  As part of the upgrade process, these changes will automatically be updated in the pre-packaged reports that come with Oracle HCM, and also in the OTBI ‘Subject Areas’ – a semantic layer used to aid report writers by removing the need to write SQL directly against the underlying database.

As a result it is highly recommended to use either a pre-packaged report, or to create a new report based on one of the many OTBI Subject Areas, to prevent extracts subsequently breaking due to the changing data structures.

Pre-Packaged Reports

Pre-packaged reports can be found by selecting ‘Catalog’, expanding ‘Shared Folders’ and looking in the ‘Human Capital Management’ sub-folder.  If a pre-packaged report is used, make a note of the full path of the report shown in the ‘Location’ box below.  This path, and the report name, will be required for the WSDL.

Windows7_x64

Ad-Hoc Reports

To create an Ad-Hoc report, a user login with the minimum of BI Author rights is required.

a. Select ‘New’ and then ‘Analysis’

Windows7_x64

b. Select the appropriate HCM Subject Area to create a report.

Windows7_x64

c. Expand the folders and drag the required elements into the report.

d. Save the report into a shared location.  In this example this is being called ‘Answers_demo_report’ and saved into this location.

/Shared Folders/Custom/demo

This path will be referenced later in the WSDL.

Edit_Post_‹_ATeam_Chronicles_—_WordPress

Building Web Service Request

To create and test the Web Service, this post will use the opensource tool SoapUI.  This is free and can be downloaded here:

https://www.soapui.org

Within SoapUI, create a new SOAP project.  For the Initial WSDL address, use the Cloud or On-Premise URL, appending  ‘/analytics-ws/saw.dll/wsdl/v7′

For example:

https://cloudlocation.oracle.com/analytics-ws/saw.dll/wsdl/v7

or

https://my-on-premise-server.com/analytics-ws/saw.dll/wsdl/v7

This will list the available WSDLs

 

Calling the BI Answers report is a 2 step process

1. Within SoapUI, expand out the ‘SAWSessionService’ and then ‘logon’.  Make a copy of the example ‘Request’ WSDL, then update it to add the username and password for a user with credentials to run the BI Answers report.

Run that WSDL and a sessionID is returned:

SoapUI_4_6_4

2. In SoapUI expand ‘XmlViewService’ / ‘executeXMLQuery’.  Make a copy of the example ‘Request’ WSDL.  Edit that, insert the BI Answers report name and path into the <v7:reportPath> variable, and the SessionID from the first step into the <v7:sessionID> variable.

Note that while in the GUI the top level in the path was called ‘Shared Reports’, in the WSDL that is replaced with ‘shared’.  The rest of the path will match the format from the GUI.

You will notice a number of other options available.  For this example we are going to ignore those.

You can then execute the web service request.  The report returns the data as an XML stream, which can then be parsed by your code.

3

Summary

This post demonstrated a simple method to leverage BI Answers and the underlying OTBI Subject Areas within Oracle HCM, to create and call a report via web service to extract data for a down stream process.

Using Oracle BI Publisher to Extract Data From Oracle Sales and ERP Clouds

$
0
0

Introduction

Many Cloud products such as Oracle Sales Cloud, and Oracle ERP Cloud come packaged with Oracle Transaction Business Intelligence (OTBI). OTBI allows users to create and run reports against both Transactional and Warehouse databases.  At times there may be a need to extract that data to an external system. On-Premise customers can create ETL jobs to run against the Database, but would need to figure out how everything joins together – the RPD does a great job in obfuscating that for the end user.  For Cloud customers, it’s going to be even more complicated getting access to the source databases.

This post will cover a method for calling a BI Publisher report (a component of OTBI), via SOAP web services, and returning an XML file with the report data (embedded as a Base64 encoded content).

This approach can be used as an alternative to using the standard SOAP APIs documented in https://fusionappsoer.oracle.com/oer/  and as a basis to extract data automatically from your OTBI, albeit in the Cloud or On-Premise.

Main Article

For this demonstration, the assumption is being made that the reader has access to a user with the minimum of BI Author rights to create and run BI Publisher .

Create a simple BI Publisher report and name it ‘BIP_demo_report’ and save it into the shared location:

e.g. /Shared Folders/Custom/demo   , make not of this path as it will be referenced later in the WSDL.

Untitled

 

Go to XMLPServer

Within the XMLPServer, edit the BI Publisher report.

In the upper right, select ‘View a List’

Add_New_Post_‹_ATeam_Chronicles_—_WordPress

And then in the ‘Output Formats’ drop down – deselect everything except for ‘XML’:

Add_New_Post_‹_ATeam_Chronicles_—_WordPress

Make sure you ‘Save’ the changes.

 

Building Web Service Request

To create and test the Web Service, this post will use the opensource tool SoapUI.  This is free and can be downloaded from https://www.soapui.org

 

In SoapUI create a new Soap Project. The WSDL you should use is the Cloud, or On-Premise URL with the BIP suffix “xmlpserver/services/ExternalReportWSSService?wsdl” appended

 

For example:

https://cloudlocation.oracle.com/xmlpserver/services/ExternalReportWSSService?wsdl

or

https://my-on-premise-server.com/xmlpserver/services/ExternalReportWSSService?wsdl

This will generate a Soap UI project and  list the available methods available from the WSDL.

Take a copy of the ‘runReport’ command and edit it.  There are a number of parameters available, but in this example we will remove the majority of those and just leave the basic attributes as shown in this example;

Add_New_Post_‹_ATeam_Chronicles_—_WordPress

This includes the path of the report.  There is an assumption that the report is in the /Shared folder, so that’s not included in the path.  The suffix ‘.xdo’ is also required.

This service follows Oracle’s OWSM policy.  Within SoapUI you’ll also need to enter the user authentication details in the request properties:

Add_New_Post_‹_ATeam_Chronicles_—_WordPress

Run the WSDL and the report is returned as an XML file, encoded in Base64 format, within the <reportBytes> tag:

Add_New_Post_‹_ATeam_Chronicles_—_WordPress

The base64 payload can easily be parsed by most programming languages, for example in Java you would use the Base64  class Base64.decodeBase64(base64String)
.

Summary

This post demonstrated a method which can be used to leverage BI Publisher within an Oracle Application that is packaged with OTBI – beit cloud or on-premise – and extract the data through a web service for a downstream process.


Cloud Security: Federated SSO for Fusion-based SaaS

$
0
0

Introduction

To get you easily started with Oracle Cloud offerings, they come with their own user management. You can create users, assign roles, change passwords, etc.

However, real world enterprises already have existing Identity Management solutions and want to avoid to maintain the same information in many places. To avoid duplicate identities and the related security risks, like out of sync passwords, outdated user information, or rogue user accounts of locked accounts, single sign-on solutions are mandatory.

This post explains how to setup Federated Single Sign-on with Oracle SaaS to enable users present in existing Identity Management solutions to work with the Oracle SaaS offerings without additional user setup. After a quick introduction to Federated Single Sign-on based on SAML, we explain the requirements and the setup of Oracle SaaS for Federated Single Sign-on.

Federated Single Sign-on

Federated Single Sign-on or Federated SSO based on SAML Web Browser Single Sign-on is a widely-used standard in many enterprises world-wide.

The SAML specification defines three roles: the principal (typically a user), the Identity Provider, and the Service Provider. The Identity Provider and the Service Provider form a Circle of Trust and work together to provide a seamless authentication experience for the principal.

SAML Login Flows

The most comomly used SAML login flows are Service Provider Initiated Login and Identity Provider Initiated Login as shown below.

Service Provider Initiated Login

The Service Provider Initiated Login is the most common login flow and will be used by users without explicitely starting it. Pointing the browser to an application page is usually all that is needed.

Here the principal requests a service from the Service Provider. The Service Provider requests and obtains an identity assertion from the Identity Provider and decides whether to grant access to the service.

SAML_IdP_Initiated_Login_0

Identity Provider Initiated Login

SAML allows multiple Identity Provider configured for the same Service Provider. Deciding which of these Identity Providers is the right one for the principal is possible but not always easy to setup. The Identity Provider Initiated Login allows the principal to help here by picking the correct Identity Provider as a starting point. The Identity Provider creates the identity assertion and redirects to the Service Provider which is now able to decide whether to grant access to the service.

SAML_IdP_Initiated_Login_0

Oracle SaaS and Federated SSO

Here Oracle SaaS acts as the Service Provider and builds a Circle of Trust with a third-party, on-premise Identity Provider. This setup applies to all Fusion Applications based SaaS offerings (like Oracle Sales Cloud, Oracle HCM Cloud, or Oracle ERP Cloud) and looks like this.

SaaS_SP_OnPrem_IDP
The setup requires a joint effort of the customer and Oracle Cloud Support.

Scenario Components

The components of this scenario are:

  • Oracle SaaS Cloud (based on Fusion Applications, for example, Oracle Sales Cloud, Oracle HCM Cloud, Oracle ERP Cloud)
  • Any supported SAML 2.0 Identity Provider, for example:
    • Oracle Identity Federation 11g+
    • Oracle Access Management 11gR2 PS3+
    • AD FS 2.0+
    • Shibboleth 2.4+
    • Okta 6.0+
    • Ping Federate 6.0+
    • Ping One
    • etc.

The list of the supported SAML 2.0 Identity Providers for Oracle SaaS is updated regularly, and is available as part of the Fusion Applications Technology: Master Note on Fusion Federation (Support Doc ID 1484345.1).

Supported SAML 2.0 Identity Providers

The Setup Process

To setup this scenario Oracle Cloud Support and the Customer work together to create a operational scenario.

Setup of the On-Premise Identity Provider

To start the setup, the on-premise Identity Provider must be configured to fulfill these requirements:

  • It must implement SAML 2.0 of the Federation protocol.
  • The SAML 2.0 browser artifact SSO profile has been configured.
  • The SAML 2.0 Assertion NameID element must contain one of the following:
    • The user’s email address with the NameID Format being Email Address
    • The user’s Fusion uid with the NameID Format being Unspecified
  • All Federation Identity Provider endpoints must use SSL.
Setup of the Oracle SaaS Cloud Service Provider

Once the on-premise Identity Provider has been configured successfully, the following table outlines the process to request the setup of Oracle SaaS as Service Provider for Federated SSO with the customers on-premise Identity Provider:

Step Customer Oracle
1. File a Service Request to enable the required Oracle SaaS instance as Service Provider. The Service Request must follow the documented requirements.
(See Support Doc ID 1477245.1 or 1534683.1 for details.)
2. Approves the Service Request
3. Receives a document that describes how to configure the on-premise Identity Provider for the Service Provider.
4. When the conformance check has been done successfully, the Identity Provider Metadata File as XML file must be uploaded to the Service Request.
5. Configures the Service Provider in a non-production SaaS environment. When this is completed the Service Provider Metadata will be attached to the Service Request as an XML file for the customer. This file includes all the required information to add the Service Provider as a trusted partner to the Identity Provider.
6. Download the Service Provider metadata file and import it into the Identity Provider.
7. Adds the provided Identity Provider metadata to the Service Provider setup.
8. After the completion of the Service Provider setup, publishes a verification link in the Service Request.
9. Uses the verification link to test the features of Federated SSO.

Note: No other operations are allowed during this verification.

10. When the verification has been completed, update the SR to confirms the verification.
11. Finalize the configuration procedures.
12. Solely responsible for authenticating users.

When Federated SSO has been enabled, only those users whose identities have been synchronized between the on-premise Identity Provider and Oracle Cloud will be able to log in. To support this, Identity Synchronization must be configured (see below).

Identity Synchronization

Federated SSO only works correctly when users of the on-premise Identity Store and of the Oracle SaaS identity store are synchronized. The following sections outline the steps in general. The detailed steps will be covered in a later post.

Users are First Provisioned in Oracle SaaS

The general process works as follows:

Step Oracle SaaS On-premise Environment
1. Setup Extraction Process
2. Download Data
3. Convert Data into Identity Store Format
4. Import Data into Identity Store

Users are First Provisioned in On-Premise Environment

It is very common that users are already exiting in on-premise environments. To allow these users to work with Oracle SaaS, they have to be synchronized into Oracle SaaS. The general process works as follows:

Step Oracle SaaS On-premise Environment
1. Extract Data
2 Convert data into supported file format
3 Load user data using supported loading methods

References

Oracle Service Cloud Bulk Data Import – Best Practices

$
0
0

This blog is part of the series of blogs the A-Team has been running on Oracle Service Cloud (OSvC).

In the previous blog I went through an introduction to the Bulk APIs in OSvC and briefly touched upon throughput considerations(If you haven’t, please read that first). In this blog I build on the previous post and delve deeper into various operational factors to consider when implementing bulk data load into OSvC.

For the purpose of this blog, I define bulk import(or bulk load) as scenarios where records of the order of millions need to be imported into Service Cloud. For smaller scale data load OSvC has an import utility, described here.

Following is an overview of this blog post:

  1. 1.   I first discuss the pros and cons of bulk-loading data into OSvC.
  2. 2.   Then I present a high-level architecture of any bulk-import implementation.
  3. 3.   After that I present the results of a series of performance tests that demonstrate how well the OSvC APIs scale.
  4. 4.   Finally, based on the performance tests and other findings I provide a few recommendations that will ensure high throughput during the data import.

1. Should I Bulk-Load ?

The first rule of OSvC bulk data load is, don’t unless you absolutely have to. In an OSvC implementation an initial bulk-load may seem like an obvious requirement, but it’s a lot of stress on the Service Cloud database and other resources (especially considering that OSvC is multi-tenant), and it can potentially degrade the overall performance. Bulk import may be avoided by designing use-cases so that OSvC data is created on demand, only when needed.

However, sometimes it may be imperative to perform an initial bulk-load. For example, any lead time needed to fetch and create data on-demand may be undesirable, especially when the Service Agent is on call with the customer.

For the rest of this blog it is assumed that bulk-import is indeed a key requirement.

2. High Level Architecture

Any bulk-load implementation mainly requires two things: An API that allows bulk-import, and an Integration layer where data is prepped and massaged before importing.

API

OSvC’s Connect Web Services API enables bulk data import via the batch operation(explained in the previous blog), which lets multiple heterogenous operations and objects be batched into one request.

Integration Layer

The Integration layer is used to validate, massage, transform and orchestrate data before importing into Service Cloud. It is depicted below:


IntegFramework

The integration layer can be SOA Cloud Service, SOA on-premise or any other integration tool. For this blog I assume an on-premise SOA environment.

3. Performance Testing

Given the architecture above, for high-throughput data import it is important that the integration layer as well as the Batch API performs well. Tuning the integration layer is specific to the product chosen, and there may be plenty of ways to do that.

However, there isn’t much documentation available on the CWSS APIs’ performance under stress. So, I conducted a performance-testing exercise to evaluate how well the CWSS APIs (batch operation) scale under various conditions. I will now present the results of those tests.

Test Conditions

Following the integration framework above, I used a two-node Oracle SOA 12.1.3 Cluster on a VM with 24-core 3GHz processor and 140GB RAM.
After that I developed a BPEL process that acted as the bulk-import process. The SOA Composite is shown below:

BPEL

The first BPELProcess1 takes in a parameter called ‘NumOfThreads’ and creates that many BPELProces2 instances in a loop, without waiting for a response. The BPELProcess2 in turn invokes the OSvC Batch API. This is how we simulated ‘n’ concurrent threads. We can also simulate this by using an inbound DB Adapter’s NumberOfThreads parameter.

Varying Batch Sizes and record sizes were simulated using XSLTs that generated an OSvC Batch request payload of the desired size, commit points etc.

The following four areas were tested:

  1. 1)How well the API scales with increasing number of client threads.
  2. 2)How much overhead is added with custom OSvC attributes.
  3. 3)Effect of varying batch sizes.
  4. 4)Performance over a period of time.

Test Results

The graphs below depict the result. The values on the y-axis are depicted relatively (x, 10x, 20x, etc). Please note that these graphs and any numbers in them are only indicative of the observed trend, and no other conclusion should be derived from these.

Increasing Number of Threads

Graph1
Fig. 1 – Contacts created per second with increasing number of threads.


Implication:

  • Increasing the number of concurrent request threads increases the overall throughput

Standard vs Custom Attributes

Graph2
Fig. 2 – Contacts created per second with standard and custom attributes.

25 OSvC custom attributes (for the Contact object) were created to test the custom attributes scenario. Also, the batch size was set to 100.

Implication:

  • Batch API throughput(Contacts created per second) reduces with increasing number of attributes

Batch Size Analysis

Graph3
Fig. 3 – Contacts created per second with varying batch sizes and increasing number of threads.

Implication:

  • Throughput increases with increasing batch sizes. Batch size of 1 is almost always a BAD idea. Also, even though the results seem to be best with batch size 100, other test/environment conditions might result in a different number being optimal.

Long Running Tests


Graph4
Fig. 4 – Long running data import process, contacts created per 10 minutes.

Implication:

  • Contacts were created at a fairly constant rate over a period of time (1 hour)

4. Operational Recommendations

I will now list a few recommendations which will ensure that the data import process is consistently high-performing without affecting overall Service Cloud performance. These recommendations are based on the performance tests above and other factors.

Record Size

In Service Cloud custom attributes and custom fields can be defined for standard data objects. Each record to be inserted can contain standard as well as custom fields’ data, thus increasing the record size. As we saw, larger record sizes lead to higher API response times.
This should be kept in mind when designing the Service Cloud data model such that we don’t create unnecessary custom attributes. Also, if possible, the initial data load can be done with minimum required data so that auxiliary data can be added later.

Batching Records

A key component of the CWSS Batch operation is the batch size, i.e. the number of records being sent in the Batch request, to be committed in a single transaction. As the results aboved showed, committing one record per transaction is a BAD idea. It adds a lot of unnecessary and repetitive I/O on the database. Instead, larger batch sizes should be used. Some testing may be necessary to find the ideal batch size.

Parallel API Invocations

As the tests demonstrated, CWSS scales well with multiple parallel invocations. This should be leveraged when designing any bulk-load interface. The degree of parallelism will depend on your actual use-case, how well the client scales, and other factors.
At the same time, this does not imply we create large number of concurrent client threads and pump as much data as possible. As the number of concurrent API requests increase, more and more Service Cloud database threads get busy with data insertion, and that may interfere with the Agent (Customer Service Reps using the Agent Desktop) experience.
Hence, the number of concurrent client threads inserting data should be configurable, so that we can increase the number when agent activity is less and vice-versa. For example, concurrent requests can be kept low during the Service Cloud Agents’ working hours, and it can be increased when the data load runs overnight.

Disabling Triggers within Service Cloud

OSvC allows Custom Process Models (also called Custom Processes or Events) and Business Rules to be configured for a given object. Following is a brief description:
Custom Process Models (CPMs): CPMs , also known as Custom Processes are used to execute custom logic after an object is created/modified/deleted. They can be attached to any standard or custom object, and they are written in PHP.
CPMs can be synchronous or asynchronous. Synchronous CPMs, as the name suggests, are executed immediately, i.e. if an Update Contact API is invoked and a corresponding Sync CPM exists, then the API won’t return until the CPM is executed. Asynchronous CPMs’ execution is decoupled from the API execution, so the API response time is slightly better.
Business Rules: Business Rules offer the ability to automate tasks such as incident or queue assignments, escalations etc. From a technical perspective these rules are executed in the same transaction as the API execution, hence directly add to the API response time.
For bulk-load scenarios Rules and CPMs may not be needed, and suppressing them significantly improves the overall performance. This can be done by setting Processing Options appropriately in the input payload.

Cleansing the Data Before Insertion

When CWSS insert is invoked, OSvC performs a number of validations on the data, such as if the email address is valid (using regex), uniqueness constraints, upper and lower limits of the data, etc. OSvC rejects the payload whenever such a validation fails. In a batch request payload, even a small percentage of data errors can lead to the entire batch being rejected. This can potentially bring the bulk-load interface to a crawl.
Validation errors would be minimal if we could sanitize the data with OSvC validations before trying to insert. This can be done by using the getMetadata API. This API exposes all the metadata associated with standard and custom objects such as data types, nullability, uniqueness, regex patterns, etc. The metadata is returned in the form of a ‘MetaDataClass’ for each object.
Below is a sample response payload from getMetadata :

<n0:MetaDataClass>
	<n1:Attributes>
	   <n1:MetaDataAttributeList>
		  <n1:DataType>STRING</n1:DataType>
		  <n1:IsDeprecated>false</n1:IsDeprecated>
		  <n1:Description>Authentication user name</n1:Description>
		  <n1:MaxLength>80</n1:MaxLength>
		  <n1:Name>Login</n1:Name>
		  <n1:Nullable>true</n1:Nullable>
		  <n1:Pattern>[^ \t\n<>"]*</n1:Pattern>
		  <n1:UsageOnCreate>ALLOWED</n1:UsageOnCreate>
		  <n1:UsageOnDestroy>IGNORED</n1:UsageOnDestroy>
		  <n1:UsageOnGet>ALLOWED</n1:UsageOnGet>
		  <n1:UsageOnUpdate>ALLOWED</n1:UsageOnUpdate>
		  <n1:UsedAsName>false</n1:UsedAsName>
	   </n1:MetaDataAttributeList>
	   <n1:MetaDataAttributeList>
		  <n1:DataType>STRING</n1:DataType>
		  <n1:IsDeprecated>false</n1:IsDeprecated>
		  <n1:Description>Social or professional title (e.g. Mrs. or Dr.)</n1:Description>
		  <n1:MaxLength>80</n1:MaxLength>
		  <n1:Name>Title</n1:Name>
		  <n1:Nullable>true</n1:Nullable>
		  <n1:Pattern>[^\n]*</n1:Pattern>
		  <n1:UsageOnCreate>ALLOWED</n1:UsageOnCreate>
		  <n1:UsageOnDestroy>IGNORED</n1:UsageOnDestroy>
		  <n1:UsageOnGet>ALLOWED</n1:UsageOnGet>
		  <n1:UsageOnUpdate>ALLOWED</n1:UsageOnUpdate>
		  <n1:UsedAsName>false</n1:UsedAsName>
	   </n1:MetaDataAttributeList>
	</n1:Attributes>
	<n1:CanCreate>true</n1:CanCreate>
	<n1:CanDestroy>true</n1:CanDestroy>
	<n1:CanGet>true</n1:CanGet>
	<n1:CanUpdate>true</n1:CanUpdate>
	<n1:DerivedFrom>
	   <n2:Namespace>urn:base.ws.rightnow.com/v1_2</n2:Namespace>
	   <n2:TypeName>RNObject</n2:TypeName>
	</n1:DerivedFrom>
	<n1:MetaDataLink>Contact</n1:MetaDataLink>
	<n1:Name>
	   <n2:Namespace>urn:objects.ws.rightnow.com/v1_2</n2:Namespace>
	   <n2:TypeName>Contact</n2:TypeName>
	</n1:Name>
</n0:MetaDataClass>

For example, the response payload above suggests that the ‘Login’ field in the Contact object is nullable, and has a maximum length of 80. Knowing this, we can ensure that the bulk-load process rejects data where Login field has more than 80 characters.
In order to use this API, we can either manually analyze the getMetadata response and keep the validation checks updated, or we can create a process that periodically fetches these validations rules and applies them on the data before inserting.
There are four flavors of this API, GetMetadata, GetMetadataForClass, GetMetadataForOperation, and GetMetadataSinceLastDateTime. They are described here

Network Latency

There is network latency in CWSS payload travelling from the integration layer to OSvC and back. Not only is latency dependant on the distance the network packets travel, another key factors is whether the packets travel over the public internet or VPN.
Thorough testing and network analysis may be needed to ensure network latency isn’t high. Also, it makes sense to keep the integration layer physically close to OSvC.

Conclusion

In this blog I discussed whether bulk-import is always necessary. I also demonstrated that Service Cloud APIs perform well under stress conditions. Finally, I discussed a few recommendations when running bulk import into Service Cloud.
Following those recommendations will go a long way in ensuring the data import runs smoothly.

Integrating Oracle Sales Cloud (OSC) with Oracle Database as a Service (DBaaS) using PL/SQL

$
0
0

Introduction


This article describes how to integrate Oracle Sales Cloud (OCS) with Oracle Database as a Service (DBaaS) using PL/SQL.

The code snippet provided uses the REST API for Oracle Sales Cloud to create a new OSC contact from DbaaS. The PL/SQL uses UTL_HTTP commands to call the REST API for Oracle Sales Cloud.

A sample use case for this code snippet could be: Displaying a list of contacts in an Oracle or an external application. Then allowing the application consumer to select the relevant contacts to push to OSC as potential opportunities.

Alternative OCS integration patterns have been discussed in the previously published articles listed below:


Integrating Oracle Sales Cloud with Oracle Business Intelligence Cloud Service (BICS) – Part 1

Integrating Oracle Sales Cloud with Oracle Business Intelligence Cloud Service (BICS) – Part 2


The primary difference between the past and current articles is:


The prior two articles focused on using the APEX_WEB_SERVICE.

This current article uses UTL_HTTP and has no dependency on Apex.


That said, DBaaS does come with Apex and the previously published solutions above are 100% supported with DbaaS. However, some DbaaS developers may prefer to keep all functions and stored procedures in DbaaS – using native PL/SQL commands through Oracle SQL Developer or SQL*Plus. This article addresses that need.

Additionally, the article explains the necessary prerequisites steps required for calling the REST API for OCS from DbaaS. These steps include: configuring the Oracle Wallet and importing the OSC certificate into the wallet.

The techniques referenced in this blog can be easily altered to integrate with other components of the OSC API. Additionally, they may be useful for those wanting to integrate DbaaS with other Oracle and non-Oracle products using PL/SQL.

There are four steps to this solution:


1. Create Oracle Wallet

2. Import OSC Certificate into Wallet

3. Create Convert Blob to Clob Function

4. Run PL/SQL Sample Snippet


Main Article


1. Create Oracle Wallet

 

The Schema Service Database is pre-configured with the Oracle Wallet and 70+ common root CA SSL certificates. It is completely transparent to developers when building declarative web services in APEX or when using APEX_WEB_SERVICE API.

DbaaS, on the other hand, does not come pre-configured with the Oracle Wallet. Therefore, a wallet must be created and the OSC certificate imported into the Wallet.


Using PuTTY (or another SSH and Telnet client) log on to the DbaaS instance as oracle or opc.

If set, enter the passphrase.

If logged on as opc, run:

sudo su – oracle

Set any necessary environment variables:

. oraenv

Create the Wallet

orapki wallet create -wallet . -pwd Welcome1

 

2. Import OSC Certificate into Wallet

 

From a browser (these examples use Chrome) go to the crmCommonApi contacts URL. The screen will be blank.

https://abc1-cloud1234-crm.oracledemos.com/crmCommonApi/resources/11.1.10/contacts

In R10 you may need to use the latest/contacts URL:

https://abc1-cloud1234-crm.oracledemos.com/crmCommonApi/resources/latest/contacts

Click on the lock

Snap1

Click Connection -> Certificate Information

Snap2

Click “Certification Path”. Select “GeoTrust SSL CA – G3”.

Snap3

Click Details -> Copy to File

Snap4

 

Click Next

Snap5

Select “Base-64 encoded X.509 (.CER)

Snap6

Save locally as oracledemos.cer

Snap7

Click Finish

Snap8

Snap9

Copy the oracledemos.cer file from the PC to the Cloud server. This can be done using SFTP.

Alternatively, follow the steps below to manually create the oracledemos.cer using vi editor and cut and paste between the environments.


Return to PuTTY.

Using the vi editor create the certificate file.

vi oracledemos.cer

Open the locally saved certificate file in NotePad (or other text editor). Select all and copy.

Return to the vi Editor and paste the contents of the certificate into the oracledemos.cer file.

Hit “i” to insert
“Right Click” to paste
Hit “Esc”
Type “wq” to save
Type “ls -l” to confirm oracledemos.cer file was successfully created

Run the following command to add the certificate to the wallet.

orapki wallet add -wallet . -trusted_cert -cert /home/oracle/oracledemos.cer -pwd Welcome1

 Confirm the certificate was successfully added to the wallet.

orapki wallet display -wallet . -pwd Welcome1

Snap11

3. Create Convert Blob to Clob Function


I used this v_blobtoclob function created by Burleson Consulting to convert the blob to a clob.

There are many other online code samples using various methods to convert blobs to clobs that should work just fine as well.

This function isn’t actually required to create the OSC contact.

It is however necessary to read the response – since the response comes back as a blob.

 

4. Run PL/SQL Sample Snippet


Replace the highlighted items with those of your environment:


(a) Wallet Path

(b) Wallet Password

(c) OSC crmCommonAPI Contacts URL

(d) OCS User

(e) OSC Pwd

(f) Blob to Clob Function Name


Run the code snippet in SQL Developer or SQL*Plus.

DECLARE
l_http_request UTL_HTTP.req;
l_http_response UTL_HTTP.resp;
l_response_text VARCHAR2(32766);
l_response_raw RAW(32766);
l_inflated_resp blob;
l_body VARCHAR2(30000);
l_clob CLOB;

l_body := ‘{“FirstName”: “Jay”,”LastName”: “Pearson”,”Address”: [{“Address1”: “100 Oracle Parkway”,”City”: “Redwood Shores”,”Country”: “US”,”State”: “CA”}]}’;
UTL_HTTP.set_wallet(‘file:/home/oracle‘, ‘Welcome1‘);
l_http_request := UTL_HTTP.begin_request (‘https://abc1-cloud1234-crm.oracledemos.com:443/crmCommonApi/resources/11.1.10/contacts’,’POST’,’HTTP/1.1′);
UTL_HTTP.set_authentication(l_http_request,’User‘,’Pwd‘);
UTL_HTTP.set_header(l_http_request, ‘Content-Type’, ‘application/vnd.oracle.adf.resourceitem+json’);
UTL_HTTP.set_header (l_http_request, ‘Transfer-Encoding’, ‘chunked’ );
UTL_HTTP.set_header(l_http_request, ‘Cache-Control’, ‘no-cache’);
utl_http.write_text(l_http_request, l_body);
l_http_response := UTL_HTTP.get_response(l_http_request);
dbms_output.put_line (‘status code: ‘ || l_http_response.status_code);
dbms_output.put_line (‘reason phrase: ‘ || l_http_response.reason_phrase);
UTL_HTTP.read_raw(l_http_response, l_response_raw,32766);
DBMS_OUTPUT.put_line(‘>> Response (gzipped) length: ‘||utl_raw.length(l_response_raw));
l_inflated_resp := utl_compress.lz_uncompress(to_blob(l_response_raw));
DBMS_OUTPUT.put_line(‘>> Inflated Response: ‘||dbms_lob.getlength(l_inflated_resp));
l_clob := v_blobtoclob(l_inflated_resp);
dbms_output.put_line(dbms_lob.substr(l_clob,24000,1));
UTL_HTTP.end_response(l_http_response);
–;
/
sho err

Dbms Output should show status code: 201 – Reason Phrase: Created.

However, I have found that status 201 is not 100% reliable.

That is why it is suggested to return the response, so you can confirm that the contact was actually created and get the PartyNumber.

Snap12

Once you have the PartyNumber, Postman can be used to confirm the contact was created and exists in OSC.

https://abc1-abc1234-crm.oracledemos.com:443/crmCommonApi/resources/11.1.10/contacts/345041

Snap13


Further Reading


Click here for the REST API for Oracle Sales Cloud guide.

Click here for Oracle Database UTL_HTTP commands.

Click here for more A-Team BICS Blogs.

 
Summary

 

This article provided a code snippet of how to use UTL_HTTP PL/SQL commands in DbaaS to create an OSC contact using the REST API for Oracle Sales Cloud.

Additionally, the article provided the prerequisite steps to create an Oracle Wallet and import the OSC certificate. This is required for accessing the OSC API externally – in this case using PL/SQL ran in SQL Developer.

The techniques referenced in this blog can be easily altered to integrate with other components of the OSC API. Additionally, they may be useful for those wanting to integrate DbaaS with other Oracle and non-Oracle products using PL/SQL.

Round Trip On-Premise Integration (Part 1) – ICS to EBS

$
0
0

One of the big challenges with adopting Cloud Services Architecture is how to integrate the on-premise applications when the applications are behind the firewall. A very common scenario that falls within this pattern is cloud integration with Oracle E-Business Suite (EBS). To address this cloud-to-ground pattern without complex firewall configurations, DMZs, etc., Oracle offers a feature with the Integration Cloud Service (ICS) called Connectivity Agent (additional details about the Agent can be found under New Agent Simplifies Cloud to On-premises Integration). Couple this feature with the EBS Cloud Adapter in ICS and now we have a viable option for doing ICS on-premise integration with EBS. The purpose of this A-Team blog is to detail the prerequisites for using the EBS Cloud Adapter and walk through a working ICS integration to EBS via the Connectivity Agent where ICS is calling EBS (EBS is the target application). The blog is also meant to be an additional resource for the Oracle documentation for Using Oracle E-Business Suite Adapter.

The technologies at work for this integration include ICS (Inbound REST Adapter, Outbound EBS Cloud Adapter), Oracle Messaging Cloud Service (OMCS), ICS Connectivity Agent (on-premise), and Oracle EBS R12.  The integration is a synchronous (request/response) to EBS where a new employee will be created via the EBS HR_EMPLOYEE_API. The flow consists of a REST call to ICS with a JSON payload containing the employee details.  These details are then transformed in ICS from JSON to XML for the EBS Cloud Adapter. The EBS adapter then sends the request to the on-premise connectivity agent via OMCS. The agent then makes the call to EBS where the results will then be passed back to ICS via OMCS. The EBS response is transformed to JSON and returned to the invoking client. The following is a high-level view of the integration:

ICSEBSCloudAdapter-Overview

Prerequisites

1. Oracle E-Business Suite 12.1.3* or higher.
2. EBS Configured for the EBS Cloud Adapter per the on-line document: Setting Up Oracle E-Business Suite Adapter from Integration Cloud Service.
a. ISG is configured for the EBS R12 Environment.
b. EBS REST services are configured in ISG.
c. Required REST services are deployed in EBS.
d. Required user privileges granted for the deployed REST services in EBS.
3. Install the on-premise Connectivity Agent (see Integration Cloud Service (ICS) On-Premise Agent Installation).

* For EBS 11 integrations, see another A-Team Blog E-Business Suite Integration with Integration Cloud Service and DB Adapter.

Create Connections

1. Inbound Endpoint Configuration.
a. Start the connection configuration by clicking on Create New Connection in the ICS console:
ICSEBSCloudAdapter-Connections_1-001
b. For this blog, we will be using the REST connection for the inbound endpoint. Locate and Select the REST Adapter in the Create Connection – Select Adapter dialog:
ICSEBSCloudAdapter-Connections_1-002
c. Provide a Connection Name in the New Connection – Information dialog:
ICSEBSCloudAdapter-Connections_1-003
d. The shell of the REST Connection has now been created. The first set of properties that needs to be configured is the Connection Properties. Click on the Configure Connectivity button and select REST API Base URL for the Connection Type. For the Connection URL, provide the ICS POD host since this is an incoming connection for the POD. A simple way to get the URL is to copy it from the browser location of the ICS console being used to configure the connection:
ICSEBSCloudAdapter-Connections_1-004
e. The last set of properties that need to be configured are the Credentials. Click on the Configure Credentials button and select Basic Authentication for the Security Policy. The Username and Password for the basic authentication will be a user configured on the ICS POD:
ICSEBSCloudAdapter-Connections_1-005
f. Now that we have all the properties configured, we can test the connection. This is done by clicking on the Test icon at the top of the window. If everything is configured correctly, a message of The connection test was successful!:
ICSEBSCloudAdapter-Connections_1-006
2. EBS Endpoint Connection
a. Create another connection, but this time select Oracle E-Business Suite from the Create Connection – Select Adapter dialog:
ICSEBSCloudAdapter-Connections_2-001
b. Provide a Connection Name in the New Connection – Information dialog:
ICSEBSCloudAdapter-Connections_2-002
c. Click on the Configure Connectivity button and for the EBS Cloud Adapter there is only one property, the Connection URL. This URL will be the hostname and port where the EBS metadata has been deployed for EBS. This metadata is provided by Oracle’s E-Business Suite Integrated SOA Gateway (ISG) and the setup/configuration of ISG can be found under the Prerequisites for this blog (item #2). The best way to see if the metadata provider has been deployed is to access the WADL using a URL like the following: http://ebs.example.com:8000/webservices/rest/provider?WADL where ebs.example.com is the hostname of your EBS metatdata provider machine. The URL should provide something like the following:
<?xml version = '1.0' encoding = 'UTF-8'?>
<application name="EbsMetadataProvider" targetNamespace="http://xmlns.oracle.com/apps/fnd/soaprovider/pojo/ebsmetadataprovider/" xmlns:tns="http://xmlns.oracle.com/apps/fnd/soaprovider/pojo/ebsmetadataprovider/" xmlns="http://wadl.dev.java.net/2009/02" xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:tns1="http://xmlns.oracle.com/apps/fnd/rest/provider/getinterfaces/" xmlns:tns2="http://xmlns.oracle.com/apps/fnd/rest/provider/getmethods/" xmlns:tns3="http://xmlns.oracle.com/apps/fnd/rest/provider/getproductfamilies/" xmlns:tns4="http://xmlns.oracle.com/apps/fnd/rest/provider/isactive/">
   <grammars>
      <include href="http://ebs.example.com:8000/webservices/rest/provider/?XSD=getinterfaces_post.xsd" xmlns="http://www.w3.org/2001/XMLSchema"/>
      <include href="http://ebs.example.com:8000/webservices/rest/provider/?XSD=getmethods_post.xsd" xmlns="http://www.w3.org/2001/XMLSchema"/>
      <include href="http://ebs.example.com:8000/webservices/rest/provider/?XSD=getproductfamilies_post.xsd" xmlns="http://www.w3.org/2001/XMLSchema"/>
      <include href="http://ebs.example.com:8000/webservices/rest/provider/?XSD=isactive_post.xsd" xmlns="http://www.w3.org/2001/XMLSchema"/>
   <include href="http://ebs.example.com:8000/webservices/rest/provider/?XSD=getinterfaces_get.xsd" xmlns="http://www.w3.org/2001/XMLSchema"/>
      <include href="http://ebs.example.com:8000/webservices/rest/provider/?XSD=getmethods_get.xsd" xmlns="http://www.w3.org/2001/XMLSchema"/>
      <include href="http://ebs.example.com:8000/webservices/rest/provider/?XSD=getproductfamilies_get.xsd" xmlns="http://www.w3.org/2001/XMLSchema"/>
      <include href="http://ebs.example.com:8000/webservices/rest/provider/?XSD=isactive_get.xsd" xmlns="http://www.w3.org/2001/XMLSchema"/>
   </grammars>
   <resources base="http://ebs.example.com:8000/webservices/rest/provider/">
      <resource path="getInterfaces/{product}/">
         <param name="product" style="template" required="true" type="xsd:string"/>
         <method id="getInterfaces" name="GET">
            <request>
               <param name="ctx_responsibility" type="xsd:string" style="query" required="false"/>
               <param name="ctx_respapplication" type="xsd:string" style="query" required="false"/>
               <param name="ctx_securitygroup" type="xsd:string" style="query" required="false"/>
               <param name="ctx_nlslanguage" type="xsd:string" style="query" required="false"/>
               <param name="ctx_language" type="xsd:string" style="query" required="false"/>
               <param name="ctx_orgid" type="xsd:int" style="query" required="false"/>
               <param name="scopeFilter" type="xsd:string" style="query" required="true"/>
               <param name="classFilter" type="xsd:string" style="query" required="true"/>
            </request>
            <response>
               <representation mediaType="application/xml" type="tns1:getInterfaces_Output"/>
               <representation mediaType="application/json" type="tns1:getInterfaces_Output"/>
            </response>
         </method>
      </resource>
      <resource path="getInterfaces/">
         <method id="getInterfaces" name="POST">
            <request>
               <representation mediaType="application/xml" type="tns1:getInterfaces_Input"/>
               <representation mediaType="application/json" type="tns1:getInterfaces_Input"/>
            </request>
            <response>
               <representation mediaType="application/xml" type="tns1:getInterfaces_Output"/>
               <representation mediaType="application/json" type="tns1:getInterfaces_Output"/>
            </response>
         </method>
      </resource>
      <resource path="getMethods/{api}/">
         <param name="api" style="template" required="true" type="xsd:string"/>
         <method id="getMethods" name="GET">
            <request>
               <param name="ctx_responsibility" type="xsd:string" style="query" required="false"/>
               <param name="ctx_respapplication" type="xsd:string" style="query" required="false"/>
               <param name="ctx_securitygroup" type="xsd:string" style="query" required="false"/>
               <param name="ctx_nlslanguage" type="xsd:string" style="query" required="false"/>
               <param name="ctx_language" type="xsd:string" style="query" required="false"/>
               <param name="ctx_orgid" type="xsd:int" style="query" required="false"/>
               <param name="scopeFilter" type="xsd:string" style="query" required="true"/>
               <param name="classFilter" type="xsd:string" style="query" required="true"/>
            </request>
            <response>
               <representation mediaType="application/xml" type="tns2:getMethods_Output"/>
               <representation mediaType="application/json" type="tns2:getMethods_Output"/>
            </response>
         </method>
      </resource>
      <resource path="getMethods/">
         <method id="getMethods" name="POST">
            <request>
               <representation mediaType="application/xml" type="tns2:getMethods_Input"/>
               <representation mediaType="application/json" type="tns2:getMethods_Input"/>
            </request>
            <response>
               <representation mediaType="application/xml" type="tns2:getMethods_Output"/>
               <representation mediaType="application/json" type="tns2:getMethods_Output"/>
            </response>
         </method>
      </resource>
      <resource path="getProductFamilies/">
         <method id="getProductFamilies" name="GET">
            <request>
               <param name="ctx_responsibility" type="xsd:string" style="query" required="false"/>
               <param name="ctx_respapplication" type="xsd:string" style="query" required="false"/>
               <param name="ctx_securitygroup" type="xsd:string" style="query" required="false"/>
               <param name="ctx_nlslanguage" type="xsd:string" style="query" required="false"/>
               <param name="ctx_language" type="xsd:string" style="query" required="false"/>
               <param name="ctx_orgid" type="xsd:int" style="query" required="false"/>
               <param name="scopeFilter" type="xsd:string" style="query" required="true"/>
               <param name="classFilter" type="xsd:string" style="query" required="true"/>
            </request>
            <response>
               <representation mediaType="application/xml" type="tns3:getProductFamilies_Output"/>
               <representation mediaType="application/json" type="tns3:getProductFamilies_Output"/>
            </response>
         </method>
      </resource>
      <resource path="getProductFamilies/">
         <method id="getProductFamilies" name="POST">
            <request>
               <representation mediaType="application/xml" type="tns3:getProductFamilies_Input"/>
               <representation mediaType="application/json" type="tns3:getProductFamilies_Input"/>
            </request>
            <response>
               <representation mediaType="application/xml" type="tns3:getProductFamilies_Output"/>
               <representation mediaType="application/json" type="tns3:getProductFamilies_Output"/>
            </response>
         </method>
      </resource>
      <resource path="isActive/">
         <method id="isActive" name="GET">
            <request>
               <param name="ctx_responsibility" type="xsd:string" style="query" required="false"/>
               <param name="ctx_respapplication" type="xsd:string" style="query" required="false"/>
               <param name="ctx_securitygroup" type="xsd:string" style="query" required="false"/>
               <param name="ctx_nlslanguage" type="xsd:string" style="query" required="false"/>
               <param name="ctx_language" type="xsd:string" style="query" required="false"/>
               <param name="ctx_orgid" type="xsd:int" style="query" required="false"/>
            </request>
            <response>
               <representation mediaType="application/xml" type="tns4:isActive_Output"/>
               <representation mediaType="application/json" type="tns4:isActive_Output"/>
            </response>
         </method>
      </resource>
      <resource path="isActive/">
         <method id="isActive" name="POST">
            <request>
               <representation mediaType="application/xml" type="tns4:isActive_Input"/>
               <representation mediaType="application/json" type="tns4:isActive_Input"/>
            </request>
            <response>
               <representation mediaType="application/xml" type="tns4:isActive_Output"/>
               <representation mediaType="application/json" type="tns4:isActive_Output"/>
            </response>
         </method>
      </resource>
   </resources>
</application>

 

If you don’t get something like the above XML, here are some general troubleshooting steps:
Login to EBS console
–> Integrated SOA Gateway
—-> Integration Repository
——> Click on “Search” button on the right
——–> Enter “oracle.apps.fnd.rep.ws.service.EbsMetadataProvider” in the field “Internal Name”
———-> Click “Go” (If this doesn’t list anything, you are missing a patch on the EBS instance. Please follow the Note. 1311068.1)
————> Click on “Metadata Provider”
————–> Click on “REST Web Service” tab
—————-> Enter “provider” as is in the “Service Alias” field and click the button “Deploy”
——————> Navigate to “Grants” tab and give grants on all methods.
If the WADL shows that the metadata provider is deployed and ready, the Connection URL is simply the host name and port where the metatdata provider is deployed. For example, http://ebs.example.com:8000
ICSEBSCloudAdapter-Connections_2-003
d. The next set of properties that need to be configured are the Credentials. Click on the Configure Credentials button and select Basic Authentication for the Security Policy. The Username and Password for the basic authentication will be a user configured on the on-premise EBS environment granted privileges to access the EBS REST services:
ICSEBSCloudAdapter-Connections_2-004
NOTE: The Property Value for Username in the screen shot above shows the EBS sysadmin user. This will most likely “not” be the user that has grants on the EBS REST service. If you use the sysadmin user here and your integration (created later) “fails at runtime” with a “Responsibility is not assigned to user” error from EBS, either the grants on the EBS REST service are not created or a different EBS user needs to be specified for this connection. Here is an example error you might get:
<ISGServiceFault>
    <Code>ISG_USER_RESP_MISMATCH</Code>
    <Message>Responsibility is not assigned to user</Message>
    <Resolution>Please assign the responsibility to the user.</Resolution>
    <ServiceDetails>
        <ServiceName>HREmployeeAPISrvc</ServiceName>
        <OperationName>CREATE_EMPLOYEE</OperationName>
        <InstanceId>0</InstanceId>
    </ServiceDetails>
</ISGServiceFault>
e. Finally, we need to associate this connection with the on-premise Connectivity Agent that was configured as a Prerequisite. To do this, click on the Configure Agents button and select the agent group that contains the running on-premise Connectivity Agent:
ICSEBSCloudAdapter-Connections_2-005
f. Now that we have all the properties configured, we can test the connection. This is done by clicking on the Test icon at the top of the window. If everything is configured correctly, a message of The connection test was successful!:
ICSEBSCloudAdapter-Connections_2-006
3. We are now ready to construct our cloud-to-ground integration using ICS and the connections that were just created.

Create Integration

1. Create New Integration.
a. Navigate to the Integrations page of the Designer section. Then click on Create New Integration:
ICSEBSCloudAdapter-CreateIntegration_1-001
b. In the Create Integration – Select a Pattern dialog, locate the Map My Data and select it:
ICSEBSCloudAdapter-CreateIntegration_1-002
c. Give the new integration a name and click on Create:
ICSEBSCloudAdapter-CreateIntegration_1-003
2. Configure Inbound Endpoint.
a. The first thing we will do is to create our inbound endpoint (entry point to the ICS integration). In the Integration page that opened from the previous step, locate the Connections section and find the REST connection configured earlier. Drag-and-Drop that connection onto the inbound (left-hand side) of the integration labeled “Drage and Drop a Trigger”:
ICSEBSCloudAdapter-CreateIntegration_2-001
b. Since the focus of this blog is on the EBS Adapter, we will not go into the details of setting up this endpoint. The important details for this integration is that the REST service will define both the request and the response in JSON format:

Example Request:

{
  "CREATE_EMPLOYEE_Input": {
    "RESTHeader": {
      "Responsibility": "US_SHRMS_MANAGER",
      "RespApplication": "PER",
      "SecurityGroup": "STANDARD",
      "NLSLanguage": "AMERICAN",
      "Org_Id": "204"
    },
    "InputParameters": {
      "HireDate": "2016-01-01T09:00:00",
      "BusinessGroupID": "202",
      "LastName": "Sled",
      "Sex": "M",
      "Comments": "Create From ICS Integration",
      "DateOfBirth": "1991-07-03T09:00:00",
      "EMailAddress": "bob.sled@example.com",
      "FirstName": "Robert",
      "Nickname": "Bob",
      "MaritalStatus": "S",
      "MiddleName": "Rocket",
      "Nationality": "AM",
      "SocialSSN": "555-44-3333",
      "RegisteredDisabled": "N",
      "CountryOfBirth": "US",
      "RegionOfBirth": "Montana",
      "TownOfBirth": "Missoula"
    }
  }
}

Example Response:

{
  "CreateEmployeeResponse": {
    "EmployeeNumber": 2402,
    "PersonID": 32871,
    "AssignmentID": 34095,
    "ObjectVersionNumber": 2,
    "AsgObjectVersionNumber": 1,
    "EffectiveStartDate": "2016-01-01T00:00:00.000-05:00",
    "EffectiveEndDate": "4712-12-31T00:00:00.000-05:00",
    "FullName": "Sled, Robert Rocket (Bob)",
    "CommentID": 1304,
    "AssignmentSequence": null,
    "AssignmentNumber": 2402,
    "NameCombinationWarning": 0,
    "AssignPayrollWarning": 0,
    "OrigHireWarning": 0
  }
}
ICSEBSCloudAdapter-CreateIntegration_2-002
3. Configure Outbound Endpoint.
a. Now we will configure the endpoint to EBS. In the Integration page, locate the Connections section and find the E-Business Suite adapter connection configured earlier. Drag-and-Drop that connection onto the outbound (right-hand side) of the integration labeled “Drage and Drop an Invoke”:
ICSEBSCloudAdapter-CreateIntegration_3-001
b. The Configure Oracle E-Business Suite Adapter Endpoint configuration window should now be open. Provide a meaningful name for the endpoint and press Next >. If the windows hangs or errors out, check to make sure the connectivity agent is running and ready. This endpoint is dependent on the communication between ICS and EBS via the connectivity agent.
ICSEBSCloudAdapter-CreateIntegration_3-002
c. At this point, the adapter has populated the Web Services section of the wizard with Product Family and Product metatdata from EBS. For this example, the Product Family will be Human Resources Suite and the Product will be Human Resources. Once those are selected, the window will be populated with API details.
ICSEBSCloudAdapter-CreateIntegration_3-003
d. Next to API label is a text entry field where the list of APIs can be searched by typing values in that field. This demo uses the HR_EMPLOYEE_API, which can be found by typing Employee in the text field and selecting Employee from the list:
ICSEBSCloudAdapter-CreateIntegration_3-004
e. The next section of the configuration wizard is the Operations. This will contain a list of “all” operations for the API including operations that have not yet been deployed in the EBS Integration Repository. If you select an operation and see a warning message indicating that the operation has not been deployed, you must go to the EBS console and deploy that operation in the Integration Repository and provide the appropriate grants.
ICSEBSCloudAdapter-CreateIntegration_3-005
f. This demo will use the CREATE_EMPLOYEE method of the HR_EMPLOYEE_API. Notice that there is no warning when this method is selected:
ICSEBSCloudAdapter-CreateIntegration_3-006
g. The Summary section of the configuration wizard shows all the details from the previous steps. Click on Done to complete the endpoint configuration.
ICSEBSCloudAdapter-CreateIntegration_3-007
h. Check point – the ICS integration should look something like the following:
ICSEBSCloudAdapter-CreateIntegration_3-008
4. Request/Response Mappings.
a. The mappings for this example are very straightforward in that the JSON was derived from the EBS input/output parameters, so the relationships are fairly intuitive. Also, the number of data elements have been minimized to simplify the mapping process. It is also a good idea to provide a Fault mapping:

Request Mapping:

ICSEBSCloudAdapter-CreateIntegration_4-001

Response Mapping:

ICSEBSCloudAdapter-CreateIntegration_4-002

Fault Mapping:

ICSEBSCloudAdapter-CreateIntegration_4-003
5. Set Tracking.
a. The final step to getting the ICS Integration to 100% is to Add Tracking. This is done by clikcing on the Tracking icon at the top right-hand side of the Integration window.
ICSEBSCloudAdapter-CreateIntegration_5-001
b. In the Business Identifiers For Tracking window, drag-and-drop fields that will be used for tracking purposes. These fields show up in the ICS console in the Monitoring section for the integration.
ICSEBSCloudAdapter-CreateIntegration_5-002
c. There can be up to 3 fields used for the tracking, but only one is considered the Primary.
ICSEBSCloudAdapter-CreateIntegration_5-003
6. Save (100%).
a. Once the Tracking is configured, the integration should now be at 100% and ready for activation. This is a good time to Save all the work that has been done thus far.
ICSEBSCloudAdapter-CreateIntegration_6-001

Test Integration

1. Make sure the integration is activated and you open the endpoint URL that located by clicking on the “I”nformation icon.
ICSEBSCloudAdapter-Test-001
2. Review the details of this page since it contains everything needed for the REST client that will be used for testing the integration.
ICSEBSCloudAdapter-Test-002
3. Open a REST test client and provide all the necessary details from the endpoint URL. The important details from
the page include:
Base URL: https://[ICS POD Host Name]/integration/flowapi/rest/HR_CREATE_EMPLOYEE/v01
REST Suffix: /hr/employee/create
URL For Test Client: https://[ICS POD Host Name]/integration/flowapi/rest/HR_CREATE_EMPLOYEE/v01/hr/employee/create
REST Method: POST
Content-Type application/json
JSON Payload:
{
  "CREATE_EMPLOYEE_Input": {
    "RESTHeader": {
      "Responsibility": "US_SHRMS_MANAGER",
      "RespApplication": "PER",
      "SecurityGroup": "STANDARD",
      "NLSLanguage": "AMERICAN",
      "Org_Id": "204"
    },
    "InputParameters": {
      "HireDate": "2016-01-01T09:00:00",
      "BusinessGroupID": "202",
      "LastName": "Demo",
      "Sex": "M",
      "Comments": "Create From ICS Integration",
      "DateOfBirth": "1991-07-03T09:00:00",
      "EMailAddress": "joe.demo@example.com",
      "FirstName": "Joseph",
      "Nickname": "Demo",
      "MaritalStatus": "S",
      "MiddleName": "EBS",
      "Nationality": "AM",
      "SocialSSN": "444-33-2222",
      "RegisteredDisabled": "N",
      "CountryOfBirth": "US",
      "RegionOfBirth": "Montana",
      "TownOfBirth": "Missoula"
    }
  }
}
The last piece that is needed for the REST test client is authentication information. Add Basic Authentication to the header with a user name and password for an authorized “ICS” user. The user that will be part of the on-premise EBS operation is specified in the EBS connection that was configured in ICS earlier. The following shows what all this information looks like using the Firefox RESTClient add-on:
ICSEBSCloudAdapter-Test-003
4. Before we test the integration, we can login to the EBS console as the HRMS user. Then navigating to Maintaining Employees, we can search for our user Joseph Demo by his last name. Notice, nothing comes up for the search:
ICSEBSCloudAdapter-Test-004
5. Now we send the POST from the RESTClient and review the response:
ICSEBSCloudAdapter-Test-005
6. We can compare what was returned from EBS to ICS in the EBS application. Here is the search results for the employee Joseph Demo:
ICSEBSCloudAdapter-Test-006
7. Here are the details for Joseph Demo:
ICSEBSCloudAdapter-Test-007
8. Now we return to the ICS console and navigate to the Tracking page of the Monitoring section. The integration instance shows up with the primary tracking field of Last Name: Demo
ICSEBSCloudAdapter-Test-008
9. Finally, by clicking on the tracking field for the instance, we can view the details:
ICSEBSCloudAdapter-Test-009

Hopefully this walkthrough of how to do an ICS integration to an on-premise EBS environment has been useful. I am looking forward to any comments and/or feedback you may have. Also, keep an eye out for the “Part 2” A-Team Blog that will detail EBS business events surfacing in ICS to complete the ICS/EBS on-premise round trip integration scenarios.

Loading Data into Oracle Cloud ERP R10 using the new LoadAndImportData operation

$
0
0

 

Introduction

As part of Oracle ERP cloud release 10 a new SOAP function has been made available to our customers which greatly simplifies the loading of ERP data using the batch oriented SOAP Services.

This article aims to give the reader, details of this new SOAP Service and how it helps in loading data files into Oracle ERP cloud.

Assuming the input file has been already produced, loading the data into Oracle ERP cloud service is traditionally a multi-step process.

The typical “happy” path is :

  1. 1. Load the file into Oracle Fusion ERP UCM service
  2. 2. Execute the first ESS Job which transfers the file from UCM to the Oracle ERP interface tables
  3. 3. Using a polling technique check to see when the ESS job has finished transferring the file into the interface tables
  4. 4. Execute a second ESS job, which transfers the file from Oracle ERP interface tables to the Oracle ERP data object tables
  5. 5. Use a polling technique to check to see when the file has been processed
    6. Finally execute a call to the downloadESSJobExecutionDetails() operation to download a log file so you can check for success,or any errors, which need dealing with.

Whilst this approach appears attractive, as it allows the developer a great deal of control of the process, in truth this internal processing should be something that the SaaS application [Oracle ERP Cloud] should manage and provide feedback to the developer when things finish

New SOAP method in R10

As of Oracle ERP cloud Release 10 there is a new API called “loadAndImportData“, which is held within the ERPintegrationService, ( https://(FinancialDomain,Financial Common)/publicFinancialCommonErpIntegration/ErpIntegrationService?WSDL). This service has been specifically created to simplify the loading of data into Oracle ERP Cloud service by allowing you the ability to submit a file which is then automatically taken through the various stages of processing within Oracle ERP Cloud, without the user needing to execute each step of the process manually.

The operation takes the following parameters :

Element Name Type Description
document Document Information SDO List of elements, each containing the details of the file to be uploaded. The details include the file content, file name, content type, file title, author, security group, and account
jobList Process Details SDO List of elements, each containing the details of the Enterprise Scheduling Service job to be submitted to import and process the uploaded file. The details include the job definition name, job package name, and list of parameters
interfaceDetails string The interface whose data is to be loaded.
notificationCode string A two-digit number that represents the manner and timing in which a notification is sent.
callbackURL string The callback URL of the service implemented by customers to receive the Enterprise Scheduling Service job status on completion of the job

 

Diving into the Details

A sample soap payload, which imports journal records, looks like the following :

<soapenv:Envelope xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/" xmlns:typ="http://xmlns.oracle.com/apps/financials/commonModules/shared/model/erpIntegrationService/types/" xmlns:erp="http://xmlns.oracle.com/apps/financials/commonModules/shared/model/erpIntegrationService/">
   <soapenv:Header/>
   <soapenv:Body>
      <typ:loadAndImportData>
         <typ:document>
            <erp:Content>  UEsDBBQAAAAIAMG2b0hvJGqkiAAAAKsBAAAPAAAAR2xJbnRlcmZhY2UuY3N2tY+xDoJADIZ3E9+hD9BIexojIwQWB0wU49yQqgMcyYnv7wELiYEwQIe2/9+mzZelD2Q0xMeA9gExxlKKLRRyJ/bzVIdXrepmoO+3ZLgfoU9EhGzYtA11ajTCcNePc5UKIuhKLE3xPnnz4r+8FM7111kpW+c/eOp8miXbzXJQB2aeAQU91rpUP1BLAQIUABQAAAAIAMG2b0hvJGqkiAAAAKsBAAAPAAAAAAAAAAAAIAAAAAAAAABHbEludGVyZmFjZS5jc3ZQSwUGAAAAAAEAAQA9AAAAtQAAAAAA</erp:Content>
            <erp:FileName>LoadGLData1.zip</erp:FileName>
            <erp:ContentType>zip</erp:ContentType>
            <erp:DocumentTitle>ImportJournalEntry</erp:DocumentTitle>
            <erp:DocumentAuthor></erp:DocumentAuthor>
            <erp:DocumentSecurityGroup>FAFusionImportExport</erp:DocumentSecurityGroup>
            <erp:DocumentAccount>fin$/journal$/import$</erp:DocumentAccount>
         </typ:document>
         <typ:jobList>
      
           <erp:JobName>oracle/apps/ess/financials/generalLedger/programs/common,JournalImportLauncher</erp:JobName>
           <erp:ParameterList>1061,Balance Transfer,1,123,N,N,N</erp:ParameterList>
         </typ:jobList>
         <typ:interfaceDetails>15</typ:interfaceDetails>
         <typ:notificationCode>50</typ:notificationCode>
         <typ:callbackURL>http://somecallbackserver.domain.com/mycallback</typ:callbackURL>
      </typ:loadAndImportData>
   </soapenv:Body>
</soapenv:Envelope>

Now lets dive into each element and explain what it represents and more importantly where you derive the data from :

  • Document : This element contains the details of the document to be uploaded
    • content : This is the document itself, base64 encoded and in-lined in the soap payload. There  are many tools on the internet to base64encode a document and in Java there is a  helper Base64.Encoder which does this for you.
    • contentType : This value should be set to “zip”, this means your files must be zipped before base64encoding them and in lining them above
    • documentTitle : A title for the document, this is so you can find it in UCM later if you need to.
    • documentSecurityGroup : Needs to be set to a security group that secures the document, for our example we’ve used FAFusionImportExport
    • documentAccount : This needs to be set to the correct account depending on the data which is being loaded. For our journal import we need to set the account to fin$/journal$import. This is the same Account used when you “manually” upload files into Oracle ERP for loading.. If you don’t know the what UCM account your data should be loaded into you can find it by going into File Based Data Import for Financials Cloud documentation and searching for your data object. In our case the object is “Journal Import” and the documentation states that the UCM account is fin/journal/import. For our SOAP Service we need to prefix each “/” with a “$”
  • jobList : This element contains data describing the job which needs to be executed for this batch upload
    • jobName : This is the “package name” of the ESS job which loads the data into Oracle ERP Cloud. You can find this in FusionAppsOER or in the documentation. The format for the field is “packageName,jobName”
    • parameterList : This is the list of parameters which the job requires to execute. The parameters depend on the ESS Job being executed. In our case the ESS Job is for journals and in our case the parameters are Data Access Set ID, Source (Balance Transfer), LedgerID, GroupID (aka BatchID) etc
    • journalimport

      Example from FusionOER

  • interfaceDetails :  This is set to 15 for journals  (no longer needed in R11)
  • notificationCode : This is set to 50 (no longer needed in R11)
  • callBackURL :  The magic about this service is that it executes all of the ESS jobs in the background and then executes a callback to your service when its finished. This response contains the “last” ESSjob ID executed so you can then query the status of the jobs using the downloadESSJobExecutionDetails method.

 

Handling the callback

  • As mentioned earlier, the loadAndImportData operation does all heavy lifting, and orchestration , within Oracle Fusion ERP SaaS, the only thing the developer needs to implement [optionally but very desirable], is a webservice endpoint which manages the callback generated by the ESS framework. This service needs to implement the ESS onJobCompletion operation, which will deliver three pieces of data, the requestId of the ESSJob which completed, the state of the process and a status message. For more information on handing the ESS callbacks please see this documentation link, and additionally if you are using BPEL to execute the SOAP Service then this documentation link may be of interest (Section 11.7.7 : Receive the Job Completion Status)

 

Conclusion

The new LoadAndImportData operation will most certainly make importing of data into Oracle ERP a much simpler process, its biggest advantage is that developers will easily be able to trigger the import with a single SOAP call which can easily be done without the need to worry about orchestration. There are however scenarios when you would you probably use the traditional , step by step method, for example when you want to control/trigger external notification providers that each step has been executed at the macro level or when the import file size is very large (>100Mb). In the latter case you might want to upload the file into Oracle UCM using UCMs native IdcWebService, which supports MTOM and then execute the ESS jobs in order as we have traditionally done.

 

 

Using Event Handling Framework for Outbound Integration of Oracle Sales Cloud using Integration Cloud Service

$
0
0

Introduction:

Oracle’s iPaaS solution is the most comprehensive cloud based integration platform in the market today.  Integration Cloud Service (ICS) gives customers an elevated user experience that makescomplex integration simple to implement.

Oracle Sales Cloud (OSC) is a SaaS application and is a part of the comprehensive CX suite of applications. Since OSC is usually the customer master and is the center for all Sales related activities, integration with OSC is often a requirement in most use cases

Although OSC provides useful tools for outbound as well as inbound integration, it is a common practice to use ICS as a tool to integrate OSC and other SaaS as well as on-premises applications. In this article, I will explore this topic in detail and also demonstrate the use of Event Handling Framework (EHF) in OSC to achieve the same.

Main Article:

Within ICS you can leverage the OSC adapter to create an integration flow. OSC can act both as source (inbound)  or as target (outbound) for integration with other SaaS or on-premises applications; with ICS in the middle acting as the integration agent. While the inbound integration flow is triggered by the source application, invoking the outbound flow is the responsibility of OSC.

InboundIntegration OurboundIntegration

In this article, I will discuss the outbound flow, where OSC acts as the source and other applications serve as the target. There are essentially 2 ways of triggering this integration:

  • Invoking the ICS integration every time the object which needs to be integrated is created or updated. This can be achieved by writing groovy code inside create/update triggers of the object and invoking the flow web service by passing in the payload.
  • Using the Event Handling Framework (EHF) to generate an update or create event on the object and notify the subscribers. In this case, ICS registers itself with OSC and gets notified when the event gets fired along with the payload

 

OSC supports events for most important business objects such as Contact, Opportunities, Partners etc. More objects are being enabled with EHF support on a continuous basis.

In this article, I will demonstrate how to use EHF to achieve an outbound integration. We will create a flow in ICS which subscribes to the “Contact Created” event and on being notified of the event, updates the newly created contact object. While this integration is quite basic, it demonstrates the concept. While we use Update Contact as a target for our integration, you can use another SaaS application (for example Siebel or Service Cloud) as the target and create a Contact there.

Integration

 

Detailed steps:

Before starting, let’s identify some URLs. For the example, we will need 2 URLs – One for CommonDomain and one for CRMDomain. You can find these out using from Review Topology under Setup and Maintenance

CRM_URL FS_URL

The URLs will be of the following form:

CommonDomain: https://<instance_name>.fs.us2.oraclecloud.com

CRMDomain: https://<instance_name>.crm.us2.oraclecloud.com

I will refer to these URLs as COMMON_DOMAIN_URL and CRM_DOMAIN_URL in the rest of the article.

Let’s now move on to configuring our environment and creating a example integration based on events.

The first step is to create a CSF key so that Sales Cloud can connect to ICS and invoke the subscriptions. In R11, this can be achieved through SOA Composer. To access SOA Composer, navigate to <CRM_DOMAIN_URL>/soa/composer

Inside SOA Composer, click on “Manage Security” to  open “Manage Credentials” dialog. The name of csf-key should be the same as identity domain on the ICS instance. Provide username and password of the user that OSC should use to invoke ICS subscriptions.

Note: Customers didn’t have this ability in R10 and it had to be done by the operations team.

001_CSF_Key

Login to ICS Console and and in the home page, click on Create Connections followed by Create New Connection

01_ICS_Home_Page

02_Connections

Click Select under Oracle Sales Cloud

 

03_Create_Connection

Provide a unique name and identifier for the connection. Optionally, provide a detailed description. Click Create

04_New_Connection

 

You will see the prompt that the connection was created successfully and will automatically go to the connection details page. It tracks your progress as well. Click Configure Connectivity

05_Connection_Created

In the Connection Properties page, provide details as follows:

OSC Services Catalog WSDL URL: <COMMON_DOMAIN_URL>/fndAppCoreServices/ServiceCatalogService?wsdl

OSC Events Catalog URL: <CRM_DOMAIN_URL >/soa-infra.

06_Connection_Properties

Click Configure Connectivity

07_Configure_Credential

Provide credentials of the service user that will be user for integration and click OK

08_Credentials

Connection details page shows the connection is 85% complete. The only step remaining at this point it to test the connection to make sure all the details provided are correct. Click on Test

09_Test

If all the provided details are correct, you will see message confirming the test was successful. Progress indicator also shows 100%. At this point, you Save and click Exit Integration.

10_Test_Successful

You see a confirmation that the connection was saved successfully. You can also see the new connection in the list.

11_Connections

The next step is to use this connection to create an integration. Click on Integrations followed by Create New Integration.

12_Create_Integration

In the Create Integration – Select a Pattern dialog, click Select under Map My Data. You may choose a different pattern based on your integration requirements but for this example, we will use Map My Data pattern.

13_Select_Pattern

In the New Integration – Information dialog provide the unique name and identifier for this integration, an appropriate version number, and optionally a package name and description.

14_Integration_Information

Drag and drop the connection that we created on the source. This opens the Configure Sales Cloud Endpoint wizard.

15_Integration_Created

In the Configure Sales Cloud Endpoint wizard, provide the name, and optionally a description of the endpoint. Click Next.

16_Configure_Sales_Cloud_EP

In section titled Configure a Request, choose With Business Events to create this integration using Business Events in OSC. For this example, we will use Contact Created Event which fires when a contact is created is OSC. Click Next.

17_Pick_Event

In the next screen under section titled Response Type, choose None and click Next.

18_Response

The wizard shows the endpoint summary. Review the details and click Done.

19_EP_Summary

Now we have to create a target endpoint. Usually this target will be another application that we are integrating with OSC. For our example, we will simply use OSC as a target application itself. Drag and drop the OSC connection we created earlier into the target.

20_EP1_Done

In the Configure Sales Cloud Endpoint wizard, provide the name, and optionally a description of the endpoint. Click Next.

21_Configure_Sales_Cloud_EP

Under section titled Select a Business Object find the Contact object and click on it. The drop down below the operations this object supports. For this example, choose updateContact and click Next.

22_Pick_Business_Object

The wizard shows the endpoint summary. Review the details and click Done.

23_EP2_Summary

Now we need to map the source payload to the target payload. Clicking on the Map icon followed by the “+” icon to create a mapping.

24_EP1_Done

In the mapping wizard, you can specify the appropriate mapping. For our example, we will use a very simple mapping to update the PreviousLastName with the value of LastName we received in the payload. This doesn’t add a lot of value, but serves the purpose of illustrating an end-to-end integration. Drag and drop PartyId to PartyId from source to target and LastName to PreviousLastName from source to target. Click Save and Exit Mapper.

25_Map1

The integration details page shows our integration is 77% complete. One final step is to add tracking fields which allow us to identify various instances of integration. Click on Tracking.

26_Tracking

Drag and drop appropriate fields from Source into tracking fields and click Done.

27_Tracking_Identifiers

Now our integration is 100% complete. We can optionally choose an action for the response and fault. For our example, we will skip this step. Click on Save followed by Exit Integration.

28_integration_Complete

ICS console shows the integration was saved successfully. Newly created integration also shows up in the list of integrations. Click to Activate to activate this integration.

29_Integration_Saved

In the confirmation dialog, click Yes.

30_Activation_Confirmation

Once the integration is active, a subscription for it is created in OSC. You can review this subscription, as well as all the other subscriptions by invoking the following URL from your browser:

<CRM_DOMAIN_URL>/soa-infra/PublicEvent/subscriptions

31_Subscriptions

You can now create a Contact in Sales Cloud and it will almost instantaneously be updated with the new value of Previous Last Name.

 

Accessing Fusion Data from BI Reports using Java

$
0
0

Introduction

In a recent article by Richard Williams on A-Team Chronicles, Richard explained how you can execute a BI publisher report from a SOAP Service and retrieve the report, as XML, as part of the response of the SOAP call.  This blog article serves as a follow on blog article providing a tutorial style walk through on how to implement the above procedure in Java.

This article assumes you have already followed the steps in Richard’s blog article and created your report in BI Publisher, exposed it as a SOAP Service and tested this using SOAPUI, or another SOAP testing tool.

Following Richards guidance we know that he correct SOAP call could look like this

<soap:Envelope xmlns:soap="http://www.w3.org/2003/05/soap-envelope" xmlns:pub="http://xmlns.oracle.com/oxp/service/PublicReportService">
   <soap:Header/>
   <soap:Body>
      <pub:runReport>
         <pub:reportRequest>
            <pub:reportAbsolutePath>/~angelo.santagata@oracle.com/Bi report.xdo</pub:reportAbsolutePath>
            <pub:reportRawData xsi:nil="true" >true</pub:reportRawData>
            <pub:sizeOfDataChunkDownload>-1</pub:sizeOfDataChunkDownload>
            <pub:flattenXML>true</pub:flattenXML>
            <pub:byPassCache>true</pub:byPassCache>
         </pub:reportRequest>
         <pub:appParams/>
      </pub:runReport>
   </soap:Body>
</soap:Envelope>

Tip :One easy way to determine the reports location is to run the report and then examine the URL in the browser.

 

Implementing the SOAP call using JDeveloper 11g

We can now need to implement the Java SOAP Client to call our SOAP Service. For this blog we will use JDeveloper 11g, the IDE recommended for extending Oracle Fusion, however you are free to use your IDE of choice, e.g. NetBeans, Eclipse, VI, Notepad etc, the steps will obviously be different.

Creating the project

Within JDeveloper 11g start by creating a new Application and within this application create two generic projects. Call one project “BISOAPServiceProxy” and the other “FusionReportsIntegration”. The “BISOAPServiceProxy” project will contain a SOAP Proxy we are going to generate from JDeveloper 11g and the “FusionReportsIntegration” project will contain our custom client code. It is good practice to create separate projects so that the SOAP Proxies resides in its own separate project, this allows us to regenerate the proxy from scratch without affecting any other code.

Generating the SOAP Proxy

For this example we will be using the SOAP Proxy wizard as part of JDeveloper. This functionality generates a static proxy for us, which in turn makes it easier to generate the required SOAP call later.

  1. 1. With the BISOAPService project selected, start the JDeveloper SOAP Proxy wizard.
    File-> New-> Business Tier-> Web Services-> Web Service Proxy
  2. Proxy1
  3. 2. Click Next
  4. 3. Skipping the first welcome screen, in step 2 enter the JAX-WS Style as the type of SOAP Proxy you wish to generate in step 3 enter the WSDL of your Fusion Application BI Publisher webservice WSDL. It’s best to check this URL returns a WSDL document in your web browser before entering it here. The WSDL location will normally be something like : http://<your fusion Applications Server>/xmlpserver/services/ExternalReportWSSService?wsdl
  5. Proxy2
  6. It’s recommended that you leave the copy WSDL into project check-box selected.
  7. 4. Give a package name, unless you need to it’s recommended to leave the Root Package for generated types to be left blank
  8. proxy3
  9. 5. Now hit Finish

Fixing the project dependencies

We now need to make sure that the “FusionReportsIntegration” is able to see classes generated by the  “BISOAPServiceProxy” proxy. To resolve this in JDeveloper we simply need to setup a dependency between the two projects.

  1. 1. With the FusionReportsIntegration project selected, right-mouse click on the project and select “Project properties
  2. 2. In the properties panel select Dependencies
  3. 3. Select the little pencil icon and in the resulting dialog select “Build Output”. This selection tells JDeveloper that “this project depends on the successful build output” of the other project.
  4. 4. Save the Dialog
    dependancies1
  5. 5. Close [OK] the Project Properties dialog
  6. 6. Now is a good time to hit compile and make sure the SOAP proxy compiles without any errors, given we haven’t written any code yet it should compile just fine.

Writing the code to execute the SOAP call

With the SOAP Proxy generated, the project dependency setup, we’re now ready to write the code which will call the BI Server using the generated SOAP Proxy

  1. 1. With the Fusion Reports Integration selected , right mouse Click -> New -> Java -> Java Class
    javacode
  2. 2. Enter a name, and java package name, for your class
  3. 3. Ensure that “Main Method” is selected. This is so we can execute the code from the command line, you will want to change this depending on where you execute your code from, e.g. A library, a servlet etc.
  4. 4. Within the main method you will need to enter the following code snippet, once this code snippet is pasted you will need to correct and resolve imports for your project.
  5. 1.	ExternalReportWSSService_Service externalReportWSSService_Service;
    2.	// Initialise the SOAP Proxy generated by JDeveloper based on the following WSDL xmlpserver/services/ExternalReportWSSService?wsdl
    3.	externalReportWSSService_Service = new ExternalReportWSSService_Service();
    4.	// Set security Policies to reflect your fusion applications
    5.	SecurityPoliciesFeature securityFeatures = new SecurityPoliciesFeature(new String[]
    6.	{ "oracle/wss_username_token_over_ssl_client_policy" });
    7.	// Initialise the SOAP Endpoint
    8.	ExternalReportWSSService externalReportWSSService = externalReportWSSService_Service.getExternalReportWSSService(securityFeatures);
    9.	// Create a new binding, this example hardcodes the username/password, 
    10.	// the recommended approach is to store the username/password in a CSF keystore
    11.	WSBindingProvider wsbp = (WSBindingProvider)externalReportWSSService;
    12.	Map<String, Object> requestContext = wsbp.getRequestContext();
    13.	//Map to appropriate Fusion user ID, no need to provide password with SAML authentication
    14.	requestContext.put(WSBindingProvider.USERNAME_PROPERTY, "username");
    15.	requestContext.put(WSBindingProvider.PASSWORD_PROPERTY, "password");
    16.	requestContext.put(WSBindingProvider.ENDPOINT_ADDRESS_PROPERTY, "https://yourERPServer:443/xmlpserver/services/ExternalReportWSSService");
    
    17.	// Create a new ReportRequest object using the generated ObjectFactory
    18.	ObjectFactory of = new ObjectFactory();
    19.	ReportRequest reportRequest = of.createReportRequest();
    20.	// reportAbsolutePath contains the path+name of your report
    21.	reportRequest.setReportAbsolutePath("/~angelo.santagata@oracle.com/Bi report.xdo");
    22.	// We want raw data
    23.	reportRequest.setReportRawData("");
    24.	// Get all the data
    25.	reportRequest.setSizeOfDataChunkDownload(-1); 
    26.	// Flatten the XML response
    27.	reportRequest.setFlattenXML(true);
    28.	// ByPass the cache to ensure we get the latest data
    29.	reportRequest.setByPassCache(true);
    30.	// Run the report
    31.	ReportResponse reportResponse = externalReportWSSService.runReport(reportRequest, "");
    32.	// Display the output, note the response is an array of bytes, you can convert this to a String
    33.	// or you can use a DocumentBuilder to put the values into a XLM Document object for further processing
    34.	System.out.println("Content Type="+reportResponse.getReportContentType());
    35.	System.out.println("Data ");
    36.	System.out.println("-------------------------------");
    37.	String data=new String (reportResponse.getReportBytes());
    38.	System.out.println(data);
    39.	System.out.println("-------------------------------");
  6. Going through the code

  7.  
    Line What does it do
    1-3 This is the instantiation of a new class containing the WebService Proxy object. This was generated for us earlier
    5 Initialise a new instance of a security policy object, with the correct security policy, for your Oracle Fusion server . The most common security policy is that of “oracle/wss_username_token_over_ssl_client_policy”, however your server maybe setup differently
    8 Calls the factory method to initialise a SOAP endpoint with the correct security features set
    9-16 These lines setup the SOAP binding so that it knows which endpoint to execute (i.e. the Hostname+URI of your webservice which is not necessarily the endpoint where the SOAP Proxy was generated, the username and the password.In this example we are hard coding the details because we are going to be running this example on the command line. If this code is to be  executed on a JEE server, e.g. Weblogic, then we recommend this data is stored in the Credential store as CSF keys.
    17-19 Here we create a reportRequest object and populate it with the appropriate parameters for the SOAP call. Although not mandatory its recommended that you use the objectFactory generated by the SOAP proxy wizard in JDeveloper.
    21 This set the ReportPath parameter, including path to the report
    23 This line ensures we get the raw data without decoration, layouts etc.
    25 By default BI Publisher publishes data on a range basis, e.g. 50 rows at a time, for this usecase we want all the rows, and setting this to -1 will ensure this
    27 Tells the webservice to flatten out the XML which is produced
    29 This is an optional flag which instructs the BI Server to bypass the cache and go direct to the database
    30 This line executes the SOAP call , passing the “reportReport” object we previously populated as a parameter. The return value is a reportResponse object
    34-39 These lines print out the results from the BI Server. Of notable interest is the XML document is returned as a byte array. In this sample we simply print out the results to the output, however you would normally pass the resulting XML into Java routines to generate a XML Document.

 

 

Because we are running this code from the command line as a java client code we need to import the Fusion Apps Certificate into the Java Key Store. If you run the code from within JDeveloper then the java keystore is stored in <JDeveloperHome>\wlserver_10.3\server\lib\DemoTrust.jks

Importing certificates

 

  1. 1. Download the Fusion Applications SSL certificate, using a browser like internet explorer navigate to the SOAP WSDL URL
  2. 2. Mouse click on the security Icon which will bring you to the certificate details
  3. 3. View Certificate
    4. Export Certificate as a CER File
  4. 5. From the command line we now need to import the certificate into our DemoTrust.jks file using the following commandkeytool -import -alias fusionKey -file fusioncert.cer -keystore DemoIdentity.jks

jks

Now ready to run the code!

With the runReport.java file selected press the “Run” button, if all goes well then the code will execute and you should see the XML result of the BI Report displayed on the console.

 


Oracle Data Integrator (ODI) for HCM-Cloud: a Knowledge Module to Generate HCM Import Files

$
0
0

Introduction

For batch imports, Oracle Cloud’s Human Capital Management (HCM) uses a dedicated file format that contains both metadata and data. As far as the data is concerned, the complete hierarchy of parent and children records must be respected for the file content to be valid.

To load data into HCM with ODI, we are looking here into a new Integration Knowledge Module (KM). This KM allows us to leverage ODI to prepare the data and generate the import file. Then traditional Web Services connections can be leveraged to load the file into HCM.

Description of the Import File Format

HCM uses a structured file format that follows a very specific syntax so that complex objects can be loaded. The complete details of the syntax for the import file are beyond the scope of this article, we only provide an overview of the process here. For more specific instructions, please refer to Oracle Human Capital Management Cloud: Integrating with Oracle HCM Cloud.

The loader for HCM (HCL) uses the following syntax:

  • Comments are used to make the file easier to read by humans. All comment lines must start with the keyword COMMENT
  • Because the loader can be used to load all sorts of business objects, the file must describe the metadata of the objects being loaded. This includes the objects name along with their attributes. Metadata information must be prefixed by the keyword METADATA.
  • The data for the business objects can be inserted or merged. The recommended approach is to merge the incoming data: in this case data to be loaded is prefixed with the keyword MERGE, immediately followed by the name of the object to be loaded and the values for the different attributes.

The order in which the different elements are listed in the file is very important:

  • Metadata for an object must always be described before data is provided for that object;
  • Parent objects must always be described before their dependent records.

In the file example below we are using the Contact business object because it is relatively simple and makes for easier descriptions of the process. The Contact business object is made of multiple components: Contact, ContactName, ContactAddress, etc. Notice that in the example the Contact components are listed before the ContactName components, and that data entries are always placed after their respective metadata.

COMMENT ##############################################################
COMMENT HDL Sample files.
COMMENT ##############################################################
COMMENT Business Entity : Contact
COMMENT ##############################################################
METADATA|Contact|SourceSystemOwner|SourceSystemId|EffectiveStartDate|EffectiveEndDate|PersonNumber|StartDate
MERGE|Contact|ST1|ST1_PCT100|2015/09/01|4712/12/31|STUDENT1_CONTACT100|2015/09/01
MERGE|Contact|ST1|ST1_PCT101|2015/09/01|4712/12/31|ST1_CT101|2015/09/01
COMMENT ##############################################################
COMMENT Business Entity : ContactName
COMMENT ##############################################################
METADATA|ContactName|SourceSystemOwner|SourceSystemId|PersonId(SourceSystemId)|EffectiveStartDate|EffectiveEndDate|LegislationCode|NameType|FirstName|MiddleNames|LastName|Title
MERGE|ContactName|ST1|ST1_CNTNM100|ST1_PCT100|2015/09/01|4712/12/31|US|GLOBAL|Emergency||Contact|MR.
MERGE|ContactName|STUDENT1|ST1_CNTNM101|ST1_PCT101|2015/09/01|4712/12/31|US|GLOBAL|John||Doe|MR.

Figure 1: Sample import file for HCM

The name of the file is imposed by HCM (the file must have the name of the parent object that is loaded). Make sure to check with the HCM documentation for the the limits in size and number of records for the file that you are creating. We will also have to zip the file before uploading it to the Cloud.

Designing the Knowledge Module

Now that we know what needs to be generated, we can work on creating a new Knowledge Module to automate this operation for us. If you need more background on KMs, the ODI documentation has a great description available here.

With the new KM, we want to respect all the constraints imposed by the loader for the file format. We also want to simplify the creation of the file as much as possible.

Our reasoning was that if ODI is used to prepare the file, the environment would most likely be such that:

  • Data has to be aggregated, augmented from external sources or somehow processed before generating the file;
  • Some of the data is coming from a database, or a database is generally available.

We designed our solution by creating database tables that matched the components of the business object that can be found in the file. This gives us the ability to enforce referential integrity: once primary keys and foreign keys in place in the database, parent records are guaranteed to be available in the tables when we want to write a child record to the file. Our model is the following for the Contact business object:

Data Model for HCM load

Figure 2: Data structure created in the database to temporary store and organize data for the import file

We are respecting the exact syntax (case sensitive) for the table names and columns. This is important because we will use these metadata to generate the import file.

The metadata need to use the proper case in ODI – depending on your ODI configuration, this may result in mixed case or all uppercase table names in your database. Either case works for the KM.

At this point, all we need is for our KM to write to the file when data is written to the tables. If the target file does not exist, the KM creates it with the proper header. If it does exist, the KM appends the metadata and data for the current table to the end of the file. Because of the referential integrity constraints in the database, we have to load the parent tables first… this will guarantee that the records are added to the file in the appropriate order. All we have to do is to use this KM for all the target tables of our model, and to load the tables in the appropriate order.

For an easy implementation, we took the IKM Oracle Insert and modified it as follows:

  • We added two options: one to specify the path where the HCM import file must be generated, the other for the name of the file to generate;
  • We created a new task to write the content of the table to the file, once data has been committed to the table. This task is written in Groovy and shown below in figure 3:

import groovy.sql.Sql
File file = new File('<%=odiRef.getOption("HCM_IMPORT_FILE_FOLDER")%>/<%=odiRef.getOption("HCM_IMPORT_FILE_NAME")%>')
if (!file.exists()){
file.withWriterAppend{w->
w<<"""COMMENT ##################################################################

COMMENT File generated by ODI
COMMENT Based on HDL Desktop Integrator- Sample files.
"""
  }
}
file.withWriterAppend{w->
  w<<"""
COMMENT ##########################################################################
COMMENT Business Entity : <%=odiRef.getTargetTable("TABLE_NAME")%>
COMMENT ###########################################################################
"""
  }
file.withWriterAppend{w->
w<<"""METADATA|<%=odiRef.getTargetTable("TABLE_NAME")%>|<%=odiRef.getTargetColList("", "[COL_NAME]", "|", "")%>
""".replace('"', '')
  }
// Connect to the target database
def db = [url:'<%=odiRef.getInfo("DEST_JAVA_URL")%>', user:'<%=odiRef.getInfo("DEST_USER_NAME")%>', password:'<%=odiRef.getInfo("DEST_PASS")%>', driver:'<%=odiRef.getInfo("DEST_JAVA_DRIVER")%>']
def sql = Sql.newInstance(db.url, db.user, db.password, db.driver)
// Retrieve data from the target table and write the data to the file
sql.eachRow('select * from  <%=odiRef.getTable("L","TARG_NAME","D")%>') { row ->
     file.withWriterAppend{w->
w<<"""MERGE|<%=odiRef.getTargetTable("TABLE_NAME")%>|<%=odiRef.getColList("","${row.[COL_NAME]}", "|", "", "")%>
""".replace('null','')
  }
 }
sql.close()

Figure 3: Groovy code used in the KM to create the import file

If you are interested in this implementation, the KM is available here for download.

Now all we have to do is to use the KM in our mappings for all target tables.

HCM KM in Use

Figure 4: The KM used in a mapping

We can take advantage of the existing options in the KM to either create the target tables if they do not exist or truncate them if they already exist. This guarantees that we only add new data to the import file.

Testing the Knowledge Module

To validate that the KM is creating the file as expected, we have created a number of mappings that load the 6 tables of our data model. Because one of our source files contains data for more than just one target table, we create a single mapping to load the first three tables. In this mapping, we specify the order in which ODI must process these loads as shown in figure 5 below:

Ensure data load order

Figure 5: Ensuring load order for the target tables… and for the file construction.

The remaining tables load can be designed either in individual mappings or consolidated in a single mapping if the transformations are really basic.

We can then combine these mappings in a package that waits for incoming data (incoming files or changes propagated by GoldenGate). The Mappings process the data and create the import file. Once the file is created, we can zip it to make it ready for upload and import with web services, a subject that will be discussed in our next blog post. The complete package looks like this:

HCM Load package

Figure 6: Package to detect arriving data, process them with the new KM and generate an import file for HCM, compress the file and invoke the necessary web services to upload and import the file.

With this simple package, you can start bulk loading business objects into HCM-Cloud with ODI.

The web service to import data into HCM requires the use of OWSM security policies. To configure OWSM with ODI, please see Connecting Oracle Data Integrator (ODI) to the Cloud: Web Services and Security Policies

Conclusion

With relatively simple modifications to an out-of-the-box ODI Knowledge Module, the most advanced features of ODI can now be leveraged to generate an import file for HCM and to automate the load of batch data into the cloud

For more Oracle Data Integrator best practices, tips, tricks, and guidance that the A-Team members gain from real-world experiences working with customers and partners, visit Oracle A-team Chronicles for Oracle Data Integrator.

Acknowledgements

Special thanks to Jack Desai and Richard Williams for their help and support with HCM and its load process.

References

HCM Atom Feed Subscriber using SOA Cloud Service

$
0
0

Introduction

HCM Atom feeds provide notifications of Oracle Fusion Human Capital Management (HCM) events and are tightly integrated with REST services. When an event occurs in Oracle Fusion HCM, the corresponding Atom feed is delivered automatically to the Atom server. The feed contains details of the REST resource on which the event occurred. Subscribers who consume these Atom feeds use the REST resources to retrieve additional information about the resource.

For more information on Atom, please refer to this.

This post focuses on consuming and processing HCM Atom feeds using Oracle Service Oriented Architecture (SOA) Cloud Service. Oracle SOA Cloud Service provides a PaaS computing platform solution for running Oracle SOA Suite, Oracle Service Bus, and Oracle API Manager in the cloud. For more information on SOA Cloud Service, please refer this.

Oracle SOA is the industry’s most complete and unified application integration and SOA solution. It transforms complex application integration into agile and re-usable service-based connectivity to speed time to market, respond faster to business requirements, and lower costs.. SOA facilitates the development of enterprise applications as modular business web services that can be easily integrated and reused, creating a truly flexible, adaptable IT infrastructure.

For more information on getting started with Oracle SOA, please refer this. For developing SOA applications using SOA Suite, please refer this.

 

Main Article

Atom feeds enable you to keep track of any changes made to feed-enabled resources in Oracle HCM Cloud. For any updates that may be of interest for downstream applications, such as new hire, terminations, employee transfers and promotions, Oracle HCM Cloud publishes Atom feeds. Your application will be able to read these feeds and take appropriate action.

Atom Publishing Protocol (AtomPub) allows software applications to subscribe to changes that occur on REST resources through published feeds. Updates are published when changes occur to feed-enabled resources in Oracle HCM Cloud. These are the following primary Atom feeds:

Employee Feeds

New hire
Termination
Employee update

Assignment creation, update, and end date

Work Structures Feeds (Creation, update, and end date)

Organizations
Jobs
Positions
Grades
Locations

The above feeds can be consumed programmatically. In this post, Node.js is implemented as one of the solutions consuming “Employee New Hire” feeds, but design and development is similar for all the supported objects in HCM.

 

HCM Atom Introduction

For Atom “security, roles and privileges”, please refer my blog HCM Atom Feed Subscriber using Node.js.

 

Atom Feed Response Template

 

AtomFeedSample_1

SOA Cloud Service Implementation

Refer my blog on how to invoke secured REST services using SOA. The following diagram shows the patterns to subscribe to HCM Atom feeds and process it to downstream applications that may have either web services or file based interfaces. Optionally, all entries from the feeds could be staged either in database or messaging cloud before processing it during events such as downstream application is not available or throwing system errors. This provides the ability to consume the feeds, but hold the processing until downstream applications are available. Enterprise Scheduler Service (ESS), a component of SOA Suite, is leveraged to invoke the subscriber composite periodically.

 

soacs_atom_pattern

The following diagram shows the implementation of the above pattern for Employee New Hire:

soacs_atom_composite

 

Feed Invocation from SOA

HCM cloud feed though in XML representation, the media type of the payload response is “application/atom+xml”. This media type is not supported at this time, but use the following java embedded activity in your BPEL component:

Once the built-in REST Adapter supports the Atom media type, java embedded activity will be replaced and further simplify the solution.

try {

String url = "https://mycompany.oraclecloud.com";
String lastEntryTS = (String)getVariableData("LastEntryTS");
String uri = "/hcmCoreApi/atomservlet/employee/newhire";

//Generate URI based on last entry timestamp from previous invocation
if (!(lastEntryTS.isEmpty())) {
uri = uri + "?updated-min=" + lastEntryTS;
}

java.net.URL obj = new URL(null,url+uri, new sun.net.www.protocol.https.Handler());

javax.net.ssl.HttpsURLConnection conn = (HttpsURLConnection) obj.openConnection();
conn.setRequestProperty("Content-Type", "application/vnd.oracle.adf.resource+json");
conn.setDoOutput(true);
conn.setRequestMethod("GET");

String userpass = "username" + ":" + "password";
String basicAuth = "Basic " + javax.xml.bind.DatatypeConverter.printBase64Binary(userpass.getBytes("UTF-8"));
conn.setRequestProperty ("Authorization", basicAuth);

String response="";
int responseCode=conn.getResponseCode();
System.out.println("Response Code is: " + responseCode);

if (responseCode == HttpsURLConnection.HTTP_OK) {

BufferedReader reader = new BufferedReader(new InputStreamReader(conn.getInputStream()));

String line;
String contents = "";

while ((line = reader.readLine()) != null) {
contents += line;
}

setVariableData("outputVariable", "payload", "/client:processResponse/client:result", contents);

reader.close();

}

} catch (Exception e) {
e.printStackTrace();
}

 

These are the following things to consider when consuming feeds:

Initial Consumption

When you subscribe first time, you can invoke the resource with the query parameters to get all the published feeds or use updated-min or updated-max arguments to filter entries in a feed to begin with.

For example the invocation path could be /hcmCoreApi/Atomservlet/employee/newhire or /hcmCoreApi/Atomservlet/employee/newhire?updated-min=<some-timestamp>

After the first consumption, the “updated” element of the first entry must be persisted to use it in next call to avoid duplication. In this prototype, the “/entry/updated” timestamp value is persisted in a database cloud (DbaaS).

This is the sample database table

create table atomsub (
id number,
feed_ts varchar2(100) );

For initial consumption, keep the table empty or add a row with the value of feed_ts to consume initial feeds. For example, the feed_ts value could be “2015-09-16T09:16:00.000Z” to get all the feeds after this timestamp.

In SOA composite, you will update the above table to persist the “/entry/updated” timestamp in the feed_ts column of the “atomsub” table.

 

Next Call

In next call, read the updated timestamp value from the database and generate the URI path as follows:

String uri = "/hcmCoreApi/atomservlet/employee/newhire";
String lastEntryTS = (String)getVariableData("LastEntryTS");
if (!(lastEntryTS.isEmpty())) {
uri = uri + "?updated-min=" + lastEntryTS;
}

The above step is done in java embedded activity, but it could be done in SOA using <assign> expressions.

Parsing Atom Feed Response

The Atom feed response is in XML format as shown previously in the diagram. In this prototype, the feed response is stored in output variable as a string. The following expression in <assign> activity will convert it to XML

oraext:parseXML($outputVariable.payload/client:result)


Parsing Each Atom Entry for Downstream Processing

Each entry has two major elements as mentioned in Atom response payload structure.

Resource Link

This contains the REST employee resource link to get Employee object. This is a typical REST invocation from SOA using REST Adapter. For more information on invoking REST services from SOA, please refer my blog.

 

Content Type

This contains selected resource data in JSON format. For example: “{  “Context” : [ {    “EmployeeNumber” : “212”,    “PersonId” : “300000006013981”,    “EffectiveStartDate” : “2015-10-08”,    “EffectiveDate” : “2015-10-08”,    “WorkEmail” : “phil.davey@mycompany.com”,    “EmployeeName” : “Davey, Phillip”  } ]}”.

In order to use above data, it must be converted to XML. The BPEL component provides a Translator activity to transform JSON to XML. Please refer the SOA Development document, section B1.8 – doTranslateFromNative.

 

The <Translate> activity syntax to convert above JSON string from <content> is as follows:

<assign name="TranslateJSON">
<bpelx:annotation>
<bpelx:pattern>translate</bpelx:pattern>
</bpelx:annotation>
<copy>
 <from>ora:doTranslateFromNative(string($FeedVariable.payload/ns1:entry/ns1:content), 'Schemas/JsonToXml.xsd', 'Root-Element', 'DOM')</from>
 <to>$JsonToXml_OutputVar_1</to>
 </copy>
</assign>

This is the output:

jsonToXmlOutput

The following provides detailed steps on how to use Native Format Builder in JDeveloper:

In native format builder, select JSON format and use above <content> as a sample to generate a schema. Please see the following diagrams:

JSON_nxsd_1JSON_nxsd_2JSON_nxsd_3

JSON_nxsd_5

 

One and Only One Entry

Each entry in an Atom feed has a unique ID. For example: <id>Atomservlet:newhire:EMP300000005960615</id>

In target applications, this ID can be used as one of the keys or lookups to prevent reprocessing. The logic can be implemented in your downstream applications or in the integration space to avoid duplication.

 

Scheduler and Downstream Processing

Oracle Enterprise Scheduler Service (ESS) is configured to invoke the above composite periodically. At present, SOA cloud service is not provisioned with ESS, but refer this to extend your domain. Once the feed response message is parsed, you can process it to downstream applications based on your requirements or use cases. For guaranteed transactions, each feed entry can be published in Messaging cloud or Oracle Database to stage all the feeds. This will provide global transaction and recovery when downstream applications are not available or throws error.

The following diagram shows how to create job definition for a SOA composite. For more information on ESS, please refer this.

ess_3

SOA Cloud Service Instance Flows

First invocation without updated-min argument to get all the feeds

 

soacs_atom_instance_json

Atom Feed Response from above instance

AtomFeedResponse_1

 

Next invocation with updated-min argument based on last entry timestamp

soacs_atom_instance_noentries

 

Conclusion

This post demonstrates how to consume HCM Atom feeds and process it for downstream applications. It provides details on how to consume new feeds (avoid duplication) since last polled. Finally it provides an enterprise integration pattern from consuming feeds to downstream applications processing.

 

Sample Prototype Code

The sample prototype code is available here.

 

soacs_atom_composite_1

 

 

Creating custom Fusion Applications User Interfaces using Oracle JET

$
0
0

Introduction

JET is Oracle’s new mobile toolkit specifically written for developers to help them build client slide applications using JavaScript. Oracle Fusion Applications implementers are often given the requirement to create mobile, or desktop browser, based custom screens for Fusion Applications. There are many options available to the developer for example Oracle ADF (Java Based) and Oracle JET (JavaScript based). This blog article gives the reader a tutorial style document on how to build a hybrid application using data from Oracle Fusion Sales Cloud. It is worth highlighting that although this tutorial is using Sales Cloud, the technique below is equally applicable to HCM cloud, or any other Oracle SaaS cloud product which exposes a REST API.

Main Article

Pre-Requisites

It is assumed that you’ve already read the getting started guide on the Oracle Jet website and installed all the pre-requisites. In addition if you are to create a mobile application then you will also need to install the mobile SDKs from either Apple (XCode) or Android (Android SDK).

 

You must have a Apple Mac to be able to install the Apple IOS developer kit (XCode), it is not possible to run XCode on a Windows PC

Dealing with SaaS Security

Before building the application itself we need to start executing the REST calls and getting our data and security is going to be the first hurdle we need to cross.Most Sales Cloud installations allow “Basic Authentication” to their APIs,  so in REST this involves creating a HTTP Header called “Authorization” with the value “Basic <your username:password>” , with the <username:password> section encoded as Base64. An alternative approach used when embedding the application within Oracle SaaS is to use a generated JWT token. This token is generated by Oracle SaaS using either groovy or expression language. When embedding the application in Oracle SaaS you have the option of passing parameters, the JWT token would be one of these parameters and can subsequently be used instead of the <username:password>. When using JWT token the Authorization string changes slightly so that instead of “Basic” it become “Bearer”,

 

Usage Header Name Header Value
Basic Authentication Authorization Basic <your username:password base64 encoded>
JWT Authentication Authorization Bearer <JWT Token>

 

Groovy Script in SalesCloud to generate a JWT Token

def thirdpartyapplicationurl = oracle.topologyManager.client.deployedInfo.DeployedInfoProvider.getEndPoint("My3rdPartyApplication" )
def crmkey= (new oracle.apps.fnd.applcore.common.SecuredTokenBean().getTrustToken())
def url = thirdpartyapplicationurl +"?jwt ="+crmkey
return (url)

Expression Language in Fusion SaaS (HCM, Sales, ERP etc) to generate a JWT Token

#{EndPointProvider.externalEndpointByModuleShortName['My3rdPartApplication']}?jwt=#{applCoreSecuredToken.trustToken}

Getting the data out of Fusion Applications using the REST API

When retrieving  data from Sales Cloud we need to make sure we get the right data, not too much and not too little. Oracle Sales Cloud, like many other Oracle SaaS products, now supports the REST API for inbound and outbound data access. Oracle HCM also has a REST API but at the time of writing this article, the API is in controlled availability.

Looking at the documentation hosted at Oracle Help Center :http//docs.oracle.com/cloud/latest/salescs_gs/FAAPS/ 

The REST call to get all Sales Cloud Opportunities looks like this :

https://yourCRMServer/salesApi/resources/latest/opportunities

If you executed the above REST call you will notice that the resulting payload is large, some would say huge. There are good reasons for this, namely that the Sales Cloud Opportunity object contains a large number fields, secondly the result not only contains data but also contains metadata and finally the request above is a select all query. The metadata includes links to child collections, links to List of Values, what tabs are visible in Sales Cloud , custom objects, flexfields etc. Additionally the query we just executed is a the equivalent of a select * from table, i.e. it brings back everything so we’ll also need to fix that.

 

Example snippet of a SalesCloud Opportunity REST Response showing custom fields,tabs visible, child collections etc

"Opportunity_NewQuote_14047462719341_Layout6": "https://mybigm.bigmachines.com/sso/saml_request.jsp?RelayState=/commerce/buyside/document.jsp?process=quickstart_commerce_process_bmClone_4%26formaction=create%26_partnerOpportunityId=3000000xxx44105%26_partnerIdentifier=fusion%26_partnerAccountId=100000001941037",
  "Opportunity_NewQuote_14047462719341_Layout6_Layout7": "https://mybigMmachine.bigmachines.com/sso/saml_request.jsp?RelayState=/commerce/buyside/document.jsp?process=quickstart_commerce_process_bmClone_4%26formaction=create%26_partnerOpportunityId=300000060xxxx5%26_partnerIdentifier=fusion%26_partnerAccountId=100000001941037",
  "ExtnFuseOpportunityEditLayout7Expr": "false",
  "ExtnFuseOpportunityEditLayout6Expr": "false",
  "ExtnFuseOpportunityCreateLayout3Expr": "false",
  "Opportunity_NewQuote_14047462719341_Layout8": "https://mybigm-demo.bigmachines.com/sso/saml_request.jsp?RelayState=/commerce/buyside/document.jsp?process=quickstart_commerce_process_bmClone_4%26formaction=create%26_partnerOpportunityId=300000060744105%26_partnerIdentifier=fusion%26_partnerAccountId=100000001941037",
  "ExtnFuseOpportunityEditLayout8Expr": "false",
  "CreateProject_c": null,
  "Opportunity_DocumentsCloud_14399346021091": "https://mydoccloud.documents.us2.oraclecloud.com/documents/embed/link/LF6F00719BA6xxxxxx8FBEFEC24286/folder/FE3D00BBxxxxxxxxxxEC24286/lyt=grid",
  "Opportunity_DocsCloud_14552023624601": "https://mydocscserver.domain.com:7002/SalesCloudDocCloudServlet/doccloud?objectnumber=2169&objecttype=OPPORTUNITY&jwt=eyJhxxxxxy1pqzv2JK0DX-xxxvAn5r9aQixtpxhNBNG9AljMLfOsxlLiCgE5L0bAI",
  "links": [
    {
      "rel": "self",
      "href": "https://mycrmserver-crm.oracledemos.com:443/salesApi/resources/11.1.10/opportunities/2169",
      "name": "opportunities",
      "kind": "item",
      "properties": {
        "changeIndicator": "ACED0005737200136A6176612E7574696C2E41727261794C6973747881D21D99C7619D03000149000473697A65787000000002770400000010737200116A6176612E6C616E672E496E746567657212E2A0A4F781873802000149000576616C7565787200106A6176612E6C616E672E4E756D62657286AC951D0B94E08B020000787200106A6176612E6C616E672E4F626A65637400000000000000000000007870000000017371007E00020000000178"
      }
    },
    {
      "rel": "canonical",
      "href": "https://mycrmserver-crm.oracledemos.com:443/salesApi/resources/11.1.10/opportunities/2169",
      "name": "opportunities",
      "kind": "item"
    },
    {
      "rel": "lov",
      "href": "https://mycrmserver-crm.oracledemos.com:443/salesApi/resources/11.1.10/opportunities/2169/lov/SalesStageLOV",
      "name": "SalesStageLOV",
      "kind": "collection"
    },

Thankfully we can tell the REST API that we :

  • Only want to see the data, achieved by adding onlyData=true parameter
  • Only want to see the following fields OpportunityNumber,Name,CustomerName (TargetPartyName), achieved by adding a fields=<fieldName,fieldname> parameter
  • Only want to see a max of 10 rows, achieved by adding the limit=<value> parameter
  • Only want to see open opportunities, achieved by adding the q= parameter with a query string, in our case StatusCode=OPEN

If we want to get the data in pages/blocks we can use the offset parameter. The offset parameter tells the REST service to get the data “from” this offset. Using offset and limit we can effectively page through the data returned by Oracle Fusion Applications REST Service.

Our final REST request URL would look like :

https://myCRMServeroracledemos.com/salesApi/resources/latest/opportunities?onlyData=true&fields=OptyNumber,Name,Revenue,TargetPartyName,StatusCode&q=StatusCode=OPEN&offset=0&limit=10

The Oracle Fusion Applications REST API is documented in the relevant Oracle Fusion Applications Documentation, e.g. for Sales Cloud, http://docs.oracle.com/cloud/latest/salescs_gs/FAAPS/ but it is also worth noting that the Oracle Fusion Applications REST Services are simply an implementation of the Oracle ADF Business Components REST Services, these are very well documented here  https://docs.oracle.com/middleware/1221/adf/develop/GUID-8F85F6FA-1A13-4111-BBDB-1195445CB630.htm#ADFFD53992

Our final tuned JSON result from the REST service will look something like this (truncated) :

{
  "items": [
    {
      "Name": "Custom Sentinel Power Server @ Eagle",
      "OptyNumber": "147790",
      "StatusCode": "OPEN",
      "TargetPartyName": "Eagle Software Inc",
      "Revenue": 104000
    },
    {
      "Name": "Ultra Servers @ Beutelschies & Company",
      "OptyNumber": "150790",
      "StatusCode": "OPEN",
      "TargetPartyName": "Beutelschies & Company",
      "Revenue": 175000
    },
    {
      "Name": "Diablo Technologies 1012",
      "OptyNumber": "176800",
      "StatusCode": "OPEN",
      "TargetPartyName": "Diablo Technologies",
      "Revenue": 23650
    }
}

Creating the Hybrid Application

Now we have our datasource defined we can start to build the application. We want this application to be available on a mobile device and therefore we will create a “Mobile Hybrid” application using Oracle JET, using the –NavDrawer template.

yo oraclejet:hybrid OSCOptyList --template=navDrawer --platforms=android

Once the yeoman script has built your application, you can test the (basic) application using the following two commands.

grunt build --platform=android 
grunt serve --platform=android --web=true

The second grunt serve command has a web=true parameter at the end, this is telling the script that we’re going to be testing this in our browser and not on the device itself. When this is run you should see basic shell [empty] application in your browser window.

For

Building Our JavaScript UI

Now that we have our data source defined we can now get onto to task of building the JET User Interface. Previously you executed the yo oraclejet:hybrid command, this created you a hybrid application using a template. Opening the resulting project in an IDE, like NetBeans, we can see that the project template has created a collection of files and that one of them is “dashboard.html” (marked 1 in the image), edit this file using your editor.

dashboard.html

 

Within the file delete everything and replace it with this snippet of html code

<div class="oj-hybrid-padding">
    <div class="oj-flex">
        <div class="oj-flex-item">
            <button id= "prevButton" 
                    data-bind="click: previousPage, 
                       ojComponent: { component: 'ojButton', label: 'Previous' }">
            </button>
            <button id= "nextButton"
                    data-bind="click: nextPage, 
                       ojComponent: { component: 'ojButton', label: 'Next' }">
            </button>
        </div>
    </div>
    <div class="oj-flex-item">    
        <div class="oj-panel oj-panel-alt1 oj-margin">
            <table id="table" summary="Opportunity List" aria-label="Opportunity List"
                   data-bind="ojComponent: {component: 'ojTable', 
                                data: opportunityDataSource, 
                                columnsDefault: {sortable: 'none'}, 
                                columns: [{headerText: 'Opty Number', 
                                           field: 'OptyNumber'},
                                          {headerText: 'Name', 
                                           field: 'Name'},
                                          {headerText: 'Revenue', 
                                           field: 'Revenue'},
                                          {headerText: 'Customer Name', 
                                           field: 'TargetPartyName'},
                                          {headerText: 'Status Code', 
                                           field: 'StatusCode'}
           ]}">
            </table>
        </div>    
    </div>
</div>

The above piece of html adds a JET table to the page, for prettiness we’ve wrapped the table in a decorative panel and added a next and previous buttons. The table definition tells Oracle JET that the data is coming from a JavaScript object called “opportunityDataSource“, it also defines defines the columns, column header text and that the columns are not sortable. The button definitions reference two functions in our JavaScript (to follow) which will paginate the data.

Building The logic

We can now move onto the JavaScript side of things, that is the part where we get the data from Sales Cloud and makes it available to the table object in the html file. For this simplistic example we’ll get the data direct from Sales Cloud and display it in the table, with no caching and nothing fancy like collection models for pagination .

Edit the dashboard.js file, this is marked as 2 in the above image. This file is a RequiresJS AMD (Application Module Definition File) and is pre-populated to support the dashboard.html page.

Within this file, cut-n-paste the following JavaScript snippet.

define(['ojs/ojcore', 'knockout', 'jquery', 'ojs/ojtable', 'ojs/ojbutton'],
        function (oj, ko, $) {
            function DashboardViewModel() {
                var self = this;
                var offset = 0;
                var limit = 10;
                var pageSize = 10;
                var nextButtonActive = ko.observable(true);
                var prevButtonActive = ko.observable(true);
                //
                self.optyList = ko.observableArray([{Name: "Fetching data"}]);
                console.log('Data=' + self.optyList);
                self.opportunityDataSource = new oj.ArrayTableDataSource(self.optyList, {idAttribute: 'Name'});
                self.refresh = function () {
                    console.log("fetching data");
                    var hostname = "https://yourCRMServer.domain.com";
                    var queryString = "/salesApi/resources/latest/opportunities?onlyData=true&fields=OptyNumber,Name,Revenue,TargetPartyName,StatusCode&q=StatusCode=OPEN&limit=10&offset=" + offset;
                    console.log(queryString);
                    $.ajax(hostname + queryString,
                            {
                                method: "GET",
                                dataType: "json",
                                headers: {"Authorization": "Basic " + btoa("username:password")},
                                // Alternative Headers if using JWT Token
                                // headers : {"Authorization" : "Bearer "+ jwttoken; 
                                success: function (data)
                                {
                                    self.optyList(data.items);
                                    console.log('Data returned ' + JSON.stringify(data.items));
                                    console.log("Rows Returned"+self.optyList().length);
                                    // Enable / Disable the next/prev button based on results of query
                                    if (self.optyList().length < limit)
                                    {
                                        $('#nextButton').attr("disabled", true);
                                    } else
                                    {
                                        $('#nextButton').attr("disabled", false);
                                    }
                                    if (self.offset === 0)
                                        $('#prevButton').attr("disabled", true);
                                },
                                error: function (jqXHR, textStatus, errorThrown)
                                {
                                    console.log(textStatus, errorThrown);
                                }
                            }
                    );
                };
                // Handlers for buttons
                self.nextPage = function ()
                {

                    offset = offset + pageSize;
                    console.log("off set=" + offset);
                    self.refresh();
                };
                self.previousPage = function ()
                {
                    offset = offset - pageSize;
                    if (offset < 0)
                        offset = 0;
                    self.refresh();
                };
                // Initial Refresh
                self.refresh();
            }
            
            return new DashboardViewModel;
        }
);

Lets examine the code

Line 1: Here we’ve modified the standard define so that it includes a ojs/table reference. This is telling RequiresJS , which the JET toolkit uses, that this piece of JavaScript uses a JET Table object
Line 8 & 9 : These lines maintain variables to indicate if the button should be enabled or not
Line 11: Here we created a variable called optyList, this is importantly created as a knockout observableArray.
Line 13: Here we create another variable called “opportunityDataSource“, which is the variable the HTML page will reference. The main difference here is that this variable is of type oj.ArrayTableDataSource and that the primary key is OptyNumber
Lines 14-47 :  Here we define a function called “refresh”. When this javascript function is called we execute a REST Call back to SalesCloud using jquery’s ajax call. This call retrieves the data and then populates the optyList knockout data source with data from the REST call. Specifically here note that we don’t assign the results to the optyData variable directly but we purposely pass a child array called “items”. If you execute the REST call, we previously discussed, you’ll note that the data is actually stored in an array called items
Line 23 : This line is defining the headers, specifically in this case we’re defining a header called “Authorization” , with the username & password formatted as “username:password” and then the base64 encoded.
Line 24-25  :These lines define an alternative header which would be appropriate if a JWT token was being used. This token would be passed in as a parameter rather than being hardcoded
Lines 31-40 : These query the results of the query and determine if the next and previous buttons should be enabled or not using jQuery to toggle the disabled attribute
Lines 50-63 : These manage the next/previous button events
Finally on line 65 we execute the refresh() method when the module is initiated.

Running the example on your mobile

To run the example on your mobile device execute the follow commands

grunt build --platform=android 
grunt serve --platform=android

or if you want to test on a device

grunt serve --platform=android -destination=[device or emulator name]

If all is well you should see a table of data populated from Oracle Sales Cloud

 

For more information on building JavaScript applications with the Oracle JET tool make sure to check out our other blog articles on JET here , the Oracle JET Website here and the excellent Oracle JET You Tube channel here

Running the example on the browser and CORS

If you try and run the example on your browser you’ll find it probably won’twork. If you look at the browser console (control+shift+I on most browsers) you’ll probably see that the error was something like “XMLHttpRequest cannot load…” etc

cors

This is because the code has violated “Cross Origin Scripting” rules. In a nut shell “A JavaScript application cannot access a resource which was not served up by the server which itself was served up from”.. In my case the application was served up by Netbeans on http://localhost:8090, whereas the REST Service from Sales Cloud is on a different server, thankfully there is a solution called “CORS”. CORS stands for Cross Origin Resource Sharing and is a standard for solving this problem, for more information on CORS see this wikipedia article, or other articles on the internet.

Configuring CORS in Fusion Applications

For our application to work on a web browser we need to enable CORS in Fusion Applications, we do this by the following steps :

  1. 1. Log into Fusion Applications (SalesCloud, HCM etc) using a user who has access to “Setup and Maintenance”
  2. 2. Access setup and Maintenance screens
  3. 3. Search for Manage Administrator Profile Values and then navigate to that task
  4. 4. Search for “Allowed Domains” profile name (case sensitive!!).
  5. 5. Within this profile name you see a profile option called “site“, this profile option has a profile value
  6. 6. Within the profile value add the hostname, and port number, of the application hosting your JavaScript application. If you want to allow “ALL” domains set this value to “*” (a single asterisk )
  7. WARNING : Ensure you understand the security implication of allowing ALL Domains using the asterisk notation!
  8. 7. Save and Close and then retry running your JET Application in your browser.
setupandMaiteanceCORS

CORS Settings in Setup and Maintenance (Click to enlarge)

If all is good when you run the application on your browser, or mobile device, you’ll now see the application running correctly.

JETApplication

Running JET Application (Click to enlarge)

 

Final Note on Security

To keep this example simple the security username/password was hard-coded in the mobile application, not suitable for a real world application. For a real application you would create a configuration screen, or use system preferences, to collect and store the username , password and the SalesCloud Server url which would then be used in the application.

If the JET Application is to be embedded inside a Fusion Applications Page then you will want to use JWT Token authentication. Modify the example so that the JWT token is passed into the application URL as a parameter and then use that in the JavaScript (lines 24-25) accordingly.

For more information on JWT Tokens in Fusion Applications see these blog entries (Link 1, Link 2) and of course the documentation

Conclusion

As we’ve seen above its quite straightforward to create mobile, and browser, applications using the Oracle JET Framework. The above example was quite simple and only queried data, a real application would also have some write/delete/update operations and therefore you would want to start to look at the JET Common Model and Collection Framework (DocLink) instead. Additionally in the above example we queried data direct from a single SalesCloud instance and did no processing on it.. Its very likely that a single mobile application will need to get its data from multiple data sources and require some backend services to “preprocess” and probably post process the data, in essence provide an API.. We call this backend a  “MBaaS”, ie Mobile Backend As A Service, Oracle also provides a MBaaS in its PaaS suite of products and it is called Oracle Product is called “Mobile Cloud Service”..

In a future article we will explore how to use Oracle Mobile Cloud Service (Oracle MCS) to query SalesCloud and Service cloud and provide an API to the client which would be using the more advanced technique of using the JET Common Model/Collection framework.

 

 

Using Oracle Data Integrator (ODI) to Bulk Load Data into HCM-Cloud

$
0
0

Introduction

With its capacity to handle complex transformations and large volumes of data, and its ability to orchestrate operations across heterogeneous systems, ODI is a great tool to prepare and upload bulk data into HCM Cloud.

In this post, we are looking at the different steps required to perform this task.

Overview of the integration process

There are three steps that are required to prepare and upload data into HCM:

  • Transform data and prepare a file that matches the import format expected by HCM. Then ZIP the generated file;
  • Upload the file to UCM-Cloud using the appropriate web service call;
  • Invoke HCM-Cloud to trigger the import process, using the appropriate web service call.

We will now see how these different steps can be achieved with ODI.

Preparing the data for import

We will not go into the details of how to transform data with ODI here: this is a normal use of the product and as such it is fully documented.

For HCM to be able import the data, ODI needs to generate a file that has to be formatted according to HCM specifications. For ODI to generate the proper file, the most effective approach is to create a custom Knowledge Module (KM). The details for this Knowledge Module as well as an introduction to the HCM file format are available here: Oracle Data Integrator (ODI) for HCM-Cloud: a Knowledge Module to Generate HCM Import Files. Using this KM, data can be prepared from different sources, aggregated and augmented as needed. ODI will simply generate the import file as data is loaded into a set of tables that reflect the HCM file’s business objects components.

Once the file has been generated with regular ODI mappings, the ODI tool OdiZip can be used to compress the data. You need to create a package to define the sequence of mappings to transform the data and create the import file. Then add an OdiZip step in the package to compress the file.

ODIZip

the name of the import file is imposed by HCM, but the ZIP file can have any name, which can be very convenient if you want to generate unique file names.

Uploading the file to UCM-Cloud

The web service used to upload the file to UCM is relatively straightforward. The only element we have to be careful with is the need to timestamp the data by setting a start date and a nonce (unique ID) in the header of the SOAP message. We use ODI to generate these values dynamically by creating two variables: StartDate and Nonce.  Both variables are refreshed in the package.

The refresh code for the StartDate variable is the following:

select to_char(sysdate,’YYYY-MM-DD’) || ‘T’ || to_char(systimestamp,’HH:MI:SSTZH:TZM’)
from dual

This formats the date like this: 2016-05-15T04:38:59-04:00

The refresh code for the Nonce variable is the following:

select dbms_random.string(‘X’, 9) from dual

This gives us a 9 character random alphanumeric string, like this: 0O0Q3LRKM

We can then set the parameters for the UCM web service using these variables. When we add the OdiInvokeWebService tool to a package, we can take advantage of the HTTP Analyzer to get help with the parameters settings.

HTTP Analyzer

To use the HTTP Analyzer, we first need to provide the WSDL for the service we want to access. Then we click the HTTP Analyzer button: ODI will read the WSDL and build a representation of the service that lets us view and set all possible parameters.

If not obvious, for the Analyzer to work, you need to be able to connect to the WSDL.

The Analyzer lets us set the necessary parameters for the header, where we use the variables that we have previously defined:

UCM soap header

We can then set the rest of the parameters for the web service. To upload a file with UCM, we need the following settings:

IdcService: CHECKIN_UNIVERSAL (for more details on this and other available services, check out the Oracle Fusion Middleware Services Reference Guide for Oracle Universal Content Management)

FieldArray: we use the following fields:

Field name Field content Comment
 
dDocName Contact.zip The name of our file
dDocAuthor HCM_APP_ADMIN_ALL Our user name in UCM
dDocTitle Contact File for HCM Label for the file
dDocType Document
dSecurityGroup Public
doFileCopy TRUE Keep the file on disk after copy
dDocAccount ODIAccount

 

The screenshot below shows how these parameters can be entered into the Analyzer:

HTTP Analyzer IdcService

In the File Array we can set the name of the file and its actual location:

HTTP Analyzer - File and send

At this point we can test the web service with the Send Request button located at the bottom of the Analyzer window: you see the response from the server on the right-hand side of the window.

If you want to use this test feature, keep in mind that:
– Your ODI variables need to have been refreshed so that they have a value
– The ODI variables need to be refreshed between subsequent calls to the service: you cannot use the same values twice in a row for StartDate and Nonce (or the server would reject your request).

A few comments on the execution of the web service: a successful call to the web service does not guarantee that the operation is successful. You want to review the response returned by the service to validate the success of the operation. Make sure that you set the name of the Response File when you set the parameters for the OdiInvokeWebService tool to do this.

All we need to validate in this response file is the content of the element StatusMessage: if it contains ‘Successful’ then the file was loaded successfully. If it contains ‘not successful’ then you have a problem. It is possible to build an ODI mapping for this (creating a model for the XML file, reverse-engineering the file, then building the mapping…) but a very simple Groovy script (in an ODI procedure for instance) can get us there faster and can throw in the ODI Operator log the exact error message returned by the web service in case of problems:

import groovy.util.XmlSlurper

// You can replace the following hard-coded values with ODI variables. For instance:
// inputFile=#myProject.MyResponseFile
inputFile = ‘D/TEMP/HCM//UCMResponse.xml’
XMLTag=’StatusMessage’
fileContents = new File(inputFile).getText(‘UTF-8’)
def xmlFile=new XmlSlurper().parseText(fileContents)
def responseStatus= new String(xmlFile.’**’.find{node-> node.name() == XMLTag}*.text().toString())
if (responseStatus.contains(‘Successful’)) {
// some action}
else {
throw new Exception(responseStatus)
}

This said, if all parameters are set correctly and if you have the necessary privileges on UCM Cloud, at this point the file is loaded on UCM. We can now import the file into HCM Cloud.

Invoking the HCM-Cloud loader to trigger the import process

The HCM web service uses OWSM security policies. If you are not familiar with OWSM security policies, or if you do not know how to setup ODI to leverage OWSM, please refer to Connecting Oracle Data Integrator (ODI) to the Cloud: Web Services and Security Policies. This blog post also describes how to define a web service in ODI Topology.

Once we have the web service defined in ODI Topology, invoking the web service is trivial. When you set the parameters for the ODI tool OdiInvokeWebService in your package, you only need to select a Context as well as the logical schema that points to the web service. Then you can use the HTTP Analyzer to set the parameters for the web service call:

HCM web service call

In our tests we set the ContentId to the name of the file that we want to load, and the Parameters to the following values:

FileEncryption=NONE, DeleteSourceFile=N.

You can obviously change these values as you see fit. The details for the parameters for this web service are available in the document HCM Data Loader Preparation.

Once we have set the necessary parameters for the payload, we just have to set the remainder of parameters for OdiInvokeWebService. In particular, we need a response file to store the results from the invocation of the web service.

Here again we can use Groovy code to quickly parse the response file and make sure that the load started successfully (this time we are looking for an element named result in the response file).

Make sure that the user you are using to connect and initiate the import process has enough privileges to perform this operation. One easy way to validate this is with the HCM user interface: if you can import the files manually from the HCM user interface, then you have enough privileges to execute the import process with the web service.

The final ODI package looks like this:

HCM Load Final Package

This final web service call initiates the import of the file. You can make additional calls to check on the status of the import (running, completed, aborted) to make sure that the file is successfully imported. The process to invoke these additional web services is similar to what we have done here to import the file.

Conclusion

The features available in ODI 12.2.1 make it relatively easy to generate a file, compress it, upload it to the cloud and import it into HCM-Cloud: we have generated an import file in a proprietary format with a quick modification of a standard Knowledge Module; we have edited the header and the workload of web services without ever manipulating XML files directly; we have setup security policies quickly by leveraging the ability to define web services in ODI Topology. Now all we have to do is to design all the transformations that will be needed to generate the data for HCM-Cloud!

For more Oracle Data Integrator best practices, tips, tricks, and guidance that the A-Team members gain from real-world experiences working with customers and partners, visit Oracle A-team Chronicles for Oracle Data Integrator.

References

SaaS workflow extensions using Process Cloud Service.

$
0
0

Introduction

Oracle Process Cloud Service (PCS), a Platform-as-a-Service offering, enables human workflow capabilities on the cloud with easy-to-use composer and wor space. PCS allows authoring of processes using business-analyst friendly BPMN notation over swim-lanes. PCS eliminates the burden of building on-premise process manage platforms, while allowing enterprises to leverage knowledge from on-premise implementations.

 

Key features of Process Cloud Service include:

  • Invoke workflows through Web forms, SOAP service and REST service.
  • Invoke external SOAP and REST/JSON services.
  • Synchronous and Asynchronous service invocations.
  • Ability to import existing BPMN based workflows.

With the rapid adoption of Oracle SaaS applications, PCS comes in handy as an option to extend SaaS with human-task workflows. Here is some scenario where PCS is a strong candidate:

  • Workflows customizations to SaaS products are necessary to meet enterprise needs.
  • Workflow capabilities need to enabled rapidly for on-premise or Cloud applications.
  • Orchestration use cases with heavy use of human tasks.

Let’s look at extending Supply Chain Management Cloud with a workflow in PCS to capture, review and submit sales orders.

Sample Workflow

In this scenario, users in an enterprise submit orders in a PCS workflow. PCS then sends the orders to Supply Chain Management cloud’s Distributed Order Orchestration (DOO) web services.  The status of SCM Cloud Sales Order is retrieved for user’s review before the workflow ends.  This sample demonstrates the capabilities of PCS with basic functions of PCS and SCM Cloud.  It could be extended for advanced use cases. Figure 1 shows the high-level workflow.

Figure 1

Worflow overkview

 

Environment requirements for the sample

Below are requirements to enable the sample workflow between PCS and SCM Cloud..

Process Cloud Service

  • Access to PCS composer and workflow provisioned.
  • Network connectivity between PCS and SCM verified.

Supply Chain Management Cloud (R11)

  • Access to SCM Cloud with implementation privileges provisioned.
  • SCM Order Management features provisioned and implemented.
  • Order Management Order Capture service endpoint.
  • Relevant configuration for Source systems and item relationships.
  • Collection of order reference data enabled.

SCM Cloud Order Management module should be implemented in order for the order capture services function properly. For more information on configuring  Order Management, refer to the white papers listed at the bottom of this post. These documents might require access to Oracle support portal.

 

SCM Cloud order management services

For the sample, a test instance of Oracle Supply Chain Management Cloud Release 11 was used. Order capture service accepts orders from upstream capture systems through ProcessOrderRequest calls. It also provides details of an order through GetOrderDetails call. XML payloads for both services were captured from sample workflow and  provided below and, for sake of brevity, detailed instructions on Order Management configuration is left to support documentation.

As of Release 11, SCM Cloud only exposes SOA services for Order capture. the SOA service endpoint for R11 is not listed in the catalog. The endpoint is

https://<hostname>:<port>/soa-infra/services/default/DooDecompReceiveOrderExternalComposite/ReceiveOrderRequestService. Append “?WSDL” to end the endpoint to retrieve the WSDL.

 

Sample payload for ProcessOrderRequest:

<?xml version = '1.0' encoding = 'UTF-8'?>
<env:Envelope xmlns:env="http://schemas.xmlsoap.org/soap/envelope/" xmlns:wsa="http://www.w3.org/2005/08/addressing">
   <env:Header>
      <wsa:To>https://eczc-test.scm.em2.oraclecloud.com:443/soa-infra/services/default/DooDecompReceiveOrderExternalComposite/ReceiveOrderRequestService</wsa:To>
      <wsa:Action>ProcessOrderRequestSync</wsa:Action>
      <wsa:MessageID>urn:eebb5147-2840-11e6-9a89-08002741191a</wsa:MessageID>
      <wsa:RelatesTo>urn:eebb5147-2840-11e6-9a89-08002741191a</wsa:RelatesTo>
      <wsa:ReplyTo>
         <wsa:Address>http://www.w3.org/2005/08/addressing/anonymous</wsa:Address>
         <wsa:ReferenceParameters>
            <orasoa:EndpointAddress xmlns:orasoa="http://xmlns.oracle.com/soa">http://localhost:7003/soa-infra/services/testing/SalesOrderProcess!595*soa_8366f568-20d7-4a6a-ad68-58effa7a29e3/SCMWebService%23SCMSalesOrderProcess/Services.Externals.SCMWebService.reference</orasoa:EndpointAddress>
            <orasoa:PortType xmlns:ptns="http://xmlns.oracle.com/apps/scm/doo/decomposition/receiveTransform/receiveSalesOrder/DooDecompReceiveOrderExternalComposite" xmlns:orasoa="http://xmlns.oracle.com/soa">ptns:ReceiveOrderRequestServiceCallback</orasoa:PortType>
            <instra:tracking.ecid xmlns:instra="http://xmlns.oracle.com/sca/tracking/1.0">297bed2c-1dde-4850-8a34-cf2da58d19ca-0001398a</instra:tracking.ecid>
            <instra:tracking.conversationId xmlns:instra="http://xmlns.oracle.com/sca/tracking/1.0">urn:eebb5147-2840-11e6-9a89-08002741191a</instra:tracking.conversationId>
            <instra:tracking.FlowEventId xmlns:instra="http://xmlns.oracle.com/sca/tracking/1.0">40879</instra:tracking.FlowEventId>
            <instra:tracking.FlowId xmlns:instra="http://xmlns.oracle.com/sca/tracking/1.0">40047</instra:tracking.FlowId>
            <instra:tracking.CorrelationFlowId xmlns:instra="http://xmlns.oracle.com/sca/tracking/1.0">0000LKDtQPbFw000jzwkno1NJbIb0000LQ</instra:tracking.CorrelationFlowId>
         </wsa:ReferenceParameters>
      </wsa:ReplyTo>
      <wsa:FaultTo>
         <wsa:Address>http://www.w3.org/2005/08/addressing/anonymous</wsa:Address>
         <wsa:ReferenceParameters>
            <orasoa:EndpointAddress xmlns:orasoa="http://xmlns.oracle.com/soa">http://localhost:7003/soa-infra/services/testing/SalesOrderProcess!595*soa_8366f568-20d7-4a6a-ad68-58effa7a29e3/SCMWebService%23SCMSalesOrderProcess/Services.Externals.SCMWebService.reference</orasoa:EndpointAddress>
            <orasoa:PortType xmlns:ptns="http://xmlns.oracle.com/apps/scm/doo/decomposition/receiveTransform/receiveSalesOrder/DooDecompReceiveOrderExternalComposite" xmlns:orasoa="http://xmlns.oracle.com/soa">ptns:ReceiveOrderRequestServiceCallback</orasoa:PortType>
         </wsa:ReferenceParameters>
      </wsa:FaultTo>
   </env:Header>
   <env:Body>
      <process xmlns="http://xmlns.oracle.com/apps/scm/doo/decomposition/receiveTransform/receiveSalesOrder/DooDecompReceiveOrderExternalComposite">
         <OrchestrationOrderRequest>
            <SourceTransactionIdentifier xmlns="http://xmlns.oracle.com/apps/scm/doo/decomposition/receiveTransform/receiveSalesOrder/model/">1154551RBWM</SourceTransactionIdentifier>
            <SourceTransactionSystem xmlns="http://xmlns.oracle.com/apps/scm/doo/decomposition/receiveTransform/receiveSalesOrder/model/">OPS</SourceTransactionSystem>
            <SourceTransactionNumber xmlns="http://xmlns.oracle.com/apps/scm/doo/decomposition/receiveTransform/receiveSalesOrder/model/">1154551RBWM</SourceTransactionNumber>
            <BuyingPartyName xmlns="http://xmlns.oracle.com/apps/scm/doo/decomposition/receiveTransform/receiveSalesOrder/model/">Computer Service and Rentals</BuyingPartyName>
            <TransactionalCurrencyCode xmlns="http://xmlns.oracle.com/apps/scm/doo/decomposition/receiveTransform/receiveSalesOrder/model/">USD</TransactionalCurrencyCode>
            <TransactionOn xmlns="http://xmlns.oracle.com/apps/scm/doo/decomposition/receiveTransform/receiveSalesOrder/model/">2016-06-01T14:36:42.649-07:00</TransactionOn>
            <RequestingBusinessUnitIdentifier xmlns="http://xmlns.oracle.com/apps/scm/doo/decomposition/receiveTransform/receiveSalesOrder/model/">300000001548368</RequestingBusinessUnitIdentifier>
            <PartialShipAllowedFlag xmlns="http://xmlns.oracle.com/apps/scm/doo/decomposition/receiveTransform/receiveSalesOrder/model/">false</PartialShipAllowedFlag>
            <OrchestrationOrderRequestLine xmlns:ns2="http://xmlns.oracle.com/apps/scm/doo/decomposition/receiveTransform/receiveSalesOrder/model/" xmlns="http://xmlns.oracle.com/apps/scm/doo/decomposition/receiveTransform/receiveSalesOrder/model/">
               <ns2:SourceTransactionLineIdentifier>1</ns2:SourceTransactionLineIdentifier>
               <ns2:SourceTransactionScheduleIdentifier>1</ns2:SourceTransactionScheduleIdentifier>
               <ns2:SourceTransactionLineNumber>1</ns2:SourceTransactionLineNumber>
               <ns2:SourceTransactionScheduleNumber>1</ns2:SourceTransactionScheduleNumber>
               <ns2:ProductNumber>AS54888</ns2:ProductNumber>
               <ns2:OrderedQuantity>1</ns2:OrderedQuantity>
               <ns2:OrderedUOMCode>zzx</ns2:OrderedUOMCode>
               <ns2:OrderedUOM>EA</ns2:OrderedUOM>
               <ns2:RequestingBusinessUnitIdentifier>300000001293806</ns2:RequestingBusinessUnitIdentifier>
               <ns2:ParentLineReference/>
               <ns2:RootParentLineReference/>
               <ns2:ShippingInstructions>BM Ship Instructions- Ship it in a day</ns2:ShippingInstructions>
               <ns2:PackingInstructions/>
               <ns2:RequestedShipDate>2016-12-26T00:00:00</ns2:RequestedShipDate>
               <ns2:PaymentTerms/>
               <ns2:TransactionCategoryCode>ORDER</ns2:TransactionCategoryCode>
               <ns2:BillToCustomerName>Computer Service and Rentals</ns2:BillToCustomerName>
               <ns2:BillToAccountSiteUseIdentifier>300000001469016</ns2:BillToAccountSiteUseIdentifier>
               <ns2:BillToCustomerIdentifier>300000001469002</ns2:BillToCustomerIdentifier>
               <ns2:PartialShipAllowedFlag>false</ns2:PartialShipAllowedFlag>
               <ns2:UnitListPrice>100.0</ns2:UnitListPrice>
               <ns2:UnitSellingPrice>100.0</ns2:UnitSellingPrice>
               <ns2:ContractEndDate>2018-12-13</ns2:ContractEndDate>
               <ns2:ExtendedAmount>100.0</ns2:ExtendedAmount>
               <ns2:TaxExempt>S</ns2:TaxExempt>
               <ns2:ShipSetName>{{SHIPSET}}</ns2:ShipSetName>
               <ns2:OrigSysDocumentReference>ORIGSYS</ns2:OrigSysDocumentReference>
               <ns2:OrigSysDocumentLineReference>ORIGSYSLINE</ns2:OrigSysDocumentLineReference>
               <ns2:LineCharge>
                  <ns2:ChargeDefinitionCode>QP_SALE_PRICE</ns2:ChargeDefinitionCode>
                  <ns2:ChargeSubtypeCode>ORA_PRICE</ns2:ChargeSubtypeCode>
                  <ns2:PriceTypeCode>ONE_TIME</ns2:PriceTypeCode>
                  <ns2:PricedQuantity>1</ns2:PricedQuantity>
                  <ns2:PrimaryFlag>true</ns2:PrimaryFlag>
                  <ns2:ApplyTo>PRICE</ns2:ApplyTo>
                  <ns2:RollupFlag>false</ns2:RollupFlag>
                  <ns2:SourceChargeIdentifier>SC2</ns2:SourceChargeIdentifier>
                  <ns2:ChargeTypeCode>ORA_SALE</ns2:ChargeTypeCode>
                  <ns2:ChargeCurrencyCode>USD</ns2:ChargeCurrencyCode>
                  <ns2:SequenceNumber>2</ns2:SequenceNumber>
                  <ns2:PricePeriodicityCode/>
                  <ns2:GsaUnitPrice/>
                  <ns2:ChargeComponent>
                     <ns2:ChargeCurrencyCode>USD</ns2:ChargeCurrencyCode>
                     <ns2:HeaderCurrencyCode>USD</ns2:HeaderCurrencyCode>
                     <ns2:HeaderCurrencyExtendedAmount>150.0</ns2:HeaderCurrencyExtendedAmount>
                     <ns2:PriceElementCode>QP_LIST_PRICE</ns2:PriceElementCode>
                     <ns2:SequenceNumber>1</ns2:SequenceNumber>
                     <ns2:PriceElementUsageCode>LIST_PRICE</ns2:PriceElementUsageCode>
                     <ns2:ChargeCurrencyUnitPrice>150.0</ns2:ChargeCurrencyUnitPrice>
                     <ns2:HeaderCurrencyUnitPrice>150.0</ns2:HeaderCurrencyUnitPrice>
                     <ns2:RollupFlag>false</ns2:RollupFlag>
                     <ns2:SourceParentChargeComponentId/>
                     <ns2:SourceChargeIdentifier>SC2</ns2:SourceChargeIdentifier>
                     <ns2:SourceChargeComponentIdentifier>SCC3</ns2:SourceChargeComponentIdentifier>
                     <ns2:ChargeCurrencyExtendedAmount>150.0</ns2:ChargeCurrencyExtendedAmount>
                  </ns2:ChargeComponent>
                  <ns2:ChargeComponent>
                     <ns2:ChargeCurrencyCode>USD</ns2:ChargeCurrencyCode>
                     <ns2:HeaderCurrencyCode>USD</ns2:HeaderCurrencyCode>
                     <ns2:HeaderCurrencyExtendedAmount>150.0</ns2:HeaderCurrencyExtendedAmount>
                     <ns2:PriceElementCode>QP_NET_PRICE</ns2:PriceElementCode>
                     <ns2:SequenceNumber>3</ns2:SequenceNumber>
                     <ns2:PriceElementUsageCode>NET_PRICE</ns2:PriceElementUsageCode>
                     <ns2:ChargeCurrencyUnitPrice>150.0</ns2:ChargeCurrencyUnitPrice>
                     <ns2:HeaderCurrencyUnitPrice>150.0</ns2:HeaderCurrencyUnitPrice>
                     <ns2:RollupFlag>false</ns2:RollupFlag>
                     <ns2:SourceParentChargeComponentId/>
                     <ns2:SourceChargeIdentifier>SC2</ns2:SourceChargeIdentifier>
                     <ns2:SourceChargeComponentIdentifier>SCC1</ns2:SourceChargeComponentIdentifier>
                     <ns2:ChargeCurrencyExtendedAmount>150.0</ns2:ChargeCurrencyExtendedAmount>
                  </ns2:ChargeComponent>
               </ns2:LineCharge>
            </OrchestrationOrderRequestLine>
         </OrchestrationOrderRequest>
      </process>
   </env:Body>
</env:Envelope>

Sample payload for GetOrderDetails:

<?xml version = '1.0' encoding = 'UTF-8'?>
<env:Envelope xmlns:env="http://schemas.xmlsoap.org/soap/envelope/" xmlns:wsa="http://www.w3.org/2005/08/addressing">
   <env:Header>
      <wsa:To>https://eczc-test.scm.em2.oraclecloud.com:443/soa-infra/services/default/DooDecompReceiveOrderExternalComposite/ReceiveOrderRequestService</wsa:To>
      <wsa:Action>GetOrderDetailsSync</wsa:Action>
      <wsa:MessageID>urn:109ed9ec-2841-11e6-9a89-08002741191a</wsa:MessageID>
      <wsa:RelatesTo>urn:eebb5147-2840-11e6-9a89-08002741191a</wsa:RelatesTo>
      <wsa:ReplyTo>
         <wsa:Address>http://www.w3.org/2005/08/addressing/anonymous</wsa:Address>
         <wsa:ReferenceParameters>
            <orasoa:EndpointAddress xmlns:orasoa="http://xmlns.oracle.com/soa">http://localhost:7003/soa-infra/services/testing/SalesOrderProcess!595*soa_8366f568-20d7-4a6a-ad68-58effa7a29e3/SCMWebService%23SCMSalesOrderProcess/Services.Externals.SCMWebService.reference</orasoa:EndpointAddress>
            <instra:tracking.ecid xmlns:instra="http://xmlns.oracle.com/sca/tracking/1.0">297bed2c-1dde-4850-8a34-cf2da58d19ca-0001398a</instra:tracking.ecid>
            <instra:tracking.conversationId xmlns:instra="http://xmlns.oracle.com/sca/tracking/1.0">urn:eebb5147-2840-11e6-9a89-08002741191a</instra:tracking.conversationId>
            <instra:tracking.FlowEventId xmlns:instra="http://xmlns.oracle.com/sca/tracking/1.0">40886</instra:tracking.FlowEventId>
            <instra:tracking.FlowId xmlns:instra="http://xmlns.oracle.com/sca/tracking/1.0">40047</instra:tracking.FlowId>
            <instra:tracking.CorrelationFlowId xmlns:instra="http://xmlns.oracle.com/sca/tracking/1.0">0000LKDtQPbFw000jzwkno1NJbIb0000LQ</instra:tracking.CorrelationFlowId>
         </wsa:ReferenceParameters>
      </wsa:ReplyTo>
      <wsa:FaultTo>
         <wsa:Address>http://www.w3.org/2005/08/addressing/anonymous</wsa:Address>
         <wsa:ReferenceParameters>
            <orasoa:EndpointAddress xmlns:orasoa="http://xmlns.oracle.com/soa">http://localhost:7003/soa-infra/services/testing/SalesOrderProcess!595*soa_8366f568-20d7-4a6a-ad68-58effa7a29e3/SCMWebService%23SCMSalesOrderProcess/Services.Externals.SCMWebService.reference</orasoa:EndpointAddress>
         </wsa:ReferenceParameters>
      </wsa:FaultTo>
   </env:Header>
   <env:Body>
      <GetOrderDetailsProcessRequest xmlns="http://xmlns.oracle.com/apps/scm/doo/decomposition/orderDetailServices/DooDecompOrderDetailSvcComposite">
         <SourceOrderInput xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/" xmlns:dood="http://xmlns.oracle.com/apps/scm/doo/decomposition/orderDetailServices/DooDecompOrderDetailSvcComposite" xmlns:mod="http://xmlns.oracle.com/apps/scm/doo/decomposition/orderDetailServices/model/">
            <mod:SourceOrderSystem>OPS</mod:SourceOrderSystem>
            <mod:SalesOrderNumber>1154660RBWM</mod:SalesOrderNumber>
         </SourceOrderInput>
      </GetOrderDetailsProcessRequest>
   </env:Body>
</env:Envelope>

PCS workflow in detail

The sample PCS workflow has submission and approval human tasks, associated web forms, a basic gateway rule, a REST web service call to dynamically populate drop-down list and two SOAP web service calls to Order Management services.  Order management service in this case is secured with HTTP basic authentication and accessible only over TLS. Figure 2 shows the swim lane representation of the workflow with self-describing flow elements names. We’ll focus on SCM Cloud specific aspects of the process flow. Process Cloud Service provides several pre-built samples and pattern templates, both of which could be used for quick development of a flow.

Figure 2

processflow



Adding a connector to SCM Cloud web service

In order to use web services to be used in a process, a web service connector  WSDL and associated schema files must be added to the PCS project in composer. Figure 3 shows how to create a connector. Once a connector is created, it is available for implementation in a Service flow element. As part of the connector setup, composer allows security to be configured with options such as HTTP basic authentication or WS-Security username token. Note that these settings can be changed in customization page when the flow is deployed.

Figure 3

WebServiceConnector

 

Associating Data between PCS flow elements

As the PCS flow transitions between flow elements, data input to the element and data output from the element need to be associated to suitable data objects. Data objects could be based on pre-built types such as int or String, or based on a one of the types defined in imported XML schema files. XML schema types be imported under “Business Objects” section of the composer.  Figure 4 shows data association to capture input for order capture service. As shown, some elements are captured from a web form submitted by a user and many others are hard-coded for this sample flow.

Figure 4

dataassociation

 

 Building web forms from pre-defined business types.

Human  tasks in PCS are represented by Web form-based UI. Web forms could be built quickly from pre-defined data types, such as XML complex types. Web form elements inherit the data contraints defined in the complex type. Once a form is generated, fields could be re-arranged to improve the UI.  Figure 5 shows a web form generated based on output of GetOrderDetails web service call, with details returned by SCM Cloud.

Figure 5

OrderDetailsStatus

 

Customizing process during deployment

Process cloud service supports deployments from composer to a test environment and then to a production or subsequent test environments. During the deployment, environment specific information such as endpoints and credentials could be updated.  All web service connectors used by the project are available to be configured. Figure 6 shows customization page for test deployment.

Figure 6

DeploymentCustomization

 
























Conclusion

Process Cloud Service offers a quick and reliable way to author and deploy work flows that could orchestrate human tasks and system interactions using industry standard notations and protocols. This is very useful to integrate and extend SaaS applications from Oracle and other vendors. PCS allows enterprise to leverage in-house expertise in process development without the hassles of building and maintaining the platform or having to master process flow implementation techniques in multiple SaaS products. This article covered a specific use case where PCS captures orders through rules and approval tasks in PCS and sends the order to SCM Cloud’s order capture service, and , finally, obtains order status and other details from SCM Cloud.

 

References

Viewing all 155 articles
Browse latest View live