Quantcast
Channel: ATeam Chronicles
Viewing all 155 articles
Browse latest View live

Integrating with Sales Cloud using SOAP web services and REST APIs (Part 3)

$
0
0

This is part 3 of the blog series that covers SOAP and REST integration with Sales Cloud. In part 1 and part 2, I covered SOAP services. In this part I’ll be covering Sales Cloud REST APIs

Sales Cloud provides REST APIs that allow external applications to view and edit Sales Cloud data. A complete catalog of these REST APIs is available at the Oracle Help Center

In this catalog you’ll notice several details as shown in the screenshot below

Snap8

Sales Cloud REST APIs are automatically available in your Sales Cloud Instance. No additional configuration, or setup is required to enable REST APIs.

Sales Cloud REST APIs are also completely free of charge to use for any number of invocations you may want to make.

For each Sales Cloud object, GET, POST, PATCH and DELETE HTTP Methods are supported corresponding to CRUD operations on these Sales Objects. These appear in the API catalog, on the left navigation, under Accounts entry. The request and response payload structure along with sample payloads are all provided in this one stop shop.

The REST APIs also directly correlate to Sales Cloud UI. That is, if you have a field that is required or is of a specific data type, then the REST API automatically reflects those validations.

In addition to the Parent Object, REST APIs also supports using child objects. In the API catalog screenshot above the Accounts Objects and its corresponding child objects are shown.

The simplest way for you to get started is to use a tool like SOAPUI or Postman to execute a “Get all” action on the Account object. I’ll use Postman and use the /crmCommonApi/resources/latest/accounts. Since the latest version is 11.1.11, this is the same as /crmCommonApi/resources/11.1.11/accounts. The only other header I provided is the authentication using Basic Auth.

Snap9

You can then query for a specific account now, as described in the “Get an Account” task of the catalog. The API will now be https://<instance>.crm.<datacenter>.oraclecloud.com/crmCommonApi/resources/11.1.11/accounts/3008

If you want to understand more about each field within the account object, then use https://<instance>.crm.<datacenter>.oraclecloud.com/crmCommonApi/resources/11.1.11/accounts/describe

It is possible to filter specific the results such that you see only the data that you need using the “fields” parameter

https://.crm..oraclecloud.com/crmCommonApi/resources/11.1.11/accounts/3008?fields=PartyUniqueName,PrimaryContactName

Even after specifying only two fields, you will notice that there are several other “links” return in the JSON response. These links are primarily used to navigate to child objects, self-references, and LOVs. In conformance to the HATEOAS constraints, Oracle Sales Cloud REST APIs provide these links to facilitate dynamic navigation from User interfaces. For example, it is easy to obtain the list of values associated with a given record, or access the child records, or during pagination it is easy to navigate back and forth to other sets of records. These links can be optionally be disabled using the onlyData=true parameter

Several parameters such as query, limit, total count, and order by are available when doing a GET. I won’t go into more details as these are very clearly described in the API catalog https://docs.oracle.com/cloud/latest/salescs_gs/FAAPS/Resource_Actions.html. It is very easy to play with these params. Try them out!

Custom Fields, Custom Objects, and REST APIs

All Custom objects will automatically have a REST API enabled. So, if you create a custom object called PromotionalMaterial, then you obtain an API /../PromotionalMaterial that exposes all fields of this object. No additional work is required. Any relationships that are modeled for between any combination of custom objects and standard objects are also exposed in the API immediately. Similarly, when custom fields are added to an existing standard object(for which a REST API is available), then these custom fields are also exposed immediately.

Important: Since you will typically be adding custom fields and custom objects when in an active Sandbox, these API changes will be visible only to those users that are using that sandbox. Once the changes are published to the mainline, all users will be able to access the new/modified APIs

Authentication

All Sales Cloud REST APIs are protected using an OWSM policy – oracle/multi_token_over_ssl_rest_service_policy. This policy currently the following client authentication methods

  1. Basic Auth over SSL (which we used earlier in this post)
  2. SAML 2.0 (provided a SAML trust is already established between the Sales Cloud and the calling service)
  3. JWT Token (A bearer token instead of Basic Auth. If trust is set up between Sales Cloud and the calling service, the JWT token can be issued by the calling service)

Authorization

The data returned by the API is governed by the Sales Cloud Role and Data level security. For example, if John Doe does a GET on all Opportunities and authenticating using Basic Auth as John Doe, then Sales Cloud will check if John Doe is allowed to access Opportunities through the web service, and will also determine which Opportunities John Doe should be able to view based on standard data level security for Opportunities in Sales Cloud (more details here)

Additionally, clients can pre-verify the level of Role access for a given user by using /describe. For example, when John Doe executes /describe and it returns the content below, then it is clear that the GET and POST operation are allowed on this specific object. Additionally, Child objects can be protected separately and so it will have it’s own “actions” element with the allowed methods.

"actions": [
          {
            "name": "get",
            "method": "GET",
            "responseType": [
              "application/json",
              "application/vnd.oracle.adf.resourcecollection+json"
            ]
          },
          {
            "name": "create",
            "method": "POST",
            "requestType": [
              "application/vnd.oracle.adf.resourceitem+json"
            ],
            "responseType": [
              "application/json",
              "application/vnd.oracle.adf.resourceitem+json"
            ]
          }
        ] 

Use Cases for SOAP vs REST APIs

Although Sales Cloud REST APIs can be used for both data integration use cases and UI extension use cases, it is likely that Sales Cloud REST APIs will be used more for the latter. Sales Cloud REST APIs use JSON by default making it a great choice for UI development (not the only reason). Sales Cloud SOAP services on the other hand use XML and since there are several mature integration tools that are based on XML technologies such as XSLT, and XQuery, it makes it easier for Integration developers to continue using SOAP Services. Additionally Sales Cloud Events and File integration choices are also available for data integration.

REST APIs are a huge boon for Custom UI development, which could take the form of building specific extensions Sales Cloud UIs, or for building fully standalone browser-based Javascript UIs, or Mobile UIs that are powered by Sales Cloud REST APIs. Since the REST API encapsulate all core Sales Cloud business logic and Security logic, custom UIs need not duplicate the same.

If you would like to get started on building your first UI based on Sales Cloud REST API, Angelo has written a very nice blog where he shows the usage of Oracle JET to build a custom UI that invokes Sales Cloud REST APIs.

Using Sales Cloud with Oracle Mobile Cloud Service when building Custom UIs

While Sales Cloud REST APIs are very powerful and intuitive to use, you may have requirements where it would make more sense to have a wrapper/proxy API which shapes these Sales Cloud APIs to make it more tailored for your custom UI. For example, your UI requirements may dictate the need for invoking multiple Sales Cloud APIs or even external applications, and shaping the data before it can be displayed meaningfully in your custom UI. When developing Single Page Applications (SPA) you may want to minimize the number of server calls made to render data and use getters and setters that simply bring the data that you are relevant to your current screen.

In such cases, as we discussed above, you will typically consider a wrapper/proxy layer. A good choice for implementing this shaping/orchestration/proxy would be any commercial product that serves as an API manager.

I’ll talk next about using Oracle Mobile Cloud Service to implement this layer. Mobile Cloud Service is an Oracle offering which not only offers the power of creating and managing API endpoints, but also provides a Node.js engine to perform this orchestration, and connectors to connect to several end systems. In addition to serving as an API management layer, Mobile Cloud Service (as the name suggests) provide several Mobile centric features such as push notifications, offline support, caching support, and location support.

The focus of this blog is not to describe all Mobile Cloud Service (MCS) features, but to illustrate the usage of MCS with Sales Cloud using a simple example. We’ll also touch upon expanded security options when using this approach.
Note: If you would like to learn more about MCS, please refer to list of A-Team blogs on MCS

I’ll next walk-through a simple use case of using MCS with Sales Cloud REST APIs.

Imagine a Sales Rep walking into a customer site for a meeting. This Rep would like to have a UI which provides details about this customer. Let’s assume that the Sales Rep would like basic account details, open issues with existing products that this customer uses, and the current stock ticker of the company. As you may realize each of this information typically comes from a different system – account details come from Sales Cloud, Issue details (or Incidents) come from Service Cloud, and the stock ticker probably comes from Google Finance APIs . The developer of this custom UI however would like a single REST API called /Customers providing all this information.

Here are some high level steps to achieve this

Step 1: Create a new API in MCS and decide on a contract with the UI developer

Step 2: Create a REST connector in MCS to connect to Sales Cloud

Step 3: Create connectors to Service Cloud for incidents, and Google Finance for stock ticker

Step 4: Implement logic in MCS Node.js layer to orchestrate these calls

Step 5: Test service, add it to MCS Mobile back end, and expose MCS APIs securely

I’ll only discuss Step 2, Step 4 and Step 5. For step 1, refer to the Oracle MCS documentation. Step 3 is similar to Step 2.

For Step 2, refer to the image below. You will notice that when creating a connector you are providing only the endpoint of the top level Sales Cloud resource i.e. https://<instance>.crm.<datacenter>oraclecloud.com/crmCommonApi/resources/latest with no references to a specific Sales Cloud object. This connector will later be referenced in the Node.js implementation and specific resources will later be requested by the implementation code. For example Connector/Accounts. This decoupling is very useful when changing any connection details such as pointing the APIs to a different Sales Cloud instance or using different authentication credentials.

In the future, Oracle plans to release pre-build connectors to Sales Cloud which will also allow you to introspect the endpoints and browse different resources.

Snap13

In terms of security, when creating the connector notice in the image below that I’ve chosen the SAML security policy. Since MCS and Sales Cloud are both Oracle Cloud products SSO and SAML trust is pre-established. By choosing SAML for the MCS to Sales Cloud communication, I ensure that the identity of the user invoking the /Customer MCS API will be automatically be propagated to Sales Cloud. Passing the user context is very important because, as you may recollect, the Sales Cloud API returns results specific to a user’s role and data level security access. To ensure that SAML works, MCS and Sales Cloud should be in the same oracle cloud identity domain which inherently sets up the SSO and SAML trust between these services.

Snap15

Now step 4 requires me to invoke the Sales Cloud connector that I created using Node.js code and Oracle custom APIs. I’m not going into the details of how to attach a Node.js implementation to your MCS REST endpoint. This is explained very well in the Oracle MCS documentation. I’ll just focus on the code that calls the Sales Cloud MCS connector that we created earlier. A sample code snippet for this looks like below

module.exports = function(service) {
	service.get('/mobile/custom/Customers/CustomerId', function(req,res) {
		var result = {};
		req.oracleMobile.connectors.ArvindSalesCloud.get('accounts',null,{qs: {fields: 'PartyUniqueName,StockSymbol',onlyData:'true',limit:'10'}}).then(
		<<Other code to gather Service Cloud and Stock price. Typically a series of Node.js calls are invoked using Futures or async.waterflow constructs>>
		function(result){
			res.send(result.statusCode, result.result);
		},
		function(error){
			res.send(500, error.error);
		}
	);
	});
};

After implementation is complete, this API is attached to a Mobile Backend and published. The API is then protected for authentication using Basic Auth or OAuth. You can then call this API from the custom UI or using SOAPUI/Postman for the purpose of testing. Remember to pass the MCS Mobile Backend ID in your HTTP Header addition to providing the authentication headers. More details are available here in Oracle MCS documentation. Among other things, Mobile Backend acts an OAuth Client whose credentials are used by all Mobile apps that are using the MCS APIs. If the called MCS API includes calls to other MCS APIs (chaining) within the same backend, then the identity and credentials of the original caller are propagated through the chain of calls automatically.

As discussed before, when Sales Cloud and MCS are used together, they are typically provisioned in the same Oracle Cloud ID domain, inherently establishing a SAML trust. This is why MCS was able to invoke Sales Cloud using SAML when configuring the connector. In such environments, Sales Cloud is typically the IdP for Federation. In this setup, as soon as the end user using the mobile App logs in using the Sales Cloud credentials, he/she is able to request an OAuth token for the Mobile Backend which will be used for invocations in all subsequent contexts. This ensures end to end identity propagation.

As a closing remark, I’d like to point out that Sales Cloud REST APIs are very powerful, available out of the box, and free of charge to use. This post is to just introduce you to the APIs and point out some common implementation patterns. Expect to see more blogs from the A-Team on this topic.

 


Handling Unknown File Formats with Oracle Big Data Preparation Cloud Service (BDP)

$
0
0

Introduction

Recently one of our customers shared with us a Splunk file and asked if we could handle this with Oracle Big Data Preparation Cloud Service (BDP), so we tried it! The product will handle this format out of the box in its next release (coming soon, stay tuned!), but in the meantime we wanted to take this opportunity to see how BDP could help us process a file format that it did not understand out-of-the-box. We are sharing the result of this experience here.

Understanding the file format

A very good introduction to Slunk is available here and can be summarized as: “Splunk is a log aggregation tool […where you] write out your logs into comma delimited key/value pairs”. The file we had to interpret had over 6 different formats for the log entries with different key/value pairs that needed extraction. The following example has a similar structure to that file. We are using color coding and characters thickness to differentiate different formats:

2016-05-10-08:00:00 action=macdetected equipment=WD23sp001 macaddress=01:11:25:5a:92:a1 accesspoint=sbmalexingctr

2016-05-10-08:00:00 action=macdetected equipment=LS11ad145 macaddress=1a:22:bc:fe:21:b2 accesspoint=sbmalexingctr

2016-05-10-08:00:00 action=macdetected equipment=WD23sp001 macaddress=45:f1:32:2a:bb:11 accesspoint=sbmalexingctr

2016-05-10-08:00:00 action=macdetected equipment=WD23sq002 macaddress=c1:14:f1:66:48:8b accesspoint=sbmabulwc

2016-05-10-08:01:12 action=signup id=johndoe device=android ip=128.12.25.01 equipment=LS11ad145 macaddress=1a:22:bc:fe:21:b2 accesspoint=sbmalexingctr

2016-05-10-08:01:15 action=webaccess url=my.yahoo.com equipment=LS11ad145 macaddress=1a:22:bc:fe:21:b2 accesspoint=sbmalexingctr

2016-05-10-08:02:23 action=webaccess url=https://www.google.com/search?q=johndoe&ie=utf-8&oe=utf-8 equipment=LS11ad145 macaddress=1a:22:bc:fe:21:b2 accesspoint=sbmalexingctr

2016-05-10-08:03:21 action=maclost reason=outofrange equipment=WD23sq002 macaddress=c1:14:f1:66:48:8b accesspoint=sbmabulwc

2016-05-10-08:03:22 action=macdetected equipment=WD23sp001 macaddress=77:a8:da:c5:33:58 accesspoint=sbmalexingctr

2016-05-10-08:03:22 action=signup id=janedoe device=iphone ip=128.12.25.02 equipment=WD23sp001 macaddress=77:a8:da:c5:33:58 accesspoint=sbmalexingctr

2016-05-10-08:03:22 action=iMessage message=connect server=imessage.apple.com equipment=WD23sp001 macaddress=77:a8:da:c5:33:58 accesspoint=sbmalexingctr

2016-05-10-08:04:45 action=iMessage message=connectResponse server=imessage.apple.com equipment=WD23sp001 macaddress=77:a8:da:c5:33:58 accesspoint=sbmalexingctr

2016-05-10-08:04:47 action=iMessage message=Push Topic server=imessage.apple.com equipment=WD23sp001 macaddress=77:a8:da:c5:33:58 accesspoint=sbmalexingctr

2016-05-10-08:04:49 action=iMessage message=Push Notification server=imessage.apple.com equipment=WD23sp001 macaddress=77:a8:da:c5:33:58 accesspoint=sbmalexingctr

2016-05-10-08:05:05 action=macdetected equipment=WD23sp001 macaddress=33:fb:b8:34:55:04 accesspoint=sbmalexingctr

2016-05-10-08:05:05 action=iMessage message=Push Notification Response server=imessage.apple.com equipment=WD23sp001 macaddress=77:a8:da:c5:33:58 accesspoint=sbmalexingctr

2016-05-10-08:06:57 action=signup id=JimmyFoo device=OSX ip=128.12.25.03 equipment=LS11ad145 macaddress=1a:22:bc:fe:21:b2 accesspoint=sbmalexingctr

2016-05-10-08:07:35 action=data protocol=ftp equipment=LS11ad145 macaddress=1a:22:bc:fe:21:b2 accesspoint=sbmalexingctr

2016-05-10-08:07:36 action=disconnect id=johndoe device=android ip=128.12.25.01 equipment=LS11ad145 macaddress=1a:22:bc:fe:21:b2 accesspoint=sbmalexingctr

2016-05-10-08:07:37 action=maclost reason=poweroff equipment=WD23sq002 macaddress=c1:14:f1:66:48:8b accesspoint=sbmabulwc

2016-05-10-08:08:01 action=macdetected equipment=WD23sq002 macaddress=c1:14:f1:66:48:8b accesspoint=sbmabulwc

There are two challenges with this type of format:

  • First, the shape of the records change from line to line
  • Second, the records mix metadata (keys or labels) and data (values). We want to separate the two for an easier display with reporting tools, or to feed that data to ETL tools for further processing.

With BDP, we want to extract all the labels and use them as column names. Once we have identified and extracted the column names, we have a separation between data and meta-data… and we can display the information efficiently.

In this early version of BDP, we only see a single column for the original file (Col_0001 in the screenshot below).

Extract from Col_0001

The first thing we do is to ask BDP to list all labels available with a very basic regular expression that lists all strings that end with the ‘=’ sign:

([A-z]+=)

You can see below that BDP provides immediate feedback for the result of the expression (see Regex Result) as we are building it:

Labels Regular Expression

This generates a new pseudo column with only the label names. If we ask BDP to display all distinct values for this new column, we now have a list of all possible shapes for the records… along with the exhaustive list of possible labels:

List of Labels

Now that we know which column names we want to create, we further take advantage of BDP to extract the information that we need from the file and prepare it for our users.

Extracting Column names

For each one of the column of interest to us, we can extract the data with a very simple extract expression. For instance, if we want a column named AccessPoint to store the data from the accesspoint label, all we have to do is use the expression:

accesspoint=([A-z]+)

When we enter this expression, once again BDP gives us immediate feedback on the values found in the data file (sbmalexingctr in the example below):

Access Point regular expression

A column such as Mac Address would use an expression like this:

macaddress =([a-f0-9:]+)

Mac address regular expression

For each column that we create, BDP updates profiling statistics immediately to give us a detailed view of the distribution of the data. In the screenshot below, we can see the distribution for the Action column: distinct values and their count in the table on the left, and a bubble chart at the bottom right.

Actions profiling

This allows us to keep or discards columns as we move along. Columns with only a single value across the board might not be of much interest for instance…

Publishing the result

Once we have identified all the columns of interest, we can publish the data – either directly to Business Intelligence Cloud Service (BICS), or to a file for further consumption. At this point, removing the original columns (‘Col_0001’ and ‘Labels’) makes sense. We can always retrieve them later if needed since BDP allows us to reverse any operation by clicking on the ‘X’ next to the operation itself in the list of transforms:

Transforms list

We can now execute our script. The progress of the operations is visible in the Activity Stream which can be found on the right-hand side of the dashboard as seen in the screenshot below:

Dashboard

We can then review the result of our operations:

Published Results

We did our first exercise with a small version of the original file. This allowed us to design the solution while the much larger (4Gb) original file was being loaded to the Cloud. Since BDP runs on a Hadoop cluster (leveraging Spark/Scala), it scales easily for massive content.

By the time we had completed this first run, the original file was available in the Cloud. We ran the exact same job definition on the larger file and experienced immediate success: once the transform script is defined, we can run it against any file that matches more or less the original format: labels can be out of order, they can be added or missing. None of these will prevent the transformations from running.

These jobs can be operationalized via the scheduler to process new files, or can be based on the detection of new files arriving in the source system.

Conclusion

Even with a file format that is not recognized by BDP (at least in this early release) we can take advantage of its ability to dynamically prepare and profile data to extract relevant information. We can then publish it in a structured format for further consumption with very little effort and no programming skills required.

For more Data Integration best practices, tips, tricks, and guidance that the A-Team members gain from real-world experiences working with customers and partners, visit Oracle A-team Chronicles for Data Integration.

Acknowledgements

Special thanks to Luis Rivas and Sandrine Riley for continuous support on BDP.

Oracle Sales Cloud REST APIs – Handling Child Objects

$
0
0

Introduction

Oracle Sales Cloud provides a comprehensive set of customization tools and configuration options to implement customer-specific business cases. In this article I would like to put a spot light on an imaginary situation

  • Customer is active in an industry like Hightech, Mechanical Engineering or Tooling and uses Oracle Sales Cloud for their sales processes
  • As a new feature they would like to capture competitive intelligence for every existing account in Sales Cloud
  • The idea is to allow service technicians to collect information about installed base of competitor products at their customers site
  • This information can be gathered by service technicians via a mobile device (tablet, smartphone etc) using a custom application that is connected to Sales Cloud
  • Enrichment of information can happen later by sales employees via Oracle Sales Cloud UI – first step is creation of a catalogue by post-sales team for the creation of a base of competitive intelligence
  • We can imagine the benefits of such an information in a further sales cycle or for upsell activities

Such an extension can be easily implemented via Custom UI, Sales Cloud Application Composer and Sales Cloud REST interfaces. For custom UI multiple options exist such as using Oracle JET, or Ionic/HTML5 with AngularJS. My teammate Angelo Santagata has published recently a great article describing the interface between Oracle Sales Cloud and Oracle JET. Please refer to the article when you want learn more about Mobile UI creation via Oracle JET. In this article I’ll show a generic approach how a custom user interface built in HTML and Javascript could look like. The focus for this article will also be on Sales Cloud Child Objects and addressing them via REST interfaces.

High Level Steps

Before we start implementing such a solution our customer must be clear about the functional requirements:

  • which are the competitive data they want to collect and how will they be used later?
  • which data are mandatory in the moment new records are being created by service engineers and which data can be enriched later offline?
  • will the competitive information exist in a certain context of an account or rather represent a standalone data entity?

In a next step the new extension objects must be setup in AppComposer in Sales Cloud. Eventually a new page will be created that allows the creation and maintenance of the admired competitive information. So far no coding had to be done as all of these steps work on a declarative base.

These new objects will be addressable by REST API’s and the creation of a custom UI follows as a technical implementation step. A base decision has to be made about the technical implementation:

  • is there already a PaaS solution (MCS, JCS etc) in use and should a custom UI run on any of those platforms?
  • what are the preferred devices the service engineers use on their customers site?
  • are there any restrictions in terms of roles and policies setup for Account and their competitive data (who sees what)?
  • does our customer have the right skills in development team to create a custom solution?
  • do we have all the REST API’s we need?

Setting up Account related Child Objects in AppComposer

The best way to add Account related custom information can be done by creation of a Child Object in App Composer. In the beginning we must make a decision to create the data structures for this extension as Child Object or a Standalone Object. Standalone Objects are new objects that exist autonomously as new data entities in OSC, while Child Objects have a fixed relation to an existing Standard Object like Accounts, Addresses, Contacts, Sales Orders or others. In our specific use-case the definition of a Child Object under Accounts makes most sense as the additional information is tied to specific accounts and exist in that specific context. Standalone Objects might cause more overhead for creation of a relationship to existing Accounts. So there wouldn’t be a benefit to use them, while Child Objects fulfill exactly the requirements as needed in this business case.

Picture below shows the solution once a Child Object extension for storing competitive intelligence information has been set up. An additional icon represents a custom page to enter and maintain the competitive information for a specific Account. No coding is required as being explained further down in this article. Creation of such an extension is a pure declarative activity and data structures will resist lifecycle maintenance operations.

01_CustomerPage

Once the Child Objects have been created and user clicked on custom icon above, a new page will be opened to enter all known information about competitor products installed at this Account. Least number of information is mandatory like Competitor Name and a Remark with some qualifying information. A Service Engineer onsite might be under time pressure or have no access to other sources of competitive intelligence like pricing. We want him to enter the information he is capable to gather – sales employees can enrich the information at a later stage. Once a field in Child Object structure is registered as mandatory it will be also a required field in REST structure. For a better flexibility it is sufficient to make just those fields mandatory that are really qualifying an information.

02_CustomerPage

The starting point for creation of a Child Object is the Objects menu in AppComposer as shown in screenshot below. Choose Standard Objects and Accounts to create a custom structure for competitive intelligence being related to a customer record.

In our case we provide information for field definitions and access management as we must our service engineers able to enter information.

The new Child Objects exist  will run in a user specific sandbox, but can also published to the global sandbox once this solution will be made final. Why is the usage of sandboxes crucial for the development of extensions like this? As the name says a sandbox is an isolated runtime environment for every user subscribing to it. A user can subscribe to any sandbox, but not use more than one single sandbox at a time. Using sandboxes will decrease the risk of harming the entire system: if issues appear as a result of customizations and extensions they will run only in an isolated environment and not harm other business processes, UI’s or affect other functionalities. Only when a solution in a sandbox has been accurately tested and certified for a more common usage it should be published and made available globally as part of a careful maintenance activity. Its important to mention that all steps for registration of Child Objects and usage of their REST API’s are tied to a specific sandbox and only those users have a benefit of their usage who have applied to the specific sandbox where they have been created.

03_Child_Object

As shown in screenshot below we’ve created a record name containing the competitors name for our custom record.

04_ChildObject

For the record structure we’ve chosen the following fields

  • CompetitorInfo ⇒ Context information about the specific record – mandatory, but can be filled with CompetitorName if no other information is available
  • ProductNameInstalled ⇒ Name or Typ of our competitor’s product as been founders side
  • NumberOfItemsInstalled ⇒ whatever the service engineers see at customers site regarding installed competitors items
  • ProductInstallationDate ⇒ if known this field will hold the information about installation date
  • ProductExpirationDate ⇒ knowing the expiration date would be useful for our sales team to initiate our own upsell activity
  • ProductValue ⇒ if known we can enter the value of competitor items being sold to our customer

05_ChildObjectWe should bear in mind that our service engineers are not usually the audience working with core customer data. For a perfect fitting access management we should consider to create a special role for our service engineers and give them exactly the grant to manage competitive information they need.

06_ChildObjectOnce done, we’re good to collect the competitive information per account. This is available yet as shown in screenshot above. Means the members of sales team can start entering/maintaining the information. However the Sales Cloud UI might provide too many interactions for service engineers. For that audience we will create a more simplified UI to create information via the REST interface of our Child Object.

Child Object representation in Sales Cloud REST APIs

The REST structure of Sales Cloud objects is documented here. For a full introduction to Sales Cloud REST APIs refer to Arvind’s blog!

By using the URL https://<mysalescloud.oraclecloud.com>/crmCommonApi/resources/latest/accounts/describe we will retrieve information about the structure for Account REST API as shown below in Google Chrome’s extension Postman.

09_PostmanRestCallWhile the information above is probably known we might wonder how to address the custom Child Object for Competitive Information via REST. When editing the Child Objects in AppComposer we find the internal name set for the ChildCollection as “CompetitorCollection_c “ – this name is usually derived from Display Name of Child Object (“Competitor”) concatenated with “Collection_c”. The child structure in REST uses the same name concatenated with some context information: “OrganizationDEO_CompetitorCollection_c” as shown in screenshot below.

08_RestCompetitorsWhere do these context information come from? As a hint you might want to check the main page for our Child Object in AppComposer. There you will find “OrganizationProfile” as a parent object. Means the Child Objects are all linked to Organization Profile as additional information for this Standard Object. This link is fixed and cannot be changed to another node in the REST structure hierarchy.

As shown on top we received the entire Account structure in REST by using the “describe” qualifier as part of the URL. Our Child object description is embedded in that huge output, so that we have to search for it. Once found it looks like this as a declaration:

...
{
    "rel": "child",
    "href": "https://<mysalescloud.oraclecloud.com>:443/crmCommonApi/resources/11.1.11/accounts/{id}/child/OrganizationDEO_CompetitorCollection_c",
    "name": "OrganizationDEO_CompetitorCollection_c",
    "kind": "collection",
    "cardinality":
    {
        "value": "1 to *",
        "sourceAttributes": "OrganizationProfileId",
        "destinationAttributes": "OrganizationProfile_Id_c"
    }
},
...

Further down in REST structure we find more details about our child object – below shown for the Child Object structure and the field definition for “ProductNameInstalled” as a sample:

...
},
"children": {
...
    "OrganizationDEO_CompetitorCollection_c": 
    {
        "discrColumnType": false,
        "title": "Competitor",
        "titlePlural": "Competitors",
        "attributes": [
        {
            "name": "Id",
            "type": "integer",
            "updatable": false,
            "mandatory": true,
            "queryable": true,
            "allowChanges": "never",
            "precision": 32,
            "hasDefaultValueExpression": true,
            "title": "Record ID",
            "properties": 
            {
                "fnd:FND_AUDIT_ATTR_ENABLED": "false"
            }
        },
        {
                "name": "RowType",
                "type": "string",
        },
        ...
        {
            "name": "ProductNameInstalled_c",
            "type": "string",
            "updatable": true,
            "mandatory": false,
            "queryable": true,
            "allowChanges": "always",
            "precision": 80,
            "title": "Product Name Installed",
            "maxLength": "80",
            "properties": {
            "protectionObjectTitle": "Competitor",
            "fnd:OSN_ENABLED_ATTR": "true",
            "TOOLTIP": "Name of competitors product installed at our clients side",
            "protectionKey": "Competitor_c.ProductNameInstalled_c",
            "DISPLAYWIDTH": "50",
            "description": "Name of competitors product installed at our clients side",
            "protectionState": "TOKENIZED",
            "AttributeType": "Text",
            "ExtnCustom": "Y"
        }
    },
    ...

As the definition shows the information and fields for every specific Child Object are related to an {id} in the hierarchy. In our case for the Standard Object “Accounts” the unique key “PartyNumber” of parent object represents this ID. Means: without knowing the value of “PartyNumber” for a specific Account we can’t address the attached Child Object.

Using the REST interface for Child Objects to view/edit data

With the knowledge above about REST structures and addressing we will show in a sample how to retrieve and enter data for competitive information via REST.

One choice to evaluate the data would be Postman as a Google Chrome extension. As described in the standard Oracle Docs we can add parameters to our REST call to add some filter conditions. The screenshot below shows a sample where we look for a customer called “Willis Towers”. In the JSON result we find the value for PartyNumber (2nd item below): 34014. Now we got the information we need to access our child objects!

07_RestAccount

For those people who prefer a command line call to retrieve the data they could use curl with the following parameters:

curl -u <user>:<passwd> -H "Content-Type:application/json" \ 
-H "Accept: application/json" -k -X GET \ 
https://<mysalescloud.oraclecloud.com>/crmCommonApi/resources/latest/accounts?onlyData\&limit=200\&q=OrganizationName="Willis%20Towers"

The resulting JSON will look like this:

{ "items" : [ 
    { 
        "PartyId" : 300000007548087, 
        "PartyNumber" : "34014", 
        "SourceSystem" : null, 
        "SourceSystemReferenceValue" : null, 
        "OrganizationName" : "Willis Towers", 
        "UniqueNameSuffix" : null, 
        "PartyUniqueName" : "Willis Towers", 
        "Type" : "ZCA_PROSPECT", 
        "OwnerPartyId" : 300000006885963, 
        "OwnerPartyNumber" : "31005", 
        "OwnerEmailAddress" : "john.doe@foo.com", 
        "OwnerName" : "John Doe", 
        ...
}

Coming back to our sample for the service engineers having a custom application to collect competitive information we can follow the following algorithm as a sample:

  • Use an identifier as a filter for Accounts the service engineers are allowed to see and to manage in terms of competitive data
  • In our sample below we used the field OwnerName, but in real life cases any other filter condition would work the same or even better like a specific address information (customer location = “Chicago”) or a specific Organization or similar
  • The custom app will provide a choice list only for the customers the service engineers has a given permission via filter
  • We bear in mind that the service engineers job is the maintenance of our products and an entry of competitive intelligence must be straight-forward and quick – as said on top of this article ideally working as a mobile app on a smartphone

As shown in the Postman screenshot below we filter the PartyNumber and PartyUniqueName in our call. The name value will be shown in custom app later while the number will be used to address the Child Object.

10_PostmanRestCall

The call via curl would look like this:

curl -u <user>:<passwd> -H "Content-Type:application/json" -H "Accept: application/json" -k X GET \ 
https://<mysalescloud.oraclecloud.com>/crmCommonApi/resources/latest/acounts?onlyData\&limit=200\&q=OwnerName="Ulrich%20Janke"\&fields=PartyNumber,PartyUniqueName

Maybe worth to mention that we get an overview about number of records found, any limits for number or records in result set as defined in the REST call and information that more records exist:

{ 
    "items" : [ 
        { 
            "PartyId" : 300000007548087, 
            "PartyNumber" : "34014",
            ...
        } ], 
    "count" : 2, 
    "hasMore" : false, 
    "limit" : 200, 
    "offset" : 0, 
    "links" : [ 
        { 
            "rel" : "self", 
            "href" : "https://<mysalescloud.oraclecloud.com>/:443/crmCommonApi/resources/11.1.11/accounts", 
            "name" : "accounts", 
            "kind" : "collection" 
        } ]
...

It’s a business decision, but also a question of usability whether a long list of customers would make sense to be shown in custom UI. Apparently it would make rather sense to restrict it to a number less than 200 and to use any other filter condition. Solution details are not part of the scope in this article, but maybe a SR number or something similar would be better choice.

Once our service engineer has opened the custom app and entered the competitive intelligence data, the final action is to trigger a REST call to update the values of according Child Object by pressing the Submit button.

For this action two things are important to mention (also explained in detail in Oracle Docs):

  • We must used PATCH as operation to insert a new Child Object
  • The Content-Type must be “application/vnd.oracle.adf.resourceitem+json”

Screenshot below shows a sample REST call in Postman:

11_PostmanRestCall

12_PostmanRestCall

With curl the same REST call would look like this:

curl -u <user>:<passwd> -H "Content-Type: application/vnd.oracle.adf.resourceitem+json " \
 -H "Accept: application/json" -k X PATCH \
 -d '{\
         "RecordName":"Bad & Expensive",\
         "CompetitorName_c":"Deal late in 2014",\
         "ProductNameInstalled_c":"UJ/890-7896/ABC-890",\
         "NumberOfItemsInstalled_c":2, \
         "ProductInstallationDate_c":"2014-11-01",\
         "ProductExpirationDate_c":"2019-11-01"\
     }' \
 https://<mysalescloud.oraclecloud.com>/crmCommonApi/resources/latest/accounts/{partyID}/child/OrganizationDEO_CompetitorCollection_c

Once the call was executed successfully on server he will send back the complete new child record containing the competitive information.

Finally the screenshot below shows a sample UI as written in plain HTML and Javascript.

13_CompetitiveEntryApp

As mentioned in the beginning any JS framework including Oracle JET in combination with Oracle Mobile Cloud or Java Cloud Service would make sense to host such a custom app. The reason to provide a generic code sample is based on the consideration that we focus in this article about the logical structure to handle the access to Child Objects. A plain HTML sample is more compact and might be easier to read or to be used in an own test application other than some framework specific solution.

Below you can find the sample implementation in HTML:

<!DOCTYPE html>
<html>
<head>
    <title>Oracle Sales Cloud - Rest Access for Child Objects</title>
    <meta charset="UTF-8">
    <meta name="viewport" content="width=device-width, initial-scale=1.0">
    <style>
        form {
            width: 40em;
        }
 
        #h2 {
            font-weight: bold;
            font-size: 150%;
            width: 90%;
            margin-left: 30%;
            margin-bottom: 5%;
            height: 10%;
            float: left;
        }

        input,
        label.input {
            float: left;
            width: 40%;
        }
 
        select,
        select.option {
            width: 45%;
            margin-left: 30%;
        }
 
        select {
            margin: 0 0 1em .2em;
            padding: .2em .5em;
            background-color: #aaeee0;
            border: 1px solid #e7c157;
            font-weight: bold;
        }
 
        input {
            margin: 0 0 1em .2em;
            padding: .2em .5em;
            background-color: #fffbf0;
            border: 1px solid #e7c157;
        }
 
        label.input {
            text-align: right;
            line-height: 1.5;
            font-weight: bold;
        }
 
        label.input::after {
            content: ": ";
        }
 
        button {
            float: right;
            width: 30%;
        }
    </style>
</head>
<body>
    <main>
        <form id="competitiveInfo">
            <label id="h2" form="compInfo">Competitive Information</label>
            <label class="input" for="customerList">Customer</label>
            <select id="customerList" name="customerList"></select>
            <label class="input" for="compInfo">Competitor Name</label>
            <input type="text" id="compName" maxlength="100" required>
            <label class="input" for="compInfo">Additional Competitor Info</label>
            <input type="text" id="compInfo" maxlength="100">
            <label class="input" for="prodName">Product Name/Type</label>
            <input type="text" id="prodName" maxlength="100" required>
            <label class="input" for="numItems">Number of items</label>
            <input type="number" id="numItems" min="1" max="100" required> 
            <label class="input" for="installDate">Installation Date</label> 
            <input type="date" id="installDate"> 
            <label class="input" for="expireDate">Expiration Date</label> 
            <input type="date" id="expireDate"> 
            <button type="submit" id="mySubmit">Submit</button> 
            <button type="reset" id="myReset">Clear</button> 
        </form> 
    </main> 
    <script> 
        var handleSubmit = document.getElementById("mySubmit"); 
        var getCustData = null; 
        var sendCustData = null; 
        handleSubmit.addEventListener ('click', doSubmit); 
        document.addEventListener("DOMContentLoaded", doPastDocLoad);   

        function doPastDocLoad() { 
            var custDataURL = "https://<mysalescloud.oraclecloud.com>/crmCommonApi/resources/latest/accounts?onlyData&limit=200&q=OwnerName=John%20Doe&fields=PartyNumber,PartyUniqueName"; 
            getCustData = new XMLHttpRequest();   
            getCustData.onreadystatechange = processCustDataRequest; 
            getCustData.open( "GET", custDataURL, true ); 
            getCustData.setRequestHeader("content-type", "application/json"); 
            getCustData.setRequestHeader("accept", "application/json"); 
            getCustData.setRequestHeader("Allow-Control-Allow-Origin", "*" ); 
            getCustData.send( null ); 
        }   

        function processCustDataRequest() { 
            if ( getCustData.readyState === XMLHttpRequest.DONE && getCustData.status === 200 ) { 
                var myCusts = getCustData.responseText; 
                var custData = JSON.parse(myCusts);   
                
                if( custData.hasMore ) 
                    alert("More than " + custData.limit + " customers existing! Just showing first " + custData.count + " records ...");   

                var custDataOpts = '';   
               
                if( custData.count === 0 ) { 
                    custDataOpts = '<option value=0>NO CUSTOMER ASSIGNED</option>'; 
                    document.getElementById('customerList').innerHTML = custDataOpts; 
                    document.getElementById('mySubmit').setAttribute("disabled", "true"); 
                } 
                else { 
                    for (var i = 0; i < custData.count; i++) { 
                        custDataOpts += '<option value="'+ custData.items[i].PartyNumber + '">' + custData.items[i].PartyUniqueName + '</option>'; 
                    } 
                    document.getElementById('customerList').innerHTML = custDataOpts; 
                } 
            } 
        }
     
        function doSubmit() { 
            var custList = document.getElementById("customerList"); 
            var partyID = custList.options[custList.selectedIndex].value; 
            var compName = document.getElementById("compName").value; 
            var compInfo = document.getElementById("compInfo").value; 
            var prodName = document.getElementById("prodName").value; 
            var numItems = document.getElementById("numItems").value; 
            var installDate = document.getElementById("installDate").value; 
            var expireDate = document.getElementById("expireDate").value;
         
            if ( compInfo === '') 
                compInfo = compName;   
            
            var patchBodyString = '{ "RecordName": "' + compName + '",' + '"CompetitorName_c": "' + compInfo + '",' + '"ProductNameInstalled_c": "' + prodName + '",
                   ' + '"NumberOfItemsInstalled_c": ' + numItems; 
            
            if (installDate.toString() !== '') 
                patchBodyString = patchBodyString + ', "ProductInstallationDate_c": "' + installDate + '"'; 

            if (expireDate.toString() !== '') 
                patchBodyString = patchBodyString + ', "ProductExpirationDate_c": "' + expireDate + '"'; 

            patchBodyString = patchBodyString + ' }'; 
            var patchBody = JSON.parse(patchBodyString);   
            
            try { 
                sendCustData = new XMLHttpRequest(); 
                sendCustData.onreadystatechange = procCompUpdReq; 
                var updCustDataURL = "https://<mysalescloud.oraclecloud.com/crmCommonApi/resources/latest/accounts/" + partyID + "/child/OrganizationDEO_CompetitorCollection_c"; 
                sendCustData.open( "PATCH", updCustDataURL, true );   
                sendCustData.setRequestHeader("Content-Type", "application/vnd.oracle.adf.resourceitem+json"); 
                sendCustData.setRequestHeader("Allow-Control-Allow-Origin", "*" ); 
                sendCustData.send( patchBody ); 
            }   
        
            catch ( e ) { 
                if ( e instanceof TypeError ) 
                    console.log("TypeError occured!");   

                if ( e instanceof SecurityError ) 
                    console.log("SecurityError occured!");   

                if ( e instanceof InvalidAccessError) 
                    console.log("InvalidAccessError occured!");   

                console.log(e.message); 
                console.log(e.name); 
                console.log(e.fileName); 
                console.log(e.lineNumber); 
                console.log(e.columnNumber); 
                console.log(e.stack); 
                alert(“Failure: “ + e.message ); 
            } 
        }   
        
        function procCompUpdReq() { 
            if ( sendCustData.readyState === XMLHttpRequest.DONE ) { 
                var respText = sendCustData.responseText; 
                var respData = JSON.parse(respText);   

                if (sendCustData.status !== 200) { 
                    alert("Failed! Status= " + sendCustData.status + " readyState=" + sendCustData.readyState); 
                } 
                else { 
                    alert("Competitive Information successfully updated!"); 
                } 
            } 
        } 
    </script> 
</body> 
</html>

Troubleshooting

In this article I didn’t handle some topics as they were already mentioned in Angelo’s blog post about Oracle JET and Sales Cloud:

  • Security considerations and Login are well explained in that blog and also valid for the solution shown above. In my explanations above I didn’t provide any special instructions for user authentication and authorization. Please follow the instructions in Angelos blog.
  • When testing the custom app above from a development environment like Google Chrome you might run into some serious issues with CORS and pre-flight checks. By default OSC won’t answer OPTIONS calls that are sent in advance of a PATCH operation when testing from a browser. As a workaround you might follow the instructions in Angelo’s blog and deploy the code to a registered server.

Cloud Security: Using Fusion Application Web Services with Message Protection

$
0
0

Introduction

Oracle Fusion Applications offers a number of WebServices to allow other applications to incorporate the Fusion Applications functionality. To prevent data leakage, these WebServices follow a common security pattern that requires access authentication and message protection using message signing and/or message encryption.

To use such a WebService, the WSDL of each service provides all the information that tells the WebService client what needs to be provided to call it successfully.

For Fusion Applications, nearly all WebServices use message protection, i.e., message signing and/or message encryption, to ensure that the message arrives as it has been sent by the client. Like many Oracle products, Fusion Applications implements this by using Oracle WebService Manager (OWSM). OWSM also publishes the WebService’s base64-encoded public key certificate in the WSDL.

This article explains how to get this public key certificate and its related signing root certificate and how to put in the correct keystore.

Background

WebServices are a well known technique used by many applications to expose APIs that allow to build bigger applications in service-oriented architectures (SOA). WebServices may receive or send sensitive information and should be secured to avoid non-authorized usage. To allow a WebService developer to work on the service implementation only, Oracle provides a security layer for WebServices called Oracle Web Service Manager (OWSM). This layer allows to configure security measures, profiles in OWSM parlance, by service administrators at runtime. The WebService security profile will be implemented by OWSM adapters that intercept the WebService incoming or outgoing traffic and free the developer from implementing all possible security measures and allow them fully focus on the service implementation only.

Certificates, Certificates, Certificates

Well secured WebServices require a number of certificates for proper message protection and secured transport. For Fusion Application Web Services these certificate types are very common:

  • Transport Level Security Certificate – The certificate used for securing the transport level (i.e., HTTPS). It can be easily retrieved from the browser session when inspecting the WSDL file. It is usually stored in the JDK truststore.
  • The Owner Certificate – This certificate is part of the WSDL, and used for message signing and/or encryption and may be installed in the client keystore.
  • The Issuer Certificate – (Optional) The Issuer Certificate name is part of the WSDL description. In later versions of OWSM the Issue Certificate is included in the WSDL, too.

Finding the OWSM Security Policy

A WebService WSDL protected by OWSM lists the security policies used for the WebService. The implementer of a client for the WebService can easily spot the security related contents.

The <wsp:Policy> tags may include OWSM policies, if the wsu:Id attribute specifies OWSM policy names like these:

<wsp:Policy wsu:Id="wss11_saml_or_username_token_with_message_protection_service_policy">
<wsp:Policy wsu:Id="wss11_saml_token_with_message_protection_client_policy">

If the <wsp:Policy wsu:Id> attribute includes the text message_protection, the related information, i.e., the certificates, for the message protection policies must be found.

Message protection uses X.509 certificates for a public key which can be used to encrypt and/or sign the SOAP message (see X.509 for a detailed description). An OWSM protected WebService may include one or more X.509 certificates, which can be found within the <wsdl:service> tag.

<wsdl:service name="FinancialUtilService">

The <wsdl:service> tag has a few subtags. The most interesting is the <wsid:Identity> tag.

<wsid:Identity>
 <dsig:KeyInfo>
  <dsig:X509Data>
   <dsig:X509Certificate>MIICUDCCAbmgAwIBAgIIcIrTEM228yQwDQYJKoZIhvcNAQEFBQAw
    VzETMBEGCgmSJomT8ixkARkWA2NvbTEWMBQGCgmSJomT8ixkARkWBm9yYWNsZTEVMBMGCgmSJomT8ix
    ...
    uJZwkAwdUZXpk7GfIo136l6wQDtmCl/k=</dsig:X509Certificate>
   <dsig:X509IssuerSerial>
    <dsig:X509IssuerName>CN=Cloud9CA, DC=cloud, DC=oracle, DC=com</dsig:X509IssuerName>
    <dsig:X509SerialNumber>8109526148158255908</dsig:X509SerialNumber>
   </dsig:X509IssuerSerial>
   <dsig:X509SubjectName>CN=FAEncryption, DC=cloud, DC=oracle, DC=com</dsig:X509SubjectName>
   <dsig:X509SKI>epsQzG3qkIZbd7Ia5NzRiQDfb3g=</dsig:X509SKI>
  </dsig:X509Data>
 </dsig:KeyInfo>
</wsid:Identity>

The important tags here are <dsig:X509Certificate>, <dsig:X509IssuerName>, <dsig:X509SubjectName>. The tag <dsig:X509Certificate> holds the actual certificate required for message protection. The tag <dsig:X509SubjectName> specifies the name of the certificate. And finally, the tag <dsig:X509IssuerName> specifies the name of the certificate that was used for signing the certificate in <dsig:X509Certificate>. This is usually the name of the root certificate.

If the <dsig:X509SubjectName> and the <dsig:X509IssuerName> match, the <dsig:X509Certificate> is a self-signed certificate and only this certificate is needed.

Extract the Certificate

Once the certificate has been located, it needs to be extracted and stored into a Java keystore (jks file). To do this, the value between <dsig:X509Certificate> and </dsig:X509Certificate> needs to be selected and copied into an editor. Before saving the content into a file, it must have a line -----BEGIN CERTIFICATE----- before the certificate and a line -----END CERTIFICATE----- after it. It should look like this:

-----BEGIN CERTIFICATE-----
MIICUDCCAbmgAwIBAgIIcIrTEM228yQwDQYJKoZIhvcNAQEFBQAwVzETMBEGCgmSJomT8ixkARkWA2Nv
bTEWMBQGCgmSJomT8ixkARkWBm9yYWNsZTEVMBMGCgmSJomT8ixkARkWBWNsb3VkMREwDwYDVQQDEwhD
...
OfGDtW/MLQpL2i8dL+SgEmjGUGtZuqEojTRE1IB/G+UuJZwkAwdUZXpk7GfIo136l6wQDtmCl/k=
-----END CERTIFICATE-----

When done, this should be stored in a file (for example owner_cert.cer).

Setting up the Keystore

Next, this certificate should be imported into a JKS keystore file. Although, there are many tools for doing this, the standard JDK keystore tool is a reliable tool for this task. To import the certificate, use the command keytool -importcert.

Note: Searching the Internet for a certificate import procedure often shows a two step process. However, even if there is no keystore file available, the keytool -importcert command creates a keystore with just the new certificate.
$ keytool -importcert -trustcacerts -alias orakey -keystore client.jks -file owner_cert.cer
Enter keystore password:
Re-enter new password:
Owner: CN=service, DC=us, DC=oracle, DC=com
Issuer: CN=CertGenCA, OU=FOR TESTING ONLY, O=MyOrganization, L=MyTown, ST=MyState, C=US
Serial number: 15633202a8b
Valid from: Thu Jul 28 22:09:21 CEST 2016 until: Tue Jul 27 22:09:21 CEST 2021
Certificate fingerprints:
         MD5:  B3:58:A8:61:A1:97:A2:DB:A6:5F:B3:EB:36:41:87:73
         SHA1: 9A:95:96:23:60:06:55:30:17:58:51:75:AF:2D:A4:A0:AF:65:1F:B9
         SHA256: EC:48:17:95:E6:6C:3A:7D:29:22:3E:21:9A:60:43:06:F5:57:DF:A6:E8:0B:FD:B9:4B:07:8B:E6:6A:73:35:FE
         Signature algorithm name: SHA256withRSA
         Version: 3

Extensions:

#1: ObjectId: 2.5.29.14 Criticality=false
SubjectKeyIdentifier [
KeyIdentifier [
0000: A8 67 A3 DA E8 52 2E D6   0D 07 93 83 96 3F 9E 09  .g...R.......?..
0010: EF E8 2F 56                                        ../V
]
]

Trust this certificate? [no]:  yes
Certificate was added to keystore
$

The values of Owner and Issuer match their respective values of the <dsig:X509SubjectName> and <dsig:X509IssuerName> tags.

Owner vs Issuer

Owner and Issuer are the values that help to identify the certificate. Both values are the parties who created the certificate. Both values can match or be distinct. If they match, we have a root or self-signed certificate.

Root certificates are issued by an Certificate Authority (CA). These play a role similar to government authorities that issue ID cards and passports. They can be trusted. On the other hand, a self-signed certificate can be issued by everyone and may not be trusted.

If the imported certificate have distinct values for Owner and Issuer, this surely means that at least two different certificates are required:

  • the certificate of the Owner
  • the certificate of the Issuer

Normally, OWSM includes the Owner certificate in the WSDL file.

Getting the Issuer Certificate

If the Owner certificate is not a self-signed certificate there are two options to get the Issuer certificate:

Getting the Issuer Certificate From the WSDL

In later versions of OWSM, even the Issuer certificate can be included in the WSDL, too. Here is how the tag of such a WSDL looks like.

<wsid:Identity>
 <dsig:KeyInfo>
  <dsig:X509Data>
   <dsig:X509Certificate>
    MIIDbDCCAlSgAwIBAgIGAVYzICqLMA0GCSqGSIb3DQEBCwUAMHgxCzAJBgNVBAYTAlVTMRAwDgY
    DVQQIEwdNeVN0YXRlMQ8wDQYDVQQHEwZNeVRvd24xFzAVBgNVBAoTDk15T3JnYW5pemF0aW9uMR
    ...
    yHnI/gfr19XWPAtSWVr0XqkTKmBtdtw4AwmEZB5bF08PIh+Ew==</dsig:X509Certificate>
   <dsig:X509IssuerSerial>
    <dsig:X509IssuerName>CN=CertGenCA, OU=FOR TESTING ONLY, O=MyOrganization, L=MyTown, ST=MyState, C=US</dsig:X509IssuerName>
    <dsig:X509SerialNumber>1469736561291</dsig:X509SerialNumber>
   </dsig:X509IssuerSerial>
   <dsig:X509SubjectName>CN=service, DC=us, DC=oracle, DC=com</dsig:X509SubjectName>
   <dsig:X509SKI>qGej2uhSLtYNB5ODlj+eCe/oL1Y=</dsig:X509SKI>
   <dsig:X509Certificate>
    MIIDvzCCAqegAwIBAgIQQARIhsRB7ztkOoBmQJr8oDANBgkqhkiG9w0BAQsFADB4MQswCQYDVQQ
    GEwJVUzEQMA4GA1UECAwHTXlTdGF0ZTEPMA0GA1UEBwwGTXlUb3duMRcwFQYDVQQKDA5NeU9yZ2
    ...
    4OTPTZgMX</dsig:X509Certificate>
  </dsig:X509Data>
 </dsig:KeyInfo>
</wsid:Identity>

The second <dsig:X509Certificate> tag may hold the certificate of the Issuer. This certificate should be copied into a file as described above. To import the Issuer certificate see the import command in the Developer Tasks below.

Getting the Issuer Certificate From the Administrator

The Administrator can be any person who manages a Fusion Applications environment on-premises or on Oracle Cloud. The Administrator steps for both installation options are the same. The steps for the WebService client developer are similar but involve different routes:

  • Cloud – File a Service Request on My Oracle Support
  • On-premises – File a Service Request internally

The steps for the Administrator are as follows:

  • Open the Domain FMW Console
  • Navigate to WebLogic Domain > DomainName > Security > Keystore > system >
  • Select castore
  • Click on Manage
  • Find the certificate which has the matching Issuer name (in column Subject Name)
  • Select the certificate
  • Click on Export
  • Click on Export Certificate
  • Send the file to the WebService client developer

Developer Tasks

A developer needs to do these steps:

  • Ask the Administrator to get the certificate whose Subject Name matches the value of the Issuer
  • When the Issuer certificate has been received from the Administrator, these steps are needed:
    • Import the Issuer certificate into the client keystore:
      $ keytool -importcert -trustcacerts -alias democa -keystore client.jks -file issuer_cert.cer
      Enter keystore password:
      <b>Owner: CN=CertGenCA, OU=FOR TESTING ONLY, O=MyOrganization, L=MyTown, ST=MyState, C=US</b>
      <b>Issuer: CN=CertGenCA, OU=FOR TESTING ONLY, O=MyOrganization, L=MyTown, ST=MyState, C=US</b>
      Serial number: 40044886c441ef3b643a8066409afca0
      Valid from: Sat Dec 01 04:07:51 CET 2012 until: Thu Dec 02 04:07:51 CET 2032
      Certificate fingerprints:
               MD5:  F2:33:C1:AF:A6:95:8B:A3:5C:CE:DF:0D:16:05:07:AD
               SHA1: CA:61:71:5B:64:6B:02:63:C6:FB:83:B1:71:F0:99:D3:54:6A:F7:C8
               SHA256: 57:10:7C:2C:B3:07:B9:8B:F8:FD:EB:69:99:36:53:03:7A:E1:E7:CB:D3:7A:E7:CF:30:F3:B3:ED:F3:42:0A:D7
               Signature algorithm name: SHA256withRSA
               Version: 3
      
      Extensions:
      
      #1: ObjectId: 2.5.29.19 Criticality=true
      BasicConstraints:[
        CA:true
        PathLen:1
      ]
      
      #2: ObjectId: 2.5.29.15 Criticality=true
      KeyUsage [
        Key_CertSign
      ]
      
      #3: ObjectId: 2.5.29.14 Criticality=false
      SubjectKeyIdentifier [
      KeyIdentifier [
      0000: 34 38 FD 45 D8 80 CF C7   D2 E8 DF 1D F8 A1 39 B0  48.E..........9.
      0010: 11 88 00 6A                                        ...j
      ]
      ]
      
      Trust this certificate? [no]:  yes
      Certificate was added to keystore
      $
      

Using the Certificate

Finally, when every certificate is in place, the WebService client code can be written in any programming language but should use the certificates stored in the client keystore when calling the WebService.

Java code for OWSM Client

One of the best ways to implement a Java Client is to use the JDeveloper WebService Proxy generator. The code uses the OWSM client libraries and frees the developer from coding the details for the security policies.

Once the code has been created, the following code lines help to get started. (Your mileage for the authentication may vary. For simplicity Username/Password have been used.)

// (Optional) If the SSL certificate is not present in the standard truststore
// we may override the it with these lines:
if (overrideTruststore) {
  System.setProperty("javax.net.ssl.trustStore", trustStore);
  System.setProperty("javax.net.ssl.trustStorePassword", trustStorePassword);
}
// ...
WSBindingProvider wsbp = (WSBindingProvider)service;
Map<String, Object> requestContext = wsbp.getRequestContext();
requestContext.put(BindingProvider.USERNAME_PROPERTY, username);
requestContext.put(BindingProvider.PASSWORD_PROPERTY, password);
requestContext.put(BindingProvider.ENDPOINT_ADDRESS_PROPERTY, endPointURL);
requestContext.put(ClientConstants.WSSEC_KEYSTORE_TYPE, &quot;JKS&quot;);
requestContext.put(ClientConstants.WSSEC_KEYSTORE_LOCATION, clientJKS);
requestContext.put(ClientConstants.WSSEC_KEYSTORE_PASSWORD, clientJKSPassword);
Note: If the OWSM client library finds an Owner certificate in the WSDL, it may override the certificate stored in the WSSEC_KEYSTORE_LOCATION. This is a pretty useful feature and frees the client developer from importing the Owner certificate. However, if the keystore will be shared between different WebService client implementations it must contain both certificates. In any case, the Issuer certificate must be present in the WSSEC_KEYSTORE_LOCATION.

References

Oracle Service Cloud – Outbound Integration Approaches

$
0
0

Introduction

This blog is part of the series of blogs the A-Team has been running on Oracle Service Cloud(Rightnow).

In the previous blogs we went through various options for importing data into Service Cloud. In this article I will first describe two main ways of subscribing to outbound events, as data is created/updated/deleted in Rightnow. These notifications are real-time and meant only for real-time or online use-cases.
Secondly, I will briefly discuss a few options for bulk data export.

This blog is organized as follows :

  1. Event Notification Service (ENS) – The recently introduced mechanism for receiving outbound events
    • a. Common Setup Required – for using ENS
    • b. Registering a Generic Subscriber with ENS
    • c. Using Integration Cloud Service – the automated way of subscribing to ENS
  2. Rightnow Custom Process Model(CPM) – The more generic, PHP-cURL based outbound invocation mechanism
  3. Bulk Export
    • a. Rightnow Object Query Language (ROQL) and ROQL based providers
    • b. Rightnow Analytics Reports
    • c. Third-party providers

1. The Event Notification Service

Sincethe May 2015 release Rightnow has a new feature called the Event Notification Service, documented here .
This service currently allows any external application to subscribe to Create/Update/Delete events for Contact, Incident and Organization objects in Service Cloud. More objects/features may be added in upcoming releases.

I will now demonstrate how to make use of this service to receive events. Essentially there are two ways, using the Notification Service as is (the generic approach) or via Integration Cloud Service (ICS).

a. Common Setup

In order to receive event notifications the following steps have to be completed in the Rightnow Agent Desktop. These steps need to be completed for both generic as well as the ICS approaches below.

  1. In the Agent Desktop go to Configuration -> Site Configuration-> Configuration Settings. In the Search page that comes up, in the ‘Configuration Base’ section select ‘Site’ and click Search.
  2. In the ‘Key’ field enter ‘EVENT%’ and click Search.
  3. Set the following keys:
    • EVENT_NOTIFICATION_ENABLED – Set it to ‘Yes’ for the Site. This is the global setting that enables ENS.
    • EVENT_NOTIFICATION_MAPI_USERNAME – Enter a valid Service Cloud username.
    • EVENT_NOTIFICATION_MAPI_PASSWORD – Enter the corresponding password.
    • EVENT_NOTIFICATION_MAPI_SEC_IP_RANGE – This can be used for specifying whitelisted subscriber IP Addresses. All IPs are accepted if kept blank.
    • EVENT_NOTIFICATION_SUBSCRIBER_USERNAME– Enter the Subscriber service’s username. ENS sends these credentials as part of the outgoing notification, in the form of a WS-Security Username-Password token.
    • EVENT_NOTIFICATION_SUBSCRIBER_PASSWORD – Enter the password.

01

b. Registering a Generic Subscriber

Now that the Event Notifications have been enabled, we need to create a subscriber and register it. The subscriber endpoint should be reachable from Rightnow, and in most cases any publicly available endpoint should be good.

For the purpose of this blog I defined a generic subscriber by creating a Node.js based Cloud9 endpoint accessible at https://test2-ashishksingh.c9users.io/api/test . It’s a dummy endpoint that accepts any HTTP POST and prints the body on Cloud9 terminal. It doesn’t require any authentication as well.

In order to register this endpoint, following steps must be followed :

  1. Rightnow manages subscriptions by using an object called ‘EventSubscription’. By instantiating this object an ‘endpoint’ can be registered as a subscriber, to listen to an object(Contact/Organization/Incident) for a particular operation(Create/Update/Delete). The object also tracks username/password to be sent out to the endpoint as part of the notification.
  2. In order to create an EventSubscription object the usual Connect Web Services Create operation can be used. Below is a sample XML request payload for the Create operation, that registers a Contact Update event to the Cloud9 endpoint.
  3. <soapenv:Envelope xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/" xmlns:v1="urn:messages.ws.rightnow.com/v1_3" xmlns:v11="urn:base.ws.rightnow.com/v1_3">
       <soapenv:Body>
          <v1:Create>
             <v1:RNObjects xmlns:ns4="urn:objects.ws.rightnow.com/v1_3" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:type="ns4:EventSubscription"> <!--specify the subscription object-->
    			<ns4:EndPoint>https://test2-ashishksingh.c9users.io/api/test</ns4:EndPoint> <!--endpoint info-->
    			<ns4:EventType>
    				<ID id="2" xmlns="urn:base.ws.rightnow.com/v1_3" /> <!--1=Create,2=Update,3=Delete-->
    			</ns4:EventType>
    			<ns4:IntegrationUser>
    				<ID id="1" xmlns="urn:base.ws.rightnow.com/v1_3" /> <!--1 = the seeded SUSCRIBER_USERNAME and PWD above-->
    			</ns4:IntegrationUser>
    			<ns4:Name>TestContactSubscription</ns4:Name>  <!--Name of the subscription-->
    			<ns4:ObjectShape xsi:type="Contact"/>   <!--Name of the object to subscribe-->
    			<ns4:Status>
    				<ID id="1" xmlns="urn:base.ws.rightnow.com/v1_3" /> <!--1=Active,2=Paused,3=Inactive-->
    			</ns4:Status>
             </v1:RNObjects>
          </v1:Create>
       </soapenv:Body>
    </soapenv:Envelope>

     
    Note : The OWSM security policy username_token_over_ssl_client_policy can be used to invoke the web service, passing valid Rightnow credentials. However, the SOAP Security Header shouldn’t contain a TimeStamp element. Rightnow will discard the requests containing a Timestamp element in the SOAP Header.

  4. That’s it. The endpoint is now registered, and whenever a contact is updated, Rightnow will invoke the registered endpoint with details. The message sent out is an XML SOAP message that contains object/event details and conforms to the Rightnow Event WSDL available at https:///cgi-bin/.cfg/services/soap?wsdl=event . This message also contains the SUBSCRIBER_USERNAME/PWD in the SOAP Header, in the form of a WS-Security UsernameToken. For now our Cloud9 endpoint doesn’t care about validating the Username token.
  5. In order to test, let’s update a Contact in Agent Desktop
  6. 02

  7. Voila! We see the corresponding EventNotification XML message in the Cloud9 console.
  8. 03

    For reference I have attached the formatted XML message here.

c. Using ICS Service Cloud Adapter

The Oracle Integration Cloud Service (ICS), the tool of choice for SaaS integrations, automates all of the steps in 1.2 above into a simple GUI based integration definition.
Below are the steps for receiving Rightnow events in ICS. It is assumed that the reader is familiar with ICS and knows how to use it.
Please note that the steps in 1.1 still need to be followed, and this time the SUBSCRIBER_USERNAME/PWD ‘Configuration Setting’ should be the ICS account’s username/password.

  1. Create and save an Oracle Rightnow connection in ICS.
  2. Create an Integration by the name ‘receive_contacts’. For this blog I chose the ‘Publish to ICS’ integration type.
  3. 05

  4. Open the integration and drag the Rightnow connection on the source-side. Name the endpoint and click ‘Next’
  5. 06

  6. On the ‘Request’ page select ‘Event Subscription’ , and select the desired event. Click Next.
  7. 07

  8. On the ‘Response’ page select ‘None’. Although, you could select a callback response if the use-case required so. Click Next.
  9. 08

  10. Click ‘Done’. Complete the rest of the integration and activate it.
  11. 09

  12. During activation ICS creates an endpoint and registers it as an EventSubscription object, as described in 1.2 above. But all of that happens in the background, providing a seamless experience to the user.
  13. If a Contact is updated in Agent Desktop now, we’d receive it in ICS.
  14. 10

2. Rightnow Custom Process Model

As discussed above, the Event Notification Service supports only Contact, Organization and Incident objects. But sometimes use-cases may require Custom Objects or other Connect Common Objects. In such cases Service Cloud’s Custom Process Model feature can be used for outbound events. I will now describe how to use them.

First, a few key terms:

  • Object Event Handler : A PHP code snippet that is executed whenever Create/Update/Delete events occur in the specified Rightnow objects. The snippet is used to invoke external endpoints using the cURL library.
  • Process Designer / Custom Process Model (CPM) : A component of the Rightnow Agent Desktop that is used to configure Object Event Handlers.

Below are the steps :

  1. Using any text editor, create a file called ContactHandler.php (or any other name) with the following code. The code basically defines a Contact create/update handler, loads the PHP cURL module and invokes a web service I wrote using Oracle BPEL. I have provided explanation at various places in the code as ‘[Note] :’
  2. <?php
    /**
     * CPMObjectEventHandler: ContactHandler // [Note] : Name of the file.
     * Package: RN
     * Objects: Contact // [Note] : Name of the object.
     * Actions: Create, Update // [Note] : Name of the operations on the object above for which the PHP code will be executed
     * Version: 1.2 // [Note] : Version of the Rightnow PHP API
     * Purpose: CPM handler for contact create and update. It invokes a web service.
     */
    use \RightNow\Connect\v1_2 as RNCPHP;
    use \RightNow\CPM\v1 as RNCPM; 
    /**
     * [Note] : Below is the main code, defining the handler class for the CPM . Like java, the class name should match the file name, and it implements the ObjectEventHandler class. The 'use' statements above define aliases for the \RightNow\Connect\v1_2 'package' .
     */
    class ContactHandler implements RNCPM\ObjectEventHandler
    {
        /**
         * Apply CPM logic to object.
         * @param int $runMode
         * @param int $action
         * @param object $contact
         * @param int $cycles
         */
    // [Note] : Below is the actual function that gets executed on Contact Create/Update.
        public static function apply($runMode, $action, $contact, $cycle)
        {
            if($cycle !== 0) return ;
    		// [Note] : The snippet below declares the URL and the XML Payload to be invoked
                $url = "http://10.245.56.67:10613/soa-infra/services/default/RnContact/bpelprocess1_client_ep?WSDL" ;
                $xml = '<soap:Envelope xmlns:soap="http://schemas.xmlsoap.org/soap/envelope/">
            <soap:Header>
                    <wsse:Security xmlns:wsse="http://docs.oasis-open.org/wss/2004/01/oasis-200401-wss-wssecurity-secext-1.0.xsd" mustUnderstand="1">
                <wsse:UsernameToken>
                    <wsse:Username>HIDDEN</wsse:Username>
                    <wsse:Password Type="http://docs.oasis-open.org/wss/2004/01/oasis-200401-wss-username-token-profile-1.0#PasswordText">HIDDEN</wsse:Password>
                </wsse:UsernameToken>
            </wsse:Security>
            </soap:Header>
            <soap:Body>
                    <ns1:process xmlns:ns1="http://xmlns.oracle.com/Application6/RnContact/BPELProcess1">
                            <ns1:input>'.$contact->Name->First.' '.$contact->Name->Last .'</ns1:input>
            </ns1:process>
        </soap:Body>
    </soap:Envelope>' ;
      
    
                $header[0]= "Content-Type: text/xml;charset=UTF-8";
                $header[1]= 'SOAPAction: "process"';
    			
    			// [Note] :The invocation requires and makes use of the cURL module.
                load_curl();
                $curl = curl_init();
                curl_setopt_array($curl,array(
                  CURLOPT_URL => $url,            
                  CURLOPT_HEADER => 0,
                  CURLOPT_HTTPHEADER => $header,  
                  CURLOPT_FOLLOWLOCATION => 1, 
                  CURLOPT_RETURNTRANSFER => 1,
                  CURLOPT_CONNECTTIMEOUT => 20,
                  CURLOPT_SSL_VERIFYPEER => 0,
                  CURLOPT_SSL_VERIFYHOST => 0,
     
                ));
                curl_setopt($curl,CURLOPT_POST,TRUE);
                curl_setopt($curl,CURLOPT_POSTFIELDS, $xml);
                $content = curl_exec($curl);
        }
    }
    /**
     * CPM test harness
     */
    // [Note] : These are unit test functions, needed by the RN PHP framework.
    class ContactHandler_TestHarness
            implements RNCPM\ObjectEventHandler_TestHarness
    {
        static $contactOneId = null,
        static $contactTwoId = null;
        /**
         * Set up test cases.
         */
        public static function setup()
        {
            // First test
            $contactOne = new RNCPHP\Contact;
            $contactOne->Name->First = "First";
            $contactOne->save();
            self::$contactOneId = $contactOne->ID;
            // Second test
            $contactTwo = new RNCPHP\Contact;
            $contactTwo->Name->First = "Second";
            $contactTwo->save();
            self::$contactTwoId = $contactTwo->ID;
        }
        /**
         * Return the object that we want to test with. You could also return
         * an array of objects to test more than one variation of an object.
         * @param int $action
         * @param class $object_type
         * @return object | array
         */
        public static function fetchObject($action, $object_type)
        {
            $contactOne = $object_type::fetch(self::$contactOneId);
            $contactTwo = $object_type::fetch(self::$contactTwoId);
            return array($contactOne, $contactTwo);
        }
        /**
         * Validate test cases
         * @param int $action
         * @param object $contact
         * @return bool
         */
        public static function validate($action, $contact)
        {
            echo "Test Passed!!";
            return true;
        }
        /**
         * Destroy every object created by this test. Not necessary since in
         * test mode and nothing is committed, but good practice if only to
         * document the side effects of this test.
         */
        public static function cleanup()
        {
            if (self::$contactOneId)
            {
                $contactOne = RNCPHP\Contact::fetch(self::$contactOneId);
                $contactOne->destroy();
                self::$contactOneId = null;
            }
            if (self::$contactTwoId)
            {
                $contactTwo = RNCPHP\Contact::fetch(self::$contactTwoId);
                $contactTwo->destroy();
                self::$contactTwoId = null;
            }
        }
    }
    ?>
  3. Log on to Agent Desktop. Click on Configuration-> Site Configuration-> Process Designer, and click ‘New’.
  4. 11

  5. Upload the ContactHandler.php file. Check the ‘Execute Asynchronously’ checkbox, the lib_curl module is available for async CPMs only.
  6. 12

  7. Click ‘Save’ on the Home Ribbon , and then click on the ‘Test’ button. On clicking test the ‘validate’ function in the code is executed. Make sure it executes fine, and that the output looks OK.
  8. 13

  9. Click OK, followed by clicking ‘Yes’, and then Save again. Now go to the Contact object under OracleServiceCloud, and assign the newly created ContactHandler to the Create and Update events. Then Save again.
  10. 14

  11. Now click ‘Deploy’ on the Ribbon to upload and activate all the changes to the RN Server
  12. 15

  13. In order to test, create a new contact called ‘John Doe’ in Service Cloud, and the BPEL process gets instantiated.
  14. 16

This ends our discussion on configuring and consuming outbound real-time events. Before moving on to bulk data export, it must be noted that the Rightnow event subscribers and CPMs are inherently transient. Thus, durable subscriptions are not available, although for error scenarios Rightnow does have a retry mechanism with exponential back-off.
If durability is a key requirement then the subscriber must be made highly-available and durability must be built in the subscriber design, such as by persisting messages in a queue immediately upon receiving them.

3. Bulk Export

So far we have discussed various ways of receiving real-time events/notifications from Rightnow. These can be used for online integration scenarios, but not for bulk-export use-cases.
We’ll now discuss a few options for bulk export:

a. ROQL

ROQL, or Rightnow Object Query Language is the simplest tool for extracting data, using SQL-like queries against Rightnow.
ROQL can be executed using Connect Web Services, Connect REST Services, and Connect PHP Services.

ROQL comes in two flavors, Object Query and Tabular Query:

  • Object Query : This is when Rightnow Objects are returned as response to the query. This is the simpler form of queries, available in SOAP API as the QueryObjects operation, or in REST API as the ?q= URL parameter.
  • Tabular Query : Tabular queries are more advanced queries, which allow operands like ‘ORDER BY’, USE, aggregate functions, max returned items, pagination, etc. These are available in SOAP API as the queryCSV operation, or in REST API as the queryResults resource.

Between the two, Tabular Query is the more efficient way of extracting data, as it returns the required dataset in a single database query. Two great resources to get started on tabular queries are the A-Team blogs here and here. They explain how to use SOAP and REST-based Tabular queries to extract data from Service Cloud and import into Oracle BI Cloud Service.

b. Analytics Report

For more advanced querying needs Rightnow Analytics Reports can be defined in the Agent Desktop, and they can be executed using the SOAP RunAnalyticsReport operation, or REST analyticsReportResults resource.

c. Third Party Providers

A number of third party providers, including the Progress ODBC and JDBC Drivers also allow bulk extract of Rightnow data. These providers internally use the same ROQL based approach, but provide a higher layer of abstraction by automating pagination and other needs.

Conclusion

In this blog we looked at a couple of ways to receive outbound events from Service Cloud, and how ICS can be used to seamlessly receive the events in a UI-driven fashion.
We also saw how PHP-based Rightnow Custom processes can be used as object triggers.
Finally, we saw a few options available for bulk data export from Rightnow, using ROQL, Rightnow Analytics and third-party providers.

A pattern for managing segments across Oracle apps

$
0
0

Background

A large number of Oracle and 3rd party CX applications include segmentation as a key part of their feature set. Trouble is, most of these implementations are siloed and proprietary. Further very few of these segment-aware applications expose services to the underlying segment mechanism, making integrations — with regard to segments — difficult if not impossible in most cases. This leads to inevitable “left hand not knowing what the right hand is doing” scenarios, thereby complicating the lives of marketers within an enterprise who must manually orchestrate the use of (and definition of) segments across an increasing number of apps. Worse, such behavior reinforces the perception that our products (even those that are otherwise “integrated”) don’t work well together.

This document explores a pattern that could help marketers manage their segments across a wide range of products that can span both cloud and on premise.

Some obvious use cases

  • An unknown visitor arrives at a bespoke webpage and reveals their email address (either via global cookie, device fingerprinting, submitting a “more information” form, a redirect via campaign landing page, or just simple registration/provisioning). Webpage/app then inquires to a cloud endpoint: “what segments does xxx@yyy.com belong to?” and gets a response of 0-N segments. Webpage/app then renders segment-appropriate content for that visitor going-forward. Caching/expiring the response is entirely up to the webpage/app.
  • A known visitor/contact arrives at a webpage served by an Oracle app that supports segment calculation natively (e.g. Commerce, Eloqua, Sites). The framework serving the webpage can then optionally inquire to a cloud endpoint: “what other segments does xxx@yyy.com belong to as calculated by other Oracle apps?” and gets a response of 0-N segments identifying both the segment name and the app that calculated it. The webpage framework can then use this information to further enrich local segment calculation or ignore it altogether.
  • An application that cannot be extended to request REST services needs visitor/contact data that described pre-calculated segments for known visitors/contacts. This application could be for internet or intranet or whatever. Access to ICS can be assumed.

You will note that the above use cases assume that specifically *how* the segment was calculated is hidden from the requestor (as it should be as it is meant to be a service). As such, the requestor can simply trust the calculation and either leverage it or ignore it as appropriate. Think of segments exposed this way as a form of denormalization.

Trouble in Paradise

Everything would be fine if all of our segment-aware applications behaved in the same way and had exposed services around their segment engines. Unfortunately, nothing could be further from the truth. At the very minimum we have the following types of segment-aware applications:

  • those that publicly (at least partially) expose their segment mechanism as services or APIs (e.g. Webcenter Sites)
  • those that hide their segment services/APIs (e.g. Commerce)
  • those that don’t have segment services/APIs at all (e.g. Eloqua)
  • those that use other mechanisms (e.g. “categories”, “personas”)

Further, existing integrations offer very little with regard to segments. For example: The latest “Eloqua —> BlueKai” integration (as of June 2016) allows mapping a (known contact) segment/list to a BlueKai (anonymous visitor) category. But enrichment data is not shared with Eloqua — the expectation is that BlueKai will handle subsequent paid advertisements once Eloqua determines the right scenario (e.g. contact doesn’t respond to email). BlueKai likewise is unaware of why an Eloqua contact is “in segment”.

Worse, segment calculations are invariably proprietary. e.g.

  • WC Sites: based on visitor attributes. Anonymous users are more difficult to ascertain and require custom front-end logic to do anything meaningful. (80%+ website visitors are anonymous)
  • Eloqua: segments are just lists of contacts. Such lists can be created manually or based on contact’s attributes. Run-time behaviors are not supported. (100% of email recipients are known)
  • Commerce: segments are (typically) based on visitor behaviors (note: while 80% of website visitors may be anonymous, 100% eventually become known once they complete their transaction)
  • BICS: segments can be hierarchical. Very “chart of accounts”-like

One might come to the conclusion that “orchestration across all segment-aware apps can’t be done” and simply give up. I believe otherwise and further, that as more and more applications become segment-aware, the need for a single method of defining and using segments is becoming increasingly critical to marketers.

Segment “Management” not merely orchestration

As technologists, it is easy to figure out ways to “glue” things together — that’s our job. However, that is not only what is needed here. Rather we need to empower an organization’s marketing team to make sure that all their various web and mobile apps are using a common set of segments and their definitions — even though each app likely will have its own way of calculating a segment for any visitor. As such there is a need for a single management console.

One other observation: while applications such as Commerce, WC Sites, and others tout their ability to do dynamic segment calculation on-the-fly, many implementations and marketing teams treat segments as mostly “static” in that a marketing person or service representative places a known contact into one or more segments based on attributes (or behaviors). Consequently, the relationship rarely changes. Example: if I am a Gold Member I might eventually become a Platinum Member but such changes are rare and don’t happen daily. As such, for many use-cases, storing pre-calculated segments per known contact in a central repository can provide a sufficiently rich customer experience that the benefit of centrally managing outweighs any perceived negatives.

Rather than building our own custom app (which is still allowed under this pattern) I propose instead leveraging a new feature of WebCenter Sites v12 — its Visitor Services (a.k.a SVS). (see https://docs.oracle.com/middleware/1221/wcs/use/GUID-EA7FEA17-C9C3-483D-B246-2F88E2BA17CC.htm#WBCSU8599). Not only do we not have to build very much to complete the pattern, but SVS provides a rich environment for allowing marketers to design the rules for managing aggregating visitor attributes across a wide range of repositories. More to the point, using SVS to prove out the viability of the proposed pattern is a low-risk approach that is readily presentable.

Assumptions

  • This pattern need only manage “static” segment calculations performed by various applications and stored in a central repository as “static”, pre-calculated values. Any real-time calculations (that perhaps build upon and enrich the static segment assignment) are to be performed by the application serving the webpage or mobile app.
  • There will be a service endpoint that will allow querying on behalf of known visitors/contacts
  • The service endpoint can also be queried to inquire as to the current list of enumerated segment names per application
  • Applications that cannot expose their segment calculation mechanism to remote querying via REST (e.g. Eloqua) must observe the following rule when creating segments locally: in order to maintain synchronization with other apps across the enterprise, segments should ideally only ever be created based the value of a custom field per visitor. This field value will be populated by updates from the central repository. Ideally for end-users of the application it would be expressed in the GUI as an enumerated Multi-select field named “segments”.
  • Example: 
 Contact=michael@abc.com
 Segments=Gold,FrequentTraveler,CollegeEducatedMale.

The segment array payload

The service endpoint should return an aggregated lightweight profile for each visitor/contact styled like the following:

visitor_segments : 
{
email : “msullivan@xyz.com”,
context: “commerceSite”, segments: [ “A”, “C” ],
context: “campaign2016”, segments: [ “B”, “D” ]
}

The segments array should be configured to be able to created/updated from any contributing system that is segment-aware. The segments array should be readable by any subscribing application.

This makes it possible for a consuming client to fetch and build logic around any combination of segments without losing the origin/original intent of segment. As newer systems are added as part of the integration, they just end up being new contexts and integration is streamlined.

Diagram

SVS exposed services

Outbound Features:

  • For those apps that can efficiently request additional visitor attributes on-the-fly (e.g. being able to cache the payload response), a read-only, lightweight REST endpoint for querying single visitor records. Assume the app doing the querying has some sort of key to initiate the search, whether it be email, visitorid, or whatever.
  • For those apps that have security or latency issues, a bulk mechanism to update visitor/contact records via export from Sites Visitor Services. One obvious model would be to use ICS for this. Note however that until such time that ICS supports batch processing, there will likely need to be a custom servlet deployed in the SVS Weblogic instance to handle the batch orchestration.
    Optionally, as raw visitor attributes are aggregated via an aggregation template, this template can optionally call out the WC Sites Engage mechanism to calculate the segment based on the current visitor attribute values
  • Similar to the above, the template could also call out to any other segmentation engine as appropriate

Inbound Features:

  • Any OOTB or custom providers as allowed for and supported by WC Sites

Final Thoughts

Ideally, such a service would be hosted “in the cloud” and as such, using Sites Visitor Services will not be a long-term solution. But that is not the point of this pattern anyway. The suggestion of using SVS was simply to “bootstrap” any POC so as to validate and tweak the pattern to your client’s particular use cases. Would love to hear your thoughts.

Cloud Security: Seamless Federated SSO for PaaS and Fusion-based SaaS

$
0
0

Introduction

Oracle Fusion-based SaaS Cloud environments can be extended in many ways. While customization is the standard activity to setup a SaaS environment for your business needs, chances are that you want to extend your SaaS for more sophisticated use cases.

In general this is not a problem and Oracle Cloud offers a great number of possible PaaS components for this. However, user and login experience can be a challenge. Luckily, many Oracle Cloud PaaS offerings use a shared identity management environment to make the integration easier.

This article describes how the integration between Fusion-based SaaS and PaaS works in general and how easy the configuration can be done.

Background

At the moment, Oracle Fusion-based SaaS comes with its own identity management stack. This stack can be shared between Fusion-based SaaS offerings like Global Human Capital Management, Sales Cloud, Financials Cloud, etc.

On the other hand, many Oracle PaaS offerings use a shared identity management (SIM-protected PaaS) and can share it if they are located in the same data center and identity domain. If done right, integration of SIM-protected PaaS and Fusion-based SaaS for Federated SSO can be done quite easily.

Identity Domain vs Identity Management Stack

In Oracle Cloud environments the term identity is used for two different parts and can be quite confusing.

  • Identity Domain – Oracle Cloud environments are part of an Identity Domain that governs service administration, for example, start and restart of instances, user management, etc. The user management always applies to the service administration UI but may not apply to the managed environments.
  • Identity Management Stack – Fusion-based SaaS has its own Identity Management Stack (or IDM Stack) and is also part of an Identity Domain (for managing the service).

Federated Single Sign-On

As described in Cloud Security: Federated SSO for Fusion-based SaaS, Federated Single Sign-on is the major user authentication solution for Cloud components.

Among its advantages are a single source for user management, single location of authentication data and a chance for better data security compared to multiple and distinct silo-ed solutions.

Components

In general, we have two component groups we want to integrate:

  • Fusion-based SaaS Components – HCM Cloud, Sales Cloud, ERP Cloud, CRM Cloud, etc.
  • SIM-protected PaaS Components – Developer Cloud Service, Integration Cloud Service, Messaging Cloud Service, Process Cloud Service, etc.

Each component group should share the Identity Domain. For seamless integration both groups should be in the same Identity Domain.

Integration Scenarios

The integration between both component groups follows two patterns. The first pattern shows the integration of both component groups in general. The second pattern is an extension of the first, but allows the usage of a third-party Identity Provider solution. The inner workings for both patterns are the same.

Federated Single Sign-On

This scenario can be seen as a “standalone” or self-contained scenario. All users are maintained in the Fusion-based IDM stack and synchronized with the shared identity management stack. The SIM stack acts as the Federated SSO Service Provider and the Fusion IDM stack acts as the Identity Provider. Login of all users and for all components is handled by the Fusion IDM stack.

SaaS-SIM-1

Federated Single Sign-On with Third Party Identity Provider

If an existing third-party Identity Provider should be used, the above scenario can be extended as depicted below. The Fusion IDM stack will act as a Federation Proxy and redirect all authentication requests to the third-party Identity Provider.

SaaS-SIM-IdP-2

User and Role Synchronization

User and Role synchronization is the most challenging part of Federated SSO in the Cloud. Although a manageable part, it can be really challenging if the number of identity silos is too high. The lower the number of identity silos the better.

User and Role Synchronization between Fusion-based SaaS and SIM-protected PaaS is expected to be available in the near future.

Requirements and Setup

To get the seamless Federated SSO integration between SIM-protected PaaS and Fusion-based SaaS these requirements have to be fulfilled:

  • All Fusion-based SaaS offerings should be in the same Identity Domain and environment (i.e., sharing the same IDM stack)
  • All SIM-based PaaS offerings should be in the same Identity Domain and data center
  • Fusion-based SaaS and SIM-based PaaS should be in the same Identity Domain and data center

After all, these are just a few manageable requirements which must be mentioned during the ordering process. Once this is done, the integration between Fusion-based SaaS and SIM-protected PaaS will be done automatically.

Integration of a third-party Identity Provider is still an on-request, Service Request based task (see Cloud Security: Federated SSO for Fusion-based SaaS). When requesting this integration adding Federation SSO Proxy setup explicitly to the request is strongly recommended!

Note: The seamless Federated SSO integration is a packaged deal and comes with a WebService level integration setting up the Identity Provider as the trusted SAML issuer, too. You can’t get the one without the other.

References

Loading Data into Oracle BI Cloud Service using BI Publisher Reports and SOAP Web Services

$
0
0

Introduction

This post details a method of loading data that has been extracted from Oracle Business Intelligence Publisher (BIP) into the Oracle Business Intelligence Cloud Service (BICS). The BIP instance may either be Cloud-Based or On-Premise.

It builds upon the A-Team post Using Oracle BI Publisher to Extract Data from Oracle Sales and ERP Clouds. This post uses SOAP web services to extract data from an XML-formatted BIP report.

The method uses the PL/SQL language to wrap the SOAP extract, XML parsing commands, and database table operations. It produces a BICS staging table which can then be transformed into star-schema object(s) for use in modeling.  The transformation processes and modeling are not discussed in this post.

Additional detailed information, including the complete text of the procedure described, is included in the References section at the end of the post.

Rationale for using PL/SQL

PL/SQL is the only procedural tool that runs on the BICS / Database Schema Service platform. Other wrapping methods e.g. Java, ETL tools, etc. require a platform outside of BICS to run on.

PL/SQL can utilize native SQL commands to operate on the BICS tables. Other methods require the use of the BICS REST API.

Note: PL/SQL is a very good at showcasing functionality. However, it tends to become prohibitively resource intensive when deploying in an enterprise production environment.

For the best enterprise deployment, an ETL tool such as Oracle Data Integrator (ODI) should be used to meet these requirements and more:

* Security

* Logging and Error Handling

* Parallel Processing – Performance

* Scheduling

* Code re-usability and Maintenance

The steps below depict how to load a BICS table.

About the BIP Report

The report used in this post is named BIP_DEMO_REPORT and is stored in a folder named Shared Folders/custom as shown below:

BIP Report Location

The report is based on a simple analysis with three columns and output as shown below:

BIP Demo Analysis

Note: The method used here requires all column values in the BIP report to be NOT NULL for two reasons:

1. The XPATH parsing command signals either the end of a row or the end of the data when a null result is returned.

2. All columns being NOT NULL ensures that the result set is dense and not sparse. A dense result set ensures that each column is represented in each row. Additional information regarding dense and sparse result sets may be found in the Oracle document Database PL/SQL Language Reference.

One way to ensure a column is not null is to use the IFNull function in the analysis column definition as shown below:

BIP IFNULL Column Def

Call the BIP Report

The SOAP API request used here is similar to the one detailed in Using Oracle BI Publisher to Extract Data from Oracle Sales and ERP Clouds.

The SOAP API request should be constructed and tested using a SOAP API testing tool e.g. SoapUI.

This step uses the APEX_WEB_SERVICE package to issue the SOAP API request and store the XML result in a XMLTYPE variable. The key inputs to the package call are:

* The URL for the Report Request Service

* The SOAP envelope the Report Request Service expects.

* Optional Headers to be sent with the request

* An optional proxy override

Note: Two other BI Publisher reports services exist in addition to the one shown below. The PublicReportService_v11 should be used for BI Publisher 10g environments and the ExternalReportWSSService should be used when stringent security is required. An example URL is below:

https://hostname/xmlpserver/services/v2/ReportService

An example Report Request envelope is below:

<soapenv:Envelope xmlns:soapenv=”http://schemas.xmlsoap.org/soap/envelope/” xmlns:v2=”http://xmlns.oracle.com/oxp/service/v2″>
<soapenv:Header/>
<soapenv:Body>
<v2:runReport>
<v2:reportRequest>
<v2:byPassCache>
true</v2:byPassCache>
<v2:flattenXML>
false</v2:flattenXML>
<v2:reportAbsolutePath>
/custom/BIP_DEMO_REPORT.xdo</v2:reportAbsolutePath>
<v2:sizeOfDataChunkDownload>
-1</v2:sizeOfDataChunkDownload>
</v2:reportRequest>
<v2:userID>’||
P_AU||'</v2:userID>
<v2:password>’||
P_AP||'</v2:password>
</v2:runReport>
</soapenv:Body>
</soapenv:Envelope>

An example of setting a SOAP request header is below:

apex_web_service.g_request_headers(1).name :=SOAPAction‘; apex_web_service.g_request_headers(1).value := ”;

An example proxy override is below:

www-proxy.us.oracle.com

 Putting this together, example APEX statements are below:

apex_web_service.g_request_headers(1).name := ‘SOAPAction’;                  apex_web_service.g_request_headers(1).value := ”;                  f_xml := apex_web_service.make_request(p_url => p_report_url, p_envelope => l_envelope, p_proxy_override => l_proxy_override );

Note: The SOAP header used in the example above was necessary for the call to the BI Publisher 11g implementation used in a demo Sales Cloud instance. If it were not present, the error LPX-00216: invalid character 31 (0x1F) would appear. This message indicates that the response received from the server was encoded in a gzip format which is not a valid xmltype data type.

Parse the BIP Report Result Envelope

This step parses the XML returned by the SOAP call for the data stored in the tag named reportBytes that is encoded in Base64 format.

The XPATH expression used below should be constructed and tested using an XPATH testing tool e.g. freeformatter.com

This step uses the APEX_WEB_SERVICE package to issue parsing command and store the result in a CLOB variable. The key inputs to the package call are:

* The XML returned from BIP SOAP call above

* The XML Path Language (XPATH) expression to find the reportBytes data

An example of the Report Response envelope returned is below:

<soapenv:Envelope xmlns:soapenv=”http://schemas.xmlsoap.org/soap/envelope/” xmlns:xsd=”http://www.w3.org/2001/XMLSchema” xmlns:xsi=”http://www.w3.org/2001/XMLSchema-instance”><soapenv:Body><runReportResponse xmlns=”http://xmlns.oracle.com/oxp/service/v11/PublicReportService”><runReportReturn>        <reportBytes>PD94bWwgdmVyc2lvbj0iMS4wIiBlbmNvZGluZz0iVVRGLTgiPz4KPCEtLUdlbmVyYXRlZCBieSBPcmFjbGUgQkkgUHVibGlzaGVyIDEyLjIuMS4xLjAgLURhdGFlbmdpbmUsIGRhdGFtb2RlbDpfY3VzdG9tX0JJUF9ERU1PX01PREVMX3hkbSAtLT4KPERBVEFfRFM+PFNBVy5QQVJBTS5BTkFMWVNJUz48L1NBVy5QQVJBTS5BTkFMWVNJUz4KPEdfMT4KPENPTFVNTjA+QWNjZXNzb3JpZXM8L0NPTFVNTjA+PENPTFVNTjE+NTE2MTY5Ny44NzwvQ09MVU1OMT48Q09MVU1OMj40ODM3MTU8L0NPTFVNTjI+CjwvR18xPgo8R18xPgo8Q09MVU1OMD5BdWRpbzwvQ09MVU1OMD48Q09MVU1OMT43MjM3MzYyLjM8L0NPTFVNTjE+PENPTFVNTjI+NjI3OTEwPC9DT0xVTU4yPgo8L0dfMT4KPEdfMT4KPENPTFVNTjA+Q2FtZXJhPC9DT0xVTU4wPjxDT0xVTU4xPjY2MTQxMDQuNTU8L0NPTFVNTjE+PENPTFVNTjI+NDAzNzQ0PC9DT0xVTU4yPgo8L0dfMT4KPEdfMT4KPENPTFVNTjA+Q2VsbCBQaG9uZXM8L0NPTFVNTjA+PENPTFVNTjE+NjMyNzgxOS40NzwvQ09MVU1OMT48Q09MVU1OMj40Nzg5NzU8L0NPTFVNTjI+CjwvR18xPgo8R18xPgo8Q09MVU1OMD5GaXhlZDwvQ09MVU1OMD48Q09MVU1OMT44ODA3NzUzLjI8L0NPTFVNTjE+PENPTFVNTjI+NjU1MDY1PC9DT0xVTU4yPgo8L0dfMT4KPEdfMT4KPENPTFVNTjA+SW5zdGFsbDwvQ09MVU1OMD48Q09MVU1OMT40MjA4ODQxLjM5PC9DT0xVTU4xPjxDT0xVTU4yPjY2MTQ2OTwvQ09MVU1OMj4KPC9HXzE+CjxHXzE+CjxDT0xVTU4wPkxDRDwvQ09MVU1OMD48Q09MVU1OMT43MDAxMjUzLjI1PC9DT0xVTU4xPjxDT0xVTU4yPjI2OTMwNTwvQ09MVU1OMj4KPC9HXzE+CjxHXzE+CjxDT0xVTU4wPk1haW50ZW5hbmNlPC9DT0xVTU4wPjxDT0xVTU4xPjQxMjAwOTYuNDk8L0NPTFVNTjE+PENPTFVNTjI+NTI3Nzk1PC9DT0xVTU4yPgo8L0dfMT4KPEdfMT4KPENPTFVNTjA+UGxhc21hPC9DT0xVTU4wPjxDT0xVTU4xPjY2Njk4MDguODc8L0NPTFVNTjE+PENPTFVNTjI+Mjc4ODU4PC9DT0xVTU4yPgo8L0dfMT4KPEdfMT4KPENPTFVNTjA+UG9ydGFibGU8L0NPTFVNTjA+PENPTFVNTjE+NzA3ODE0Mi4yNTwvQ09MVU1OMT48Q09MVU1OMj42MzcxNzQ8L0NPTFVNTjI+CjwvR18xPgo8R18xPgo8Q09MVU1OMD5TbWFydCBQaG9uZXM8L0NPTFVNTjA+PENPTFVNTjE+Njc3MzEyMC4zNjwvQ09MVU1OMT48Q09MVU1OMj42MzMyMTE8L0NPTFVNTjI+CjwvR18xPgo8L0RBVEFfRFM+</reportBytes><reportContentType>text/xml</reportContentType><reportFileID xsi:nil=”true”/><reportLocale xsi:nil=”true”/></runReportReturn></runReportResponse></soapenv:Body></soapenv:Envelope>

An example of the XPATH expression to retrieve just the value of reportBytes is below:

//*:reportBytes/text()

Putting these together, an example APEX statement is below:

f_report_bytes := apex_web_service.parse_xml_clob( p_xml => f_xml, p_xpath => ‘//*:reportBytes/text()’ );

Decode the Report Bytes Returned

This step uses the APEX_WEB_SERVICE package to decode the Base64 result from above into a BLOB variable and then uses the XMLTYPE function to convert the BLOB into a XMLTYPE variable.

Decoding of the Base64 result should first be tested with a Base64 decoding tool e.g. base64decode.org

An example of the APEX decode command is below:

f_blob := apex_web_service.clobbase642blob(f_base64_clob);

 An example of the XMLTYPE function is below:

f_xml := xmltype (f_blob, 1);

The decoded XML output looks like this:

<?xml version=”1.0″ encoding=”UTF-8″?>
<!–Generated by Oracle BI Publisher 12.2.1.1.0 -Dataengine, datamodel:_custom_BIP_DEMO_MODEL_xdm –>
<DATA_DS><SAW.PARAM.ANALYSIS></SAW.PARAM.ANALYSIS>
<G_1>
<COLUMN0>Accessories</COLUMN0><COLUMN1>5161697.87</COLUMN1><COLUMN2>483715</COLUMN2>
</G_1>
<G_1>
<COLUMN0>Audio</COLUMN0><COLUMN1>7237362.3</COLUMN1><COLUMN2>627910</COLUMN2>
</G_1>
<G_1>
<COLUMN0>Camera</COLUMN0><COLUMN1>6614104.55</COLUMN1><COLUMN2>403744</COLUMN2>
</G_1>
<G_1>
<COLUMN0>Cell Phones</COLUMN0><COLUMN1>6327819.47</COLUMN1><COLUMN2>478975</COLUMN2>
</G_1>
<G_1>
<COLUMN0>Fixed</COLUMN0><COLUMN1>8807753.2</COLUMN1><COLUMN2>655065</COLUMN2>
</G_1>
<G_1>
<COLUMN0>Install</COLUMN0><COLUMN1>4208841.39</COLUMN1><COLUMN2>661469</COLUMN2>
</G_1>
<G_1>
<COLUMN0>LCD</COLUMN0><COLUMN1>7001253.25</COLUMN1><COLUMN2>269305</COLUMN2>
</G_1>
<G_1>
<COLUMN0>Maintenance</COLUMN0><COLUMN1>4120096.49</COLUMN1><COLUMN2>527795</COLUMN2>
</G_1>
<G_1>
<COLUMN0>Plasma</COLUMN0><COLUMN1>6669808.87</COLUMN1><COLUMN2>278858</COLUMN2>
</G_1>
<G_1>
<COLUMN0>Portable</COLUMN0><COLUMN1>7078142.25</COLUMN1><COLUMN2>637174</COLUMN2>
</G_1>
<G_1>
<COLUMN0>Smart Phones</COLUMN0><COLUMN1>6773120.36</COLUMN1><COLUMN2>633211</COLUMN2>
</G_1>
</DATA_DS>

Create a BICS Table

This step uses a SQL command to create a simple staging table that has 20 identical varchar2 columns. These columns may be transformed into number and date data types in a future transformation exercise that is not covered in this post.

A When Others exception block allows the procedure to proceed if an error occurs because the table already exists.

A shortened example of the create table statement is below:

execute immediate ‘create table staging_table ( c01 varchar2(2048), … , c20 varchar2(2048)  )’;

Load the BICS Table

This step uses SQL commands to truncate the staging table and insert rows from the BIP report XML content.

The XML content is parsed using an XPATH command inside two LOOP commands.

The first loop processes the rows by incrementing a subscript.  It exits when the first column of a new row returns a null value.  The second loop processes the columns within a row by incrementing a subscript. It exits when a column within the row returns a null value.

The following XPATH examples are for a data set that contains 11 rows and 3 columns per row:

//G_1[2]/*[1]text()          — Returns the value of the first column of the second row

//G_1[2]/*[4]text()          — Returns a null value for the 4th column signaling the end of the row

//G_1[12]/*[1]text()        — Returns a null value for the first column of a new row signaling the end of the — data set

After each row is parsed, it is inserted into the BICS staging table.

An image of the staging table result is shown below:

BIP Table Output

 

Summary

This post detailed a method of loading data that has been extracted from Oracle Business Intelligence Publisher (BIP) into the Oracle Business Intelligence Cloud Service (BICS).

Data was extracted and parsed from an XML-formatted BIP report using SOAP web services wrapped in the Oracle PL/SQL APEX_WEB_SERVICE package.

A BICS staging table was created and populated. This table can then be transformed into star-schema objects for use in modeling.

For more BICS and BI best practices, tips, tricks, and guidance that the A-Team members gain from real-world experiences working with customers and partners, visit Oracle A-Team Chronicles for BICS.

References

Complete Text of Procedure Described

Using Oracle BI Publisher to Extract Data from Oracle Sales and ERP Clouds

Database PL/SQL Language Reference

Reference Guide for the APEX_WEB_SERVICE

Soap API Testing Tool

XPATH Testing Tool

Base64 Decoding and Encoding Testing Tool


Integrating Commerce Cloud using ICS and WebHooks

$
0
0

Introduction:

Oracle Commerce Cloud is a SaaS application and is a part of the comprehensive CX suite of applications. It is the most extensible, cloud-based ecommerce platform offering retailers the flexibility and agility needed to get to market faster and deliver desired user experiences across any device.

Oracle’s iPaaS solution is the most comprehensive cloud based integration platform in the market today.  Integration Cloud Service (ICS) gives customers an elevated user experience that makescomplex integration simple to implement.

Commerce Cloud provides various webhooks for integration with other products. A webhook sends a JSON notification to URLs you specify each time an event occurs. External systems can implement the Oracle Commerce Cloud Service API to process the results of a webhook callback request. For example, you can configure the Order Submit webhook to send a notification to your order management system every time a shopper successfully submits an order.

In this article, we will explore how ICS can be used for such integrations. We will use the Abandoned Cart Web Hook which is triggered when a customer leaves the shopping cart idle for a specified period of time. We will use ICS to subscribe to this Web Hook.

ICS provides pre-defined adapters, easy to use visual mechanism for transforming and mapping data, fan out mechanism to send data to multiple end points. It also provides and ability to orchestrate and encrich the payload.

Main Article:

For the purpose of this example, we will create a task in Oracle Sales Cloud (OSC), when the Idle Cart Web Hook is triggered.

The high level steps for creating this integration are:

  1. Register an application in Commerce Cloud
  2. Create a connection to Commerce Cloud in ICS
  3. Create a connection to Sales Cloud in ICS
  4. Create an integration using the 2 newly created connections
  5. Activate the integration and register its endpoint with Abandoned Cart Web Hook

Now let us go over each of these steps in detail

 

Register an application in Commerce Cloud

Login to Admin UI of commerce cloud. Click on Settings

01_CCDashBoard

 

 

 

Click on Web APIs

02_CCSettings

 

 

 

 

 

 

 

 

 

 

 

Click on Registered Applications

03_CCWebAPIs

 

 

 

 

 

 

 

Click on Register Application

04_CCWebAPIsRegisteredApps

 

 

 

 

 

Provide a name for the application and click Save

05_CCNewApp

 

 

 

 

 

A new application is registered and a unique application id and key is created. Click on Click to reveal to view the application key

06_CCNewAppKey1

 

 

 

 

 

Copy the application key that is revealed. This will later be provided while configuring connection to Commerce Cloud in ICS

07_CCNewAppKey2

 

 

 

 

 

You can see the new application is displayed in the list of Registered Applications

08_CCWebAPIsRegisteredApps2

 

 

 

 

 

Create a connection to Commerce Cloud in ICS

From the ICS Dashboard, click Connections to get to the connections section

01_ICSDashboard

 

 

 

 

 

Click Create New Connection

02_Connections

 

 

 

 

 

 

Create Connection – Select Adapter page is displayed. This page lists all the available adapters

03_ICSCreateNewConn

 

 

 

 

 

 

 

 

 

Search for Oracle Commerce Cloud and click Select

04_ICSNewConnCC

 

 

 

 

 

 

 

 

 

Provide a connection name and click Create

05_ICSNewConnCCName

 

 

 

 

 

 

ICS displays the message that connection was created successfully. Click Configure Connectivity

06_ICSNewConnCCCreated

 

 

 

 

 

Provide the Connection base URL. It is of the format https://<site_hostname>:443/ccadmin/v1. Click OK

07_ICSNewConnCCURL

 

 

 

Click Configure Security

08_ICSNewConnCCConfigureSecurity

 

 

 

 

Provide the Security Token. This is the value we copied after registering the application in Commerce Cloud. Click OK

09_ICSNewConnCCOAuthCreds

 

 

 

 

The final step is to test the connection. Click Test

10_ICSNewConnCCTest

 

 

ICS displays the message, if connection test is successful. Click Save

11_ICSNewConnCCTestResult

 

 

 

Create a connection to Sales Cloud in ICS

For details about this step and optionally how to use Sales Cloud Events with ICS, review this article

Create an integration using the 2 newly created connections

From the ICS Dashboard, click Integrations to get to the integrations area

01_Home

 

 

 

 

 

 

 

Click Create New Integration

02_CreateIntegration

 

 

Under Basic Map My Data, click Select

03_Pattern

 

 

 

 

 

Provide a name for the integration and click Create

04_Name

 

 

 

 

 

 

Drag the newly create Commerce Cloud connection from the right, to the trigger area on the left

05_SourceConn

 

 

 

 

 

Provide a name for the endpoint and click Next

06_EP1

 

 

 

 

 

 

Here you can chose various business objects that are exposed by the Commerce Cloud adapter. For the purpose of this integration, chose idleCart and click Next

07_IdleCartEvent

 

 

 

 

 

 

Review the endpoint summary page and click Done

08_EP1ConfigSummary

 

 

 

 

 

 

 

Similarly, drag and drop a Sales Cloud connection to the Invoke

09_TargetConn

 

 

 

 

 

Provide a name for the endpoint and click Next

10_EP2Name

 

 

 

 

 

Chose ActivityService and createActivity operation and click Next

11_CreateActivity

 

 

 

 

 

 

 

 

 

Review the summary and click Done

12_EP2Summary

 

 

 

 

 

Click the icon to create a map and click the “+” icon

This opens the mapping editor. You can create the mapping as desired. For the purpose of this article, a very simple mapping was created:

ActivityFunctionCode was assigned a fixed value of TASK. Subject was mapped to orderId from idleCart event.

22_ICSCreateIntegration

 

 

 

 

 

 

Add tracking fields to the integration and save the integration

25_ICSCreateIntegration

 

 

 

 

 

 

Activate the integration and register its endpoint with Abandoned Cart Web Hook

In the main integrations page, against the newly created integration, click Activate

26_ICSCreateIntegration

 

 

 

 

Optionally, check the box to enable tracing and click Yes

27_ICSCreateIntegration

 

 

 

 

ICS displays the message that the activation was successful. You can see the status as Active.

28_ICSCreateIntegration

 

 

 

Click the information icon for the newly activated integration. This displays the endpoint URL for this integration. Copy the URL. Remove the “/metadata” at the end of the URL. This URL will be provided in the Web Hook configuration of Commerce Cloud.

29_ICSCreateIntegration

 

 

 

 

In the Commerce Cloud admin UI, navigate to Settings -> Web APIs -> Webhook tab -> Event APIs -> Cart Idle – Production. Paste the URL and provide the ICS credentials for Basic Authorization

Webhook

 

 

 

 

 

 

By default, Abandoned cart event fires after 20 minutes. This and other settings can be modified. Navigate to Settings -> Extension Settings -> Abandoned Cart Settings. You can now configure the minutes until the webhook is fired. For testing, you can set it to a low value.

 

CCAbandonedCartSettings

 

 

 

 

 

 

 

 

This completes all the steps required for this integration. Now every time a customer adds items to a cart and leaves it idle for the specified time, this integration will create a task in OSC.

 

References / Further Reading:

Using Commerce Cloud Web Hooks

Using Event Handling Framework for Outbound Integration of Oracle Sales Cloud using Integration Cloud Service

Publishing update events from SCM cloud R11 using BI publisher reports and scheduled job on Oracle cloud.

$
0
0

Introduction

In Supply Chain Management (SCM) cloud’s work execution module, when a manufacturing work order’s steps are completed or its status changes, customers might want to capture and propagate the changes to other target systems. These target systems might take actions such as starting equipment or perform analytics, using the propagated data. SCM cloud, as of R11, does not publish events for work order status changes or work order step completions. This article elaborates on an approach to generate events using a combination of BI publisher reports in SCM cloud and a scheduled job deployed in Oracle cloud.

Main Article

SCM cloud keeps audits of actions performed on work orders. Reports in XML format can be generated from contents of audit table using SCM’s out-of-box BI publisher. BI publisher reports can be invoked through a web service. Coupled with a Quartz scheduler job, these reports can be used to emulate published events. The scheduler job can take actions such as posting the SCM changes to a queue, to update a database table or to invoke a web service of a 3rd party application. Note that this might be a suitable option for solutions that do not need events in real time, provided the inherent delays in updating the audit tables and interval between subsequent runs of the job. Intermediate level knowledge of BI publisher, WebLogic and Java is required to follow the instructions provided in subsequent sections.

Figure 1 – Overview of the solution

BI Publisher reports in XML format

BI publisher reports are based on data models. So, first create a data model and ensure that the resulting data meets the requirements. Here are the overall steps for creating a data model from SCM tables.

  • Navigate to “Reports and Analytics” from main menu.
  • On initial BIP page, click on “Browse Catalog” on the top.
  • On catalog page, click on “New” and then “Data Model”.
  • Create a SQL query with SCM data source, view the results and save.

Below is a sample query to obtain work order status changes. The term “:LastObservedDate” in WHERE clause is a parameter that needs to be passed, either at BI Publisher prompt or over web service call’s XML payload. This query returns results sorted by date and time of change in descending order. The next time this query is run, the latest change date and time should be passed as “:LastObservedDate”, so that events captured in previous run are excluded.

SELECT "WO"."WORK_ORDER_NUMBER" as "WORK_ORDER_NUMBER",
 "WOSTATHIST".STATUS_CHANGE_DATE as "STATUS_CHANGE_DATE",
 "WOSTATUS".WO_STATUS_CODE as NEW_WO_STATUS_CODE   
FROM "FUSION"."WIE_WORK_ORDERS_B" "WO",
 "FUSION"."WIE_WO_STATUS_HISTORY" "WOSTATHIST",
"FUSION"."INV_ORGANIZATION_DEFINITIONS_V"  "ORG",
 "FUSION".WIE_WO_STATUSES_B  "WOSTATUS"
WHERE  WOSTATHIST.WORK_ORDER_ID = WO.WORK_ORDER_ID AND
         ORG.ORGANIZATION_ID = WO.ORGANIZATION_ID AND
 WOSTATHIST.STATUS_CHANGE_DATE > :LastObservedDate AND
 WOSTATHIST.NEW_STATUS_ID=WOSTATUS.WO_STATUS_ID 
ORDER BY WOSTATHIST.STATUS_CHANGE_DATE DESC

 

View the data output and save it as sample data, by clicking “Save as sample data”. Save the data model.

To create a XML report, navigate to Catalog, click “New”, then select “Report”.

  • Choose Data model, by clicking on “Use Data Model” and select the data model created in previous steps.

02

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

  • Choose table format, then drag and drop fields into the report layout. Save the report.

03

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

04

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

  • Once the report is saved, modify the report to product only XML output.
  • Click on catalog on the menu and locate the report we just saved.
  • Click “Edit” for the report. On the next page, click on “View a list”.

05

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

  • In the list of layouts, click on “Output formats” for the layout.

06

 

 

 

 

 

 

 

 

 

 

 

 

 

 

  • In the list of format, uncheck all options but “Data (XML)”. Save the report.

07

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Now, the report is ready to produce the status changes of work orders in XML format.

Here is another query that lists the completion of steps in a work order. This query, similar to the previous one, accepts a parameter to list only the completions that occurred since the last run.

SELECT    "WOOPTX"."ORGANIZATION_ID" as "ORGANIZATION_ID",
 "WOOPTX"."WORK_ORDER_ID" as "WORK_ORDER_ID",
 "WOOPTX"."WO_OPERATION_ID" as "WO_OPERATION_ID",
 "WOOPTX"."TRANSACTION_TYPE_CODE" as "TRANSACTION_TYPE_CODE",
 "WOOPTX"."TRANSACTION_DATE" as "TRANSACTION_DATE",
 "WOOP"."OPERATION_SEQ_NUMBER" as "OPERATION_SEQ_NUMBER",
 "WO"."WORK_ORDER_NUMBER" as "WORK_ORDER_NUMBER",
 "WO"."WORK_ORDER_ID" as "WORK_ORDER_ID_1",
 FROM      "FUSION"."WIE_WORK_ORDERS_B" "WO",
 "FUSION"."WIE_WO_OPERATIONS_B" "WOOP",
 "FUSION"."WIE_OPERATION_TRANSACTIONS" "WOOPTX",
 "FUSION"."INV_ORGANIZATION_DEFINITIONS_V" "ORG"
 WHERE     WOOPTX.WORK_ORDER_ID = WO.WORK_ORDER_ID AND
 WOOPTX.WO_OPERATION_ID = WOOP.WO_OPERATION_ID AND
 WOOPTX.TRANSACTION_TYPE_CODE='OP_COMPLETION' AND
 ORG.ORGANIZATION_ID = WO.ORGANIZATION_ID AND
 WOOPTX.TRANSACTION_DATE > :LASTOBSERVEDDATE
 ORDER BY WOOPTX.TRANSACTION_DATE DESC

 

Running BI Publisher reports through web service

The next step is to ensure that the report output could be obtained through BI publisher web service.

The web service URL is typically https://<BI_hostname>:<port>/xmlpserver/services/v2/ReportService.  Substitute <BI_hostname> and <port> with values suitable for a specific environment.

A sample XML payload to run report is provided below, with parameter and security tokens. Note the date and time value passed for parameter LastObservedDate.  The response contains Base64 encoded XML data from BI publisher report.

 

<S:Envelope xmlns:S="http://schemas.xmlsoap.org/soap/envelope/">
   <S:Body>
     <runReport xmlns="http://xmlns.oracle.com/oxp/service/v2">
         <reportRequest>
           <XDOPropertyList xsi:nil="true" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"/>
           <attributeCalendar xsi:nil="true" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"/>
           <attributeFormat>xml</attributeFormat>
           <attributeLocale xsi:nil="true" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"/>
           <attributeTemplate xsi:nil="true" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"/>
           <attributeTimezone xsi:nil="true" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"/>
           <attributeUILocale xsi:nil="true" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"/>
           <byPassCache>false</byPassCache>
           <dynamicDataSource xsi:nil="true" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"/>
           <flattenXML>false</flattenXML>
           <parameterNameValues>
               <listOfParamNameValues>
                 <item>
                    <UIType xsi:nil="true" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"/>
                    <dataType xsi:nil="true" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"/>
                     <dateFormatString xsi:nil="true" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"/>
                     <dateFrom xsi:nil="true" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"/>
                     <dateTo xsi:nil="true" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"/>
                     <defaultValue xsi:nil="true" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"/>
                    <fieldSize xsi:nil="true" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"/>
                     <label xsi:nil="true" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"/>
                     <lovLabels xsi:nil="true" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"/>
                     <multiValuesAllowed>false</multiValuesAllowed>
                     <name>LastObservedDate</name>
                     <refreshParamOnChange>false</refreshParamOnChange>
                     <selectAll>false</selectAll>
                     <templateParam>false</templateParam>
                     <useNullForAll>false</useNullForAll>
                     <values>
                       <item>2016-08-30T18:56:49.232+00:00</item>
                    </values>
                 </item>
               </listOfParamNameValues>
           </parameterNameValues>
           <reportAbsolutePath>/~SCM_IMPL/OOW-demo-reports/WorkOrderStatusHistory.xdo</reportAbsolutePath>
           <reportData xsi:nil="true" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"/>
           <reportOutputPath xsi:nil="true" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"/>
           <reportRawData xsi:nil="true" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"/>
           <sizeOfDataChunkDownload>-1</sizeOfDataChunkDownload>
         </reportRequest>
         <userID>scm_username</userID>
         <password>scm_password</password>
     </runReport>
   </S:Body>
</S:Envelope>

 

Quartz Scheduler and job

The final task is to develop a job that could be triggered by a quartz scheduler periodically. This job invokes the BI publisher web service shown in previous section, gets the XML data from report, updates the target systems and stores the last observed date and time of status change for look-up by the next run of the job.. The java snippets provided were part of code deployed to a WebLogic container hosted on an Oracle cloud compute node. It could also be deployed to an Oracle Java Cloud Service (JCS) instance.

Quartz scheduler is an open source library available for several platforms, among them, Java. Quartz libraries for Java are available here.

The scheduler and job could perform these tasks:

  1. A web service façade to start and stop the scheduler. This is essential for controlling the job remotely
  2. A job (Java class) that is triggered by the scheduler.
  3. A module for each of these functions in the job
  4. Invoke the BI publisher web service
  5. Perform the relevant publishing and updating functions for target systems
  6. Store the last observed status change date-time to be used for next run


For the sake of brevity, code snippets for essential aspects of the solution are provided.

Quartz scheduler’s usage is well-documented in Quartz blogs and forums. This snippet shows how to schedule a job that runs every 10 seconds:

statustrigger = TriggerBuilder.newTrigger()
 .withSchedule(SimpleScheduleBuilder.simpleSchedule()
 .withIntervalInSeconds(Integer.parseInt(10))
 .repeatForever())
 .build();
statussch = schFactory.getScheduler();
statussch.start();
statussch.scheduleJob(statusjob, statustrigger);

Code for invoking BI Publisher web service:

try 
{
 ReportRequest reportRequest = new ReportRequest();
 reportRequest.setAttributeFormat("xml");
 reportRequest.setReportAbsolutePath("/~SCM_IMPL/demo-reports/WorkOrderStatusHistory.xdo");
 ArrayOfParamNameValue pNameValue = new ArrayOfParamNameValue();
 ParamNameValue nameValue = new ParamNameValue();
 ParamNameValues paramNameValues = new ParamNameValues();
 nameValue.setName("LastObservedDate");
 ArrayOfString aos = new ArrayOfString();
 
 //set the static var so that the file is not read next time.
 strLastObservedDate = strDateTime;

 //Set the value as parameter for the report        
 aos.getItem().add(strDateTime);         
 nameValue.setValues(aos);  
 pNameValue.getItem().add(nameValue);
 paramNameValues.setListOfParamNameValues(pNameValue);
 reportRequest.setParameterNameValues(paramNameValues);
 reportRequest.setSizeOfDataChunkDownload(-1);
 ReportResponse response = runReport(reportRequest, "scm username", "scm password");      
 String strReport = new String(response.getReportBytes());
 System.out.println("WOSTATUS-getreport-Report length is " + strReport.length());

} catch (Exception e) {
 e.printStackTrace();
}

Summary

This article explains how to publish events out of Supply chain Management cloud using out-of-box reporting tools in SCM cloud and a scheduled job deployed to either JCS or Oracle Compute node. This approach is suitable for R11 of SCM cloud. Subsequent releases of SCM cloud might offer equivalent or better event-publishing capabilities out-of-box. Refer to product documentation for later versions, before implementing a solution based on this article.

Publishing business events from Supply Chain Cloud’s Order Management through Integration Cloud Service

$
0
0

Introduction

In Supply Chain Cloud (SCM) Order Management, as a sales order’s state changes or it becomes ready for fulfillment, events could be generated for external systems. Integration Cloud Service offers Pub/Sub capabilities that could be used to reliably integrate SaaS applications. In this post, let’s take close look at these capabilities in order to capture Order Management events for fulfillment and other purposes. Instructions provided in this post are applicable to SCM Cloud R11 and ICS R16.4.1.

Main Article

SCM Cloud Order Management allows registering endpoints of external systems and assignment of these endpoints to various business events generated during order orchestration. For more information on business event features and order orchestration in general, refer to R11 document at this link. ICS is Oracle’s enterprise-grade iPaaS offering, with adapters for Oracle SaaS and other SaaS applications and native adapters that allow connectivity to all SaaS and on-premise applications. To learn more about ICS, refer to documentation at this link. Figure 1 provides an overview of the solution described in this post.

000

Figure 1 – Overview of the solution

Implementation of the solution requires the following high-level tasks.

  • Download WSDL for business events from SCM cloud.
  • Implement an ICS ‘Basic Publish to ICS’ integration with trigger defined using WSDL downloaded in previous step.
  • Optionally, Implement one or more ICS ‘Basic Subscribe to ICS’ integrations for external systems that desire event notification.
  • Configure SCM Cloud to generate events to the ‘Basic Publish to ICS’ endpoint.
  • Verify generation of Business Events.

For the solution to work, network connectivity between SCM Cloud and ICS and ICS to External systems, including any on-premise systems, must be enabled. ICS agents can easily enable connectivity to on-premise systems.

Downloading WSDL for business events

Order Management provides two WSDL definitions for integration with external systems, one for fulfillment systems and another for other external systems that wish to receive business events. One example for use of business events is generation of invoices by an ERP system, upon fulfillment of an order. For the solution described in this post, a Business Event Connector is implemented. To download WSDLs, follow these steps.

  • Log into SCM Cloud instance.
  • Navigate to ‘Setup and maintenance’ page, by clicking the drop-down next to username on top right of the page.
  • In the search box of ‘Setup and maintenance’ page, type in ‘Manage External Interface Web Service Details’ and click on search button or hit enter.
  • Click on ‘Manage External Interface Web Service Details’ task in search results.
  • On ‘Manage External Interface Web Service Details’ page, click on ‘Download WSDL for external integration’.
  • Two download options are provided as shown in Figure 2.
  • Download ‘Business Event Connector’.

001

Figure 2 – Download Business Event connector WSDL.

Implementing an ICS ‘Basic Publish to ICS’ integration

ICS allows publishing of events through an ICS trigger endpoint. Events published to ICS could be forwarded to one or more registered subscribers. For the solution, business event connector WSDL downloaded in the previous section is configured to a trigger connection for ‘Publish to ICS’ integration’. These are the overall tasks to build the integration:

  • Create a connection and configure WSDL and security.
  • Create new integration using the previously created connection as trigger and ‘ICS Messaging Service’ as invoke.
  • Activate the Integration and test.

Follow these instructions to configure the integration:

  • Navigate to ‘Designer’ tab and click ‘Connections’ from menu on left.
  • Click on ‘New Connection’. Enter values for required fields.

002Upload the WSDL file previously downloaded from SCM Cloud.

004

  • Configure security, by selecting the “Username Password Token’ as security policy.  Note that the Username and Password entered on this page are irrelevant for a trigger connection.  Since a trigger connection is used to initiate integration in ICS, an ICS username and password must be provided for SCM configuration.

005

  • Save the connection and test. Connection is ready for use in integration.
  • Navigate to “Integrations” page. Click “New Integration” to create a new integration.
  • Select “Basic Publish to ICS” pattern for new integration.

006

  • On integration editor, a “Publish to ICS” flow is displayed. On the left of the flow is the trigger, an entry into the flow. Drag the connection created previously to the trigger.

007

 

  • Configure the trigger. The steps are straightforward, as shown in following screenshots.

008

  • Configure SOAP Operation.

009

  • Click ‘Done’ on summary page.

010

  • Drag and drop ‘ICS Messaging Service’ to the right of the integration flow. No mappings are necessary for this integration pattern.
  • Add a business identifier for tracking and save the integration.

011

  • Add a field that could help uniquely identify the message.

012

  • Activate the integration, by clicking on slider button as shown.

013

  • Note the URL of the integration, by clicking on the info icon. This URL will be used by SCM Cloud as an external web service endpoint.

014

ICS integration to receive business events from SCM Cloud is ready for use.

Implementing an ICS ‘Subscribe to ICS’ integration

Subscribing to events published to ICS can be done through few simple steps. Events could be sent to target connection, for example, a DB connection or a web service endpoint. Here are steps to receive events in a web service.

  • Ensure that there is a “Basic Publish to ICS” integration activated and an Invoke connection to receive events is active.
  • Create a new integration in ICS and pick “Basic Subscribe to ICS” pattern. Enter a name and description for the integration.
  • ICS prompts to select one of available “Basic Publish to ICS” integrations. Select an integration and click on “Use”.

015

  • Integration editor shows a flow with “ICS messaging service” as trigger on left. Drag the Invoke connection to the right of the flow. Following screenshot shows how to define a REST connection for invoke. ICS displays several screens to configure the connection. Steps to configure the connection depend on the type of connection receive the events.

016

  • Complete request and response mappings.
  • Add a tracking field, save and activate the integration. It is now ready to receive events.

Configure SCM Cloud to generate business events

The final task is to configure SCM Cloud to trigger Business Events. Follow these instructions:

 

  • Log into SCM and navigate to Setup and Maintenance.
  • Search for “Manage External Interface Web Service Details”.
  • Click on “Manage External Interface Web Service Details”.

SCM-config-001

  • Add an entry for the external interface web service. Use the endpoint to the “Basic Publish to ICS” integration. Enter credentials to ICS as username and password.

SCM-config-002

  • Search for “Manage Business Event Trigger Points” and click on result.
  • Let’s select “Hold” as a trigger for business event.
  • Click “Active” checkbox next to “Hold”.
  • Click on hold and add a connector under “Associated Connectors”
  • Under “Associated Connectors”, “Actions”, select “Add Row”.
  • Select the “SCM_BusinessEvent” external web service added in previous steps.

SCM-config-004

  • Save the configuration and close.
  • SCM Cloud is now configured to send business events.

Verify generation of Business Events

The solution is ready for testing. SCM Cloud and the “Basic Publish to ICS” integration are sufficient to test the solution. If an ICS subscription flow is implemented, ensure that the event has been received in the target system as well.

 

  • Navigate to “Order Management” work area in SCM Cloud.

Test001

  • Select a sales order and apply hold.

Test002

  • Log into ICS and navigate to “Monitoring” and then to “Tracking” page.
  • Verify that the event has been received under “Tracking”.

Test003

ICS has received a SOAP message from Order Management similar to this one:

<Body xmlns="http://schemas.xmlsoap.org/soap/envelope/">
    <results xmlns="http://xmlns.oracle.com/apps/scm/doo/decomposition/DooDecompositionOrderStatusUpdateComposite" xmlns:ns4="http://xmlns.oracle.com/apps/scm/doo/decomposition/DooDecompositionOrderStatusUpdateComposite">
        <ns4:OrderHeader>
            <ns4:EventCode>HOLD</ns4:EventCode>
            <ns4:SourceOrderSystem>OPS</ns4:SourceOrderSystem>
            <ns4:SourceOrderId>300000011154333</ns4:SourceOrderId>
            <ns4:SourceOrderNumber>39050</ns4:SourceOrderNumber>
            <ns4:OrchestrationOrderNumber>39050</ns4:OrchestrationOrderNumber>
            <ns4:OrchestrationOrderId>300000011154333</ns4:OrchestrationOrderId>
            <ns4:CustomerId xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:nil="true"/>
            <ns4:OrderLine>
                <ns4:OrchestrationOrderLineId>300000011154334</ns4:OrchestrationOrderLineId>
                <ns4:OrchestrationOrderLineNumber>1</ns4:OrchestrationOrderLineNumber>
                <ns4:SourceOrderLineId>300000011154334</ns4:SourceOrderLineId>
                <ns4:SourceOrderLineNumber>1</ns4:SourceOrderLineNumber>
                <ns4:OrderFulfillmentLine>
                    <ns4:SourceOrderScheduleId>1</ns4:SourceOrderScheduleId>
                    <ns4:FulfillmentOrderLineId>300000011154335</ns4:FulfillmentOrderLineId>
                    <ns4:FulfillmentOrderLineNumber>1</ns4:FulfillmentOrderLineNumber>
                    <ns4:HoldCode>TD_OM_HOLD</ns4:HoldCode>
                    <ns4:HoldComments>Mani test hold </ns4:HoldComments>
                    <ns4:ItemId>300000001590006</ns4:ItemId>
                    <ns4:InventoryOrganizationId>300000001548399</ns4:InventoryOrganizationId>
                </ns4:OrderFulfillmentLine>
            </ns4:OrderLine>
        </ns4:OrderHeader>
    </results>
</Body>

Summary

This post explained how to publish Order Management events out of Supply chain Management cloud and use ICS publish and subscribe features to capture and propagate those events. This approach is suitable for R11 of SCM cloud and ICS R16.4.1. Subsequent releases of these products might offer equivalent or better event-publishing capabilities out-of-box. Refer to product documentation for later versions before implementing a solution based on this post.

Bulk import of sales transactions into Oracle Sales Cloud Incentive Compensation using Integration Cloud Service

$
0
0

Introduction

Sales Cloud Incentive Compensation application provides API to import sales transactions in bulk. These could be sales transactions exported out of an ERP system. Integration Cloud Service (ICS) offers extensive data transformation and secure file transfer capabilities that could be used to orchestrate, administer and monitor file transfer jobs. In this post, let’s look at an ICS implementation to transform and load sales transactions into Incentive Compensation. Instructions provided in this post are applicable to Sales Cloud Incentive Compensation R11 and ICS R16.4.1 or higher.

Main Article

Figure 1 provides an overview of the solution described in this post. A text file contains sales transactions, in CSV format, exported out of ERP Cloud. ICS imports the file from a file server using SFTP, transforms the data to a format suitable for Incentive Compensation and submits an import job to Sales Cloud. The data transfer is over encrypted connections end-to-end. ICS is Oracle’s enterprise-grade iPaaS offering, with adapters for Oracle SaaS and other SaaS applications and native adapters that allow connectivity to most cloud and on-premise applications. To learn more about ICS, refer to documentation at this link.

Figure1

Figure 1 – Overview of the solution

Implementation of the solution requires the following high-level tasks.

For the solution to work, ICS should be able to connect with Sales Cloud and File Server.  ICS agents can easily enable connectivity, if one of these systems are located on-premise, behind a firewall.

Configuring a file server to host ERP export file and enable SFTP

A File Server is an optional component of the solution. If the source ERP system that produces CSV file allows Secure FTP access, ICS could connect to it directly. Otherwise, a file server could host exported files from ERP system. One way to quickly achieve this is to provision a compute note on Oracle Public Cloud and enable SFTP access to a staging folder with read/write access to ERP system and ICS.

Defining data mapping for file-based data import service

File-based data import service requires that each import job specify a data mapping. This data mapping helps the import service assign the fields in input file content to fields in Incentive Compensation Transaction object. There are two ways to define such mapping.

  • Import mapping from a Spreadsheet definition
  • Define a new import by picking and matching fields on UI

Here are the steps to complete import mapping:

  • Navigate to “Setup and Maintenance”.

Figure2

  • Search for “Define File Import” task list.

Figure3

  • Click on “Manage File Import Mappings” task from list.

Figure4

  • On next page, there are options to look-up existing mapping or create a new one for specified object type. The two options, import from file or create a new mapping are highlighted.

Figure5

  • If you have a Excel mapping definition, then click on “Import Mapping” , provide information and click “OK”.

Figure6

  • Otherwise, click a new mapping by clicking on “Actions”->”Create”.

Figure7

  • Next page allows field-by-field mapping, between the CSV file’s fields and fields under “Incentive Compensation Transactions”.

Figure8

The new mapping is now ready for use.

Identifying Endpoints

Importing sales transaction require a file import web service and another optional web service to collect transactions.

  • Invoke file-based data import and export service with transformed and encoded file content.
  • Invoke ‘Manage Process Submission’ service with a date range for transactions.

File-based data import and export service could be used to import and data out of all applications on Sales Cloud. For the solution we’ll use “submitImportActivity” operation.  WSDL is typically accessible at this URL for Sales Cloud R11.

https://<Sales Cloud CRM host name>:<CRM port>/mktImport/ImportPublicService

The next task could be performed by logging into Incentive Compensation application or by invoking a web service. ‘Manage Process Submission’ service is specific to Incentive Compensation application. The file-based import processes input and loads the records into staging tables.  ‘submitCollectionJob’ operation of ‘Manage Process Submission’ service initiates the processing of the staged records into Incentive Compensation. This service is typically accessible at this URL. Note that this action can also be performed in Incentive Compensation UI, as described in the final testing section of this post.

https://<IC host name>:<IC port number>/publicIncentiveCompensationManageProcessService/ManageProcessSubmissionService

Implementing an ICS Orchestration

An ICS orchestration glues the other components together in a flow. ICS orchestrations provide flexible ways to invoke, such as a scheduled triggers or an API interface. Orchestrations can perform variety of tasks and implement complex integration logic. For the solution described in this post, ICS needs to perform the following tasks:

  • Connect to file server and import files that matches specified filename pattern.
  • Parse through file contents and for each record, transform the record to the format required by Incentive Compensation.
  • Convert the transformed file contents to Base64 format and store in a string variable.
  • Invoke File-based data import web service, with Base64-encoded data.Note this service triggers import process by does not wait for its completion.
  • Optionally, the service could invoke “Manage Submission Service” after a delay to ensure that the file-based import completed in Sales Cloud.

For sake of brevity, only the important parts of the orchestration are addressed in detailhere. Refer to ICS documentation for more information on building orchestrations.

 

FTP adapter configuration

FTP adapters could be used with ‘Basic Map my data’ or Orchestration patterns. To create a new FTP connection, navigate to “Connections” tab, click on “New Connection” and choose FTP as type of connection.

Under “Configure Connection” page, set “SFTP” drop down to “Yes”. FTP adapter allows login through SSL certificate or username and password.

Figure9

In “Configure Security” page, provide credentials, such as username password or password for a SSL certificate. FTP adapter also supports PGP encryption of content.

Figure10

Transforming the source records to destination format

Source data from ERP could be in a different format than the format required by target system. ICS provides a sophisticated mapping editor to map fields of source record to target record. Mapping could be as easy as drag & drop of fields from source to target, or could be set using complex logic using XML style sheet language (XSLT).  Here is a snapshot of the mapping used for transformation, primarily to convert date string from one format to another.

Figure15

Mapping for SOURCE_EVENT_DATE requires a transformation, which is done using transformation editor, as shown.

Figure16

Converting file content to a Base64-encoded string

File-based data import service requires the content of a CSV file to be Base64-encoded. This encoding could be done using simple XML schema to be used in the FTP invoke task of the orchestration. Here is the content of the schema.

<schema targetNamespace="http://xmlns.oracle.com/pcbpel/adapter/opaque/" xmlns="http://www.w3.org/2001/XMLSchema">
<element name="opaqueElement" type="base64Binary"/>
</schema>

To configure a new FTP connection, drag and drop a connection, configured previously.
Figure11

Select operations settings as shown.
Figure12

Choose options to select an existing schema.

Figure13

Pick the schema file containing the schema.

Figure14The FTP invoke is ready to get a file via SFTP and return the contents to the orchestration as a Base64-encoded string. Map the content as to a field in SOAP message to be sent to Incentive Compensation.

Testing the solution

To test the solution place a CSV formatted file at the stageing folder on file server. Here is sample content from source file.

SOURCE_TRX_NUMBER,SOURCE_EVENT_DATE,CREDIT_DATE,ROLLUP_DATE,TRANSACTION_AMT_SOURCE_CURR,SOURCE_CURRENCY_CODE,TRANSACTION_TYPE,PROCESS_CODE,BUSINESS_UNIT_NAME,SOURCE_BUSINESS_UNIT_NAME,POSTAL_CODE,ATTRIBUTE21_PRODUCT_SOLD,QUANTITY,DISCOUNT_PERCENTAGE,MARGIN_PERCENTAGE,SALES_CHANNEL,COUNTRY
TRX-SC1-000001,1/15/2016,1/15/2016,1/15/2016,1625.06,USD,INVOICE,CCREC,US1 Business Unit,US1 Business Unit,90071,SKU1,8,42,14,DIRECT,US
TRX-SC1-000002,1/15/2016,1/15/2016,1/15/2016,1451.35,USD,INVOICE,CCREC,US1 Business Unit,US1 Business Unit,90071,SKU2,15,24,13,DIRECT,US
TRX-SC1-000003,1/15/2016,1/15/2016,1/15/2016,3033.83,USD,INVOICE,CCREC,US1 Business Unit,US1 Business Unit,90071,SKU3,13,48,2,DIRECT,US

After ICS fetches this file and transforms content, it invokes file-based data import service, with the payload shown below.

<soapenv:Envelope xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/" xmlns:typ="http://xmlns.oracle.com/oracle/apps/marketing/commonMarketing/mktImport/model/types/" xmlns:mod="http://xmlns.oracle.com/oracle/apps/marketing/commonMarketing/mktImport/model/">
 <soapenv:Header/>
 <soapenv:Body>
 <typ:submitImportActivity>
 <typ:importJobSubmitParam>
 <mod:JobDescription>Gartner demo import</mod:JobDescription>
 <mod:HeaderRowIncluded>Y</mod:HeaderRowIncluded>
 <mod:FileEcodingMode>UTF-8</mod:FileEcodingMode>
 <mod:MappingNumber>300000130635953</mod:MappingNumber>
 <mod:ImportMode>CREATE_RECORD</mod:ImportMode>
 <mod:FileContent>U09VUkNFX1.....JUkVDVCxVUw==</mod:FileContent>
 <mod:FileFormat>COMMA_DELIMITER</mod:FileFormat>
 </typ:importJobSubmitParam>
 </typ:submitImportActivity>
 </soapenv:Body>
</soapenv:Envelope>


At this point, import job has been submitted to Sales Cloud. Status of file import job could be tracked on Sales Cloud, under ‘Set and Maintenance’. by opening “Manage File Import Activities”. As shown below, there are several Incentive Compensation file imports have been submitted, in status ‘Base table upload in progress’.

Figure17

Here is a more detailed view of one job, opened by clicking on status column of the job. This job has imported records into a staging table.

Figure18

To complete the job and see transactions in Incentive Compensation, follow one of the these two methods.

  • Navigate to “Incentive Compensation” -> “Credits and Earnings” and click on “Collect Transactions” to import data
  • OR, Invoke ‘Manage Process Submission’ service with payload similar to sample snippet below.
<soapenv:Envelope xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/" xmlns:typ="http://xmlns.oracle.com/apps/incentiveCompensation/cn/processes/manageProcess/manageProcessSubmissionService/types/">
   <soapenv:Header/>
   <soapenv:Body>
      <typ:submitCollectionJob>
         <typ:scenarioName>CN_IMPORT_TRANSACTIONS</typ:scenarioName>
         <typ:scenarioVersion>001</typ:scenarioVersion>
         <typ:sourceOrgName>US1 Business Unit</typ:sourceOrgName>
         <typ:startDate>2016-01-01</typ:startDate>
         <typ:endDate>2016-01-31</typ:endDate>
         <typ:transactionType>Invoice</typ:transactionType>
      </typ:submitCollectionJob>
   </soapenv:Body>
</soapenv:Envelope>

Finally, verify that transactions are visible under Incentive Compensation, by navigating to “Incentive Compensation” -> “Credits and Earnings”, from home page and by clicking on “Manage Transactions”.

Figure19

Summary

This post explained a solution to import transactions into Incentive Compensation using web services provided by Sales Cloud and Incentive Compensation application. It also explained several features of Integration Cloud Service utilized to orchestrate the import. The solution discussed in this post is suitable for Sales Cloud R11 and ICS R16.4.1. Subsequent releases of these products might offer equivalent or better capabilities out-of-box. Refer to product documentation for later versions before implementing a solution based on this post.

 

 

Loading Data from Oracle Field Service Cloud into Oracle BI Cloud Service using SOAP

$
0
0

Introduction

This post details a method of extracting and loading data from Oracle Field Service Cloud (OFSC) into the Oracle Business Intelligence Cloud Service (BICS).

A compelling reason to use such a method is when data is required that is not in the standard daily extract. Such data might be planning (future) data or data recently provided in new releases of the application.

This post uses SOAP web services to extract XML-formatted data responses. It also uses the PL/SQL language to wrap the SOAP extract, XML parsing commands, and database table operations in a Stored Procedure. It produces a BICS staging table and a staging view which can then be transformed into star-schema object(s) for use in modeling. The transformation processes and modeling are not discussed in this post.

Finally, an example of a database job is provided that executes the Stored Procedure on a scheduled basis.

The PL/SQL components are for demonstration purposes only and are not intended for enterprise production use. Additional detailed information, including the complete text of the PL/SQL procedure described, is included in the References section at the end of this post.

Rationale for Using PL/SQL

PL/SQL is the only procedural tool that runs on the BICS / Database Schema Service platform. Other wrapping methods e.g. Java, ETL tools, etc. require a platform outside of BICS to run on.

PL/SQL may also be used in a DBCS that is connected to BICS.

PL/SQL can utilize native SQL commands to operate on the BICS tables. Other methods require the use of the BICS REST API.

Note: PL/SQL is very good at showcasing functionality. However, it tends to become prohibitively resource intensive when deploying in an enterprise production environment. For the best enterprise deployment, an ETL tool such as Oracle Data Integrator (ODI) should be used to meet these requirements and more:

* Security

* Logging and Error Handling

* Parallel Processing and Performance

* Scheduling

* Code Re-usability and Maintenance

Using Oracle Database Cloud Service

Determining Security Protocol Requirements

If the web service requires a security protocol, key exchange or cypher not supported by the default BICS Schema Database Service, another Oracle Database Cloud Service (DBCS) may be used.

An example security protocol is TLS version 1.2 which is used by the OFSC web service accessed in this post.

Note: For TLSv1.2, specify a database version of 11.2.0.4.10 or greater, or any version of 12c. If the database is not at the required version, PL/SQL may throw the following error: ORA-29259: end-of-input reached

To detect what protocol a web service uses, open the SOAP WSDL page in a browser, click the lock icon, and navigate to the relevant security section. A Chrome example from an OFSC WSDL page is below:

1

Preparing the DBCS

If a DBCS other than the default Schema Service is used, the following steps need to be performed.

Create a BICS user in the database. The use of Jobs and the DBMS_CRPTO package shown in the example below are discussed later in the post. Example SQL statements are below:

— USER SQL
CREATE USER “BICS_USER” IDENTIFIED BY password
DEFAULT TABLESPACE “USERS”
TEMPORARY TABLESPACE “TEMP”
ACCOUNT UNLOCK;
— QUOTAS
ALTER USER “BICS_USER” QUOTA UNLIMITED ON USERS;
— ROLES
ALTER USER “BICS_USER” DEFAULT ROLE “CONNECT”,”RESOURCE”;
— SYSTEM PRIVILEGES
GRANT CREATE VIEW TO “BICS_USER”;
GRANT CREATE ANY JOB TO “BICS_USER”;
–OBJECT PERMISSIONS
GRANT EXECUTE ON SYS.DBMS_CRYPTO TO BICS_USER;

Create an entry in a new or existing Oracle database wallet for the trusted public certificate used to secure connections to the web service via the Internet. A link to the Oracle Wallet Manager documentation is included in the References section. Note the location and password of the wallet as they is used to issue the SOAP request.

The need for a trusted certificate is detected when the following error occurs: ORA-29024: Certificate validation failure.

An example certificate path found using Chrome browser is shown below. Both of these trusted certificates need to be in the Oracle wallet.

2

Preparing the Database Schema

Two objects need to be created prior to compiling the PL/SQL stored procedure.

The first is a staging table comprising a set of identical columns. This post uses a staging table named QUOTA_STAGING_TABLE. The columns are named consecutively as C01 through Cnn. This post uses 50 staging columns. The SQL used to create this table may be viewed here.

The second is a staging view named QUOTA_STAGING_VIEW built over the staging table. The view column names are the attribute names used in the API WSDL. The SQL used to create this view may be viewed here. The purpose of the view is to relate an attribute name found in the SOAP response to a staging table column based on the view column’s COLUMN_ID in the database. For example, if a response attribute name of bucket_id is detected and the COLUMN_ID of the corresponding view column is 3, then the staging table column populated with the attribute value would be C03.

Ensuring the Web Services are Available

To ensure that the web services are available in the required environment, type a form of the following URL into a browser:

https://hostname/soap/capacity/?wsdl

Note: If you are unable to reach the website, the services may not be offered or the URL may have changed. Discuss this with the service administrator.

Using API Testing Tools

The SOAP Request Envelope should be developed in an API testing tool such as SoapUI or Postman. The XPATH expressions for parsing should be developed and tested in an XPATH expression testing tool such as FreeFormatter. Links to these tools are provided in the References section.

Note: API testing tools such as SoapUI, FreeFormatter, Postman, and so on are third-party tools for using SOAP and REST services. Oracle does not provide support for these tools or recommend a particular tool for its APIs. You can select the tool based on your requirements.

Preparing the SOAP Request

This post uses the get_quota_data method of the Oracle Field Service Cloud Capacity Management API. Additional information about the API is included as a link in the References section.

Use a browser to open the WSDL page for this API. An example URL for the page is: https://hostname/soap/capacity/?wsdl. This page provides important information regarding the request and response envelopes used by the API.

The request envelope is comprised of the following sections. Note: To complete the envelope creation, the sections are concatenated together to provide a single request envelope. An example of a complete request enveloped may be viewed here.

Opening

The Opening section is static text as shown below:

<soapenv:Envelope xmlns:soapenv=”http://schemas.xmlsoap.org/soap/envelope/” xmlns:urn=”urn:toa:capacity”>
<soapenv:Header/>
<soapenv:Body>
<urn:get_quota_data>

User

The User section is dynamic and comprises the following components:

Now

The now component is the current time in the UTC time zone. An example is: <now>2016-12-19T09:13:10+00:00</now>. It is populated by the following command:

SELECT TO_CHAR (SYSTIMESTAMP AT TIME ZONE ‘UTC’, ‘YYYY-MM-DD”T”HH24:MI:SS”+00:00″‘ ) INTO V_NOW FROM DUAL;

Login

The login component is the user name.

Company

The company component is the company for which data is being retrieved.

Authorization String

The auth_string component is the MD5 hash of the concatenation of the now component with the MD5 hash of the user password. In pseudo-code it would be md5 (now + md5 (password)). It is populated by the following command:

SELECT
LOWER (
DBMS_CRYPTO.HASH (
V_NOW||
LOWER( DBMS_CRYPTO.HASH (V_PASSWORD,2) )
,2
)
)
INTO V_AUTH_STRING FROM DUAL;

Note: ‘2’ is the code for MD5.

An example is:

<auth_string>b477d40346ab40f1a1a038843d88e661fa293bec5cc63359895ab4923051002a,/auth_string>

Required Parameters

There are two required parameters: date and resource_id. Each may have multiple entries. However the sample procedure in this post allows only one resource id. It also uses just one date to start with and then issues the request multiple times for the number of consecutive dates requested.

In this post, the starting date is the current date in Sydney, Australia. An example is below:

<date>2016-12-21</date> <resource_id>Test_Resource_ID</resource_id>

The starting date and subsequent dates are populated by this command:

CASE WHEN P_DATE IS NULL
THEN SELECT TO_CHAR (SYSTIMESTAMP AT TIME ZONE ‘Australia/Sydney’, ‘YYYY-MM-DD’) INTO P_DATE FROM DUAL;
ELSE P_DATE:= TO_CHAR (TO_DATE (P_DATE, ‘YYYY-MM-DD’) + 1,’YYYY-MM-DD’); — Increments the day by 1
END CASE;

Aggregation

The aggregation component specifies whether to aggregate the results. Since BI will do this automatically, aggregation and totals are set to 0 (no). An example is:

<aggregate_results>0</aggregate_results> <calculate_totals>0</calculate_totals>

Field Requests

This section may be passed as a parameter and it lists the various data fields to be included in the extract. An example is below:

<day_quota_field>max_available</day_quota_field>
<time_slot_quota_field>max_available</time_slot_quota_field>
<time_slot_quota_field>quota</time_slot_quota_field>
<category_quota_field>used</category_quota_field>
<category_quota_field>used_quota_percent</category_quota_field>
<work_zone_quota_field>status</work_zone_quota_field>

Closing

The Closing section is static text as shown below:

</urn:get_quota_data>
</soapenv:Body>
</soapenv:Envelope>

Calling the SOAP Request

The APEX_WEB_SERVICE package is used to populate a request header and issue the SOAP request. The header requests that the web service return the contents in a non-compressed text format as shown below:

 

APEX_WEB_SERVICE.G_REQUEST_HEADERS(1).NAME := ‘Accept-Encoding’;
APEX_WEB_SERVICE.G_REQUEST_HEADERS(1).VALUE := ‘identity’;

For each date to be processed the SOAP request envelope is created and issued as shown below:

F_XML      := APEX_WEB_SERVICE.MAKE_REQUEST(
P_URL         => F_SOAP_URL
,P_ENVELOPE    => F_REQUEST_ENVELOPE
,P_WALLET_PATH => ‘file:wallet location
,P_WALLET_PWD  => ‘wallet password‘ );

Troubleshooting the SOAP Request Call

Common issues are the need for a proxy, the need for a trusted certificate (if using HTTPS), and the need to use the TLS security protocol. Note: This post uses DBCS so the second and third issues have been addressed.

The need for a proxy may be detected when the following error occurs: ORA-12535: TNS:operation timed out. Adding the optional p_proxy_override parameter to the call may correct the issue. An example proxy override is:

www-proxy.us.oracle.com

 

Parsing the SOAP Response

For each date to be processed the SOAP response envelope is parsed to obtain the individual rows and columns.

The hierarchy levels of the capacity API are listed below:

Bucket > Day > Time Slot > Category > Work Zone

Each occurrence of every hierarchical level is parsed to determine attribute names and values. Both the name and the value are then used to populate a column in the staging table.

When a hierarchical level is completed and no occurrences of a lower level exist, a row is inserted into the BICS staging table.

Below is an example XML response element for one bucket.

<bucket>
<bucket_id>TEST Bucket ID</bucket_id>
<name>TEST Bucket Name</name>
<day>
<date>2016-12-21</date>
<time_slot>
<label>7-10</label>
<quota_percent>100</quota_percent>
<quota>2520</quota>
<max_available>2520</max_available>
<used_quota_percent>0</used_quota_percent>
<category>
<label>TEST Category</label>
<quota_percent>100</quota_percent>
<quota>2520</quota>
<max_available>2340</max_available>
<used_quota_percent>0</used_quota_percent>
</category>
</time_slot>
<time_slot>
<label>10-14</label>
<quota_percent>100</quota_percent>
<quota>3600</quota>
<max_available>3600</max_available>
<used_quota_percent>0</used_quota_percent>
<category>
<label>TEST Category</label>
<quota_percent>100</quota_percent>
<quota>3600</quota>
<max_available>3360</max_available>
<used_quota_percent>0</used_quota_percent>
</category>
</time_slot>
<time_slot>
<label>14-17</label>
<quota_percent>100</quota_percent>
<quota>2220</quota>
<max_available>2220</max_available>
<used_quota_percent>0</used_quota_percent>
<category>
<label>TEST Category</label>
<quota_percent>100</quota_percent>
<quota>2220</quota>
<max_available>2040</max_available>
<used_quota_percent>0</used_quota_percent>
</category>
</time_slot>
</day>
</bucket>

The processing of the bucket element is as follows:

Occurrences 1 and 2 of the bucket level are parsed to return attribute names of bucket_id and name. The bucket_id attribute is used as-is and the name attribute is prefixed with “bucket_” to find the corresponding column_ids in the staging view. The corresponding columns in the staging table, C03 and C04, are then populated.

Occurrence 3 of the bucket level returns the day level element tag. Processing then continues at the day level.

Occurrence 1 of the day level returns the attribute name of date. The attribute name is prefixed with “day_” to find the corresponding column_id in the staging view. The corresponding column in the staging table, C05, is then populated with the value ‘2016-12-21’.

Occurrence 2 of the day level returns the first of three time_slot level element tags. Processing for each continues at the time-slot level. Each time_slot element contains 5 attribute occurrences followed by a category level element tag.

Each category level contains 5 attribute occurrences. Note: there is no occurrence of a work_zone level element tag in the category level. Thus after each category level element is processed, a row is written to the staging table.

The end result is that 3 rows are written to the staging table for this bucket. The table below describes the XML to row mapping for the first row.

Attribute Name Attribute Value View Column Name Table Column Name
bucket_id TEST Bucket ID BUCKET_ID C03
name TEST Bucket Name BUCKET_NAME C04
day 2016-12-21 DAY_DATE C05
label 7-10 TIME_SLOT_LABEL C18
quota_percent 100 TIME_SLOT_QUOTA_PERCENT C19
quota 2520 TIME_SLOT_QUOTA C21
max_available 2520 TIME_SLOT_MAX_AVAILABLE C26
used_quota_percent 0 TIME_SLOT_USED_QUOTA_PERCENT C29
label TEST Category CAT_LABEL C32
quota_percent 100 CAT_QUOTA_PERCENT C33
quota 2520 CAT_QUOTA C35
max_available 2340 CAT_MAX_AVAILABLE C42
used_quota_percent 0 CAT_USED_QUOTA_PERCENT C44

 

In PL/SQL, the processing is accomplished using the LOOP command. There is a loop for each hierarchical level. Loops end when no results are returned for a parse statement.

XPATH statements are used for parsing. Additional information regarding XPATH statements may be found in the References section. Examples are below:

Statement Returns
/bucket[5] The entire fifth bucket element in the response. If no results then all buckets have been processed.
/bucket/*[1] The first bucket attribute or element name.
/bucket/*[2]/text() The second bucket attribute value.
/bucket/day/*[6] The sixth day attribute or element name.
/bucket/day[1]/*[6]/text() The sixth day attribute value.
/bucket/day/time_slot[2]/*[4] The fourth attribute or element name of the second time_slot.

 

Scheduling the Procedure

The procedure may be scheduled to run periodically through the use of an Oracle Scheduler job. A link to the Scheduler documentation may be found in the References section.

A job is created using the CREATE_JOB procedure by specifying a job name, type, action and a schedule. Setting the enabled argument to TRUE enables the job to automatically run according to its schedule as soon as you create it.

An example of a SQL statement to create a job is below:

BEGIN
DBMS_SCHEDULER.CREATE_JOB (
JOB_NAME        => ‘OFSC_SOAP_QUOTA_EXTRACT’,
JOB_TYPE        => ‘STORED_PROCEDURE’,
ENABLED          => TRUE,
JOB_ACTION      => ‘BICS_OFSC_SOAP_INTEGRATION’,
START_DATE      => ’21-DEC-16 10.00.00 PM Australia/Sydney’,
REPEAT_INTERVAL => ‘FREQ=HOURLY; INTERVAL=24’   — this will run the job every 24 hours
);
END;
/

Note: If using the BICS Schema Service database, the package name is CLOUD_SCHEDULER rather than DBMS_SCHEDULER.

The job log and status may be queried using the *_SCHEDULER_JOBS views. Examples are below:

SELECT JOB_NAME, STATE, NEXT_RUN_DATE from USER_SCHEDULER_JOBS;
SELECT LOG_DATE, JOB_NAME, STATUS from USER_SCHEDULER_JOB_LOG;

 

Summary

This post detailed a method of extracting and loading data from Oracle Field Service Cloud (OFSC) into the Oracle Business Intelligence Cloud Service (BICS).

The post used SOAP web services to extract the XML-formatted data responses. It used a PL/SQL Stored Procedure to wrap the SOAP extract, XML parsing commands, and database table operations. It loaded a BICS staging table and a staging view which can be transformed into star-schema object(s) for use in modeling.

Finally, an example of a database job was provided that executes the Stored Procedure on a scheduled basis.

For more BICS and BI best practices, tips, tricks, and guidance that the A-Team members gain from real-world experiences working with customers and partners, visit Oracle A-Team Chronicles for BICS.

References

Text of Complete Procedure

OFSC Capacity API Document

OFSC Capacity API WSDL

Scheduling Jobs with Oracle Scheduler

Database PL/SQL Language Reference

Reference Guide for the APEX_WEB_SERVICE

Soap API Testing Tool

XPATH Testing Tool

Base64 Decoding and Encoding Testing Tool

Using Oracle Wallet Manager

Oracle Business Intelligence Cloud Service Tasks

 

Loading Data from Oracle Field Service Cloud into Oracle BI Cloud Service using REST

$
0
0

Introduction

This post details a method of extracting and loading data from Oracle Field Service Cloud (OFSC) into the Oracle Business Intelligence Cloud Service (BICS) using RESTful services. It is a companion to the A-Team post Loading Data from Oracle Field Service Cloud into Oracle BI Cloud Service using SOAP . Both this post and the SOAP post offer methods to compliment the standard OFSC Daily Extract described in Oracle Field Service Cloud Daily Extract Description.

One case for using this method is analyzing trends regarding OFSC events.

This post uses RESTful web services to extract JSON-formatted data responses. It also uses the PL/SQL language to call the web services, parse the JSON responses, and perform database table operations in a Stored Procedure. It produces a BICS staging table which can then be transformed into star-schema object(s) for use in modeling. The transformation processes and modeling are not discussed in this post.

Finally, an example of a database job is provided that executes the Stored Procedure on a scheduled basis.

The PL/SQL components are for demonstration purposes only and are not intended for enterprise production use. Additional detailed information, including the complete text of the PL/SQL procedure described, is included in the References section at the end of this post.

Rationale for Using PL/SQL

PL/SQL is the only procedural tool that runs on the BICS / Database Schema Service platform. Other wrapping methods e.g. Java, ETL tools, etc. require a platform outside of BICS to run on.

PL/SQL may also be used in a DBaaS (Database as a Service) that is connected to BICS.

PL/SQL can utilize native SQL commands to operate on the BICS tables. Other methods require the use of the BICS REST API.

Note: PL/SQL is a very good at showcasing functionality. However, it tends to become prohibitively resource intensive when deploying in an enterprise production environment. For the best enterprise deployment, an ETL tool such as Oracle Data Integrator (ODI) should be used to meet these requirements and more:

* Security

* Logging and Error Handling

* Parallel Processing – Performance

* Scheduling

* Code Re-usability and Maintenance

About the OFSC REST API

The document REST API for Oracle Field Service Cloud Service should be used extensively, especially the Authentication, Paginating, and Working with Events sections. Terms described there such as subscription, page, and authorization are used in the remainder of this post.

In order to receive events, a subscription is needed listing the specific events desired. The creation of a subscription returns both a subscription ID and a page number to be used in the REST calls to receive events.

At this time, a page contains 0 to 100 items (events) along with the next page number to use in a subsequent call.

The following is a list of supported events types available from the REST API:

Activity Events
Activity Link Events
Inventory Events
Required Inventory Events
User Events
Resource Events
Resource Preference Events

This post uses the following subset of events from the Activity event type:

activityCreated
activityUpdated
activityStarted
activitySuspended
activityCompleted
activityNotDone
activityCanceled
activityDeleted
activityDelayed
activityReopened
activityPreworkCreated
activityMoved

The process described in this post can be modified slightly for each different event type. Note: the columns returned for each event type differ slightly and require modifications to the staging table and parsing section of the procedure.

Using Oracle Database as a Service

This post uses the new native support for JSON offered by the Oracle 12c database. Additional information about these new features may be found in the document JSON in Oracle Database.

These features provide a solution that overcomes a current limitation in the APEX_JSON package. The maximum length of JSON values in that package is limited to 32K characters. Some of the field values in OFSC events exceed this length.

Preparing the DBaaS Wallet

Create an entry in a new or existing Oracle database wallet for the trusted public certificates used to secure connections to the web service via the Internet. A link to the Oracle Wallet Manager documentation is included in the References section. Note the location and password of the wallet as they are used to issue the REST request.

The need for a trusted certificate is detected when the following error occurs: ORA-29024: Certificate validation failure.

An example certificate path found using Chrome browser is shown below. Both of these trusted certificates need to be in the Oracle wallet.

  • 2

Creating a BICS User in the Database

The complete SQL used to prepare the DBaaS may be viewed here.

Example SQL statements are below:

CREATE USER “BICS_USER” IDENTIFIED BY password
DEFAULT TABLESPACE “USERS”
TEMPORARY TABLESPACE “TEMP”
ACCOUNT UNLOCK;
— QUOTAS
ALTER USER “BICS_USER” QUOTA UNLIMITED ON USERS;
— ROLES
ALTER USER “BICS_USER” DEFAULT ROLE “CONNECT”,”RESOURCE”;
— SYSTEM PRIVILEGES
GRANT CREATE VIEW TO “BICS_USER”;
GRANT CREATE ANY JOB TO “BICS_USER”;

Creating Database Schema Objects

Three tables need to be created prior to compiling the PL/SQL stored procedure. These tables are:

*     A staging table to hold OFSC Event data

*     A subscription table to hold subscription information.

*     A JSON table to hold the JSON responses from the REST calls

The staging table, named OFSC_EVENT_ACTIVITY, has columns described in the OFSC REST API for the Activity event type. These columns are:

PAGE_NUMBER — for the page number the event was extracted from
ITEM_NUMBER — for the item number within the page of the event
EVENT_TYPE
EVENT_TIME
EVENT_USER
ACTIVITY_ID
RESOURCE_ID
SCHEDULE_DATE
APPT_NUMBER
CUSTOMER_NUMBER
ACTIVITY_CHANGES — To store all of the individual changes made to the activity

The subscription table, named OFSC_SUBSCRIPTION_PAGE, has the following columns:

SUBSCRIPTION_ID     — for the supported event types
NEXT_PAGE                — for the next page to be extracted in an incremental load
LAST_UPDATE            — for the date of the last extract
SUPPORTED_EVENT — for the logical name for the subscription event types
FIRST_PAGE               — for the first page to be extracted in a full load

The JSON table, named OFSC_JSON_TMP, has the following columns:

PAGE_NUMBER — for the page number extracted
JSON_CLOB       — for the JSON response received for each page

Using API Testing Tools

The REST requests should be developed in API testing tools such as cURL and Postman. The JSON expressions for parsing should be developed and tested in a JSON expression testing tool such as CuriousConcept. Links to these tools are provided in the References section.

Note: API testing tools such as SoapUI, CuriousConcept, Postman, and so on are third-party tools for using SOAP and REST services. Oracle does not provide support for these tools or recommend a particular tool for its APIs. You can select the tool based on your requirements.

Subscribing to Receive Events

Create subscriptions prior to receiving events. A subscription specifies the types of events that you want to receive. Multiple subscriptions are recommended. For use with the method in this post, a subscription should only contain events that have the same response fields.

The OFSC REST API document describes how to subscribe using a cURL command. Postman can also easily be used. Either tool will provide a response as shown below:

{
“subscriptionId”: “a0fd97e62abca26a79173c974d1e9c19f46a254a”,
“nextPage”: “160425-457,0”,
“links”: [ … omitted for brevity ]
}.

Note: The default next page is for events after the subscription is created. Ask the system administrator for a starting page number if a past date is required.

Use SQL*Plus or SQL Developer and insert a row for each subscription into the OFSC_SUBSCRIPTION_PAGE table.

Below is an example insert statement for the subscription above:

INSERT INTO OFSC_SUBSCRIPTION_PAGE
(
SUBSCRIPTION_ID,
NEXT_PAGE,
LAST_UPDATE,
SUPPORTED_EVENT,
FIRST_PAGE
)
VALUES
(
‘a0fd97e62abca26a79173c974d1e9c19f46a254a’,
‘160425-457,0’,
sysdate,
‘Required Inventory’,
‘160425-457,0’
);

Preparing and Calling the OFSC RESTful Service

This post uses the events method of the OFSC REST API.

This method requires the Basic framework for authorization and mandates a base64 encoded value for the following information: user-login “@” instance-id “:” user-password

An example encoded result is:

dXNlci1sb2dpbkBpbnN0YW5jZS1pZDp1c2VyLXBhc3N3b3Jk

The authorization header value is the concatenation of the string ‘Basic’ with the base64 encoded result discussed above. The APEX_WEB_SERVICE package is used to set the header as shown below:

v_authorization_token := ‘ dXNlci1sb2dpbkBpbnN0YW5jZS1pZDp1c2VyLXBhc3N3b3Jk’;
apex_web_service.g_request_headers(1).name  := ‘Authorization’;
apex_web_service.g_request_headers(1).value := ‘Basic ‘||v_authorization_token;

The wallet path and password discussed in the Preparing the DBaaS Wallet section are also required. An example path from a Linux server is:

/u01/app/oracle

Calling the Events Request

The events request is called for each page available for each subscription stored in the OFSC_SUBSCRIPTION_PAGE table using a cursor loop as shown below:

For C1_Ofsc_Subscription_Page_Rec In C1_Ofsc_Subscription_Page
Loop
V_Subscription_Id := C1_Ofsc_Subscription_Page_Rec.Subscription_Id;
Case When P_Run_Type = ‘Full’ Then
V_Next_Page := C1_Ofsc_Subscription_Page_Rec.First_Page;
Else
V_Next_Page := C1_Ofsc_Subscription_Page_Rec.Next_Page;
End Case; … End Loop;

The URL is modified for each call. The subscription_id and the starting page are from the table.

For the first call only, if the parameter / variable p_run_type is equal to ‘Full’, the staging table is truncated and the page value is populated from the FIRST_PAGE column in the OFSC_SUBSCRIPTION_PAGE table. Otherwise, the staging table is not truncated and the page value is populated from the NEXT_PAGE column.

Subsequent page values come from parsing the nextPage value in the responses.

An example command to create the URL from the example subscription above is:

f_ws_url := v_base_url||’/events?subscriptionId=’ ||v_subscription_id|| chr(38)||’page=’ ||v_next_page;

The example URL result is:

https://ofsc-hostname/rest/ofscCore/v1/events?subscriptionId=a0fd97e62abca26a79173c974d1e9c19f46a254a&page=160425-457,0

An example call using the URL is below:

f_ws_response_clob := apex_web_service.make_rest_request (
p_url => f_ws_url
,p_http_method => ‘GET’
,p_wallet_path => ‘file:/u01/app/oracle’
,p_wallet_pwd => ‘wallet-password‘ );

Storing the Event Responses

Each response (page) is processed using a while loop as shown below:

While V_More_Pages
Loop
Extract_Page;
End Loop;

Each page is parsed to obtain the event type of the first item. A null (empty) event type signals an empty page and the end of the data available. An example parse to obtain the event type of the first item is below. Note: for usage of the JSON_Value function below see JSON in Oracle Database.

select  json_value (f_ws_response_clob, ‘$.items[0].eventType’ ) into f_event_type from  dual;

If there is data in the page, the requested page number and the response clob are inserted into the OFSC_JSON_TMP table and the response is parsed to obtain the next page number for the next call as shown below:

f_json_tmp_rec.page_number := v_next_page; — this is the requested page number
f_json_tmp_rec.json_clob := f_ws_response_clob;
insert into ofsc_json_tmp values f_json_tmp_rec;
select json_value (f_ws_response_clob, ‘$.nextPage’ ) into v_next_page from dual;

Parsing and Loading the Events Responses

Each response row stored in the OFSC_JSON_TMP table is retrieved and processed via a cursor loop statement as shown below:

for c1_ofsc_json_tmp_rec in c1_ofsc_json_tmp
loop
process_ofsc_json_page (c1_ofsc_json_tmp_rec.page_number);
end loop;

An example response is below with only the first item shown:

{
“found”: true,
“nextPage”: “170110-13,0”,
“items”: [
{
“eventType”: “activityUpdated”,
“time”: “2017-01-04 12:49:51”,
“user”: “soap”,
“activityDetails”: {
“activityId”: 1297,
“resourceId”: “test-resource-id“,
“resourceInternalId”: 2505,
“date”: “2017-01-25”,
“apptNumber”: “82994469003”,
“customerNumber”: “12797495”
},
“activityChanges”: {
“A_LastMessageStatus”: “SuccessFlag – Fail – General Exception: Failed to update FS WorkOrder details. Reason: no rows updated for: order_id = 82994469003 service_order_id = NULL”
}
}
],
“links”: [

]
}

Each item (event) is retrieved and processed via a while loop statement as shown below:

while f_more_items loop
process_item (i);
i := i + 1;
end loop;

For each item, a dynamic SQL statement is prepared and submitted to return the columns needed to insert a row into the OFSC_EVENT_ACTIVITY staging table as shown below (the details of creating the dynamic SQL statement have been omitted for brevity):

An example of a dynamically prepared SQL statement is below. Note: for usage of the JSON_Table function below see JSON in Oracle Database.

DYN_SQL

The execution of the SQL statement and the insert are shown below:

execute immediate f_sql_stmt into ofsc_event_activity_rec;
insert into ofsc_event_activity values ofsc_event_activity_rec;

Verifying the Loaded Data

Use SQL*Plus, SQL Developer, or a similar tool to display the rows loaded into the staging table.

A sample set of rows is shown below:

tabResults

Troubleshooting the REST Calls

Common issues are the need for a proxy, the need for an ACL, the need for a trusted certificate (if using HTTPS), and the need to use the correct TLS security protocol. Note: This post uses DBaaS so all but the first issue has been addressed.

The need for a proxy may be detected when the following error occurs: ORA-12535: TNS:operation timed out. Adding the optional p_proxy_override parameter to the call may correct the issue. An example proxy override is:

www-proxy.us.oracle.com

Scheduling the Procedure

The procedure may be scheduled to run periodically through the use of an Oracle Scheduler job as described in Scheduling Jobs with Oracle Scheduler.

A job is created using the DBMS_SCHEDULER.CREATE_JOB procedure by specifying a job name, type, action and a schedule. Setting the enabled argument to TRUE enables the job to automatically run according to its schedule as soon as you create it.

An example of a SQL statement to create a job is below:

BEGIN
dbms_scheduler.create_job (
job_name => ‘OFSC_REST_EVENT_EXTRACT’,
job_type => ‘STORED_PROCEDURE’,
enabled => TRUE,
job_action => ‘BICS_OFSC_REST_INTEGRATION’,
start_date => ’12-JAN-17 11.00.00 PM Australia/Sydney’,
repeat_interval => ‘freq=hourly;interval=24’ — this will run once every 24 hours
);
END;
/

Note: If using the BICS Schema Service database, the package name is CLOUD_SCHEDULER rather than DBMS_SCHEDULER.

The job log and status may be queried using the *_SCHEDULER_JOBS views. Examples are below:

SELECT JOB_NAME, STATE, NEXT_RUN_DATE from USER_SCHEDULER_JOBS;
SELECT LOG_DATE, JOB_NAME, STATUS from USER_SCHEDULER_JOB_LOG;

Summary

This post detailed a method of extracting and loading data from Oracle Field Service Cloud (OFSC) into the Oracle Business Intelligence Cloud Service (BICS) using RESTful services.

The method extracted JSON-formatted data responses and used the PL/SQL language to call the web services, parse the JSON responses, and perform database table operations in a Stored Procedure. It also produced a BICS staging table which can then be transformed into star-schema object(s) for use in modeling.

Finally, an example of a database job was provided that executes the Stored Procedure on a scheduled basis.

For more BICS and BI best practices, tips, tricks, and guidance that the A-Team members gain from real-world experiences working with customers and partners, visit Oracle A-Team Chronicles for BICS.

References

Complete Procedure

JSON in Oracle Database

REST API for Oracle Field Service Cloud Service

Scheduling Jobs with Oracle Scheduler

Database PL/SQL Language Reference

APEX_WEB_SERVICE Reference Guide

APEX_JSON Reference Guide

Curious Concept JSON Testing Tool

Postman Testing Tool

Base64 Decoding and Encoding Testing Tool

Using Oracle Wallet Manager

Oracle Business Intelligence Cloud Service Tasks

 

Eloqua ICS Integration

$
0
0

Introduction

Oracle Eloqua, part of Oracle’s Marketing Cloud suite of products, is a cloud based B2B marketing platform that helps automate the lead generation and nurture process. It enables the marketer to plan and execute marketing campaigns while delivering a personalized customer experience to prospects.

In this blog I will describe how to integrate Eloqua with other SaaS applications using Oracle’s iPaaS platform, the Integration Cloud Service(ICS).
ICS provides an intuitive web based integration designer for point and click integration between applications, a rich monitoring dashboard that provides real-time insight into the transactions, all of it running on a standards based, mature runtime platform on Oracle Cloud. ICS boasts of a large library of SaaS, Application, as well as Technology Adapters that add to its versatility.

One such adapter is the Eloqua adapter, which allows synchronizing accounts, contacts and custom objects with other applications. The Eloqua Adapter can be used in two ways in ICS:

  • As the target of an integration where external data is sent to Eloqua,
  • Or as the source of an integration where contacts(or other objects) flowing through a campaign or program canvas in Eloqua are sent out to any external application.

This blog provides a detailed functional as well as technical introduction to the Eloqua Adapter’s capabilities.
The blog is organized as follows:

  1. a. Eloqua Adapter Concepts
  2. b. Installing the ICS App in Eloqua
  3. c. Creating Eloqua connection
  4. d. Designing the Inbound->Eloqua flows
  5. e. Designing the Eloqua->Outbound flows

This blog assumes that the reader has basic familiarity with ICS as well as Eloqua.

a. Eloqua Adapter concepts

In this section we’ll go over the technical underpinnings of the ICS Eloqua adapter.

ICS Adapter is also referred to as ICS Connector, they mean the same.

The ICS adapter can be used in ICS integrations for both triggering the integration and as target(Invoke) within an integration.

When used as target :

  • The adapter can be used to create/update Account, Contact and custom objects defined within Eloqua.
  • Under the hood the adapter uses the Eloqua Bulk 2.0 APIs to import data into Eloqua. More on this later.

When used as trigger :

  • The Eloqua Adapter allows instantiating an ICS integration when a campaign or program canvas runs within Eloqua.
  • The adapter must be used in conjunction with a corresponding ‘ICS App’ installed within Eloqua.

    Installing the ICS App is mandatory for triggering ICS integrations. The next section describes the installation.

    The marketer in Eloqua uses this app as a step in his campaign, and the app in turn invokes the ICS endpoint at runtime. The image below shows a sample ICS App in use in a campaign canvas within Eloqua:

  • Screen Shot 01-19-17 at 10.19 AM

  • The Eloqua ICS App resides within the Eloqua AppCloud, and complements the ICS Eloqua Adapter such that contacts and other objects flow out from the campaign, into the ICS App and eventually to the ICS integration. The image below describes this.
  • Screen Shot 01-19-17 at 12.35 PM

b. Installing the ICS App in Eloqua

As explained above, installing the ICS App in Eloqua is mandatory for the Eloqua->Outbound scenarios.

The app is available in Oracle Marketplace, and the installation is straightforward:

  • Open the ICS App on Oracle Marketplace at https://cloud.oracle.com/marketplace/app/AppICS
  • Click ‘Get App’. Accept the terms and conditions in the popup. Click ‘Next’. This will redirect you to your Eloqua login page. Sign in, and click ‘Accept and Install’
  • Screen Shot 11-30-16 at 06.22 PM

  • The next page takes you to the ICS configuration, where you need to provide the ICS URL, username and password. Click ‘Save’.
  • Screen-Shot-11-30-16-at-06.23-PM

  • Click ‘Sign In’ on the next page, thus providing the app access to Eloqua on your behalf(OAuth2).
  • Screen Shot 11-30-16 at 06.24 PM

  • Click ‘Accept’ on the next page.
  • The ICS App is now installed and ready to use as an ‘Action’ in Eloqua Canvas.

Now we will look at creating Eloqua connections and integrations in ICS.

c. Creating Eloqua connection in ICS

  1. Log on to the ICS home page. Click on ‘Create Connections’, then ‘New Connection’ , and choose ‘Eloqua’.
  2. Name the connection appropriately.
  3. Screen Shot 01-17-17 at 10.15 PM

  4. The Connection Role can be:
    • a. Trigger, used in integrations where the connection is only used to trigger the integration.
    • b. Invoke, used in integrations where the connection is only used as target.
    • c. Or Trigger and Invoke, which can be used either way.
  5. Click ‘Create’. Click on the ‘Configure Security’ button, and enter the Eloqua Company name, username and password. Then click on ‘Test’.
  6. At this point ICS authenticates with Eloqua using the credentials provided above. The authentication process depends on the connection role:

  • a.If ‘Invoke’ role, then ICS performs an HTTP Basic Authentication to https://login.eloqua.com using base64-encoded “<company>\<username>:<password>” string. This process is described in more detail here.
  • b.If ‘Trigger’ or ‘Trigger and Invoke’ role, then along with the above test ICS also reaches out to Eloqua AppCloud and checks if the Eloqua ICS App has been installed. If not installed then the connection test will fail.
  • Once the connection test is successful, save the connection.
  • Now that the connection has been defined, we can use the Eloqua adapter in an ICS integration to sync data. Let’s take a look at designing the Inbound->Eloqua usecases, i.e. where Eloqua is the target application.

    d. Designing the Inbound->Eloqua flows

    The Eloqua adapter for inbound->Eloqua flows only relies on the Bulk 2.0 APIs and doesn’t need the ICS App to be installed in Eloqua.
    Below are the steps to configure the adapter.

    Design time:

    • Create an ICS integration, and drag the Eloqua connection on the target or as an invoke activity in an orchestration.
    • Name your endpoint and click Next.
    • On the operations page, you can choose the Eloqua business object that needs to be created/updated, as well as fields within the object. You can choose the field to be uniquely matched on, etc.
    • Screen Shot 01-19-17 at 03.06 PM

    • You can also set the Auto-Sync time interval such that periodically the Eloqua data inserted into staging area will be synced to actual Eloqua tables.
    • Finish the wizard, complete the rest of the integration, and then activate it.

    At runtime, since we know that under the hood the Bulk Import APIs are being used, the following specific events happen:

    • Depending on the business object and the fields chosen, an import definition is created by POSTing to the “/bulk/2.0/<object>/imports/” Eloqua endpoint.
    • This returns a unique URI in the response, which is used to POST the actual data to Eloqua. Thus, as data gets processed through the ICS integration, it reaches the Eloqua Invoke activity, which internally uses the URI returned above to POST the data to Eloqua. The data is now in the Eloqua staging area, ready to be synced into Eloqua.
    • Now, depending on the ‘Auto-Sync’ interval defined in design-time, periodically the ‘/bulk/2.0/syncs’ endpoint is invoked which moves the data from the staging area to Eloqua database tables.

    The Bulk API steps above are described in more detail here.

    e. Designing the Eloqua->Outbound flows

    Design time :

    • Create an ICS integration, and drag the Eloqua connection as the source of the integration.
    • Select the business object, select the fields, followed by selecting the response fields.
    • Finish the wizard. Complete the integration and activate it.

    When the integration is activated, ICS makes a callout to the Eloqua ICS App, registering the integration name, its ICS endpoint, and request and response fields chosen above.

    At this point, back in the Eloqua UI, the marketer can configure the ICS App in her campaign by choosing among the activated ICS integrations and configuring them appropriately. For example, the screenshot below shows the ICS App’s ‘cloud action’ configuration screen from a sample Eloqua campaign, after an integration called ‘eloqua_blog’ with the Eloqua Adapter as source is activated:
    Screen Shot 01-19-17 at 03.41 PM

    The Marketer now runs her campaign. Contacts start flowing through various campaign steps, including the ICS App step, at which point the ICS App gets invoked, which in turn invokes the configured ICS integration.


    Integrating Oracle Project Cloud with Documents Cloud Service using REST APIs and business object-level security.

    $
    0
    0

    Introduction

    Oracle Documents Cloud Service (DCS) enables collaboration through rich set of social and mobile-optimized features. Customers often come across requirements to integrate DCS to Oracle ERP cloud. Such integration improves productivity by taking advantage of features of DCS Service. In this post, let’s take a look at integrating Project Management Cloud, a part of Oracle ERP cloud, with DCS. Contents of this post are applicable to R11 of Project Management Cloud and R16.4.5 of DCS Service.

    Main Article

    Project Cloud and Document Cloud both provide secure REST APIs for integration. In addition, Document Cloud offers UI integration through applinks, short-lived links accessible through HTML IFRAME. Project Cloud offers UI customization through Page Composer, which is sufficient to implement this solution. See links to documentation to these APIs in references section below. The solution described in this post uses aforementioned APIs and tools and a custom integration service deployed to JCS-SX. It leverages parts of design described in another blog post on integrating DCS and Sales Cloud (link provided in references section). Below is a high-level depiction of the solution.

    001

    Figure 1 – Overview of the solution

     

    JCS-SX is a PaaS-for-SaaS offering usually deployed alongside the Oracle SaaS application and pre-integrated with SaaS through Single-Sign-on. Guidance to implement this solution is split into subsections. For ease of comprehension, these instructions are abstracted. Click on one of the links below to jump to a subsection of interest.

    Documents Cloud REST API

    The following actions need to be performed through the API:

    • Query whether a sub-folder exists in DCS for the selected project.
    • Create a sub-folder for the project, based on project name.
    • Get an appslink to the sub-folder

    Get contents of a folder, in order to verify existence of sub folder with same name as project:
    Request:

    GET /documents/api/1.1/folders/F7A4AF94F58A48892821654E3B57253386C697CACDB0//items HTTP/1.1
    Host: &lt;DocsCloudHostName:port&gt;
    Authorization: Basic am9obi5kdW5iYXI6VmlzaW9uMTIzIQ==
    .....<br class="none" />
    

    Response:

    ....
    {
    "type": "folder",
    "id": "FE4E22621CBDA1E250B26DD73B57253386C697CACDB0",
    "parentID": "F7A4AF94F58A48892821654E3B57253386C697CACDB0",
    "name": "Cloud based HCM",
    "ownedBy": {
    "displayName": "John Doe",
    "id": "UDFE5D9A1F50DAA96DA5F4723B57253386C6",
    "type": "user"
    }
    ...<br class="none" /><br class="none" />

    Create a new sub-folder:

    Request:

    POST /documents/api/1.1/folders/F7A4AF94F58A48892821654E3B57253386C697CACDB0 HTTP/1.1
    Host: <hostname:port>
    Authorization: Basic am9obi5kdW5iYXI6VmlzaW9uMTIzIQ==
    …..
    {
        "name": "TestFolder1",
        "description": "TestFolder"
    }

    Response:

    HTTP/1.1 201 Created
    Date: Tue, 24 Jan 2017 22:14:50 GMT
    Location: https://docs-gse00000310.documents.us2.oraclecloud.com/documents/api/1.1/folders/F073C821561724BDA2E6B6C73B57253386C697CACDB0
    ….

    Create appslink to a subfolder:
    Request:

    POST /documents/api/1.1/applinks/folder/F7A4AF94F58A48892821654E3B57253386C697CACDB0 HTTP/1.1
    Host: <DCS host:port>
    Authorization: Basic am9obi55iYXI6VmlzaW9uMTIzIQ==
    ....
    
    {
        "assignedUser": "casey.brown",
        "role":"contributor"
    }

    Response:

    HTTP/1.1 200 OK
     Date: Wed, 25 Jan 2017 00:52:40 GMT
     Server: Oracle-Application-Server-11g
     .....
    
    {
     "accessToken": "eDkMUdbNQ2ytyNTyghBbyj43yBKpY06UYhQer3EX_bAQKbAfv09d4T7zuS5AFHa2YgImBiecD2u-haE_1r3SYA==",
     "appLinkID": "LF0fW2LLCZRsnvk1TVNcz5UhiqDSflq_2Kht39UOZGKsglZo_4WT-OkR1kEA56K91S1YZxSa8pBpQZD6BSWYCnAXZZKAZaela3IySlgJaaAvJrijCvWTazDqCeY56DvyYgHNjAoZPSy2dL0DzaCWi0XA==",
     "appLinkUrl": "https://docs-gse00000310.documents.us2.oraclecloud.com/documents/embed/link/app/LF0fW2LLCZRsnvk1TVNcz5UhiqDSflq_2Kht39UOZGKsglZo_4WT-OkR1kEA56K91S1YZxSa8pBpQZD6BSWYCnAXZZKAZaela3IySlgJaaAvJrijCvWTazDqCeY56DvyYgHNjAoZPSy2dL0DzaCWi0XA==/folder/F7A4AF94F58A48892821654E3B57253386C697CACDB0/_GruppFinancial",
     "errorCode": "0",
     "id": "F7A4AF94F58A48892821654E3B57253386C697CACDB0",
     "refreshToken": "LugYsmKWK6t5aCfAb8-lgdmp7jgF8v3Q9aEtits4oy0Oz9JtaYnL9BOs8q4lwXK8",
     "role": "contributor",
     "type": "applink"
     }<br class="none" /><br class="none" />

    Project Cloud REST API

    JCS-SX in the solution ensures that only users with access to a project could access the corresponding folder in DCS. This is achieved by invoking Project API with the JWT passed to the  service by project cloud. Without a valid token, the JCS-SX service will return an error.

    Here is the sample payload for the service.
    Request:

    GET /projectsFinancialsApi/resources/11.1.11/projects/300000058801556?fields=ProjectId,ProjectName&onlyData=true HTTP/1.1
    Host: <Project Cloud>
    Authorization:Bearer <JWT token>
    ...

    Response:

    HTTP/1.1 200 OK
    Server: Oracle-Application-Server-11g
    …
    {
      "ProjectId" : 300000058801556,
      "ProjectName" : "Dixon Financials Upgrade"
    }

    Security

    There are several aspects of security addressed by this solution.

    • Project Cloud and JCS-SX integration is secured by single-sign-on infrastructure of which both systems are participants. Single sign-on is enabled for JCS-SX instances and their co-located Fusion SaaS applications. This integration only ensures that the service is invoked on behalf of a valid user of ERP Cloud.
    • The API calls from JCS-SX to Project Cloud are secured by JWT tokens supplied by Project Cloud upon invoking the JCS-SX service. This JWT token is bound the currently logged in Project Cloud user. JWT Tokens are issued with a predetermined expiry time.
    • JCS-SX to DCS integration in this solution is secured by basic authentication. Federation of identity domains could allow seamless authentication and authorization of users between these two systems, with additional effort.

    JCS-SX Service

    This is a J-EE servlet that takes Project Name, Project ID and a JWT token as query string parameters. The functions of the service are as follows:

    • Using supplied JWT token and Project Id, try to get information about project using Project Cloud REST API. If the request fails, stop processing and return “HTTP 401 unauthorized” error.
    • If the previous step succeeds, query DCS for a sub-folder with the supplied project name. The root folder ID in DCS, basic authentication credentials are available to the servlet.
    • If a sub-folder does not exist, create a new sub-folder.
    • Create an appslink to the sub-folder. Generate HTML content with an IFRAME element pointing to the appslink returned by DCS API.

    Customizing Project Cloud

    For this integration, Project Cloud must be customized for the following:

    • Invoke JCS-SX service
    • Pass Project information such as Name and Id, along with a JWT token to JCS-SX service.
    • Display the appslink content from DCS.

    Project Cloud does not yet provide the app composer tool available in Sales Cloud at the time of publishing this post. However, page composer’s features are sufficient for this integration.  Here are the steps to implement:

    • Create and activate a sandbox, if the current use does not have one already.
    • Navigate to an appropriate page of project management cloud where Document Cloud’s content could be displayed. For this solution, let’s navigate to Home->Projects->Project Financial Management. Then, search for projects and click on a project, then click on Documents tab.

    002

    • Click on top right menu and select “Customize Pages”. Page Composer is now activated current page.
    • Click on a section of page where DCS appslink should be displayed.
    • On top left menu of Page Composer, click on “View” and select “Source”. Click on “Add Content”, click on “Components” and select “Web Page” widget.
    • 003Once the widget is displayed, drag the edges to desired size. Then, while the web page widget is selected, click on “Edit” of the Page Composer menu, on top left. Web Page component’s property dialog is displayed. Click the drop-down next to “Source” field and select “Expression Builder”.
      004
    • Once the widget is displayed, drag the edges to desired size. Then, while the web page widget is selected, click on “Edit” of the Page Composer menu, on top left. Web Page component’s property dialog is displayed. Click the drop-down next to “Source” field and select “Expression Builder”. Enter appropriate JCS-SX host and service URI for the JSC-SX service. Notice the bindings variables for project information and JWT token supplied through query string. These variables are available to the page by default.
      https://<JCS-SX HOST>:<PORT>/doccloud?projectID=#{bindings.ProjectId.inputValue}&projectName=#{bindings.Name.inputValue}&buname=#{bindings.Name3.inputValue}&customername=#{bindings.Customer.inputValue}&jwt=#{applCoreSecuredToken.trustToken}

    005

    • Click OK to submit and click “Apply” on Component properties page. If the integration works end-to-end, DCS page should be displayed as shown below, with a sub-folder named after the project in focus. Use can drag and drop documents into the Widget to add documents.

      006

    Summary

    This article explains how to integrate Oracle Project Management Cloud and DCS using REST API and JCS-SX.  It provides API snippets, instructions for customizing Project Cloud and the overall logic of the service deployed on JCS-SX. This approach is suitable for R11 of ERP cloud and R16.4.5 of DCS. Subsequent releases of these products offer equivalent or better integration capabilities. Refer to product documentation for later versions before implementing a solution based on this article. 

    References

    DCS REST API:

    http://docs.oracle.com/cloud/latest/documentcs_welcome/WCDRA/index.html

    Project Portfolio Management Cloud REST API:

    http://docs.oracle.com/cloud/latest/projectcs_gs/FAPAP/

    Blog on Sales Cloud to DCS integration:

    http://www.ateam-oracle.com/integrating-oracle-document-cloud-and-oracle-sales-cloud-maintaining-data-level-business-object-security/

     

     

    Accessing Fusion Data from BI Reports using Java

    $
    0
    0

    Introduction

    In a recent article by Richard Williams on A-Team Chronicles, Richard explained how you can execute a BI publisher report from a SOAP Service and retrieve the report, as XML, as part of the response of the SOAP call.  This blog article serves as a follow on blog article providing a tutorial style walk through on how to implement the above procedure in Java.

    This article assumes you have already followed the steps in Richard’s blog article and created your report in BI Publisher, exposed it as a SOAP Service and tested this using SOAPUI, or another SOAP testing tool.

    Following Richards guidance we know that he correct SOAP call could look like this

    <soap:Envelope xmlns:soap="http://www.w3.org/2003/05/soap-envelope" xmlns:pub="http://xmlns.oracle.com/oxp/service/PublicReportService">
       <soap:Header/>
       <soap:Body>
          <pub:runReport>
             <pub:reportRequest>
                <pub:reportAbsolutePath>/~angelo.santagata@oracle.com/Bi report.xdo</pub:reportAbsolutePath>
                <pub:reportRawData xsi:nil="true" >true</pub:reportRawData>
                <pub:sizeOfDataChunkDownload>-1</pub:sizeOfDataChunkDownload>
                <pub:flattenXML>true</pub:flattenXML>
                <pub:byPassCache>true</pub:byPassCache>
             </pub:reportRequest>
             <pub:appParams/>
          </pub:runReport>
       </soap:Body>
    </soap:Envelope>
    
    
    Tip :One easy way to determine the reports location is to run the report and then examine the URL in the browser.

     

    Implementing the SOAP call using JDeveloper 11g

    We can now need to implement the Java SOAP Client to call our SOAP Service. For this blog we will use JDeveloper 11g, the IDE recommended for extending Oracle Fusion, however you are free to use your IDE of choice, e.g. NetBeans, Eclipse, VI, Notepad etc, the steps will obviously be different.

    Creating the project

    Within JDeveloper 11g start by creating a new Application and within this application create two generic projects. Call one project “BISOAPServiceProxy” and the other “FusionReportsIntegration”. The “BISOAPServiceProxy” project will contain a SOAP Proxy we are going to generate from JDeveloper 11g and the “FusionReportsIntegration” project will contain our custom client code. It is good practice to create separate projects so that the SOAP Proxies resides in its own separate project, this allows us to regenerate the proxy from scratch without affecting any other code.

    Generating the SOAP Proxy

    For this example we will be using the SOAP Proxy wizard as part of JDeveloper. This functionality generates a static proxy for us, which in turn makes it easier to generate the required SOAP call later.

    1. 1. With the BISOAPService project selected, start the JDeveloper SOAP Proxy wizard.
      File-> New-> Business Tier-> Web Services-> Web Service Proxy
    2. Proxy1
    3. 2. Click Next
    4. 3. Skipping the first welcome screen, in step 2 enter the JAX-WS Style as the type of SOAP Proxy you wish to generate in step 3 enter the WSDL of your Fusion Application BI Publisher webservice WSDL. It’s best to check this URL returns a WSDL document in your web browser before entering it here. The WSDL location will normally be something like : http://<your fusion Applications Server>/xmlpserver/services/ExternalReportWSSService?wsdl
    5. Proxy2
    6. It’s recommended that you leave the copy WSDL into project check-box selected.
    7. 4. Give a package name, unless you need to it’s recommended to leave the Root Package for generated types to be left blank
    8. proxy3
    9. 5. Now hit Finish

    Fixing the project dependencies

    We now need to make sure that the “FusionReportsIntegration” is able to see classes generated by the  “BISOAPServiceProxy” proxy. To resolve this in JDeveloper we simply need to setup a dependency between the two projects.

    1. 1. With the FusionReportsIntegration project selected, right-mouse click on the project and select “Project properties
    2. 2. In the properties panel select Dependencies
    3. 3. Select the little pencil icon and in the resulting dialog select “Build Output”. This selection tells JDeveloper that “this project depends on the successful build output” of the other project.
    4. 4. Save the Dialog
      dependancies1
    5. 5. Close [OK] the Project Properties dialog
    6. 6. Now is a good time to hit compile and make sure the SOAP proxy compiles without any errors, given we haven’t written any code yet it should compile just fine.

    Writing the code to execute the SOAP call

    With the SOAP Proxy generated, the project dependency setup, we’re now ready to write the code which will call the BI Server using the generated SOAP Proxy

    1. 1. With the Fusion Reports Integration selected , right mouse Click -> New -> Java -> Java Class
      javacode
    2. 2. Enter a name, and java package name, for your class
    3. 3. Ensure that “Main Method” is selected. This is so we can execute the code from the command line, you will want to change this depending on where you execute your code from, e.g. A library, a servlet etc.
    4. 4. Within the main method you will need to enter the following code snippet, once this code snippet is pasted you will need to correct and resolve imports for your project.
    5. 1.	ExternalReportWSSService_Service externalReportWSSService_Service;
      2.	// Initialise the SOAP Proxy generated by JDeveloper based on the following WSDL xmlpserver/services/ExternalReportWSSService?wsdl
      3.	externalReportWSSService_Service = new ExternalReportWSSService_Service();
      4.	// Set security Policies to reflect your fusion applications
      5.	SecurityPoliciesFeature securityFeatures = new SecurityPoliciesFeature(new String[]
      6.	{ "oracle/wss_username_token_over_ssl_client_policy" });
      7.	// Initialise the SOAP Endpoint
      8.	ExternalReportWSSService externalReportWSSService = externalReportWSSService_Service.getExternalReportWSSService(securityFeatures);
      9.	// Create a new binding, this example hardcodes the username/password, 
      10.	// the recommended approach is to store the username/password in a CSF keystore
      11.	WSBindingProvider wsbp = (WSBindingProvider)externalReportWSSService;
      12.	Map<String, Object> requestContext = wsbp.getRequestContext();
      13.	//Map to appropriate Fusion user ID, no need to provide password with SAML authentication
      14.	requestContext.put(WSBindingProvider.USERNAME_PROPERTY, "username");
      15.	requestContext.put(WSBindingProvider.PASSWORD_PROPERTY, "password");
      16.	requestContext.put(WSBindingProvider.ENDPOINT_ADDRESS_PROPERTY, "https://yourERPServer:443/xmlpserver/services/ExternalReportWSSService");
      
      17.	// Create a new ReportRequest object using the generated ObjectFactory
      18.	ObjectFactory of = new ObjectFactory();
      19.	ReportRequest reportRequest = of.createReportRequest();
      20.	// reportAbsolutePath contains the path+name of your report
      21.	reportRequest.setReportAbsolutePath("/~angelo.santagata@oracle.com/Bi report.xdo");
      22.	// We want raw data
      23.	reportRequest.setReportRawData("");
      24.	// Get all the data
      25.	reportRequest.setSizeOfDataChunkDownload(-1); 
      26.	// Flatten the XML response
      27.	reportRequest.setFlattenXML(true);
      28.	// ByPass the cache to ensure we get the latest data
      29.	reportRequest.setByPassCache(true);
      30.	// Run the report
      31.	ReportResponse reportResponse = externalReportWSSService.runReport(reportRequest, "");
      32.	// Display the output, note the response is an array of bytes, you can convert this to a String
      33.	// or you can use a DocumentBuilder to put the values into a XLM Document object for further processing
      34.	System.out.println("Content Type="+reportResponse.getReportContentType());
      35.	System.out.println("Data ");
      36.	System.out.println("-------------------------------");
      37.	String data=new String (reportResponse.getReportBytes());
      38.	System.out.println(data);
      39.	System.out.println("-------------------------------");
    6. Going through the code

    7.  
      Line What does it do
      1-3 This is the instantiation of a new class containing the WebService Proxy object. This was generated for us earlier
      5 Initialise a new instance of a security policy object, with the correct security policy, for your Oracle Fusion server . The most common security policy is that of “oracle/wss_username_token_over_ssl_client_policy”, however your server maybe setup differently
      8 Calls the factory method to initialise a SOAP endpoint with the correct security features set
      9-16 These lines setup the SOAP binding so that it knows which endpoint to execute (i.e. the Hostname+URI of your webservice which is not necessarily the endpoint where the SOAP Proxy was generated, the username and the password.In this example we are hard coding the details because we are going to be running this example on the command line. If this code is to be  executed on a JEE server, e.g. Weblogic, then we recommend this data is stored in the Credential store as CSF keys.
      17-19 Here we create a reportRequest object and populate it with the appropriate parameters for the SOAP call. Although not mandatory its recommended that you use the objectFactory generated by the SOAP proxy wizard in JDeveloper.
      21 This set the ReportPath parameter, including path to the report
      23 This line ensures we get the raw data without decoration, layouts etc.
      25 By default BI Publisher publishes data on a range basis, e.g. 50 rows at a time, for this usecase we want all the rows, and setting this to -1 will ensure this
      27 Tells the webservice to flatten out the XML which is produced
      29 This is an optional flag which instructs the BI Server to bypass the cache and go direct to the database
      30 This line executes the SOAP call , passing the “reportReport” object we previously populated as a parameter. The return value is a reportResponse object
      34-39 These lines print out the results from the BI Server. Of notable interest is the XML document is returned as a byte array. In this sample we simply print out the results to the output, however you would normally pass the resulting XML into Java routines to generate a XML Document.

     

     

    Because we are running this code from the command line as a java client code we need to import the Fusion Apps Certificate into the Java Key Store. If you run the code from within JDeveloper then the java keystore is stored in <JDeveloperHome>\wlserver_10.3\server\lib\DemoTrust.jks

    Importing certificates

     

    1. 1. Download the Fusion Applications SSL certificate, using a browser like internet explorer navigate to the SOAP WSDL URL
    2. 2. Mouse click on the security Icon which will bring you to the certificate details
    3. 3. View Certificate
      4. Export Certificate as a CER File
    4. 5. From the command line we now need to import the certificate into our DemoTrust.jks file using the following commandkeytool -import -alias fusionKey -file fusioncert.cer -keystore DemoIdentity.jks

    jks

    Now ready to run the code!

    With the runReport.java file selected press the “Run” button, if all goes well then the code will execute and you should see the XML result of the BI Report displayed on the console.

     

    Loading Data from Oracle Field Service Cloud into Oracle BI Cloud Service using REST

    $
    0
    0

    Introduction

    This post details a method of extracting and loading data from Oracle Field Service Cloud (OFSC) into the Oracle Business Intelligence Cloud Service (BICS) using RESTful services. It is a companion to the A-Team post Loading Data from Oracle Field Service Cloud into Oracle BI Cloud Service using SOAP . Both this post and the SOAP post offer methods to complement the standard OFSC Daily Extract described in Oracle Field Service Cloud Daily Extract Description.

    One case for using this method is analyzing trends regarding OFSC events.

    This post uses RESTful web services to extract JSON-formatted data responses. It also uses the PL/SQL language to call the web services, parse the JSON responses, and perform database table operations in a Stored Procedure. It produces a BICS staging table which can then be transformed into star-schema object(s) for use in modeling. The transformation processes and modeling are not discussed in this post.

    Finally, an example of a database job is provided that executes the Stored Procedure on a scheduled basis.

    The PL/SQL components are for demonstration purposes only and are not intended for enterprise production use. Additional detailed information, including the complete text of the PL/SQL procedure described, is included in the References section at the end of this post.

    Update: As of December, 2016 the  APEX 5.1 APEX_JSON package has removed the limitation of 32K lengths for JSON values. A new section has been added to this post named Parsing Events Responses using APEX_JSON.

    Rationale for Using PL/SQL

    PL/SQL is the only procedural tool that runs on the BICS / Database Schema Service platform. Other wrapping methods e.g. Java, ETL tools, etc. require a platform outside of BICS to run on.

    PL/SQL may also be used in a DBaaS (Database as a Service) that is connected to BICS.

    PL/SQL can utilize native SQL commands to operate on the BICS tables. Other methods require the use of the BICS REST API.

    Note: PL/SQL is a very good at showcasing functionality. However, it tends to become prohibitively resource intensive when deploying in an enterprise production environment. For the best enterprise deployment, an ETL tool such as Oracle Data Integrator (ODI) should be used to meet these requirements and more:

    * Security

    * Logging and Error Handling

    * Parallel Processing – Performance

    * Scheduling

    * Code Re-usability and Maintenance

    About the OFSC REST API

    The document REST API for Oracle Field Service Cloud Service should be used extensively, especially the Authentication, Paginating, and Working with Events sections. Terms described there such as subscription, page, and authorization are used in the remainder of this post.

    In order to receive events, a subscription is needed listing the specific events desired. The creation of a subscription returns both a subscription ID and a page number to be used in the REST calls to receive events.

    At this time, a page contains 0 to 100 items (events) along with the next page number to use in a subsequent call.

    The following is a list of supported events types available from the REST API:

    Activity Events
    Activity Link Events
    Inventory Events
    Required Inventory Events
    User Events
    Resource Events
    Resource Preference Events

    This post uses the following subset of events from the Activity event type:

    activityCreated
    activityUpdated
    activityStarted
    activitySuspended
    activityCompleted
    activityNotDone
    activityCanceled
    activityDeleted
    activityDelayed
    activityReopened
    activityPreworkCreated
    activityMoved

    The process described in this post can be modified slightly for each different event type. Note: the columns returned for each event type differ slightly and require modifications to the staging table and parsing section of the procedure.

    Using Oracle Database as a Service

    This post uses the new native support for JSON offered by the Oracle 12c database. Additional information about these new features may be found in the document JSON in Oracle Database.

    These features provide a solution that overcomes a current limitation in the APEX_JSON package. The maximum length of JSON values in that package is limited to 32K characters. Some of the field values in OFSC events exceed this length.

    Preparing the DBaaS Wallet

    Create an entry in a new or existing Oracle database wallet for the trusted public certificates used to secure connections to the web service via the Internet. A link to the Oracle Wallet Manager documentation is included in the References section. Note the location and password of the wallet as they are used to issue the REST request.

    The need for a trusted certificate is detected when the following error occurs: ORA-29024: Certificate validation failure.

    An example certificate path found using Chrome browser is shown below. Both of these trusted certificates need to be in the Oracle wallet.

    • 2

    Creating a BICS User in the Database

    The complete SQL used to prepare the DBaaS may be viewed here.

    Example SQL statements are below:

    CREATE USER “BICS_USER” IDENTIFIED BY password
    DEFAULT TABLESPACE “USERS”
    TEMPORARY TABLESPACE “TEMP”
    ACCOUNT UNLOCK;
    — QUOTAS
    ALTER USER “BICS_USER” QUOTA UNLIMITED ON USERS;
    — ROLES
    ALTER USER “BICS_USER” DEFAULT ROLE “CONNECT”,”RESOURCE”;
    — SYSTEM PRIVILEGES
    GRANT CREATE VIEW TO “BICS_USER”;
    GRANT CREATE ANY JOB TO “BICS_USER”;

    Creating Database Schema Objects

    Three tables need to be created prior to compiling the PL/SQL stored procedure. These tables are:

    *     A staging table to hold OFSC Event data

    *     A subscription table to hold subscription information.

    *     A JSON table to hold the JSON responses from the REST calls

    The staging table, named OFSC_EVENT_ACTIVITY, has columns described in the OFSC REST API for the Activity event type. These columns are:

    PAGE_NUMBER — for the page number the event was extracted from
    ITEM_NUMBER — for the item number within the page of the event
    EVENT_TYPE
    EVENT_TIME
    EVENT_USER
    ACTIVITY_ID
    RESOURCE_ID
    SCHEDULE_DATE
    APPT_NUMBER
    CUSTOMER_NUMBER
    ACTIVITY_CHANGES — To store all of the individual changes made to the activity

    The subscription table, named OFSC_SUBSCRIPTION_PAGE, has the following columns:

    SUBSCRIPTION_ID     — for the supported event types
    NEXT_PAGE                — for the next page to be extracted in an incremental load
    LAST_UPDATE            — for the date of the last extract
    SUPPORTED_EVENT — for the logical name for the subscription event types
    FIRST_PAGE               — for the first page to be extracted in a full load

    The JSON table, named OFSC_JSON_TMP, has the following columns:

    PAGE_NUMBER — for the page number extracted
    JSON_CLOB       — for the JSON response received for each page

    Using API Testing Tools

    The REST requests should be developed in API testing tools such as cURL and Postman. The JSON expressions for parsing should be developed and tested in a JSON expression testing tool such as CuriousConcept. Links to these tools are provided in the References section.

    Note: API testing tools such as SoapUI, CuriousConcept, Postman, and so on are third-party tools for using SOAP and REST services. Oracle does not provide support for these tools or recommend a particular tool for its APIs. You can select the tool based on your requirements.

    Subscribing to Receive Events

    Create subscriptions prior to receiving events. A subscription specifies the types of events that you want to receive. Multiple subscriptions are recommended. For use with the method in this post, a subscription should only contain events that have the same response fields.

    The OFSC REST API document describes how to subscribe using a cURL command. Postman can also easily be used. Either tool will provide a response as shown below:

    {
    “subscriptionId”: “a0fd97e62abca26a79173c974d1e9c19f46a254a”,
    “nextPage”: “160425-457,0”,
    “links”: [ … omitted for brevity ]
    }.

    Note: The default next page is for events after the subscription is created. Ask the system administrator for a starting page number if a past date is required.

    Use SQL*Plus or SQL Developer and insert a row for each subscription into the OFSC_SUBSCRIPTION_PAGE table.

    Below is an example insert statement for the subscription above:

    INSERT INTO OFSC_SUBSCRIPTION_PAGE
    (
    SUBSCRIPTION_ID,
    NEXT_PAGE,
    LAST_UPDATE,
    SUPPORTED_EVENT,
    FIRST_PAGE
    )
    VALUES
    (
    ‘a0fd97e62abca26a79173c974d1e9c19f46a254a’,
    ‘160425-457,0’,
    sysdate,
    ‘Required Inventory’,
    ‘160425-457,0’
    );

    Preparing and Calling the OFSC RESTful Service

    This post uses the events method of the OFSC REST API.

    This method requires the Basic framework for authorization and mandates a base64 encoded value for the following information: user-login “@” instance-id “:” user-password

    An example encoded result is:

    dXNlci1sb2dpbkBpbnN0YW5jZS1pZDp1c2VyLXBhc3N3b3Jk

    The authorization header value is the concatenation of the string ‘Basic’ with the base64 encoded result discussed above. The APEX_WEB_SERVICE package is used to set the header as shown below:

    v_authorization_token := ‘ dXNlci1sb2dpbkBpbnN0YW5jZS1pZDp1c2VyLXBhc3N3b3Jk’;
    apex_web_service.g_request_headers(1).name  := ‘Authorization’;
    apex_web_service.g_request_headers(1).value := ‘Basic ‘||v_authorization_token;

    The wallet path and password discussed in the Preparing the DBaaS Wallet section are also required. An example path from a Linux server is:

    /u01/app/oracle

    Calling the Events Request

    The events request is called for each page available for each subscription stored in the OFSC_SUBSCRIPTION_PAGE table using a cursor loop as shown below:

    For C1_Ofsc_Subscription_Page_Rec In C1_Ofsc_Subscription_Page
    Loop
    V_Subscription_Id := C1_Ofsc_Subscription_Page_Rec.Subscription_Id;
    Case When P_Run_Type = ‘Full’ Then
    V_Next_Page := C1_Ofsc_Subscription_Page_Rec.First_Page;
    Else
    V_Next_Page := C1_Ofsc_Subscription_Page_Rec.Next_Page;
    End Case; … End Loop;

    The URL is modified for each call. The subscription_id and the starting page are from the table.

    For the first call only, if the parameter / variable p_run_type is equal to ‘Full’, the staging table is truncated and the page value is populated from the FIRST_PAGE column in the OFSC_SUBSCRIPTION_PAGE table. Otherwise, the staging table is not truncated and the page value is populated from the NEXT_PAGE column.

    Subsequent page values come from parsing the nextPage value in the responses.

    An example command to create the URL from the example subscription above is:

    f_ws_url := v_base_url||’/events?subscriptionId=’ ||v_subscription_id|| chr(38)||’page=’ ||v_next_page;

    The example URL result is:

    https://ofsc-hostname/rest/ofscCore/v1/events?subscriptionId=a0fd97e62abca26a79173c974d1e9c19f46a254a&page=160425-457,0

    An example call using the URL is below:

    f_ws_response_clob := apex_web_service.make_rest_request (
    p_url => f_ws_url
    ,p_http_method => ‘GET’
    ,p_wallet_path => ‘file:/u01/app/oracle’
    ,p_wallet_pwd => ‘wallet-password‘ );

    Storing the Event Responses

    Each response (page) is processed using a while loop as shown below:

    While V_More_Pages
    Loop
    Extract_Page;
    End Loop;

    Each page is parsed to obtain the event type of the first item. A null (empty) event type signals an empty page and the end of the data available. An example parse to obtain the event type of the first item is below. Note: for usage of the JSON_Value function below see JSON in Oracle Database.

    select  json_value (f_ws_response_clob, ‘$.items[0].eventType’ ) into f_event_type from  dual;

    If there is data in the page, the requested page number and the response clob are inserted into the OFSC_JSON_TMP table and the response is parsed to obtain the next page number for the next call as shown below:

    f_json_tmp_rec.page_number := v_next_page; — this is the requested page number
    f_json_tmp_rec.json_clob := f_ws_response_clob;
    insert into ofsc_json_tmp values f_json_tmp_rec;
    select json_value (f_ws_response_clob, ‘$.nextPage’ ) into v_next_page from dual;

    Parsing and Loading the Events Responses

    Each response row stored in the OFSC_JSON_TMP table is retrieved and processed via a cursor loop statement as shown below:

    for c1_ofsc_json_tmp_rec in c1_ofsc_json_tmp
    loop
    process_ofsc_json_page (c1_ofsc_json_tmp_rec.page_number);
    end loop;

    An example response is below with only the first item shown:

    {
    “found”: true,
    “nextPage”: “170110-13,0”,
    “items”: [
    {
    “eventType”: “activityUpdated”,
    “time”: “2017-01-04 12:49:51”,
    “user”: “soap”,
    “activityDetails”: {
    “activityId”: 1297,
    “resourceId”: “test-resource-id“,
    “resourceInternalId”: 2505,
    “date”: “2017-01-25”,
    “apptNumber”: “82994469003”,
    “customerNumber”: “12797495”
    },
    “activityChanges”: {
    “A_LastMessageStatus”: “SuccessFlag – Fail – General Exception: Failed to update FS WorkOrder details. Reason: no rows updated for: order_id = 82994469003 service_order_id = NULL”
    }
    }
    ],
    “links”: [

    ]
    }

    Each item (event) is retrieved and processed via a while loop statement as shown below:

    while f_more_items loop
    process_item (i);
    i := i + 1;
    end loop;

    For each item, a dynamic SQL statement is prepared and submitted to return the columns needed to insert a row into the OFSC_EVENT_ACTIVITY staging table as shown below (the details of creating the dynamic SQL statement have been omitted for brevity):

    An example of a dynamically prepared SQL statement is below. Note: for usage of the JSON_Table function below see JSON in Oracle Database.

    DYN_SQL

    The execution of the SQL statement and the insert are shown below:

    execute immediate f_sql_stmt into ofsc_event_activity_rec;
    insert into ofsc_event_activity values ofsc_event_activity_rec;

    Parsing Events Responses using APEX_JSON

    Update: As of December, 2016 the  APEX 5.1 APEX_JSON package has removed the limitation of 32K lengths for JSON values. This update allows the continued use of an Oracle 11g database if desired.  This new section demonstrates the usage.

    Each page response clob is parsed with the APEX_JSON.PARSE procedure as shown below. This procedure stores all the JSON elements and values in an internal array which is accessed via JSON Path statements.

    apex_json.parse(F_Ws_Response_Clob);

    Each page is tested to see if it is an empty last page. A page is deemed empty when the first event has a null event type as shown below.

    apex_json.parse(F_Ws_Response_Clob);
    F_Event_Type := apex_json.get_varchar2(p_path => ‘items[1].eventType’);
    Case When F_Event_Type Is Null
    Then V_More_Pages := False; …

    An example response is shown in the section above.

    Each item (event) is retrieved and processed via a while loop statement as shown below:

    while f_more_items loop
    process_item_JParse (i);
    i := i + 1;
    end loop;

    For each item (event), the event is parsed into a variable row record as shown below:

    OFSC_EVENT_ACTIVITY_rec.PAGE_NUMBER := F_Page_Number;
    OFSC_EVENT_ACTIVITY_rec.ITEM_NUMBER := FI ;
    OFSC_EVENT_ACTIVITY_rec.EVENT_TYPE := apex_json.get_varchar2(p_path => ‘items[‘ || Fi || ‘].eventType’) ;
    OFSC_EVENT_ACTIVITY_rec.EVENT_TIME := apex_json.get_varchar2(p_path => ‘items[‘ || Fi || ‘].time’) ;
    OFSC_EVENT_ACTIVITY_rec.EVENT_USER := apex_json.get_varchar2(p_path => ‘items[‘ || Fi || ‘].user’) ;
    OFSC_EVENT_ACTIVITY_rec.ACTIVITY_ID := apex_json.get_varchar2(p_path => ‘items[‘ || Fi || ‘].activityDetails.activityId’) ;
    OFSC_EVENT_ACTIVITY_rec.RESOURCE_ID := apex_json.get_varchar2(p_path => ‘items[‘ || Fi || ‘].activityDetails.resourceId’) ;
    OFSC_EVENT_ACTIVITY_rec.SCHEDULE_DATE := apex_json.get_varchar2(p_path => ‘items[‘ || Fi || ‘].activityDetails.date’) ;
    OFSC_EVENT_ACTIVITY_rec.APPT_NUMBER := apex_json.get_varchar2(p_path => ‘items[‘ || Fi || ‘].activityDetails.apptNumber’) ;
    OFSC_EVENT_ACTIVITY_rec.CUSTOMER_NUMBER := apex_json.get_varchar2(p_path => ‘items[‘ || Fi || ‘].activityDetails.customerNumber’) ;
    OFSC_EVENT_ACTIVITY_rec.ACTIVITY_CHANGES := Get_Item_ACTIVITY_CHANGES (FI);

    The insert of the row is shown below:

    insert into ofsc_event_activity values ofsc_event_activity_rec;

    Verifying the Loaded Data

    Use SQL*Plus, SQL Developer, or a similar tool to display the rows loaded into the staging table.

    A sample set of rows is shown below:

    tabResults

    Troubleshooting the REST Calls

    Common issues are the need for a proxy, the need for an ACL, the need for a trusted certificate (if using HTTPS), and the need to use the correct TLS security protocol. Note: This post uses DBaaS so all but the first issue has been addressed.

    The need for a proxy may be detected when the following error occurs: ORA-12535: TNS:operation timed out. Adding the optional p_proxy_override parameter to the call may correct the issue. An example proxy override is:

    www-proxy.us.oracle.com

    Scheduling the Procedure

    The procedure may be scheduled to run periodically through the use of an Oracle Scheduler job as described in Scheduling Jobs with Oracle Scheduler.

    A job is created using the DBMS_SCHEDULER.CREATE_JOB procedure by specifying a job name, type, action and a schedule. Setting the enabled argument to TRUE enables the job to automatically run according to its schedule as soon as you create it.

    An example of a SQL statement to create a job is below:

    BEGIN
    dbms_scheduler.create_job (
    job_name => ‘OFSC_REST_EVENT_EXTRACT’,
    job_type => ‘STORED_PROCEDURE’,
    enabled => TRUE,
    job_action => ‘BICS_OFSC_REST_INTEGRATION’,
    start_date => ’12-JAN-17 11.00.00 PM Australia/Sydney’,
    repeat_interval => ‘freq=hourly;interval=24’ — this will run once every 24 hours
    );
    END;
    /

    Note: If using the BICS Schema Service database, the package name is CLOUD_SCHEDULER rather than DBMS_SCHEDULER.

    The job log and status may be queried using the *_SCHEDULER_JOBS views. Examples are below:

    SELECT JOB_NAME, STATE, NEXT_RUN_DATE from USER_SCHEDULER_JOBS;
    SELECT LOG_DATE, JOB_NAME, STATUS from USER_SCHEDULER_JOB_LOG;

    Summary

    This post detailed a method of extracting and loading data from Oracle Field Service Cloud (OFSC) into the Oracle Business Intelligence Cloud Service (BICS) using RESTful services.

    The method extracted JSON-formatted data responses and used the PL/SQL language to call the web services, parse the JSON responses, and perform database table operations in a Stored Procedure. It also produced a BICS staging table which can then be transformed into star-schema object(s) for use in modeling.

    Finally, an example of a database job was provided that executes the Stored Procedure on a scheduled basis.

    For more BICS and BI best practices, tips, tricks, and guidance that the A-Team members gain from real-world experiences working with customers and partners, visit Oracle A-Team Chronicles for BICS.

    References

    Complete Procedure

    Complete Procedure using APEX_JSON

    JSON in Oracle Database

    REST API for Oracle Field Service Cloud Service

    Scheduling Jobs with Oracle Scheduler

    Database PL/SQL Language Reference

    APEX_WEB_SERVICE Reference Guide

    APEX_JSON Reference Guide

    Curious Concept JSON Testing Tool

    Postman Testing Tool

    Base64 Decoding and Encoding Testing Tool

    Using Oracle Wallet Manager

    Oracle Business Intelligence Cloud Service Tasks

     

    Integrating Sales Cloud and Service Cloud using ICS – troubleshooting issues with security configuration

    $
    0
    0

    Introduction

    This blog talks about a few “gotchas” when integrating Oracle Sales Cloud (OSC) and Oracle Service Cloud (OSvC) using Oracle’s iPaaS platform, the Integration Cloud Service(ICS).
    The idea is to have a ready reckoner for some common issues faced, so that customers can hit the ground running when integrating between OSvC and OSc using ICS

     

    ICS Integrations for OSC OSvC

    Pre-built ICS integrations are available from Oracle for certain objects and can be downloaded from My Oracle Support. Contact Oracle Support to download the pre-built integrations and the documentation that comes along with it.   

    The pre-built integration provides out of the box standard integration for the following –

    •     Integrate Account and Contacts Objects from Sales Cloud to Service Cloud

    OSC_SVC_integrations

    •    Integrate Organization and Contact objects from Service Cloud to Sales Cloud

    SVC_OSC_integrations
    The pre-built integration is built using ICS and provides a few standard field mappings. It can serve as a template and users can update any custom field mappings as needed.
    The ICS Prebuilt integrations also serve as reference for building other custom integrations between OSC and OSvC using ICS. ICS integrations can be built for integrating more objects like Partner and Opportunity objects from OSC. Similarly flows can be created to integrate Asset and Incident objects from OSvC. Refer to the Sales cloud Adapter documentation  and OSvC Adapter documentation  here for capabilities that can be used to build Custom integrations.

     

    ICS Credential in Sales Cloud

    One issue that could be faced by users after following the steps in the PreBuilt integrations document and activating the ICS integrations, is that the Account and Contact subscriptions do not flow from OSC to ICS.
    This is usually due to issues with creating the ICS credentials in OSC.
    Note that a csfKey entry in Sales Cloud infrastructure stores the ICS credentials used by Sales Cloud. This key is used to connect to ICS and invoke the subscription based integrations at runtime.

    Refer to this excellent blog post from  my colleague Naveen Nahata, which gives simple steps to create the csf Key. The SOA Composer page where csf key and values are updated is shown below.

    001_CSF_Key

    Note that OSC ‘R12’ and ‘R11’ customers can now self create csfKey on the SOA Composer App using the steps from Naveen’s blog above.
    R10 customers however, should create a support SR for the csfKey creation. Refer to the steps as mentioned in the implementation guide document within the R10 preBuilt integrtaion download package.

    Invalid Field Errors in OSvC

    Further, when testing the integration of Contact or Account from OSC to OSvC, the ICS instances could be going to failed state.  ICS may show the instance to be in failed state as shown below.

    OSC_SVC_ACCT_Created_Error_2
    Tracking the failed instance further may show error message as seen below

     

    ErrorMessage
    If the OSC_SVC_ACCOUNT_CREATED integration is ‘TRACE ENABLED’, then the Activity Stream/Diagnostic log file can be downloaded from ICS to further inspect the message payloads flowing in the integration instance.
    If one searches the logs for request/response message payloads using the ICS instance ID that has failed, he/she may find out that the issue is not really at the createOriginalSystemReference stage of the flow but the BatchResponse stage from Service Cloud.

     Error:  Invalid Field While processing Organization->ExternalReference(string)

    The response payload from OSvC will look as below

    <nstrgmpr:Create>
    <nstrgmpr:RequestErrorFault xmlns:nstrgmpr="urn:messages.ws.rightnow.com/v1_3">
    <n1:exceptionCode xmlns:nstrgmpr="http://xmlns.oracle.com/cloud/adapter/rightnow/OrganizationCreate_REQUEST/types">INVALID_FIELD</n1:exceptionCode>
    <n1:exceptionMessage xmlns:nstrgmpr="http://xmlns.oracle.com/cloud/adapter/rightnow/OrganizationCreate_REQUEST/types">Invalid Field While processing Organization-&gt;ExternalReference(string).</n1:exceptionMessage>
    </nstrgmpr:RequestErrorFault>
    </nstrgmpr:Create>

    Solution:

    Ensure that the credentials specified in the EVENT_NOTIFICATION_MAPI_USERNAME and EVENT_NOTIFICATION_MAPI_PASWD in OSvC do not refer to a ‘real’ OSvC user. OSvC user credentials may not have the rights to update External Reference fileds. It is important  that a dummy username/password is created in the EVENT_NOTIFICATION_MAPI_* fields in OSvC. And remember to use this credential when configuring the OSvC connection in ICS.

    ICS Credential in Service Cloud

    Another crucial part of the OSvC Configuration is setting the Credentials to use for Outgoing Requests from OSvC to ICS. This is done by setting the EVENT_NOTIFICATION_SUBSCRIBER_USERNAME and EVENT_NOTIFICATION_SUBSCRIBER_PASSWD  parameters in OSvC. This credential is used by OSvC to connect and execute ICS integrations and must point to a ‘real’ user on ICS. This user should have the “Integration Cloud Service Runtime Role” granted to it.

    References:

    Using Event Handling Framework for Outbound Integration of Oracle Sales Cloud using Integration Cloud Service
    Service Cloud
    Sales Cloud

     

    Using Oracle Managed File Transfer (MFT) to Push Files to ICS for Processing

    $
    0
    0

    Introduction

    In a previous article I discussed the use of the Enterprise Scheduler Service (ESS) to poll for files, on a scheduled basis, to read from MFT.  In that article we discussed how to process many files that have been posted to the FTP server.  At the end of that article I mentioned the use of the push pattern for file processing.

    This article will cover how to implement that push pattern with Managed-File Transfer (MFT) and the Integration Cloud Service (ICS).  We’ll walk through the configuration of MFT, creating the connections in ICS, and developing the integration in ICS.

    The following figure is a high-level diagram of this file-based integration using MFT, ICS, and an Oracle SaaS application.

    mft2ics

     

    Create the Integration Cloud Service Flow

    This integration will be a basic integration with an orchestrated flow.  The purpose is to demonstrate how the integration is invoked and the processing of the message as it enters the ICS application.  For this implementation we only need to create two endpoints.  The first is a SOAP connection that MFT will invoke, and the second connection will be to the MFT to write the file to an output directory.

    The flow could include other endpoints but for this discussion additional endpoints will not add any benefits to understanding the push model.

    Create the Connections

    The first thing to do is the create the connections to the endpoints required for the integration.  For this integration we will create two required connections.

     

    1. SOAP connection.  This connection is what will be used by the MFT to trigger the integration as soon as the file arrives in the specified directory within the MFT (This will be covered in the MFT section of this article).
      1. FTP connection: This connection will be used to write the file to an output directory within the FTP server.  This second connection is only to demonstrate the flow and the processing of the file and then writing the file to an endpoint.  This endpoint could have been any endpoint, to invoke another operation.  For instance, we could have used the input file to invoke a REST, SOAP, or one of many other endpoints.

    Let’s define the SOAP connection.

    SOAP_Identifier

    Figure 1

    Identifier: Provide a name for the connection

    Adapter: When selecting the adapter type choose the SOAP Adapter

    Connection Role: There are three choices for the connection role; Trigger, Invoke, and Trigger and Invoke.  We will use a role of Trigger, since the MFT will be triggering our integration.

    SOAPConnectionProperties

    Figure 2

    Figure 2 shows the properties that define the endpoint.  The WSDL URL may be added by specifying the actual WSDL as shown above, or the WSDL can be consumed by specifying the host:port/uri/?WSDL.

    In this connection the WSDL was retrieved from the MFT embedded server.  This can be found at $MW_HOME/mft/integration/wsdl/MFTSOAService.wsdl.

    The suppression of the timestamp is specified as true, since the policy being used at MFT does not require the timestamp to be passed.

    Security Policy

    SOAP_Security

     

    Figure 3

    For this scenario we will be using the username-password token policy.  The policy specified on this connection needs to match the policy that is specified for the MFT SOAP invocation.

    The second connection, as mentioned previously, is for the purpose of demonstrating an end-to-end flow.  This connection is not important for the purpose of demonstrating the push pattern.  The connection is a connection back to the MFT server.

    MFT_FTP_Identifier

    Figure 4

    Identifier: Provide a unique name for the connection

    Adapter: When selecting the adapter type choose the FTP Adapter

    Connection Role: For this connection we will specify “Trigger and Invoke”.

    Connection Properties

    MFT_FTP_Connection_Properties

    Figure 5

    FTP Server Host Address:  The IP address of the FTP server.

    FTP Server Port: The listening port of the FTP Server

    SFTP Connection:  Specify “Yes”, since the invocation will be over sFTP

    FTP Server Time Zone: The time zone where the FTP server is located.

    Security Policy

    MFT_FTP_Security

    Figure 6

    Security Policy:  FTP Server Access Policy

    User Name:  The name of the user that has been created in the MFT environment.

    Password: The password for the specified user.

    Create the Integration

    Now that the connections have been created we can begin to create the integration flow.  When the flow is triggered by the MFT SOAP request the file will be passed by reference.  The file contents are not passed, but rather a reference to the file is passed in the SOAP request.  When the integration is triggered the first step is to capture the size of the file.  The file size is used to determine the path to traverse through the flow.  A file size of greater than one megabyte is the determining factor.

    integration

     

    Figure 7

    The selected path is determined by the incoming file size.  When MFT passes the file reference it also passes the size of the file.  We can then use this file size to determine the path to take.  Why do we want to do this?

    If the file is of significant size then reading the entire file into memory could cause an out-of-memory condition.  Keep in mind that memory requirements are not just about reading the file but also the XML objects that are created and the supporting objects needed to complete any required transformations.

    ICS product provides a feature to prevent an OOM condition when reading large files.  The top path shown in Figure 7 demonstrates how to handle the processing of large files.  When processing a file of significant size it is best to process the file by downloading the file to ICS (This is an option provided by the FTP adapter when configuring the work flow). After downloading the file to ICS it is processed by using a “stage” action.  The stage action is able to chunk the large file and read the file across multiple threads.  This article will not provide an in-depth discussion on the stage action.  To better understand the “stage” action, refer to the Oracle ICS documentation.

    The “otherwise” path is the execution flow above is taken when the file size is less than the configured maximum file size.  For the scenario in this blog, I set the maximum size to one megabyte.

    The use case being demonstrated involves passing the file by reference.  Therefore, in order to read or download the file we must obtain the reference location from MFT.  The incoming request provides the reference location.  We must provide this reference location and the target filename to the read or download operation.  This is done with the XSLT mapping shown in figure 8.

    FileReferenceMapping

    Figure 8

    The result mapping is shown in Figure 9.

    MappingPage

    Figure 9

     

    The mapping of the fields is provided below.

    Headers.SOAPHeaders.MFTHeader.TargetFilename -> DownFileToICS.DownloadRequest.filename.

    Substring-before(

    substring-after(InboundSOAPRequestDocument.Body.MFTServiceInput.FTPReference.URL,’7522’),

    InboundSOAPRequestDocument.Headers.SOAPHeaders.MFTHeader.TargetFilename) -> DownloadFileToICS.DownloadRequest.directory

    As previously stated, this is a basic scenario intended to demonstrate the push process.  The integration flow may be as simple or complex as necessary to satisfy your specific use case.

    Configuring MFT

    Now that the integration has been completed it is time to implement the MFT transfer and configure the SOAP request for the callout.  We will first configure the MFT Source.

    Create the Source

    The source specifies the location of the incoming file.  For our scenario the directory we place our file in will be /users/zern/in.  The directory location is your choice but it must be relative to the embedded FTP server and one must have permissions to read from that directory.  Figure 10 shows the configuration for the MFT Source.

    MFT_Source

    Figure 10

    As soon as the file is placed in the directory an “event” is triggered for the MFT target to perform the specified action.

    Create the Target

    The MFT target specifies the endpoint of the service to invoke.  In figure 11, the URL has been specified to the ICS integration that was implemented above.

    MFT_Target_Location

     

    Figure 11

    The next step to specify is the security policy.  This policy must match what was specified by the connection defined in the ICS platform.  We are specifying the username_token_over_ssl_policy as seen in Figure 12.

    MFT_Target_Policy

     

    Figure 12

    Besides specifying the security policy we must also specify to ignore the timestamp in the response. Since the policy is the username_token policy the request  must also specify the credentials in the request.  The credentials are retrieved from the keystore by providing the csf-key value.

    Create the Transfer

    The last step in this process is to bring the source and target together which is the transfer.  It is within the transfer configuration that we specify the delivery preferences.  In this example we set the “Delivery Method” to “Reference” and the Reference Type to be “sFTP”.

     

    MFT_Transfer_Overview

    Figure 13

    Putting it all together

    1. A “.csv” file is dropped at the source location, /users/zern/in.
    2. MFT invokes the ICS integration via a SOAP request.
    3. The integration is triggered.
    4. The integration determines the size of the incoming file and determines the path of execution
    5. The file is either downloaded to ICS or read into memory.  This is determined by the path of execution.
    6. The file is transformed and then written back to the output directory specified by the FTP write operation.
    7. The integration is completed.

    Push versus Polling

    There is no right or wrong when choosing either a push or poll pattern.  Each pattern has its benefits.  I’ve listed a couple of points to consider for each pattern.

    Push Pattern

    1. The file gets processed as soon as it arrives in the input directory.
    2. You need to create two connections; one SOAP connection and one FTP connection.
    3. Normally used to process only one file.
    4. The files can arrive at any time and there is no need to setup a schedule.

    Polling Pattern

    1. You must create a schedule to consume the file(s).  The polling schedule can be at either specific intervals or at a given time.
    2. You only create one connection for the file consumption.
    3. Many files can be placed in the input directory and the scheduler will make sure each file is consumed by the integration flow.
    4. The file processing is delayed upwards to the maximum time of the polling schedule.

    Summary

    Oracle offers many SaaS cloud applications such as Fusion ERP and several of these SaaS solutions provide file-based interfaces.  These products require the input files to be in a specific format for each interface.  The Integration Cloud Service is an integration gateway that can enrich and/or transform these files and then pass them along directly to an application or an intermediate storage location like UCM where the file is staged as input to SaaS applications like Fusion ERP HCM.

    With potentially many source systems interacting with Oracle SaaS applications it is beneficial to provide a set of common patterns to enable successful integrations.  The Integration Cloud Service offers a wide range of features, functionality, and flexibility and is instrumental in assisting with the implementation of these common patterns.

     

    Viewing all 155 articles
    Browse latest View live