Sunday 17 April 2016

FDMEE Hybrid and the REST Part 2

In the last post I went through the necessary configuration to get up and running with FDMEE hybrid functionality, this was based on 11.1.2.4.200 which is the first patch release of FDMEE to support hybrid.

Now I could have left it as one post because to be honest once the cloud target application details have been added in FDMEE the process for integration and functionality is pretty much the same as with an on-premise application.

As I have been spent a far bit of time recently covering web services I thought it would only be right to go into more detail of what is happening behind scenes with the integration between FDMEE on-premise and cloud.

All the interaction between on-premise and cloud is using a REST API, some of the REST resources have been around for a while which I have covered in previous posts and some of them are new.

If you have not read any of my REST posts then it might be worth having a look at them because I am not going to repeat already covered ground.

In this post I am going to go through loading data from on-premise to a PBCS application and highlight the steps in the process which are using the REST API.

So let us start with adding a target PBCS application which is achieved by selecting the Cloud option.


Next PBCS credentials are entered.


If the required configuration steps that I carried out in the last post have been completed and the FDMEE web application server has https access to the PBCS instance you should be presented with the option to select a planning application.


At present only one application will be returned.

So how was the application name returned, was it just magic like most are happy to accept, no it is at this point REST has joined the party.

A GET request is executed against the following REST resource:

https://<pbcs_instance>/aif/rest/<api_version>/applications

I can demonstrate this by using a REST client browser add-on and the following response is returned from the request:


The JSON formatted response contains the application which is used to populate the drop-down menu.

Once the application is select the application details are then populated in FDMEE.


Once again a REST resource is used to return the dimension information:

https://<pbcs_instance>/aif/rest/<api_version>/applications/<app_name>

The only difference this time is the resource URL requires the application name which we know has already been retrieved and stored.

Executing a GET request returns a whole load of information about the application.


Just like with an on-premise application there are application options that can be set and this is where the cloud credentials are stored if they ever need to be updated.


Now that the PBCS application has been added the steps to load data from a source are exactly the same as if working with an on-premise target.

When creating an import format you would select the target in the same way.


In my example I am loading from a simple flat file but it could be any of the available sources.


It is also possible to have an EPM application as the source.


In the next post I will be covering loading data from a PBCS application to an on-premise EPM application.

Anyway when you get to the stage of data load mapping there is a slight difference between an on-premise and PBCS target application.

With on-premise you have a button to refresh members.


If it is a PBCS application there is currently no button.


I am not sure if this is the way it is meant to be or a bug.

This does not stop you being able to refresh the members from a PBCS application as if you go back into the target application setup there is a button to do this.


The “Refresh Metadata” will refresh the dimension information using the REST resources I have already covered.

The “Refresh Members” will fire off a process and if you look at the process details in the on-premise FDMEE you will see the following process name.


The process steps cover in detail what is happening.


First of all the members are refreshed from the PBCS planning application to PBCS data management, once this is complete the members are retrieved for each dimension from PBCS data management and then imported and inserted into the on-premise FDMEE relational database.

Now if you go back into the on-premise “Data Load Mapping” member selector the members will be available.


So what is happening behind the scenes, well the first stage of refreshing the members from the PBCS planning application to data management is carried out using the following REST resource.

https://<pbcs_instance>/aif/rest/<api_version>/jobs

This time it is a POST method and requires a JSON input payload with a job name and job type.


The job name value is the name of the application and the type is “REFRESH_APP_DIMENSIONS”.

The response will contain information about the process that has been executed in data management.


Process details should have a matching process to the one that is returned in the REST response.



Now the members have been refreshed from PBCS planning to data management the following REST resource is executed for each dimension in the application.

https://<pbcs_instance>/aif/rest/<api_version>/applications/<app_name>/dimension/<dim_name>

A GET request is made like the following example:


The response contains information for each of the members in the requested dimension.


This information is processed and loaded into the on-premise FDMEE relational database so when the member selector is used it will contain the target PBCS applications members.

Moving on to exporting the data from on-premise FDMEE to the PBCS application, in my example the data has already been successfully imported and validated.


Once an export has been initiated and completed the on-promise process details will contain the following information:



And in the PBCS process details



In my target application options I had drill region enabled so this information is created in the target PBCS application.

As an example I have created a form in the PBCS application which contains data loaded from on-premise FDMEE.


Selecting the drill through to source option will open a new tab in workspace and then open a new tab in the browser.


The new tab will connect to on-premise FDMEE and display the drill-through summary.


Authentication is required before the drill through summary is displayed as there is no single sign on in operation.

Nice, data has been loaded from an on-premise source, validated, exported and loaded to a PBCS application with the ability to drill through if required.

Let me walk through what is happening behind the web interface.

First of all data is exported from the FDMEE database and a data file is created in the application outbox directory.


This file is then uploaded to the PBCS inbox directory using the following REST resource

https://server/interop/rest/{api_version}/applicationsnapshot/{applicationSnapshotName}/contents?q={“isLast”:”boolean”,”extDirPath”:”upload_dir”chunkSize”:”sizeinbytes”,”isFirst”:”boolean”}

I covered uploading files using the REST API here and here

Here is an example of using the upload resource with a simple PowerShell script.


Once the file has been uploaded you should be able to view it in the data management.


If the “Purge Data File” has been set to “Yes” in the target application options the on-premise file will now be deleted

As the data file exists in PBCS a REST resource is accessed to load the data from the file directly into the plan type (essbase) database.

This time it will be a POST method and the REST resource URL is the same as was previously used to refresh members:

https://<pbcs_instance>/aif/rest/<api_version>/jobs

A JSON formatted input is required which provides information about the job.


The job name will be the name of the planning application, the job type is “DIRECTIMPORT” and the load method will always be “ESSFILE

The remaining property values depend on the application options and obviously the file name of the file to load.

In the above example I have set no drill region just because it is large and looks quite messy on-screen which I don’t think your eyes could have coped with :)

The response includes information about the process that has been executed and it is possible to keep checking the job status by adding the id to the end of the same URL


The value of the property “purgeDataFile” determines whether the data file is deleted once the load has completed.

Checking the process details confirms the process is the same as if run from on-premise FDMEE like earlier.



In the next part I will cover what happens when extracting data from a PBCS application and loading to an on-premise EPM application.

2 comments:

  1. Another great post John. Cheers.

    ReplyDelete
  2. Dear John, an excellent article as always. I am curious to know the internals of how the drill-back to on-premise FDMEE works from PBCS - more specifically what kind of drill-back URL is configured as part of drill region creation in PBCS. Could you kindly share some insight into this aspect?

    ReplyDelete

Note: only a member of this blog may post a comment.