Abstract
The enterprise IT industry is undergoing massive transformation strongly influenced by the way software is being delivered and used. The traditional on-premise software, which has been the backbone of the industry in the last decades, is transforming into services available in the cloud, ready to be used. However, we are at the beginning of that long journey and companies are doing it one step at a time. For the upcoming years, it would be common to see hybrid IT landscapes incorporating both on-premise and on-demand solutions tightly integrated to ensure smooth operations for the business. Following that line, the current blog discusses how you can use SAP HANA Cloud Platform, SAP’s in-memory platform as a service, to integrate and analyze such data. In our sample scenario, we are using human capital management (HCM) data and respective SAP products but you can apply a similar approach for other enterprise systems as well.
Bill of Materials
The following table summarizes the SAP products we are using for this scenario:
Product | Use in the Scenario |
Hosts on-premise HCM data. | |
Hosts on-demand HCM data. | |
Enables moving data from the on-premise system (SAP ERP) to the cloud (SAP HANA@HCP). | |
Installed at the ERP system to enable HCI for Data Services. | |
Hosts the combined data from the on-premise and on-demand systems. | |
Add-on for SAP HANA that visualizes data from the SAP ERP system. | |
Generates reports from the combined data in SAP HANA. |
System Setup
The following diagram illustrates the components of the system and their interaction.
The transfer of ERP HCM data to the SAP HANA database is facilitated by the HCI-DS & HCI-DS Agent. Then the structure of data is improved using SAP HANA Live.
The transfer of SuccessFactors data to SAP HANA is done over a REST API, using a scheduled XS job.
From the two data tables, we generate an analytic view, which we then visualize using SAP Lumira Server.
Procedure
Data replication from On-Premise HCM
In our scenario for data replication we are using SAP HANA Cloud Integration for data services (HCI-DS). The procedure is basically the following:
- Setup HCI-DS Agent on premise side:
HCI-DS agent usually runs at its own machine. You have to mind the PAM matrix of the agent. In our scenario we used Windows 2000 Server. - Create a TCP/IP destination at ERP for program id ‘SAPDS’
- Use transaction SM59 – create destination
- Select “TCP/IP Connections” – “create” button
- Use following connection setup:
RFC Destination: SAPDS
Connection Type: T (TCP/IP connection)
Description (Optional): Description of the destination
Technical Settings tab:
Activation Type: Registered Server Program
Program ID: SAPDS - Allow registration of external RFC server program – file location: \usr\sap\<SID>\DVEBMGS00\data\reginfo or on windows: D:\usr\sap\<SID>\DVEBMGS00\data\reginfo.DAT
- If reginfo file does not exist create it using transaction SMGW -> GoTo -> Expert Functions -> External Security -> Create (reginfo)
- Open the file ‘reginfo’ and add a line like:
P TP=SAPDS* HOST=my.system.with.agent.com CANCEL=local ACCESS=*
where the HOST is of course the host from where the external program (SAP Data Services Agent) will be registering - Use transaction SMGW and reread the configuration: "Goto --> Expert Functions --> External Security --> Reread"
- ERP system now is ready to be added as Data Source in HCI-DS cockpit
- Add ERP as “Data Store” at HCI-DS
- Bind HCI-DC to your SAP HANA at HCP
- Find your schema (e.g. NEO_p123456789sapdev.jdy_DEV). Execute the following SQL statement from SAP HANA studio:
SELECT DISTINCT ROLE_NAME FROM GRANTED_ROLES WHERE ROLE_NAME LIKE 'NEO_%' - Connect your HANA schema with the HCI-DS service
- Open a command shell on your computer and execute the following command
neo grant-schema-access -h hana.ondemand.com -a <your account name> -u <your user name> -i <your schema name> -b hcidstest:dstest
As a result you will get an access token similar to: uzyhj9b7lwmzhku15gelj3j3heawn0wxszy9a7co0hq5pw8qr
- Open a command shell on your computer and execute the following command
- Find your schema (e.g. NEO_p123456789sapdev.jdy_DEV). Execute the following SQL statement from SAP HANA studio:
- Add the HCP SAP HANA system as “Data Store” at HCI-DS
- Set up HCI-DS project for data replication from ERP to SAP HANA at HCP
Have in mind that you have to have the tables that you will replicate already created in SAP HANA. In the HCI-DS project you have to import tables’ metadata, and make the mappings between them. - Promote the project for data replication
Data replication from On-Premise HCM
Replication of SuccessFactors (SFSF) employee data to SAP HANA is done using SuccessFactors OData REST API. A custom XS Job in certain time period triggers an XSJS function that queries SFSF, and updates the users’ data. You may see the setup below:
Industry pre-defined data models from HANA Live
Having the HCM data in SAP HANA at HCP, we may now analyze it. For this purpose SAP offers already prepared SAP HANA calculation views based on the tables of CRM (Virtual Data Models for HR Renewal). This gives us out-of-the-box analysis capabilities for our scenario. Because we want to count the employees in our organization, and be able to drill down by organization and department, we are using ‘sap.hba.apps.hcmrenewal.views::HeadcountForManagerByEmployeeAndDate’ calculation view. On the other hand, we want to count the SuccessFactors employees as well. For that, we have created a custom calculation view that joins the ‘HeadcountForManagerByEmployeeAndDate’ view and SFSF employees’ data table. The custom calculation view gives employee number as a measure, and organization and department as drill down attributes. You may see the results in the data preview below.
Powerful insights with SAP Lumira
SAP Lumira gives us the power to analyze the data provided by SAP HANA calculation views. In our scenario, we have created a custom calculation view that joins the data for employees in ERP, and SFSF. Business analysts can easily access this custom calculation view from SAP Lumira, and apply filters, drill down, slice and dice. SAP Lumira allows also creating reports with live data, and sharing them with colleagues. In the picture below you may see a sample SAP Lumira report from our use case.
Summary
In the current document, we described how a smooth on-demand-on-premise integration for an HCP reference scenario could be achieved. The powerful combination of tailored HCP products, the design paradigms described above, and constantly growing SAP HANA Live content (currently there are 22 industry-specific variants available) allows many valuable use cases and scenarios beyond the current scenario and industry to be covered in a standardized manner with low efforts.
Contributors:
References:
SAP HANA Cloud Integration for data services: http://help.sap.com/hci_ds/
SAP HANA Cloud Integration for data services Tutorials: http://scn.sap.com/docs/DOC-47511
SAP Lumira 1.22: http://help.sap.com/lumira
Virtual Data Models for HR Renewal (New): http://help.sap.com/saphelp_hba/helpdata/en/78/885451d600ff50e10000000a445394/frameset.htm SAP HANA Platform: http://help.sap.com/hana_platform/#section6