SAP BLOG Use Azure Data Factory to push data to SAP BW using APIs

SAP Blog

Kayıtlı Üye
Katılım
22 Ara 2017
Mesajlar
1,925
Tepki puanı
7
Puanları
6

Context:


In various business scenarios, there are often requirement to extract data from various source system and push to SAP BW system for reporting and other purposes. Often such integrations take file-based approach as well. I would like to share how we can use Azure data factory to push data to SAP BW system using REST APIs for ADSO.



Why ADF?


There are various different methods that can be used while implementing interfaces. You can configure standard SCPI iflow for the same purpose as well. Details are very well mentioned in blog – SAP Cloud Platform Integration (CPI) to push data into SAP BW/4HANA

However, when it comes to ADF there is an advantage of automatic triggers when we use Azure blob storage for files Hence your integration becomes event based. As soon as file is posted in azure blob container, ADF pipeline would run and pickup the file for processing. This is a good enough reason to use ADF
1f60a.png
.



Scenario:


Let’s understand the scenario here:

  1. Source system sends file to Azure blob container.
  2. ADF pipeline would pickup the file on event and push the data to BW system using APIs

1-4.jpg


Steps:​

  • Enable write interface for your ADSO in SAP BW:


Please follow blog – SAP BW/4HANA write interface-enabled ADSO connected to a 3rd Party Tool to enable write interface for your ADSO. This will give you API end points.



  • Setting up ADF Pipeline:​


Data format accepted by BW apis is JSON. Let us assume that we are having json formatted file for our scenario.

Note: In case you are receiving CSV formatted file, you can add additional step to convert CSV to JSON file or have a different pipeline that converts CSV to JSON format and store that in different folder.



22.jpg


Pipeline Steps:​


Lets go through pipeline steps to achieve data push to BW. I will not be covering setting up linked services and datasets for your environments.

Set variables: set api end points (without host name) for getting XCRF token and Push data:

Variable1(get XCRF token): ex: /sap/bw4/v1/push/dataStores/zadso/requests

Variable2(to push data ) :

ex: /sap/bw4/v1/push/dataStores/zadso/dataSend

You can refer to SAP BW/4HANA write interface-enabled ADSO connected to a 3rd Party Tool for endpoints.



Read Json file content: use lookup activity to read JSON file content.​

Get X-CRF Token: Use WEB Activity to get XCRF token.​




  • URL : you can define dynamic content and concatenate your host and variable for get token endpoint. Reason for having host url as global parameter is that, we would be able to change this parameter in deployment pipelines for QA & Prod environment.

@concat(pipeline().globalParameters.bw_host,variables(‘get_token_endpoint’))



  • METHOD: get
  • HEADERS: Name : x-csrf-token Value – fetch
  • Pass Authentication details.

3-5.jpg


Push Data to BW system: Use Web Activity to Push data to BW System.​

    • URL : you can define dynamic content and concatenate your host and variable for get token endpoint. Reason for having host url as global parameter is that, we would be able to change this parameter in deployment pipelines for QA & Prod environment.

@concat(pipeline().globalParameters.bw_host,variables(PUSH_DATA_ENDPOINT))

  • METHOD: post
  • HEADERS: you will need to pass XCRF token fetched in earlier step along with session cookie. In ADF session cookie doesn’t get automatically set for next step like SCPI so we have to set this manually. Below are header that we need to pass.

x-csrf-token: @activity(‘<your_get_token_activity_name>’).output.ADFWebActivityResponseHeaders[‘x-csrf-token’]

cookie: @activity(‘<your_get_token_activity_name>’).output.ADFWebActivityResponseHeaders[‘Set-Cookie’]

  • BODY: pass on data read from READ JSON FILE (lookup activity output).

@activity(‘<your_read_json_activity_name>’).output.value

  • Pass Authentication details.



4-2.jpg


You can setup trigger for your azure storage account and publish artifacts. upload json file to storage container and your pipeline will be triggered automatically.

5-5.jpg


Congratulations you are successfully push data to BW. As a last step you can process file to respective folder archive/error by checking response from PUSH DATA activity.

Special Mention to Leng Peang as it was his idea to use ADF for this scenario.
1f642.png


Okumaya devam et...
 
Üst