You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 3 Next »

SAP connection

We need to fill following parameter in order to connect BW on component tSAPConnection. Default context is already created for all BW systems. 



The user to connect must be RFC_TAL_WBP, which is the RFC user for Talend (DE team). It already save in global context g_CNX_SAP_WBP

VariableSBXDEVQASPREPRD
sapClient400110400400400
sapUserRFC_TAL_WBPRFC_TAL_WBPRFC_TAL_WBPRFC_TAL_WBPRFC_TAL_WBP
sapHostwb1.eua.solvay.comwbd.eua.solvay.comwbq.eua.solvay.comwbv.eua.solvay.comwbp.eua.solvay.com
sapSystemNumber2048328500
sapUserPwdget it from Keepass




RFC Connection

It is required when FTP-Batch is used. This is require when we load a huge volume of data (more than 10,000 records), it is required to split the data and temporary save to the folder.  To connect to RFC, we can use QAS folder to keep the file

It also already save in global context g_CNX_FTP_BW

VariableValue
ftpHostwbq.eua.solvay.com
ftpFolder/exploit/BW/test
ftpPort22
ftpUsersftpbw
ftpPwdget it from Keepass

This function is available only some component 


GCP Connection

VariableMeaning
l_VAR_GCS_CSV_TO_STAGING_BUCKETNAMEBucket Name in GCS
l_VAR_FOLDERLocal path 
l_VAR_FILELocal filename
l_VAR_LIMITROWLimit number of row per file in case of tSAPTableInput
l_PATHDIR_GCS_CSV_TO_STAGING_SERVICE_ACCOUNT_PATHLocation path and file of service account (JSON file)


Component to Connect BW

OBJECT

Talend component

COMMENT

SAP Tables

tSAPTableInput

Delta No

Filter Yes

FTP-Batch Yes

InfoObject

tSAPInfoobjectInput

Delta No

Filter No (with text better use SAP table)

FTP-Batch No

DSO 

(old model)

tSAPDSOInput

Delta No

Filter Yes

FTP-Batch Yes

InfoCube

(old model)

tSAPCubeInput

Delta No

Filter Yes

FTP-Batch Yes

ADSO 

tSAPADSOInput


Delta Yes

Filter Yes

FTP-Batch No

MultiProvider

tSAPCubeInput

Delta No

Filter Yes

FTP-Batch Yes

Composite Provider

tSAPADSOInput

Delta Yes

Filter Yes

FTP-Batch No

BW Queries

tHttpRequest is used for Xtract

The best solution for query is Xtract and use tHttpRequest 


1. SAP Table

2 InfoObject (Master data)

1.1 Retrieve metadata

Metadata > SAP Connection > (Server) > right click and select Retrieve SAP Table

Search the table and click finish. Do it 2 times one for attribute table /BIC/M and another with text table /BIC/T


1.2 Copy the reference job J022 BW_MasterData_to_GCS to the new job 

1.3 The object to change:

1.3.1 tSetGlobalVar : to change the batch size of the file to have how many row per file by using context.l_VAR_LIMITROW

1.3.2 Component tSAPTableInput (attribute)

Drag and drop the metadata table object to the new job and ensure that the schema, table name, filter, number of records are correct.

1.3.2 Component tSAPTableInput (text)

Drag and drop the metadata table object to the new job and ensure that the schema, table name, filter, number of records are correct.

Text must not limit to the max row since it use to mapping the name. 

1.4. Click on the sync column to get the new schema on tLogRow if you want to see the output during the run of the job or delete this job

1.5 Add the context g_CNX_SAP_xxx on the server that need to be connected.

1.6 Enter the parameter context to connect to GCP

The rest of the job will extract the data from SAP and generate the file on the local folder, then upload it to the bucket that we enter in point 1.6

3 DSO

4. InfoCube

Reference job is J024_BW_InfoCube_to_GCS

tSAPInfoCubeInput configuration

5. ADSO

6. MultiProvider

Reference job = J026_BW_MultiProvider_to_GCS

tSAPInfoCubeInput configuration



7. Composite Provider

8. BW Query




  • No labels