We need to fill following parameter in order to connect BW on component tSAPConnection. Default context is already created for all BW systems.

The user to connect must be RFC_TAL_WBP, which is the RFC user for Talend (DE team). It already save in global context g_CNX_SAP_WBP
| Variable | SBX | DEV | QAS | PRE | PRD |
|---|---|---|---|---|---|
| sapClient | 400 | 110 | 400 | 400 | 400 |
| sapUser | RFC_TAL_WBP | RFC_TAL_WBP | RFC_TAL_WBP | RFC_TAL_WBP | RFC_TAL_WBP |
| sapHost | wb1.eua.solvay.com | wbd.eua.solvay.com | wbq.eua.solvay.com | wbv.eua.solvay.com | wbp.eua.solvay.com |
| sapSystemNumber | 20 | 48 | 32 | 85 | 00 |
| sapUserPwd | get it from Keepass |
It is required when FTP-Batch is used. This is require when we load a huge volume of data (more than 10,000 records), it is required to split the data and temporary save to the folder. To connect to RFC, we can use QAS folder to keep the file
It also already save in global context g_CNX_FTP_BW
| Variable | Value |
|---|---|
| ftpHost | wbq.eua.solvay.com |
| ftpFolder | /exploit/BW/test |
| ftpPort | 22 |
| ftpUser | sftpbw |
| ftpPwd | get it from Keepass |
This function is available only some component


| Variable | Meaning |
|---|---|
| l_VAR_GCS_CSV_TO_STAGING_BUCKETNAME | Bucket Name in GCS |
| l_VAR_FOLDER | Local path |
| l_VAR_FILE | Local filename |
| l_VAR_LIMITROW | Limit number of row per file in case of tSAPTableInput |
| l_PATHDIR_GCS_CSV_TO_STAGING_SERVICE_ACCOUNT_PATH | Location path and file of service account (JSON file) |
OBJECT | Talend component | COMMENT |
|---|---|---|
SAP Tables | tSAPTableInput | Delta No Filter Yes FTP-Batch Yes |
InfoObject | tSAPInfoobjectInput | Delta No Filter No (with text better use SAP table) FTP-Batch No |
DSO (old model) | tSAPDSOInput | Delta No Filter Yes FTP-Batch Yes |
InfoCube (old model) | tSAPCubeInput | Delta No Filter Yes FTP-Batch Yes |
ADSO | tSAPADSOInput | Delta Yes Filter Yes FTP-Batch No |
MultiProvider | tSAPCubeInput | Delta No Filter Yes FTP-Batch Yes |
Composite Provider | tSAPADSOInput | Delta Yes Filter Yes FTP-Batch No |
BW Queries | tHttpRequest is used for Xtract | The best solution for query is Xtract and use tHttpRequest |
1.1 Retrieve metadata
Metadata > SAP Connection > (Server) > right click and select Retrieve SAP Table
Search the table and click finish. Do it 2 times one for attribute table /BIC/M and another with text table /BIC/T
1.2 Copy the reference job J022 BW_MasterData_to_GCS to the new job

1.3 The object to change:
1.3.1 tSetGlobalVar : to change the batch size of the file to have how many row per file by using context.l_VAR_LIMITROW

1.3.2 Component tSAPTableInput (attribute)
Drag and drop the metadata table object to the new job and ensure that the schema, table name, filter, number of records are correct.

1.3.2 Component tSAPTableInput (text)
Drag and drop the metadata table object to the new job and ensure that the schema, table name, filter, number of records are correct.

Text must not limit to the max row since it use to mapping the name.
1.4. Click on the sync column to get the new schema on tLogRow if you want to see the output during the run of the job or delete this job
1.5 Add the context g_CNX_SAP_xxx on the server that need to be connected.
1.6 Enter the parameter context to connect to GCP
The rest of the job will extract the data from SAP and generate the file on the local folder, then upload it to the bucket that we enter in point 1.6
Reference job is J024_BW_InfoCube_to_GCS

tSAPInfoCubeInput configuration

Reference job = J026_BW_MultiProvider_to_GCS

tSAPInfoCubeInput configuration
