...
The user to connect must be RFC_TAL_WBP, which is the RFC user for Talend (DE team). It already save in global context g_CNX_SAP_WBP
| Variable | SBX | DEV | QAS | PRE | PRD |
|---|---|---|---|---|---|
| sapClient | 400 | 110 | 400 | 400 | 400 |
| sapUser | RFC_TAL_WBP | RFC_TAL_WBP | RFC_TAL_WBP | RFC_TAL_WBP | RFC_TAL_WBP |
| sapHost | wb1.eua.solvay.com | wbd.eua.solvay.com | wbq.eua.solvay.com | wbv.eua.solvay.com | wbp.eua.solvay.com |
| sapSystemNumber | 20 | 48 | 32 | 85 | 00 |
| sapUserPwd | get it from Keepass |
...
To run the test, DE can select the system that you want to run, it will automatically connect to that system and get the result
FTP Connection
It is required when FTP-Batch is used. This is require when we load a huge volume of data (more than 10,000 records), it is required to split the data and temporary save to the folder. To connect to RFC, we can use QAS folder to keep the file
It also already save in global context g_CNX_FTP_BW
| Variable | Value |
|---|---|
| ftpHost | wbq.eua.solvay.com |
| ftpFolder | /exploit/BW/test |
| ftpPort | 22 |
| ftpUser | sftpbw |
| ftpPwd | get it from Keepass |
This function is available only some component
GCP Connection
| Variable | Meaning |
|---|---|
| l_VAR_GCS_CSV_TO_STAGING_BUCKETNAME | Bucket Name in GCS |
| l_VAR_FOLDER | Local path |
| l_VAR_FILE | Local |
| filename | |
| l_VAR_LIMITROW | Limit number of row per file. Required only tSAPTableInput |
| l_PATHDIR_GCS_CSV_TO_STAGING_SERVICE_ACCOUNT_PATH | Location path and file of service account (JSON file) |
More detail how to get GCP service account
Component to Connect BW
OBJECT | Talend component | COMMENT |
|---|---|---|
InfoObject (master data) | tSAPTableInput | Delta No Filter Yes FTP-Batch Yes |
InfoObject
tSAPInfoobjectInput
Delta No
Filter No (with text better use SAP table)
FTP-Batch No(old model) | tSAPDSOInput | Delta No Filter Yes FTP-Batch Yes |
|---|---|---|
(old model) | tSAPCubeInput | Delta No Filter Yes FTP-Batch Yes |
tSAPADSOInput
Delta Yes
Filter Yes
FTP-Batch No
MultiProvider
tSAPCubeInput
Delta No
Filter Yes
FTP-Batch YestSAPADSOInput | Delta Yes Filter Yes FTP-Batch No | |
|---|---|---|
tHttpRequest is used for Xtract | The best solution for query is Xtract and use tHttpRequest |
1. SAP Table
It is the same as ECC SAP Table just to connect to BW
2 InfoObject (Master data)
Although Talend has the component to get data from infoObject but it is better to get it from table since the tSAPInfoobjectInput can't have filter and there is no FTP batch or load by starting row like tSAPTableInput. Therefore, it is better to use tSAPTableInput
3 DSO
4. InfoCube
Reference job is J024_BW_InfoCube_to_GCS
tSAPInfoCubeInput configuration
5. ADSO
6. MultiProvider
Reference job = J026_BW_MultiProvider_to_GCS
tSAPInfoCubeInput configuration
7. Composite Provider
...
5. ADSO
6. MultiProvider
7. Composite Provider
8. BW Query
There are 3 main servers that we need to setup
8.1 BW Server
The query that will available on Xtract program need to mark flag Remote Access “By OLE DB for OLAP”
The query that mark this flag should have the technical name starting with
QV_ for Qlik
DO_ for Talend that load to GCP for the new concept of data ocean
The flag must NOT be flagged in the cored query for BW, which name starting with BW_QRY_xxxx
It has to be done by BW developer and transport from WBD to WBP
8.2 Xtract
1. Open Xtract Designer in a virtual machine having this application like bgsoa-wks116. Connect to the Xtract sever (XU server : ACEW1DXTRAXUS01:8082)
2. Create a job in Xtract ( No need to create connection since all connection should have been already created so you will only have to decide which one to use)
2.1 Click on New (or Edit then go to 3.5)
2.2 Select BW server. DE should use connection which use user RFC_TAL_WBP, which is dedicate for Talend and DE team not sure the one for Qlik
2.3 Enter the job name. The naming convention = TALEND_<Landscape>_<QueryName> such as TALEND_PRD_DO_BW_QRY_XXXX (Query name to connect via Talend must starting with DO (data ocean), therefore, it can differentiate with Qlik query that have starting name with QV)
<Landscape>
- PRD = Production
- PRE = Pre-production
- QAS = Quality
- DEV = Development
2.4 To connect with BW query, it is required to select BW Cube
3. Define data source
For Xtract jobs that will be used by Talend please use WBD_TALEND, WBQ_TALEND, WBP_TALEND. Avoid using bw_qlikview.
3.1 Click on search and
3.2 Select MDX (or BICS) and enter the Query name to search
3.3 Select the query
3.4 Click on Load
3.5 It will get Metadata of the query. Select the characteristics and key figures that required. Default value at the parent node is the text. If we need the key of the characteristic, we need to select at the properties
Edit the variable to filter if required. Be careful that MDX and BICS input can be different in some fields like date format
In case you can't find the query in MDX, then select BICS(Beta). BICS will not possible to select key and text for the characteristics on this version
Click on "Load live preview" to check the setup is OK
3.6 Click on Destination to configure output and header of the column.
3.6.1 The destination is http-csv (this will work with talend component tFileOutputDelimited) → The result from BW query will go to Talend directly.
The destination is CSVFiles (this will work with talend component tFileInputDelimited) → Recommended option. This option will load data from BW query and save to Xtract server.
3.6.2 The column name should be "Code and Text". We need code to create a field in google big query and text to explain the DA and DE what they are
Column by default will be "Code"
"Code and Text" will be:
The field name in BQ should be the one in []
3.6.3 File Splitting: It is required when the output from BW is too huge and it is required to split the files. This option will available only when select destination = CSV files. If this option select, the FileInput component has to get all the files. It may add to specific folder in Xtract server and use component tFileList.
3.7 Click on Run on the top menu. The detail job will appear and the URL that is required for Talend will appear
3.7 Click on Run again (Optional) at the below screen to test that the job is OK.
Now the job is running and ready for Talend to get the data
Move to prod
Differentiate the Xtract that will be used in DEV, TEST, PRE-PROD from the one used in PROD. For production you can
- Duplicate the Xtract query and well specify the prod environment (name must start with TALEND_PROD)
- Switch the Destination to point to the corresponding production one
- If needed adjust the file name to have the sam prefix as dev
8.3 Talend
Copy the job J028_BW_Query_csv_to_GCS to the new job


















