1. Retrieve metadata

Metadata > SAP Connection > (Server) > right click and select Retrieve SAP Table

Search the table and click finish. Do it 2 times one for attribute table /BIC/M and another with text table /BIC/T

2. Copy the reference job J022 BW_MasterData_to_GCS to the new job 


3. The components to change:

3.1 Component tSAPTableInput (attribute)

Drag and drop the metadata table object to the new job and ensure that the schema, table name, filter, number of records are correct.

(tick) The filter is always required OBJVERS = 'A' on attribute table

3.2 Component tSAPTableInput (text)

Drag and drop the metadata table object to the new job and ensure that the schema, table name, filter, number of records are correct.

Text must not limit to the max row since it use to mapping the name.  

1.4. Click on the sync column to get the new schema on tReplace, tLogRow if you want to see the output during the run of the job or delete this job. 

tMAP will join the attribute and text into 1 file. Also need to check for the key when the table is changed.  The main table must be attribute and lookup table is text.

1.5 Add the context g_CNX_SAP_xxx on the server that need to be connected.

1.6 Enter the parameter context to connect to GCP

The rest of the job will extract the data from SAP and generate the file on the local folder, then upload it to the bucket that we enter in point 1.6

(warning) The job will not delete the file in GCP but overwrite it. Therefore, be careful if the next run has number of file less than pervious one, the old data will be existing in the files and possible to be wrong data to load to staging.  Or it is better to name the file with timestamp in order to avoid overwrite wrongly. 

  • No labels