The log_tables is used to log the result of the different steps of a flow.
The job J002_Write_log_tables is a standard job that you can drag and drop into your flow to log the result of the different steps. This job is already integrated in the J001_STAGING_TO_ODS job, so you will have to add it only in your development from ODS to data mart.
The job reads input context variables and uses them to create new lines in the logs_table. The variables to be passed are:
- l_LOCAL_VAR_RUN_ID : the system id of your job
- l_LOCAL_VAR_GCS_CSV_TO_STAGING_PROJECTID : the Google Cloud Platform project id
- l_LOCAL_VAR_GCS_CSV_TO_STAGING_DATASET : the dataset where the log_tables is located (most probably it will be in the Staging dataset)
- l_LOCAL_VAR_LOGS_TABLE : the name of the log_tables
- l_LOCAL_VAR_SOURCE_SYSTEM: the source from where data is extracted (in the case of a BW extraction it could be the name of the query, in the case of a ODS to DM calculation it is the name of the ODS table...)
- l_LOCAL_VAR_TARGET_SYSTEM : the name of the target table where the data is inserted
- l_LOCAL_VAR_GCS_to_STAGING_NB_LINES_PROCESSED : the number of lines extracted or processed
- l_LOCAL_VAR_GCS_to_STAGING_NB_LINES_KO : in case some lines could not be entered. The job will consider the status not ok if the value of this parameter is greater than 0.
- l_LOCAL_VAR_STEP: the step where we are loading data . Examples of values are : Source to GCS, Bucket to staging, Staging to ODS, ODS to DW, DW to DM

