Talend project = MECANO
GCP Project : prj-data-dm-industrial-[env]
GCP Product: prj-data-maintenance-dash-[env]
Bucket = cs-ew1-prj-data-dm-industrial-[env]-staging/MECANO/


To extract data from Xtract to Bucket

1. Compute of query parameter.
To check it is incremental or reload the data by using l_LOCAL_VAR_RECOVERY_MODE (true = reload, false = incremental). More detail check on reload session
There are 3 main parameter groups for BW queries.
There are 3 groups of loading variables
Day(yyyyMMdd,yyyyMMdd) : QVMECANOBW_QRY_MVPMOR01_0002
Week (yyyyww) : The rest of queries
Month(yyyy0MM,yyyy0MM) : QVMECANO_BW_QRY_MVPMCO01_0001
If the l_LOCAL_VAR_RECOVERY_MODE = false (normal case that load weekly on Monday)
Day : Previous Monday to current Monday
Week : Last week (last Monday to Sunday)
Month : Load last month to current month
2. List of query to extract

It control by set of parameter
Xtract Job = l_VAR_mecano_xtract_[query name]_job = Xtract job name
Valid = l_VAR_mecano_xtract_[query name]_valid. If it is 1, it will extract the data (check condition on "if" line)
tJava2 does not have anything
3. tJava3 just print what is the "Current Query" in order to see on the log in TMC
4. Use reference job to call BW query and save the file to GCS
5. Move from staging bucket to folder MECANO
The separator in Xtract must use "‰" since the data inside BW has all the characters ($,|,% and etc) and it will split data incorrectly. It is required to use 2 chars |
To load the file from GCS to ODS

1. Control filename, STG, ODS table name, valid and separator

It is use the same context as job F100_MEC_EXTRACT_BW_QUERY_TO_GSC and hardcode the separator to ‰
2. Up load the file from GCS to ODS table, which copy from reference job but only this part is different

To load ODS to DM layer for:
To load ODS to DM layer for:
To load ODS to DM layer for FACT_maintenance_strategic. There is only 1 query ODS_BWH_0000_F016_F_W_mec_mvpmco01_0001

1. Select data from ODS with condition
from "+context.l_CNX_GOOGLE_MECANO_ProjectID+"."+context.l_LOCAL_GOOGLE_MECANO_DATASET_ODS+".ODS_BWH_0000_F016_F_W_mec_mvpmco01_0001
inner join "+context.l_CNX_GOOGLE_MECANO_ProjectID+"."+context.l_LOCAL_GOOGLE_MECANO_DATASET_STAGING+".log_files on ODS_BWH_0000_F016_F_W_mec_mvpmco01_0001.meta_run_id=log_files.meta_run_id and log_files.meta_file_name LIKE 'MEC_IT_0000_0000_F016_"+context.Business_date+"_0000_F_W_MEC_MVPMCO01_0001%.csv' and log_files.meta_status in ('OK','NOK')
2. Save the result of the FACT join with dimension tables to context.l_LOCAL_PATHDIR_MECANO_DATA+context.l_LOCAL_PATHDIR_MECANO_DM+"FACT_maintenance_strategic.csv"
3. Put the file to buckets context.l_LOCAL_PATHDIR_MECANO_DATA+context.l_LOCAL_PATHDIR_MECANO_DM+"FACT_maintenance_strategic.csv"
4. Upload the file to WDL.FACT_maintenance_dash_strategic_cost_budget_tmp
5. Delete file from 2
6. Delete file from 4
7. Delete data of existing month from the selection of ODS on DM.FACT_maintenance_strategic
DELETE FROM DM.FACT_maintenance_strategic
WHERE ref_date IN (SELECT DISTINCT CAST(ref_date as Date format 'MON YYYY') as ref_date FROM WDL.FACT_maintenance_dash_strategic_cost_budget_tmp)
8. Insert the data from ODS to DM.FACT_maintenance_strategic
There is no requirement to delete data manually when reload, which different from efficiency or effectiveness
To load ODS to DM layer for FACT_maintenance_effectiveness. There is only 2 queries
To load ODS to DM layer for FACT_maintenance_efficiency. There is only 9 queries
Summary
Group | Tables / Dimension | KPI | 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | Pri | US | SS |
Strategic | ODS_BWH_0000_F016_F_W_mec_mvpmco01_0001 | 1.10 | x | x | x | x | x | x | x | x | ||||
Effectiveness | ODS_BWH_0000_F006_F_W_mec_mvpmcl01_0001 | 3.02 | x | x | x | x | x | x | x | x | x | x | ||
Effectiveness | ODS_BWH_0000_F014_F_W_mec_mvpmor01_0002 | 3.03 | x | x | x | x | x | x | x | x | x | |||
Efficiency | ODS_BWH_0000_F013_F_W_mec_mvpmor04_0001 | 2.03/2.12/2.14 | x | x | x | x | x | x | x | x | x | x | ||
Efficiency | ODS_BWH_0000_F015_F_W_mec_mvpmop02_0010 | 2.11 | x | x | x | x | x | x | x | - | ||||
Efficiency | ODS_BWH_0000_F007_F_W_mec_mvpmop04_0001 | 2.06/2.07/2.09 | x | x | x | x | x | x | x | x | xx | xx | ||
Efficiency | ODS_BWH_0000_F008_F_W_mec_mvpmop04_0002 | 2.04 | x | x | x | x | x | x | x | x | x | x | ||
Efficiency | ODS_BWH_0000_F009_F_W_mec_mvpmop04_0003 | 2.06 | x | x | x | x | x | ? | x | x | x | xx | xx | |
Efficiency | ODS_BWH_0000_F011_F_W_mec_mvpmop04_0005 | 2.08 | x | x | x | x | x | x | x | |||||
Efficiency | ODS_BWH_0000_F012_F_W_mec_mvpmop04_0006 | 2.13 | x | x | x | x | x | x | x | x | xx | xx | ||
Efficiency | ODS_BWH_0000_F004_F_W_mec_mvpmno04_0001 | 2.01/2.02/2.05_hist | x | x | x | x | x | x | x | |||||
Efficiency | ODS_BWH_0000_F005_F_W_mec_mvpmno04_0002 | 2.01/2.02/2.05_det | x | x | x | x | x | x | x |
Note: number of 1 - 9 are the dimension and Pri = priority, US = user status and SS = system status
Query ODS_BWH_0000_F012_F_W_mec_mvpmop04_0006 will be loaded on Daily flow as well and this query has a special component to select max(reference_date) from the DM.FACT_maintenance_efficiency WHERE kpi_no='2.13', which will be related only this query.
Then the selection from ODS will have condition to select only the data that more than max reference_date from FACT_maintenance_efficiency
WHERE PARSE_DATE('%d.%m.%Y', SnapShot__Date)>'" + TalendDate.formatDate("yyyy-MM-dd",((java.util.Date)globalMap.get("max_ref_date")))+"'"
The new development in 2025 to add new KPI
This will load the maintenance operation order with the important date such as date_file_posted(C_FPDAT), end_real_date(c_PM_OPE2_C_FINREEL),shee_creation_date(C_SEDAT), and date_rejection(C_SPDAT) from BW query . These dates can calculate the KPI Posted File Delay, Technical Validation Delay, Validation Rejected%
This will load the table SRGBTBREL (Generic Object Services) from PF1 and WP1 to get the notification that has attachment or not
Generic Object Services (GOS): provides functionality for handling attachments, notes, and links to various objects in SAP transactions. Relationships in this environment might refer to how different objects (e.g., documents, business objects) are linked or related within the system.
There are many objects in this table. However, the notification object is
TYPEID_A(type_a_of_objects_in_persistent_object_references) = 'BUS2038' . It will be filter on DM layer
This job will run sql query to delete prj-data-maintenance-dash-[env].DM.FACT_maintenace_dash and fill the table again with the view prj-data-maintenance-dash-[env].DM.FACT_maintenace_dash with parameter l_LOCAL_VAR_FACT_HISTORY_NUM_MONTH in order to control how many months that need for the final table for dashboard. Normally it is 6 month
To update authorization from gsheet
To trigger Tableau to load data from GCP