- Created by Menghetti, Mattéo, last modified by POLOSSON-ext, Damien on May 26, 2025
You are viewing an old version of this page. View the current version.
Compare with Current View Page History
« Previous Version 18 Current »
4.1 - Talend Integration
Source data integration with Talend ETL tool
Xtract Server (WBP):
- Talend connects to the Xtract server and trigger extraction job using HTTP calls.
- Xtract generates results as CSV files stored in memory.
- Talend retrieves files, apply some transformation and then loads them into Google Cloud Storage.
Google Sheets (Hedges, wap solid fuels, CO2):
- Talend integrates with Google Sheets to extract some static files that will be use to apply mappings or lookup to BW tables.
- It retrieves this data from Google Sheets.
Data Transformation and Loading to Google BigQuery:
- Once data from all sources is available in Google Cloud Storage as files, Talend proceeds with data transformation and loading.
- Talend performs data transformations as needed, including cleansing, mapping, and structuring the data for consistency.
- The transformed data is loaded into various stages, operational data stores (ODS), and data mart tables within Google BigQuery.
- These tables are organized to facilitate efficient querying and reporting for energy optimization purposes.
By utilizing Talend for data extraction, transformation, and loading (ETL), the web app ensures that data from diverse sources is collected, processed, and structured for analysis and reporting within Google BigQuery, enabling users to make informed decisions based on up-to-date and accurate data.
4.2 - Source Data Extraction
| Main jobs for source extraction |
| --to the top ↑-- |
|---|---|---|
| Job description by steps | Job design | |
The flow triggers the extraction of all sources needed for the Global Supply chain dashboard. |
|
| Main jobs for source extraction |
| --to the top ↑-- |
|---|---|---|
| Job description by steps | Job design | |
|
|
| Main jobs for source extraction |
| --to the top ↑-- |
|---|---|---|
| Job description by steps | Job design | |
|
|
| Main jobs for source extraction |
| --to the top ↑-- |
|---|---|---|
| Job description by steps | Job design | |
|
|
| Main jobs for source extraction |
| --to the top ↑-- |
|---|---|---|
| Job description by steps | Job design | |
|
|
4.2 - Load Staging and ODS
| Main jobs for loading Staging |
| --to the top ↑-- |
|---|---|---|
| Job description by steps | Job design | |
|
| |
|
|
|
| Main jobs for loading the ODS |
| --to the top ↑-- |
|---|---|---|
| Job description by steps | Job design | |
| ||
Given the fact that staging and ods are identical and tables are always truncated, this step is unnecessary. |
|
4.2 - Prepare DM files
| Main jobs for building DM |
| --to the top ↑-- |
|---|---|---|
| Job description by steps | Job design | |
|
|
| Prepare DM file |
| --to the top ↑-- |
|---|---|---|
| Job description by steps | Job design | |
The job computes steps 19-28 of the specification document
|
|
| Main jobs for source extraction |
| --to the top ↑-- |
|---|---|---|
| Job description by steps | Job design | |
The job computes steps 41-49 of the specification document
|
|
| Main jobs for source extraction |
| --to the top ↑-- |
|---|---|---|
| Job description by steps | Job design | |
The job computes steps 33,34, and 35-49 of the specification document
|
|
| Main jobs for source extraction |
| --to the top ↑-- |
|---|---|---|
| Job description by steps | Job design | |
The job computes steps 33,34, and 60-69 of the specification document
The job uses the tUnpivotRow component (you can find it here). |
|
4.3 - Compute Perfect Order Rate
| Main jobs for source extraction |
| --to the top ↑-- |
|---|---|---|
| Job description by steps | Job design | |
The job creates the table FACT_POR which is displayed by the Perfect Order rate dashbaord.
|
|
4.4 - Load to DM (calculations and transformations)
| Main jobs for source extraction |
| --to the top ↑-- |
|---|---|---|
| Job description by steps | Job design | |
This jobs is used to call the job J501_GSC_ODS_to_DTM_FACT |
| |
| 501_GSC_ODS_to_DTM_FACT |
|
|
| Main jobs for source extraction |
| --to the top ↑-- |
|---|---|---|
| Job description by steps | Job design | |
|
| |
|
|
|
| Main jobs for source extraction |
| --to the top ↑-- |
|---|---|---|
| Job description by steps | Job design | |
|
|
4.5 - Scheduling and Automation
TMC - PL_QV_TO_TABLEAU_GSC - Daily at 9 PM.
4.6 - Data Validation
Responsible & contact points:
- Application Owner - Mathieu Pourqué
- Delivery Manager - Donia Rachidi
- Project Manager - Vitaly
- Data Architect - Joao Fonseca
- Tableau Dev - Maxime Marboeuf
- Data Engineer - Virgil Lissassi ; replaced by Matteo Menghetti
- No labels




















