...
- 134 - PROJECT - PCM
COMP_PCM
History
In Mar 2020, the data flow was updated to PCM 2.0
In 2021, we BW change the source system SDK_PCM (SQL) to BigQuery(#4749200)
In Feb 2023, Po2 project requires to segregate data of Eco/Sco, therefore, company master data file is modified to add C_AUTHMA(authorization scope) to the openhub OH_PCM_05
In Dec 2023, Po2 spin off project has the new credit controlling area (CCA) = SYEN. DTP of DBFIAR20 need to add CCA = SOLV and SYEN and master data of c_compcde and c_custid are need to add c_scope field in order to use to calculate interco in BQ
Roles & Access
Aligned on FIAR Roles & Access
Dataflow overview
...
The dataflow can be divided in 3 main steps:
...
- check that no other run are in progress (a) and then extract data from BW to PCM SQL database PCM
- Extract master data in full mode to csv files through openhub (b)
- Load FIAR and CAMS data into specific DSO build for PCM project (DBFIAR20 and DBFIAR21),then extract to csv through openhub
launch ksh script (c) to execute treatment on PCM SQL database side (through OS Command (d))
treatment on PCM SQL database sideload data from PCM SQL database to BWwhen OS command is finished (meaning that treatment are finished), load the status table (e)if code in status table is 0, load PCM data into 2 specific DSO, DPFIAR10 and DPFIAR11, else regenerate file or put process chain in error
DBFIAR20 loading process
DPFIAR10 and DPFIAR11
- Load the data from the file via Talend
- treatment on PCM GCP
- Load full with deletion DPFIAR13 from Big Query with new data source DTS_FI_PCM_01 by PC_FI_PCM_10. SQL server was replaced by Big Query
BW to Talend
Big Query to BW
DPFIAR13 data loading
Data source DTS_FI_PCM_01 get from remote source GBQ_Predict_Credit and get from table core_perimeter_future_bw (GCP project predict-credit-mgt-v2-prod)
Report to get back from BigQuery after calculate by Data Science
WIth new version 2.0, BW get only 3 key figures
PRED_NOT_PIM → K_PFNPI
PROBA_NOT_PIM → K_PPNPI
AMOUNT_EUR → 0DEB_CRE_LC
Note: DPFIAR14 is obsolete
Architecture Overview
...
- Extraction of BW data (master and transactional data) in flat files (csv with "|" separator)
- copy and loading of these file into a SQL database
- Calculation of predictive information on SQL side to produce two table which define Priority per Amount & Priority per Payer
- Loading of these data in BW to be used on standard reporting
Program (a)
...
2 program to update the TVARVC variable Z_BW_PCM_PC_STATUS
The purpose of these program is to avoid launching process chain twice if previous run is still active.
If the process chain is already running, the process chain will fail requiring deeper analysis (normal duration should be less than one hour - only init run should exceed this duration)
Openhub
...
7 openhub have been created for master data and 2 for transactional data
...
Updated on 12 Mar 2021: 18 Openhub (2 Transactional and 16 Master data)
| Open Hub | Open Hub Name | Source of Open Hub | Process Chain | Filename | Detail of file | ||
| Seq in | |||||||
| OH_PCM_ | 0121 | PCMPCMV2: DBFIAR20 | DBFIAR20 | 17PC_FI_PCM_V2 | dbfiar20_delta.csv | FIAR: Line Item with Delta - PCM - All systems | |
| OH_PCM_02 | PCM: GL_ACCOUNT | ATTRIBUTES 0GL_ACCOUNT1 | PC_FI_PCM_01 | gl_account_full.csv | Attribute of GL account | ||
| OH_PCM_03 | PCM: C_CUSTID | ATTRIBUTES C_CUSTID2 | PC_FI_PCM_01 | c_custid_full.csv | Attribute of Customer | ||
| OH_PCM_04 | PCM: COUNTRY | ATTRIBUTES 0COUNTRY6 | PC_FI_PCM_01 | country_full.csv | Attribute of Country | ||
| OH_PCM_05 | PCM: C_COMPCDE | ATTRIBUTES C_COMPCDE3 | PC_FI_PCM_01 | c_compcde_full.csv | Attribute of Company | ||
| OH_PCM_06 | PCM: C_CST_CA2 | ATTRIBUTES C_CST_CA24 | PC_FI_PCM_01 | c_cst_ca2_full.csv | Attribute of Customer credit control area | ||
| OH_PCM_07 | PCM: TCURR | DTS_TCURR5 | PC_FI_PCM_01 | tcurr_full.csv | Attribute of Currency | ||
| OH_PCM_08 | PCM: DBFIAR21 | DBFIAR2118 | PC_FI_PCM_01 | dbfiar21_delta.csv | FIAR: Credit blocked - PCM - All systems | ||
| OH_PCM_09 | PCM: G_CWWE01 | ATTRIBUTES 0G_CWWE017 | PC_FI_PCM_01 | g_cwwe01_full.csv | Attribute of sub activity (IECRA) | ||
| OH_PCM_10 | PCM: 0CLM_CLSP Texts | TEXTS 0CSM_USER8 | PC_FI_PCM_01 | 0CLM_CLSP_TEXT.csv | Text of Collection specialist | ||
| OH_PCM_11 | PCM: C_COMPCDE Texts | TEXTS C_COMPCDE9 | PC_FI_PCM_01 | c_compcde_text.csv | Text of Company | ||
| OH_PCM_12 | PCM: C_SALEMP Texts | TEXTS C_SALEMP10 | PC_FI_PCM_01 | c_salemp_text.csv | Text of Sales employee | ||
| OH_PCM_13 | PCM: C_PMNTTRM Texts | TEXTS C_PMNTTRM11 | PC_FI_PCM_01 | C_PMNTTRM_TEXT.csv | Text of payment term | ||
| OH_PCM_14 | PCM: CPFCTR1_2 Texts | TEXTS CPFCTR1_212 | PC_FI_PCM_01 | CPFCRT1_2_TEXT.csv | Text of GBU | ||
| OH_PCM_15 | PCM: CPFCTR2_2 Texts | TEXTS C_FACUBU13 | PC_FI_PCM_01 | CPFCTR2_2_TEXT.csv | Text of BFC group of activity | ||
| OH_PCM_16 | PCM: 0G_CWWE01 Texts | TEXTS 0G_CWWE0114 | PC_FI_PCM_01 | 0G_CWWE01_TEXT.csv | Text of sub activity (IECRA) | ||
| OH_PCM_17 | PCM: C_PM_MTHD Texts | TEXTS C_PM_MTHD | PC_FI_PCM_01 | C | _PMNTTRM15_PM_MTHD_TEXT.csv | Text of payment method | |
| OH_PCM_18 | PCM: 0REPR_GROUP Texts | TEXTS 0REPR_GROUP16 | PC_FI_PCM_01 | 0REPR_GROUP_TEXT.csv | Text of credit management representatives group |
All master data openhub are loaded in full mode. All extract data from master data except the TCURR one which is connected to a datasource.
...
All openhub use logical file name defined through FILE transaction. All files are stored in the following folder
KSH script (c) → Obsolete
KSH script have been build by D3S/adagio team.
They are stored in the following folder
pacm-ping.ksh is the initial test for the PCM project (not used anymore).
pacm-reset_db.ksh allow to reset SQL database from BW.
pacm-run.ksh is the script which launch treatment on SQL database side. It generate a log in the following folder: /exploit/TA/tmp/${STAMP}_log_run_$server.log
This scripts awaits two parameter:
serverWDCPAAP21.eua.solvay.com (in WBP, see OS command for more detail)WDCPAAT21.eua.solvay.com (in WBQ / WBD)
and userPA_SVC_PRD (in WBP, see OS command for more detail)PA_SVC_TST(in WBQ / WBD)
OS Command (d) → Obsolete
(created in SM49)
The ZPCM_RUN launch a .bat file which load extracted file in the D3S SQL database. It's included in process chain through the following variant:
This variant has been modified directly in WBP to launch treatment on the SQL production server
The ZPCM_RESET_DB reset the database. It's not included in process chain
Status table (e)
The status table is loaded in DSO DPFIAR12
C_COMMAND should be used to run the "ZPCM_RUN" or the "ZPCM_RESET" depending on the value sent by SQL database. This functionality is not used and by default the process chain always launch the RUN. If the RESET is required, then it needs to be launched manually.
C_CODSTAT can take 3 different value which are checked in the process chain:
0: "normal" process1: problem with file generated, the process chain will try to generate file again (only once)2: other error, the process chain will fail and deeper analysis is required
These value are checked in the process chain through "Enhanced decision"
Dependencies with other applications
PCM data are loaded on top of FIAR data.
Data loadings
Info providers and objects loaded
...
PC_FI_PCM_10 Delete DSO then load data to DPFIAR13 from BigQuery and DPFIAR14 from SDK_PCM job RSPROCESS weekday at 11:00 AM CET
PC_FI_PCM_V2 generate 1 file:/exploit/BW/PREDICTCM/V2/dbfiar20_<DATE>_<TIME>_delta.csv from Open Hub by job RSPROCESS 3 times( 7am, 1pm , 7pm CET) per weekday
Target folder
Loading frequency
PC_FI_PCM_01 load by job RSPROCESS workday at 10:00 AM CET
PC_FI_PCM_10 load by job RSPROCESS workday at 11:00 AM CET
PC_FI_PCM_V2 load by job RSPROCESS workday at 7am, 1pm , 7pm CET
Schedule by PC at 7 AM and schedule by program on job name BI_PROCESS_TRIGGER_1PM and BI_PROCESS_TRIGGER_7PM
Average performance
| Key Figure | Estimation |
|---|---|
| ~ Average Process Chain Runtime | |
| ~ Average nb of rows loaded per load | |
| ~ Total nb of rows loaded (if full) | |
| ~ Average Runtime for 10k lines |
...
Known bugs
Recurring procedure
If the service account:bqtobw@predict-credit-mgt-v2-dev.iam.gserviceaccount.com in Google Cloud Platform is expired, then the configuration of remote source:GBQ_Predict_Credit in BW/HANA should be adapted accordingly.
- Copy the new JSON file to the folder defined in gbqadapter_prod.properties on dp agent server (acew1dhcahca1)
- Change the KeyFilePath to the folder of the new JSON file (JSON file provide by DataOps team: dataops@solvay.com )
Remote source in BW configuration is not required to change.
The properties file in dp agent server is required to change if the project name or the location to keep the JSON file is changed.
The new JSON file must be the same as the detail in the properties file at KeyFilePath


















