...
- Define which payer is a priority based on its history and the due amount.
Tool Leader: David TONDA
IT leader of the application: Guillaume THEVENET
...
- 134 - PROJECT - PCM
COMP_PCM
History
In Mar 2020, the data flow was updated to PCM 2.0
In 2021, BW change the source system SDK_PCM (SQL) to BigQuery(#4749200)
In Feb 2023, Po2 project requires to segregate data of Eco/Sco, therefore, company master data file is modified to add C_AUTHMA(authorization scope) to the openhub OH_PCM_05
In Dec 2023, Po2 spin off project has the new credit controlling area (CCA) = SYEN. DTP of DBFIAR20 need to add CCA = SOLV and SYEN and master data of c_compcde and c_custid are need to add c_scope field in order to use to calculate interco in BQ
Roles & Access
Aligned on FIAR Roles & Access
Dataflow overview
Summary
The dataflow can be divided in 3 main steps:
- check that no other run are in progress (a) and then extract data from BW to PCM SQL database PCM
- Extract master data in full mode to csv files through openhub (b)
- Load FIAR and CAMS data into specific DSO build for PCM project (DBFIAR20 and DBFIAR21),then extract to csv through openhub
- launch ksh script (c) to execute treatment on PCM SQL database side (through OS Command (d))
- treatment on PCM SQL database side
- load data from PCM SQL database to BW
- when OS command is finished (meaning that treatment are finished), load the status table (e)
- if code in status table is 0, load PCM data into 2 specific DSO, DPFIAR10 and DPFIAR11, else regenerate file or put process chain in error
DBFIAR20 loading process
Architecture Overview
Reporting documentation drive folder:
- Load the data from the file via Talend
- treatment on PCM GCP
- Load full with deletion DPFIAR13 from Big Query with new data source DTS_FI_PCM_01 by PC_FI_PCM_10. SQL server was replaced by Big Query
BW to Talend
Big Query to BW
DPFIAR13 data loading
Data source DTS_FI_PCM_01 get from remote source GBQ_Predict_Credit and get from table core_perimeter_future_bw (GCP project predict-credit-mgt-v2-prod)
Report to get back from BigQuery after calculate by Data Science
WIth new version 2.0, BW get only 3 key figures
PRED_NOT_PIM → K_PFNPI
PROBA_NOT_PIM → K_PPNPI
AMOUNT_EUR → 0DEB_CRE_LC
Note: DPFIAR14 is obsolete
Architecture Overview
Obsolete
SQL server was replaced by Big Query and after files are generate, they will be loaded by Talend by Data Engineering teamhttps://drive.google.com/drive/folders/0B0qn89R0RGdqYkZZOFZyYXlXVkE
Functional and Technical rules on Workbench + Reporting
...
- Extraction of BW data (master and transactional data) in flat files (csv with "|" separator)
- copy and loading of these file into a SQL database
- Calculation of predictive information on SQL side to produce two table which define Priority per Amount & Priority per Payer
- Loading of these data in BW to be used on standard reporting
Program (a)
(created in SE38)
2 program to update the TVARVC variable Z_BW_PCM_PC_STATUS
The purpose of these program is to avoid launching process chain twice if previous run is still active.
If the process chain is already running, the process chain will fail requiring deeper analysis (normal duration should be less than one hour - only init run should exceed this duration)
Openhub (b)
7 openhub have been created for master data and 2 for transactional data
Openhub
Updated on 12 Mar 2021: 18 Openhub (2 Transactional and 16 Master data)
| Open Hub | Open Hub Name | Source of Open Hub | Process Chain | Filename | Detail of file |
| OH_PCM_21 | PCMV2: DBFIAR20 | DBFIAR20 | PC_FI_PCM_V2 | dbfiar20_delta.csv | FIAR: Line Item with Delta - PCM - All systems |
| OH_PCM_02 | PCM: GL_ACCOUNT | ATTRIBUTES 0GL_ACCOUNT | PC_FI_PCM_01 | gl_account_full.csv | Attribute of GL account |
| OH_PCM_03 | PCM: C_CUSTID | ATTRIBUTES C_CUSTID | PC_FI_PCM_01 | c_custid_full.csv | Attribute of Customer |
| OH_PCM_04 | PCM: COUNTRY | ATTRIBUTES 0COUNTRY | PC_FI_PCM_01 | country_full.csv | Attribute of Country |
| OH_PCM_05 | PCM: C_COMPCDE | ATTRIBUTES C_COMPCDE | PC_FI_PCM_01 | c_compcde_full.csv | Attribute of Company |
| OH_PCM_06 | PCM: C_CST_CA2 | ATTRIBUTES C_CST_CA2 | PC_FI_PCM_01 | c_cst_ca2_full.csv | Attribute of Customer credit control area |
| OH_PCM_07 | PCM: TCURR | DTS_TCURR | PC_FI_PCM_01 | tcurr_full.csv | Attribute of Currency |
| OH_PCM_08 | PCM: DBFIAR21 | DBFIAR21 | PC_FI_PCM_01 | dbfiar21_delta.csv | FIAR: Credit blocked - PCM - All systems |
| OH_PCM_09 | PCM: G_CWWE01 | ATTRIBUTES 0G_CWWE01 | PC_FI_PCM_01 | g_cwwe01_full.csv | Attribute of sub activity (IECRA) |
| OH_PCM_10 | PCM: 0CLM_CLSP Texts | TEXTS 0CSM_USER | PC_FI_PCM_01 | 0CLM_CLSP_TEXT.csv | Text of Collection specialist |
| OH_PCM_11 | PCM: C_COMPCDE Texts | TEXTS C_COMPCDE | PC_FI_PCM_01 | c_compcde_text.csv | Text of Company |
| OH_PCM_12 | PCM: C_SALEMP Texts | TEXTS C_SALEMP | PC_FI_PCM_01 | c_salemp_text.csv | Text of Sales employee |
| OH_PCM_13 | PCM: C_PMNTTRM Texts | TEXTS C_PMNTTRM | PC_FI_PCM_01 | C_PMNTTRM_TEXT.csv | Text of payment term |
| OH_PCM_14 | PCM: CPFCTR1_2 Texts | TEXTS CPFCTR1_2 | PC_FI_PCM_01 | CPFCRT1_2_TEXT.csv | Text of GBU |
| OH_PCM_15 | PCM: CPFCTR2_2 Texts | TEXTS C_FACUBU | PC_FI_PCM_01 | CPFCTR2_2_TEXT.csv | Text of BFC group of activity |
| OH_PCM_16 | PCM: 0G_CWWE01 Texts | TEXTS 0G_CWWE01 | PC_FI_PCM_01 | 0G_CWWE01_TEXT.csv | Text of sub activity (IECRA) |
| OH_PCM_17 | PCM: C_PM_MTHD Texts | TEXTS C_PM_MTHD | PC_FI_PCM_01 | C_PM_MTHD_TEXT.csv | Text of payment method |
| OH_PCM_18 | PCM: 0REPR_GROUP Texts | TEXTS 0REPR_GROUP | PC_FI_PCM_01 | 0REPR_GROUP_TEXT.csv | Text of credit management representatives group |
...
All master data openhub are loaded in full mode. All extract data from master data except the TCURR one which is connected to a datasource.
All transactional data openhub are loaded in delta mode. (remark delta is not possible with multiprovider).
All openhub use a "|" separator because some field already contain ";" in the value and openhub doesn't encapsulate data.
...
All openhub use logical file name defined through FILE transaction. All files are stored in the following folder
KSH script (c)
KSH script have been build by D3S/adagio team.
They are stored in the following folder
pacm-ping.ksh is the initial test for the PCM project (not used anymore).
pacm-reset_db.ksh allow to reset SQL database from BW.
pacm-run.ksh is the script which launch treatment on SQL database side. It generate a log in the following folder: /exploit/TA/tmp/${STAMP}_log_run_$server.log
This scripts awaits two parameter:
- server
- WDCPAAP21.eua.solvay.com (in WBP, see OS command for more detail)
- WDCPAAT21.eua.solvay.com (in WBQ / WBD)
- and user
- PA_SVC_PRD (in WBP, see OS command for more detail)
- PA_SVC_TST(in WBQ / WBD)
OS Command (d)
(created in SM49)
The ZPCM_RUN launch a .bat file which load extracted file in the D3S SQL database. It's included in process chain through the following variant:
This variant has been modified directly in WBP to launch treatment on the SQL production server
The ZPCM_RESET_DB reset the database. It's not included in process chain
Status table (e)
The status table is loaded in DSO DPFIAR12
C_COMMAND should be used to run the "ZPCM_RUN" or the "ZPCM_RESET" depending on the value sent by SQL database. This functionality is not used and by default the process chain always launch the RUN. If the RESET is required, then it needs to be launched manually.
C_CODSTAT can take 3 different value which are checked in the process chain:
- 0: "normal" process
- 1: problem with file generated, the process chain will try to generate file again (only once)
- 2: other error, the process chain will fail and deeper analysis is required
These value are checked in the process chain through "Enhanced decision"
Dependencies with other applications
PCM data are loaded on top of FIAR data.
Data loadings
Info providers and objects loaded
Target folder
Loading frequency
Average performance
Data loadings
Info providers and objects loaded
PC_FI_PCM_07 (Main PC) modify by ticket 5915436
PC_FI_PCM_01 Generate 18 files from Open Hub by job RSPROCESS weekday at 10:00 AM CET
PC_FI_PCM_11 Same as PC_FI_PCM_01 just in case 01 can't generate files completely
PC_FI_PCM_10 Delete DSO then load data to DPFIAR13 from BigQuery and DPFIAR14 from SDK_PCM job RSPROCESS weekday at 11:00 AM CET
PC_FI_PCM_V2 generate 1 file:/exploit/BW/PREDICTCM/V2/dbfiar20_<DATE>_<TIME>_delta.csv from Open Hub by job RSPROCESS 3 times( 7am, 1pm , 7pm CET) per weekday
Target folder
Loading frequency
PC_FI_PCM_01 load by job RSPROCESS workday at 10:00 AM CET
PC_FI_PCM_10 load by job RSPROCESS workday at 11:00 AM CET
PC_FI_PCM_V2 load by job RSPROCESS workday at 7am, 1pm , 7pm CET
Schedule by PC at 7 AM and schedule by program on job name BI_PROCESS_TRIGGER_1PM and BI_PROCESS_TRIGGER_7PM
Average performance
...
| Key Figure | Estimation |
|---|---|
| ~ Average Process Chain Runtime |
| ~ Average nb of rows loaded per load |
| ~ Total nb of rows loaded (if full) |
| ~ Average Runtime for 10k lines |
Record Keeping
DSO are loaded with full historical data but we only send today - 3 years to SQL server.
...
Queries End User Documentation
...
Main queries
Main functionalities
...
Known bugs
Recurring procedure
If the service account:bqtobw@predict-credit-mgt-v2-dev.iam.gserviceaccount.com in Google Cloud Platform is expired, then the configuration of remote source:GBQ_Predict_Credit in BW/HANA should be adapted accordingly.
Planned Evolution
- Copy the new JSON file to the folder defined in gbqadapter_prod.properties on dp agent server (acew1dhcahca1)
- Change the KeyFilePath to the folder of the new JSON file (JSON file provide by DataOps team: dataops@solvay.com )
Remote source in BW configuration is not required to change.
The properties file in dp agent server is required to change if the project name or the location to keep the JSON file is changed.
The new JSON file must be the same as the detail in the properties file at KeyFilePath
Planned Evolution
...



















