Access Management
List of application role + menu role and explanation if we have several applications role with specials rules.
Aligned on FIAR Roles & Access
| Role Code | Role Description | Explanation |
|---|---|---|
List of authorization objects mandatory for the application.
| Authorization object | Explanation |
|---|---|
DataFlow
Overview
The dataflow can be divided in 3 main steps:
- check that no other run are in progress (a) and then extract data from BW to PCM
- Extract master data in full mode to csv files through openhub
- Load FIAR and CAMS data into specific DSO build for PCM project (DBFIAR20 and DBFIAR21),then extract to csv through openhub
- Load the data from the file via Talend
- treatment on PCM GCP
- Load full with deletion DPFIAR13 from Big Query with new data source DTS_FI_PCM_01 by PC_FI_PCM_10. SQL server was replaced by Big Query
BW to Talend
DBFIAR20 loading process
Big Query to BW
DPFIAR13 data loading
Data source DTS_FI_PCM_01 get from remote source GBQ_Predict_Credit and get from table core_perimeter_future_bw (GCP project predict-credit-mgt-v2-prod)
Report to get back from BigQuery after calculate by Data Science
WIth new version 2.0, BW get only 3 key figures
PRED_NOT_PIM → K_PFNPI
PROBA_NOT_PIM → K_PPNPI
AMOUNT_EUR → 0DEB_CRE_LC
Note: DPFIAR14 is obsolete
Architecture Overview
Obsolete
SQL server was replaced by Big Query and after files are generate, they will be loaded by Talend by Data Engineering team
Objective of the application
Provide predictive information on Credit Management.
- Define which payer is a priority based on its history and the due amount.
Tool Leader: David TONDA
IT leader of the application: Guillaume THEVENET
Name of project: PCM Predictive Credit Management
PMO Project: 6958 Big Data for Credit Management
Reporting Coordinator: David TONDA
Usage information
Number of users: tbd
Critical period: none
Geographical perimeter: worldwide
InfoArea:
- All objects are stored in FIAR infoarea: IA_FMCO_FIAR
Process chain Display Component:
- 134 - PROJECT - PCM
COMP_PCM
History
In Mar 2020, the data flow was updated to PCM 2.0
In 2021, BW change the source system SDK_PCM (SQL) to BigQuery(#4749200)
In Feb 2023, Po2 project requires to segregate data of Eco/Sco, therefore, company master data file is modified to add C_AUTHMA(authorization scope) to the openhub OH_PCM_05
Technical Rules on Workbench
The whole process can be summarized has follow:
- Extraction of BW data (master and transactional data) in flat files (csv with "|" separator)
- copy and loading of these file into a SQL database
- Calculation of predictive information on SQL side to produce two table which define Priority per Amount & Priority per Payer
- Loading of these data in BW to be used on standard reporting
Openhub
Updated on 12 Mar 2021: 18 Openhub (2 Transactional and 16 Master data)
| Open Hub | Open Hub Name | Source of Open Hub | Process Chain | Filename | Detail of file |
| OH_PCM_21 | PCMV2: DBFIAR20 | DBFIAR20 | PC_FI_PCM_V2 | dbfiar20_delta.csv | FIAR: Line Item with Delta - PCM - All systems |
| OH_PCM_02 | PCM: GL_ACCOUNT | ATTRIBUTES 0GL_ACCOUNT | PC_FI_PCM_01 | gl_account_full.csv | Attribute of GL account |
| OH_PCM_03 | PCM: C_CUSTID | ATTRIBUTES C_CUSTID | PC_FI_PCM_01 | c_custid_full.csv | Attribute of Customer |
| OH_PCM_04 | PCM: COUNTRY | ATTRIBUTES 0COUNTRY | PC_FI_PCM_01 | country_full.csv | Attribute of Country |
| OH_PCM_05 | PCM: C_COMPCDE | ATTRIBUTES C_COMPCDE | PC_FI_PCM_01 | c_compcde_full.csv | Attribute of Company |
| OH_PCM_06 | PCM: C_CST_CA2 | ATTRIBUTES C_CST_CA2 | PC_FI_PCM_01 | c_cst_ca2_full.csv | Attribute of Customer credit control area |
| OH_PCM_07 | PCM: TCURR | DTS_TCURR | PC_FI_PCM_01 | tcurr_full.csv | Attribute of Currency |
| OH_PCM_08 | PCM: DBFIAR21 | DBFIAR21 | PC_FI_PCM_01 | dbfiar21_delta.csv | FIAR: Credit blocked - PCM - All systems |
| OH_PCM_09 | PCM: G_CWWE01 | ATTRIBUTES 0G_CWWE01 | PC_FI_PCM_01 | g_cwwe01_full.csv | Attribute of sub activity (IECRA) |
| OH_PCM_10 | PCM: 0CLM_CLSP Texts | TEXTS 0CSM_USER | PC_FI_PCM_01 | 0CLM_CLSP_TEXT.csv | Text of Collection specialist |
| OH_PCM_11 | PCM: C_COMPCDE Texts | TEXTS C_COMPCDE | PC_FI_PCM_01 | c_compcde_text.csv | Text of Company |
| OH_PCM_12 | PCM: C_SALEMP Texts | TEXTS C_SALEMP | PC_FI_PCM_01 | c_salemp_text.csv | Text of Sales employee |
| OH_PCM_13 | PCM: C_PMNTTRM Texts | TEXTS C_PMNTTRM | PC_FI_PCM_01 | C_PMNTTRM_TEXT.csv | Text of payment term |
| OH_PCM_14 | PCM: CPFCTR1_2 Texts | TEXTS CPFCTR1_2 | PC_FI_PCM_01 | CPFCRT1_2_TEXT.csv | Text of GBU |
| OH_PCM_15 | PCM: CPFCTR2_2 Texts | TEXTS C_FACUBU | PC_FI_PCM_01 | CPFCTR2_2_TEXT.csv | Text of BFC group of activity |
| OH_PCM_16 | PCM: 0G_CWWE01 Texts | TEXTS 0G_CWWE01 | PC_FI_PCM_01 | 0G_CWWE01_TEXT.csv | Text of sub activity (IECRA) |
| OH_PCM_17 | PCM: C_PM_MTHD Texts | TEXTS C_PM_MTHD | PC_FI_PCM_01 | C_PM_MTHD_TEXT.csv | Text of payment method |
| OH_PCM_18 | PCM: 0REPR_GROUP Texts | TEXTS 0REPR_GROUP | PC_FI_PCM_01 | 0REPR_GROUP_TEXT.csv | Text of credit management representatives group |
All master data openhub are loaded in full mode. All extract data from master data except the TCURR one which is connected to a datasource.
All transactional data openhub are loaded in delta mode. (remark delta is not possible with multiprovider).
All openhub use a "|" separator because some field already contain ";" in the value and openhub doesn't encapsuPC_FI_PCM_10ate data.
All openhub use logical file name defined through FILE transaction. All files are stored in the following folder
DBFIAR20 flow
Transformations before DBFIAR20 → DSO from other applications:
Technical Documentation - RTR - FI Working Capital for Solvay Group (WBP)
Technical Documentation - ITC - Credit Management
ODSO DBFIAR20 -> DEST OH_PCM_01:
Delete data where C_LGSYSAF = PF1_020 and C_COMPCAF = 6059.
ODSO DBFIAR20 -> DEST OH_PCM_21
Delete data where C_LGSYSAF = PF1_020 and C_COMPCAF = 6059.
DBFIAR21 flow
TRSF: ZZV_KPI_CREDITBL2 Rhodia -> DPFIAR05 & TRSF: ZZV_KPI_CREDITBL2 Solvay -> DPFIAR06
Conversion from time stamp to dates.
TRSF: DPFIAR06 -> DBFIAR09 & TRSF: DPFIAR05 -> DBFIAR10
Number of hours to release are the differences between times stamps.
C_CUSTPRS, C_CUSTPR are an attribute from C_CUSTID.
C_CST_CAT2 comes from master data C_CST_CAT2 (with help of c_ctr_area and c_custpr).
C_CRDACC comes from master data C_CST_CAT2 (with help of c_ctr_area and c_cst_ca2).
C_COMPCDE comes from master data C_COMPCDE (with help of fields logsys and c_compprs).
MVFIAR11 flow
RSDS DTS_PRIORITY_PER_AMOUNT_PRD SDK_PCM -> ODSO DPFIAR14
DPFIAR14 not loaded since septemeber 2019
If C_CUSTID, CPFCTR1_2, C_ZONEFI = #, the result is empty.
C_CUSTPR comes from C_CUSTID
MVFIAR10 flow
RSDS DTS_FI_PCM_01 BQFIPCM_01 -> ODSO DPFIAR13Nno specifics rules.
RSDS DTS_PRIORITY_PER_PAYER_PRD SDK_PCM -> ODSO DPFIAR13
Not used since june 2021
C_CST_CA2 comes from C_CUSTID
Reporting
| Query | Description |
|---|---|
| BW_QRY_MVFIAR10_0001 | PCM: predictive analytics (GBU) |
| BW_QRY_MVFIAR11_0001 | PCM: predictive analytics (Detail) |
Dependencies with other applications
We should have the information where the application is sending or receiving information (e.g. APD open hub)
Data Loading
Info Providers and objects loaded
PC_FI_PCM_07 (Main PC) modify by ticket 5915436
PC_FI_PCM_01 Generate 18 files from Open Hub by job RSPROCESS weekday at 10:00 AM CET
PC_FI_PCM_11 Same as PC_FI_PCM_01 just in case 01 can't generate files completely
PC_FI_PCM_10 Delete DSO then load data to DPFIAR13 from BigQuery and DPFIAR14 from SDK_PCM job RSPROCESS weekday at 11:00 AM CET
PC_FI_PCM_V2 generate 1 file:/exploit/BW/PREDICTCM/V2/dbfiar20_<DATE>_<TIME>_delta.csv from Open Hub by job RSPROCESS 3 times( 7am, 1pm , 7pm CET) per weekday
Target folder
Loading frequency
PC_FI_PCM_01 load by job RSPROCESS workday at 10:00 AM CET
PC_FI_PCM_10 load by job RSPROCESS workday at 11:00 AM CET
PC_FI_PCM_V2 load by job RSPROCESS workday at 7am, 1pm , 7pm CET
Average performance
Key Figure | Estimation |
|---|---|
| ~ Average Process Chain Runtime | |
| ~ Average nb of rows loaded per load | |
| ~ Total nb of rows loaded (if full) | |
| ~ Average Runtime for 10k lines |
Record Keeping
DSO are loaded with full historical data but we only send today - 3 years to SQL server.
Detail of process chain, list + link between or special event done for the loading
| Main Process Chain | Final Provider Loading | Frequency | Time start | Duration |
|---|---|---|---|---|
Data Quality Control
Operational Documentation
Procedures
If the service account:bqtobw@predict-credit-mgt-v2-dev.iam.gserviceaccount.com in Google Cloud Platform is expired, then the configuration of remote source:GBQ_Predict_Credit in BW/HANA should be adapted accordingly.
- Copy the new JSON file to the folder defined in gbqadapter_prod.properties on dp agent server (acew1dhcahca1)
- Change the KeyFilePath to the folder of the new JSON file (JSON file provide by DataOps team: dataops@solvay.com)
Remote source in BW configuration is not required to change.
The properties file in dp agent server is required to change if the project name or the location to keep the JSON file is changed.
The new JSON file must be the same as the detail in the properties file at KeyFilePath
Scheduling
<Describe the scheduling in place for the application (eg. existing jobs, trigger time/event based, dependencies)>
Monitoring
<Describe the monitoring checks to confirm the application is performing well (eg. check the overall status, check performance metrics like runtime/data volume/memory/disk/CPU, maintain and react to alerts/notifications)>
Error Handling
<Describe how to handle errors (eg. error codes, description and respective resolution, alert users)>
Known Bugs
<List the existing bugs, its criticity, workarounds and resolution plan.>










