General presentation

Objective of the application

Provide predictive information on Credit Management.


Tool Leader: David TONDA

IT leader of the application: Guillaume THEVENET

Name of project: PCM Predictive Credit Management

PMO Project: 6958 Big Data for Credit Management

Reporting Coordinator: David TONDA

Usage information

Number of users: tbd

Critical period: none

Geographical perimeter: worldwide

InfoArea:

Process chain Display Component:


History

In Mar 2020, the data flow was updated to PCM 2.0

In 2021, BW change the source system SDK_PCM (SQL) to BigQuery(#4749200)

In Feb 2023, Po2 project requires to segregate data of Eco/Sco, therefore, company master data file is modified to add C_AUTHMA(authorization scope) to the openhub OH_PCM_05

Roles & Access

Aligned on FIAR Roles & Access

Dataflow overview

Summary

The dataflow can be divided in 3 main steps:

  1. check that no other run are in progress (a) and then extract data from BW to PCM 
    1. Extract master data in full mode to csv files through openhub
    2. Load FIAR and CAMS data into specific DSO build for PCM project (DBFIAR20 and DBFIAR21),then extract to csv through openhub
    3. Load the data from the file via Talend 
  2. treatment on PCM GCP
  3. Load full with deletion DPFIAR13 from Big Query with new data source DTS_FI_PCM_01 by PC_FI_PCM_10. SQL server was replaced by Big Query


BW to Talend

DBFIAR20 loading process

 

Big Query to BW

DPFIAR13 and DPFIAR14  data loading

Report to get back from BigQuery after calculate by Data Science

WIth new version 2.0, BW get only 3 key figures

PRED_NOT_PIM     → K_PFNPI
PROBA_NOT_PIM  → K_PPNPI
AMOUNT_EUR      → 0DEB_CRE_LC


Architecture Overview

Obsolete 

SQL server was replaced by Big Query and after files are generate, they will be loaded by Talend by Data Engineering team

Functional and Technical rules on Workbench + Reporting

Rules & Explanations

The whole process can be summarized has follow:

Program (a) → Obsolete

(created in SE38)

2 program to update the TVARVC variable Z_BW_PCM_PC_STATUS

The purpose of these program is to avoid launching process chain twice if previous run is still active.

If the process chain is already running, the process chain will fail requiring deeper analysis (normal duration should be less than one hour - only init run should exceed this duration)

Openhub (b)

7 openhub have been created for master data and 2 for transactional data

Updated on 12 Mar 2021: 18 Openhub (2 Transactional and 16 Master data)

Open HubOpen Hub NameSource of Open HubSeq in PC_FI_PCM_01FilenameDetail of file
OH_PCM_01PCM: DBFIAR20DBFIAR2017dbfiar20_delta.csvFIAR: Line Item with Delta - PCM - All systems
OH_PCM_02PCM: GL_ACCOUNTATTRIBUTES 0GL_ACCOUNT1gl_account_full.csvAttribute of GL account
OH_PCM_03PCM: C_CUSTIDATTRIBUTES C_CUSTID2c_custid_full.csvAttribute of Customer
OH_PCM_04PCM: COUNTRYATTRIBUTES 0COUNTRY6country_full.csvAttribute of Country
OH_PCM_05PCM: C_COMPCDEATTRIBUTES C_COMPCDE3c_compcde_full.csvAttribute of Company
OH_PCM_06PCM: C_CST_CA2ATTRIBUTES C_CST_CA24c_cst_ca2_full.csvAttribute of Customer credit control area
OH_PCM_07PCM: TCURRDTS_TCURR5tcurr_full.csvAttribute of Currency
OH_PCM_08PCM: DBFIAR21DBFIAR2118dbfiar21_delta.csvFIAR: Credit blocked - PCM - All systems
OH_PCM_09PCM: G_CWWE01ATTRIBUTES 0G_CWWE017g_cwwe01_full.csvAttribute of sub activity (IECRA)
OH_PCM_10PCM: 0CLM_CLSP TextsTEXTS 0CSM_USER80CLM_CLSP_TEXT.csvText of Collection specialist
OH_PCM_11PCM: C_COMPCDE TextsTEXTS C_COMPCDE9c_compcde_text.csvText of Company
OH_PCM_12PCM: C_SALEMP TextsTEXTS C_SALEMP10c_salemp_text.csvText of Sales employee
OH_PCM_13PCM: C_PMNTTRM TextsTEXTS C_PMNTTRM11C_PMNTTRM_TEXT.csvText of payment term
OH_PCM_14PCM: CPFCTR1_2 TextsTEXTS CPFCTR1_212CPFCRT1_2_TEXT.csvText of GBU
OH_PCM_15PCM: CPFCTR2_2 TextsTEXTS C_FACUBU13CPFCTR2_2_TEXT.csvText of BFC group of activity
OH_PCM_16PCM: 0G_CWWE01 TextsTEXTS 0G_CWWE01140G_CWWE01_TEXT.csvText of  sub activity (IECRA)
OH_PCM_17PCM: C_PM_MTHD TextsTEXTS C_PM_MTHD15C_PM_MTHD_TEXT.csvText of payment method
OH_PCM_18PCM: 0REPR_GROUP TextsTEXTS 0REPR_GROUP160REPR_GROUP_TEXT.csvText of credit management representatives group


All master data openhub are loaded in full mode. All extract data from master data except the TCURR one which is connected to a datasource.

All transactional data openhub are loaded in delta mode. (remark delta is not possible with multiprovider).


All openhub use a "|" separator because some field already contain ";" in the value and openhub doesn't encapsulate data.

All openhub use logical file name defined through FILE transaction. All files are stored in the following folder

KSH script (c) → Obsolete

KSH script have been build by D3S/adagio team.

They are stored in the following folder

pacm-ping.ksh is the initial test for the PCM project (not used anymore).

pacm-reset_db.ksh allow to reset SQL database from BW.

pacm-run.ksh is the script which launch treatment on SQL database side. It generate a log in the following folder: /exploit/TA/tmp/${STAMP}_log_run_$server.log

This scripts awaits two parameter:

OS Command (d) → Obsolete

(created in SM49)

The ZPCM_RUN launch a .bat file which load extracted file in the D3S SQL database. It's included in process chain through the following variant:

This variant has been modified directly in WBP to launch treatment on the SQL production server

The ZPCM_RESET_DB reset the database. It's not included in process chain

Status table (e) → Obsolete 

The status table is loaded in DSO DPFIAR12

C_COMMAND should be used to run the "ZPCM_RUN" or the "ZPCM_RESET" depending on the value sent by SQL database. This functionality is not used and by default the process chain always launch the RUN. If the RESET is required, then it needs to be launched manually.

C_CODSTAT can take 3 different value which are checked in the process chain:

These value are checked in the process chain through "Enhanced decision"

Dependencies with other applications

PCM data are loaded on top of FIAR data.

Data loadings

Info providers and objects loaded

PC_FI_PCM_07 (Main PC) modify by ticket 5915436

  PC_FI_PCM_01 Generate 18 files from Open Hub by job RSPROCESS weekday at 10:00 AM CET

  PC_FI_PCM_11 Same as PC_FI_PCM_01 just in case 01 can't generate files completely

  PC_FI_PCM_10 Delete DSO then load data to DPFIAR13 from BigQuery  and DPFIAR14 from SDK_PCM  job RSPROCESS weekday at 11:00 AM CET

  PC_FI_PCM_V2 generate 1 file:/exploit/BW/PREDICTCM/V2/dbfiar20_<DATE>_<TIME>_delta.csv from Open Hub by job RSPROCESS 3 times( 7am, 1pm , 7pm CET) per weekday

Target folder


Loading frequency

  PC_FI_PCM_01 load by job RSPROCESS workday at 10:00 AM CET

  PC_FI_PCM_10 load by  job RSPROCESS workday at 11:00 AM CET

  PC_FI_PCM_V2 load by  job RSPROCESS workday at 7am, 1pm , 7pm CET

Average performance

if possible, give some information on average process chain duration, amount of data loaded and total data volume example: daily process chain loaded in 30 min, weekly chain loaded in 1h15, with around 2k to 10k lines in DELTA mode for a total of 10M lines in cube. The purpose is to give a general overview of the volume of data managed y the application


Key FigureEstimation
~ Average Process Chain Runtime
~ Average nb of rows loaded per load
~ Total nb of rows loaded (if full)
~ Average Runtime for 10k lines

Record Keeping

DSO are loaded with full historical data but we only send today - 3 years to SQL server.

Reporting

Queries End User Documentation

Query end user documentation should be created in the public "Customer Support Wiki" space under the corresponding BW application page : BW - Application. Technical query query documentation, if necessary should be added as a sub-page of this documentation using the BW Technical Query Documentation template.


Main queries

List the most important and complex queries only with a link to the documentation

Main functionalities

Give detail on all complex functionalities: list most important and/or complex KPI, query jump, alerts

Broadcast

All Credit Management broadcasts (and associated PCs) were deactivated on Business demand

Maintenance

Known bugs

Give the list and explanation on the known, not-solved, bugs.

Recurring procedure

If the service account:bqtobw@predict-credit-mgt-v2-dev.iam.gserviceaccount.com in Google Cloud Platform is expired, then the configuration of remote source:GBQ_Predict_Credit in BW/HANA should be adapted accordingly.

  1. Copy the new JSON file to the folder defined in gbqadapter_prod.properties
  2. Change the KeyFilePath to the folder of the new JSON file

Planned Evolution

Detail planned major evolution if already known. Example: complete decommissioning of application is planned in 2017 / Extension to solvay perimeter planned in 2nd semester of 2016