Page tree


You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 7 Next »

Access Management 


Roles & Access

List of application role + menu role  and explanation if we have several applications role with specials rules.


Aligned on FIAR Roles & Access



Role CodeRole DescriptionExplanation




Authorization Objects

List of authorization objects mandatory for the application.

Authorization objectExplanation


DataFlow

Overview

The dataflow can be divided in 3 main steps:

  1. check that no other run are in progress (a) and then extract data from BW to PCM 
    1. Extract master data in full mode to csv files through openhub
    2. Load FIAR and CAMS data into specific DSO build for PCM project (DBFIAR20 and DBFIAR21),then extract to csv through openhub
    3. Load the data from the file via Talend 
  2. treatment on PCM GCP
  3. Load full with deletion DPFIAR13 from Big Query with new data source DTS_FI_PCM_01 by PC_FI_PCM_10. SQL server was replaced by Big Query

BW to Talend

DBFIAR20 loading process

 

Big Query to BW

DPFIAR13 data loading

Data source DTS_FI_PCM_01 get from remote source GBQ_Predict_Credit and get from table core_perimeter_future_bw (GCP project predict-credit-mgt-v2-prod)

Report to get back from BigQuery after calculate by Data Science

 

WIth new version 2.0, BW get only 3 key figures

PRED_NOT_PIM     → K_PFNPI
PROBA_NOT_PIM  → K_PPNPI
AMOUNT_EUR      → 0DEB_CRE_LC

Note: DPFIAR14  is obsolete

Architecture Overview

Obsolete 

SQL server was replaced by Big Query and after files are generate, they will be loaded by Talend by Data Engineering team

Objective of the application

Provide predictive information on Credit Management.

  • Define which payer is a priority based on its history and the due amount.


Tool Leader: David TONDA

IT leader of the application: Guillaume THEVENET

Name of project: PCM Predictive Credit Management

PMO Project: 6958 Big Data for Credit Management

Reporting Coordinator: David TONDA

Usage information

Number of users: tbd

Critical period: none

Geographical perimeter: worldwide

InfoArea:

  • All objects are stored in FIAR infoarea: IA_FMCO_FIAR

Process chain Display Component:

  • 134 - PROJECT - PCM
    COMP_PCM


History

In Mar 2020, the data flow was updated to PCM 2.0

In 2021, BW change the source system SDK_PCM (SQL) to BigQuery(#4749200)

In Feb 2023, Po2 project requires to segregate data of Eco/Sco, therefore, company master data file is modified to add C_AUTHMA(authorization scope) to the openhub OH_PCM_05



Technical Rules on Workbench

The whole process can be summarized has follow:

  • Extraction of BW data (master and transactional data) in flat files (csv with "|" separator)
  • copy and loading of these file into a SQL database
  • Calculation of predictive information on SQL side to produce two table which define Priority per Amount & Priority per Payer
  • Loading of these data in BW to be used on standard reporting

Openhub

Updated on 12 Mar 2021: 18 Openhub (2 Transactional and 16 Master data)

Open HubOpen Hub NameSource of Open HubProcess ChainFilenameDetail of file
OH_PCM_01PCM: DBFIAR20DBFIAR20 (replace by v2)PC_FI_PCM_01dbfiar20_delta.csvFIAR: Line Item with Delta - PCM - All systems
OH_PCM_21PCMV2: DBFIAR20DBFIAR20PC_FI_PCM_V2dbfiar20_delta.csvFIAR: Line Item with Delta - PCM - All systems
OH_PCM_02PCM: GL_ACCOUNTATTRIBUTES 0GL_ACCOUNTPC_FI_PCM_01gl_account_full.csvAttribute of GL account
OH_PCM_03PCM: C_CUSTIDATTRIBUTES C_CUSTIDPC_FI_PCM_01c_custid_full.csvAttribute of Customer
OH_PCM_04PCM: COUNTRYATTRIBUTES 0COUNTRYPC_FI_PCM_01country_full.csvAttribute of Country
OH_PCM_05PCM: C_COMPCDEATTRIBUTES C_COMPCDEPC_FI_PCM_01c_compcde_full.csvAttribute of Company
OH_PCM_06PCM: C_CST_CA2ATTRIBUTES C_CST_CA2PC_FI_PCM_01c_cst_ca2_full.csvAttribute of Customer credit control area
OH_PCM_07PCM: TCURRDTS_TCURRPC_FI_PCM_01tcurr_full.csvAttribute of Currency
OH_PCM_08PCM: DBFIAR21DBFIAR21PC_FI_PCM_01dbfiar21_delta.csvFIAR: Credit blocked - PCM - All systems
OH_PCM_09PCM: G_CWWE01ATTRIBUTES 0G_CWWE01PC_FI_PCM_01g_cwwe01_full.csvAttribute of sub activity (IECRA)
OH_PCM_10PCM: 0CLM_CLSP TextsTEXTS 0CSM_USERPC_FI_PCM_010CLM_CLSP_TEXT.csvText of Collection specialist
OH_PCM_11PCM: C_COMPCDE TextsTEXTS C_COMPCDEPC_FI_PCM_01c_compcde_text.csvText of Company
OH_PCM_12PCM: C_SALEMP TextsTEXTS C_SALEMPPC_FI_PCM_01c_salemp_text.csvText of Sales employee
OH_PCM_13PCM: C_PMNTTRM TextsTEXTS C_PMNTTRMPC_FI_PCM_01C_PMNTTRM_TEXT.csvText of payment term
OH_PCM_14PCM: CPFCTR1_2 TextsTEXTS CPFCTR1_2PC_FI_PCM_01CPFCRT1_2_TEXT.csvText of GBU
OH_PCM_15PCM: CPFCTR2_2 TextsTEXTS C_FACUBUPC_FI_PCM_01CPFCTR2_2_TEXT.csvText of BFC group of activity
OH_PCM_16PCM: 0G_CWWE01 TextsTEXTS 0G_CWWE01PC_FI_PCM_010G_CWWE01_TEXT.csvText of  sub activity (IECRA)
OH_PCM_17PCM: C_PM_MTHD TextsTEXTS C_PM_MTHDPC_FI_PCM_01C_PM_MTHD_TEXT.csvText of payment method
OH_PCM_18PCM: 0REPR_GROUP TextsTEXTS 0REPR_GROUPPC_FI_PCM_010REPR_GROUP_TEXT.csvText of credit management representatives group



All master data openhub are loaded in full mode. All extract data from master data except the TCURR one which is connected to a datasource.

All transactional data openhub are loaded in delta mode. (remark delta is not possible with multiprovider).


All openhub use a "|" separator because some field already contain ";" in the value and openhub doesn't encapsulate data.

All openhub use logical file name defined through FILE transaction. All files are stored in the following folder



Reporting

Dependencies with other applications

We should have the information where the application is sending or receiving information (e.g. APD open hub) 

Data Loading

Info Providers and objects loaded 

PC_FI_PCM_07 (Main PC) modify by ticket 5915436

  PC_FI_PCM_01 Generate 18 files from Open Hub by job RSPROCESS weekday at 10:00 AM CET

  PC_FI_PCM_11 Same as PC_FI_PCM_01 just in case 01 can't generate files completely

  PC_FI_PCM_10 Delete DSO then load data to DPFIAR13 from BigQuery  and DPFIAR14 from SDK_PCM  job RSPROCESS weekday at 11:00 AM CET

  PC_FI_PCM_V2 generate 1 file:/exploit/BW/PREDICTCM/V2/dbfiar20_<DATE>_<TIME>_delta.csv from Open Hub by job RSPROCESS 3 times( 7am, 1pm , 7pm CET) per weekday

Target folder


Loading frequency

  PC_FI_PCM_01 load by job RSPROCESS workday at 10:00 AM CET

  PC_FI_PCM_10 load by  job RSPROCESS workday at 11:00 AM CET

  PC_FI_PCM_V2 load by  job RSPROCESS workday at 7am, 1pm , 7pm CET

Average performance



Key Figure

Estimation

~ Average Process Chain Runtime
~ Average nb of rows loaded per load
~ Total nb of rows loaded (if full)
~ Average Runtime for 10k lines

Record Keeping

DSO are loaded with full historical data but we only send today - 3 years to SQL server.

Detail of process chain, list + link between or special event done for the loading

Main Process ChainFinal Provider LoadingFrequencyTime startDuration





















Data Quality Control


Operational Documentation

Procedures

If the service account:bqtobw@predict-credit-mgt-v2-dev.iam.gserviceaccount.com in Google Cloud Platform is expired, then the configuration of remote source:GBQ_Predict_Credit in BW/HANA should be adapted accordingly.

  1. Copy the new JSON file to the folder defined in gbqadapter_prod.properties on dp agent server (acew1dhcahca1)
  2. Change the KeyFilePath to the folder of the new JSON file (JSON file provide by DataOps team: dataops@solvay.com)

Remote source in BW configuration is not required to change.

The properties file in dp agent server is required to change if the project name or the location to keep the JSON file is changed.  

The new JSON file must be the same as the detail in the properties file at KeyFilePath

Scheduling

<Describe the scheduling in place for the application (eg. existing jobs, trigger time/event based, dependencies)>

Monitoring

<Describe the monitoring checks to confirm the application is performing well (eg. check the overall status, check performance metrics like runtime/data volume/memory/disk/CPU, maintain and react to alerts/notifications)>

Error Handling

<Describe how to handle errors (eg. error codes, description and respective resolution, alert users)>

Known Bugs

<List the existing bugs, its criticity, workarounds and resolution plan.>

Roadmap