Provide predictive information on Credit Management.
Tool Leader: David TONDA
IT leader of the application: Guillaume THEVENET
Name of project: PCM Predictive Credit Management
PMO Project: 6958 Big Data for Credit Management
Reporting Coordinator: David TONDA
Number of users: tbd
Critical period: none
Geographical perimeter: worldwide
InfoArea:

Process chain Display Component:

Provide some history of the application : When was it created ? What was the initial project ? Who was the original requestor ? If possible provide link to the project information
Aligned on FIAR Roles & Access
Summary
The dataflow can be divided in 3 main steps:
DBFIAR20 loading process

Architecture Overview

Reporting documentation drive folder:
https://drive.google.com/drive/folders/0B0qn89R0RGdqYkZZOFZyYXlXVkE
The whole process can be summarized has follow:
(created in SE38)
2 program to update the TVARVC variable Z_BW_PCM_PC_STATUS
The purpose of these program is to avoid launching process chain twice if previous run is still active.


If the process chain is already running, the process chain will fail requiring deeper analysis (normal duration should be less than one hour - only init run should exceed this duration)
7 openhub have been created for master data and 2 for transactional data

All master data openhub are loaded in full mode. All extract data from master data except the TCURR one which is connected to a datasource.
All transactional data openhub are loaded in delta mode. (remark delta is not possible with multiprovider).
All openhub use a "|" separator because some field already contain ";" in the value and openhub doesn't encapsulate data.

All openhub use logical file name defined through FILE transaction. All files are stored in the following folder

KSH script have been build by D3S/adagio team.
They are stored in the following folder

pacm-ping.ksh is the initial test for the PCM project (not used anymore).
pacm-reset_db.ksh allow to reset SQL database from BW.
pacm-run.ksh is the script which launch treatment on SQL database side. It generate a log in the following folder: /exploit/TA/tmp/${STAMP}_log_run_$server.log
This scripts awaits two parameter:

(created in SM49)

The ZPCM_RUN launch a .bat file which load extracted file in the D3S SQL database. It's included in process chain through the following variant:

This variant has been modified directly in WBP to launch treatment on the SQL production server

The ZPCM_RESET_DB reset the database. It's not included in process chain
The status table is loaded in DSO DPFIAR12

C_COMMAND should be used to run the "ZPCM_RUN" or the "ZPCM_RESET" depending on the value sent by SQL database. This functionality is not used and by default the process chain always launch the RUN. If the RESET is required, then it needs to be launched manually.
C_CODSTAT can take 3 different value which are checked in the process chain:
These value are checked in the process chain through "Enhanced decision"

PCM data are loaded on top of FIAR data.
Detail of process chain, list + link between or special event done for the loading

Detail of frequency : monthly; weekly or else
if possible, give some information on average process chain duration, amount of data loaded and total data volume example: daily process chain loaded in 30 min, weekly chain loaded in 1h15, with around 2k to 10k lines in DELTA mode for a total of 10M lines in cube. The purpose is to give a general overview of the volume of data managed y the application
| Key Figure | Estimation |
|---|---|
| ~ Average Process Chain Runtime | |
| ~ Average nb of rows loaded per load | |
| ~ Total nb of rows loaded (if full) | |
| ~ Average Runtime for 10k lines |
DSO are loaded with full historical data but we only send today - 3 years to SQL server.
Query end user documentation should be created in the public "Customer Support Wiki" space under the corresponding BW application page : BW - Application. Technical query query documentation, if necessary should be added as a sub-page of this documentation using the BW Technical Query Documentation template.
List the most important and complex queries only with a link to the documentation
Give detail on all complex functionalities: list most important and/or complex KPI, query jump, alerts
All Credit Management broadcasts (and associated PCs) were deactivated on Business demand
Give the list and explanation on the known, not-solved, bugs.
List recurring procedures
Detail planned major evolution if already known. Example: complete decommissioning of application is planned in 2017 / Extension to solvay perimeter planned in 2nd semester of 2016