Based on the Bank Statements, this application will enhance the Treasury Postings with additional information , collected differently depending on the System of origin, on the following documents :
in order to get details on the categories of payments ( i.e. Checks vs.Transfers) , on the processing methods ( auto vs. manual clearing) , on the life cycle of the payment, etc.
This application gathers data from 3 different SAP ECC systems (PI1,PF1 and WP1) which is then available in a Multiprovider on the BW system .
Name of project or main leader for the application + reporting coodinator
How many users? Which user category ? Is there critical period ? Which geographical perimeter? Describe in one sentence: around 50 users, worldwide or 200 users / European Site
Provide some history of the application : When was it created ? What was the initial project ? Who was the original requestor ? If possible provide link to the project information
List of application role + menu role and explanation if we have several applications role with specials rules.
| Role Code | Role Description | Explanation |
|---|---|---|
List of autorisation objects mandatory for the application.
| Authorization object | Explanation |
|---|---|
Bank Statements details are extracted from both Banks Staement tables (FEBKO/FEBEP) and enhanced by data from the generated accounting Document (tables BKPF) and the Clearing information (tables BSAD and BSAS) .
Reporting documentation drive folder:
https://drive.google.com/drive/folders/0B0qn89R0RGdqYkZZOFZyYXlXVkE
Flow Chart :

The data is extracted from each system from a generic Datasource ZZFCM_AR_PAYMENTS based on a Function Module.
The function module will read the tables FEBKO/FEBEP within the range of the Statement dates selected and will enhance the payments
with Accounting Document information from table BKPF and with the Clearing information from tables BSAS and BSAD.
Dependencies with other application (BW, QV, Externals tools like PVelocity or Dynasis etc...
Detail of process chain, list + link between or special event done for the loading
Detail of frequency : monthly; weekly or else
if possible, give some information on average process chain duration, amount of data loaded and total data volume example: daily process chain loaded in 30 min, weekly chain loaded in 1h15, with around 2k to 10k lines in DELTA mode for a total of 10M lines in cube. The purpose is to give a general overview of the volume of data managed y the application
| Key Figure | Estimation |
|---|---|
| ~ Average Process Chain Runtime | |
| ~ Average nb of rows loaded per load | |
| ~ Total nb of rows loaded (if full) | |
| ~ Average Runtime for 10k lines |
Give details if any historisation is done, example: keep only data greater than beginning of Y-3
Query end user documentation should be created in the public "Customer Support Wiki" space under the corresponding BW application page : BW - Application. Technical query query documentation, if necessary should be added as a sub-page of this documentation using the BW Technical Query Documentation template.
List the most important and complex queries only with a link to the documentation
Give detail on all complex functionalities: list most important and/or complex KPI, query jump, alerts
Indicate if there are broadcasts and give some details on the broadcast settings.
Give the list and explanation on the known, not-solved, bugs.
List recurring procedures
Detail planned major evolution if already known. Example: complete decommissioning of application is planned in 2017 / Extension to solvay perimeter planned in 2nd semester of 2016