Status

OwnerVIDWANS-ext, Sauradh 
Stakeholders

Purpose

The purpose of this document is to define the conversion approach to create Facilities Transactions in S/4 HANA.

Facilities are borrowing arrangements with either one or multiple banks. If we have a long term borrowing facility with one bank its called as Bi-Lateral Facility. Under this facility you, can define the fund requirements and the Bank will ear mark these funds to be drawn at a certain interval. If this facility is with multiple banks its called as Syndicate Facility. In a Syndicate there is normally a Lead Banker who will arrange the lending commitment from other lenders and will also have a Lead Banker charges.

These instruments are created under a certain Product Type (Instrument Number) and a Transaction Type (Kind of action i.e. buying or selling) against a counterparty (BP Number) in a Company Code.

Conversion Scope

The scope of this document covers the approach for converting active Facilities from Legacy Source System (Quantum) into S/4HANA.

The relevancy criteria is not applicable for this object as the relevant data will be provided by business in a DCT (Data Collection Template).

List of source systems and approximate number of records


SourceScope

Source Approx No. of Records

Target SystemTarget Approx

No. of Records

Quantum

Live Commercial Paper Contracts

100 <TBC>

S4HANA

100 <TBC>

Additional Information

Multi-language Requirement

Summarize Multi-language Requirement/s, if any

Document Management

Summarize Document Management requirement, if any

Legal Requirement

Summarize Legal Requirement/s, if any

Special Requirements


As the conversion involves a source system which is non-SAP, the below area needs special handling:

  1. Extraction of CP data from source system i.e. Quantum is to be done manually
  2. Cleansing by business team in the source system (Quantum) to ensure that only active and in use records for CP’s are provided in DCT
  3. The extracted records to be converted into a DCT and stored in the Syniti for Transformation
  4. Below Configuration Check Tables needs to be referenced in Syniti to validate the data in the DCT:
    1. Counterparty to Business Partner - BUT100
    2. Company Code - T001
    3. Product Type - TZPA
    4. Transaction Type -  AT10
    5. Trader - TZDEA
    6. Portfolio Code - TWPOB  
    7. General Valuation Class - TRGC_COM_VALCL
    8. Interest Calculation Method - Custom Check Table with Value Fields given in the below CNV
  5. Amount and date format from source system to align with S4HANA 
  6. As the target structure in S4HANA involves dependency on various Configuration and BP Master Data objects (CNV 3040), all this data needs to be loaded before Transactions Data loads.


Custom Check Tables for Interest Calculation Method, Frequency and Update Rule are as under:

Interest Calculation Method:

Interest Calculation Method

Text

1

360E/360

B

360E/365

L

360/360 (German)

F

360/360 (ISDA)

H

360/365 (ISDA)

7

360/360

2

act/360

E

act/364

3

act/365

R

act/365.25

4

act/366

M

act/365P

5

act/actP (ICMA)

6

act/actY (ISDA)

G

act/actE (AFB)

Q

act/actEP (AFB)

N

act/365L

S

act/365Y

A

actW/252

D

365/360

C

365/365

I

360E/actY

8

30.42E/360

9

*365.25/360

P

01-01-2026

J

m+act/360

K

m+30/360

0

Not Specified

T

act/act[M]

U

actW/actW[M]


Update Rule:

Rule NumberText
0Pro Rata Temporis
1Include Fully
2Do Not Include


Frequency:

1

At End of Term

5

On First Day of Month

2

On Last Day of Month

3

Monthly

4

Daily

0

Manual Input




Target Design


For a Transaction Load DCT, since there are no standard LTMC objects that we can use for loading this data, we will be using BAPI’s for this purpose. In a BAPI there are multiple fields, off which all are not necessary to be utilized as part of DCT.

In this section, we will see the full list of fields that are part of the DCT. This section will also explain field by field mapping of the DCT structure and where the data will come from for each field.

The technical design of the target for this conversion approach.


Table NameFieldField DescriptionData TypeLengthRequirement
VTBFHABUKRSCOMPANY_CODECHAR4This is a Required Field
VTBFHASGSARTPRODUCT_TYPECHAR3This is a Required Field
VTBFHASFHAARTTRANSACTION_TYPECHAR3This is a Required Field
VTBFHAKONTRHPARTNERCHAR10This is a Required Field
VTBFHADBLFZCONTRACT_DATEDATS8This is a Required Field
VTBFHAZURDEALERTRADERCHAR12This is Optional Field
VTBFHARPORTBPORTFOLIOCHAR10This is Optional Field
VTBFHAZUONDASSIGNMENTCHAR18This is Optional Field
VTBFHAZUNORDEXTEXTERNAL_REFERENCECHAR16This is Optional Field
VTBFHAREFERINTERNAL_REFERENCECHAR16This is Optional Field
VTBFHAMERKMCHARACTERISTICSCHAR25This is a Required Field
VTBFHATPM_COM_VAL_CLASSVALUATION_CLASSNUMC4This is a Required Field
VTBFHADBLFZTerm StartDATS8This is a Required Field
VTBFHADELFZTerm EndDATS8This is a Required Field
NANACalculation Period: Start Inclusive vs. End InclusiveCHAR1This is a Required Field
VTBFHAWGSCHFTTransaction CurrencyCUKY5This is a Required Field
VTB_ASGN_LIMITLIMIT_DATEFTR Assignment Management: Limit DateDATS8This is a Required Field
VTB_ASGN_LIMITLIMIT_CHG_AMOUNTFTR Assignment Management: Drawing AmountCURR17This is a Required Field
VTB_ASGN_LIMITLIMIT_CURRENCYFTR Assignment Management: Limit CurrencyCUKY5This is a Required Field
VTB_ASGN_LIMITLIMIT_POS_AMOUNTFTR Assignment Management: Drawing Amount (BAPI)DEC23This is a Required Field
VTBFINKOSRHYTHMFrequencyNUMC3This is Optional Field
VTBFINKOAMMRHY OR ATTRHY
 (Pick any field that has a value. If the Frequency Indicator is set to Months, then AMMRHY will have a value. If Frequency Indicator is set to Days then ATTRHY will have a value.
Unit for FrequencyNUMC1This is Optional Field
VTBFHAPOPKONDPercentage rate for condition itemsDEC10This is Optional Field
NANACalculation method for first period of condition itemNUMC1This is Optional Field
NANACalculation method for last period of condition itemNUMC1This is Optional Field
NANAInterest Calculation MethodCHAR1This is Optional Field
VTBFHAPOSKALIDWTInterest CalendarCHAR2This is Optional Field
NANAUpdate for Calculation/Due Date for InterestNUMC2This is Optional Field
VTBFHAPODVALUTCalculation DateDATS8This is Optional Field
NANAInclusive Indicator for Calculation DateCHAR1This is Optional Field
NANAMonth-End Indicator for Calculation DateCHAR1This is Optional Field
VTBFINKOSVWERKWorking Day Shift for Calculation DayNUMC1This is Optional Field
VTBFINKODFAELLDue dateDATS8This is Optional Field
NANAMonth-End Indicator for Due DateCHAR1This is Optional Field
VTBFINKOSFWERKWorking Day Shift for Due DateNUMC1This is Optional Field
NANAShift Due Date Back to End of TermCHAR1This is Optional Field


There are 4 main tables in consideration here. 

  1. VTBFHA
  2. VTBFHAPO
  3. VTBFINKO
  4. VTB_ASGN_LIMIT
  5. VTBFHAZU

The reference between the tables is a combination of Company Code and Transaction Number. This is the Primary Key to join these tables. 

In Addition to this, the field "Characteristics" will have the Legacy Quantum Deal Number. This will form a linkage between Deals loaded in the system vs Quantum deals. 

In the above table we saw fields in the DCT. Now we will check the fields in the BAPI and the data logic how it will correspond to the DCT.


Data Cleansing

All data cleansing should take place in the data source system as defined in this document, unless system limitations prevent it.

If data cleansing is managed outside of the source system (e.g. Syniti Migrate, 3rd Party Vendor, DCT), the necessary documentation must be produced and appended to this deliverable for sign-off.

IDCriticalityError Message/Report DescriptionRuleOutputSource System


























Conversion Process

The high-level process is represented by the diagram below:

Summarize High-Level Process. Include diagrams, where applicable. Include information supporting details of Extract, Transform and Load specific to the Data Object


Data Privacy and Sensitivity

Summarize Data Privacy and Sensitivity Requirements, if any


Extraction

Extract data from a source into Syniti Migrate. There are 2 possibilities:

  1. The data exists. Syniti Migrate connects to the source and loads the data into Syniti Migrate. There are 3 methods:
    1. Perform full data extraction from relevant tables in the source system(s).
    2. Perform extraction through the application layer.
    3. Only if Syniti Migrate; cannot connect to the source, data is loaded to the repository from the provided source system extract/report.
  2. The data does not exist (or cannot be converted from its current state). The data is manually collected by the business directly in Syniti Migrate. This is to be conducted using DCT (Data Collection Template) in Syniti Migrate

The agreed Relevancy criteria is applied to the extracted records to identify the records that are applicable for the Target loads

Extraction Run Sheet

Req #Requirement DescriptionTeam Responsible













Selection Screen

If applicable, this section will give the details on any selection screen parameters, including the parameter type, that are required to be entered to ensure consistent data extracts.
Selection Ref ScreenParameter NameSelection TypeRequirementValue to be entered/set





















Data Collection Template (DCT)

Target Ready Data Collection Template will be created for Data Object data with exception of some fields which require transformation as mentioned in the transformation rule.

<Object> DCT Rules

Field NameField DescriptionRule












Extraction Dependencies

List the steps that need to occur before extraction can commence

Item #Step DescriptionTeam Responsible













Transformation

The Target fields are mapped to the applicable Legacy field that will be its source, this is a 3-way activity involving the Business, Functional team and Data team. This identifies the transformation activity required to allow Syniti Migrate to make the data Target ready:

  1. Perform value mapping and data transformation rules.
    1. Legacy values are mapped to the to-be values (this could include a default value)
    2. Values are transformed according to the rules defined in Syniti Migrate
  2. Prepare target-ready data in the structure and format that is required for loading via prescribed Load Tool. This step also produces the load data ready for business to perform Pre-load Data Validation

Transformation Run Sheet

Item #Step DescriptionTeam Responsible













Transformation Rules

Rule #Source systemSource TableSource FieldSource DescriptionTarget SystemTarget TableTarget FieldTarget DescriptionTransformation Logic









































Transformation Mapping

Use the exact name and reference this section in the “Transformation rules” above
Mapping Table NameMapping Table Description








Transformation Dependencies

List the steps that need to occur before transformation can commence
Item #Step DescriptionTeam Responsible













Pre-Load Validation

Project Team

The following pre-load validations will be performed by the Project Team.

Completeness

TaskAction
titlespecific details of what and how the task needs to be performed e.g. which reports are being used etc.





Accuracy

TaskAction
titlespecific details of what and how the task needs to be performed e.g. which reports are being used etc.





Business

The following pre-load validations will be performed by the business.

Completeness

TaskAction
titlespecific details of what and how the task needs to be performed e.g. which reports are being used etc.





Accuracy

TaskAction
titlespecific details of what and how the task needs to be performed e.g. which reports are being used etc.





Load

The load process includes:

  1. Execute the automated data load into target system using load tool or product the load file if the load must be done manually
  2. Once the data is loaded to the target system, it will be extracted and prepared for Post Load Data Validation

Load Run Sheet

Item #Step DescriptionTeam Responsible













Load Phase and Dependencies

Identify the phase as to “when” the load for this object will occur. <Pre-Cutover, Cutover, Post Cutover> and list the steps that need to occur before the load can commence

Configuration

List the Configurations required before loading can commence

Item #Configuration Item






Conversion Objects

Object #Preceding Object Conversion Approach

list the exact title of the conversion object of only the immediate predecessor – this will then confirm the DDD (Data Dependency Diagram)




Error Handling

The table below depicts some possible system errors for this data object during data load. All data load error is to be logged as defect and managed within the Defect Management

Error TypeError DescriptionAction Taken










Post-Load Validation

Project Team

The following post-load validations will be performed by the Project Team.

Completeness

TaskAction
titlespecific details of what and how the task needs to be performed e.g. which reports are being used etc.





Accuracy

TaskAction
titlespecific details of what and how the task needs to be performed e.g. which reports are being used etc.





Business

The following post-load validations will be performed by the business.

Completeness

TaskAction
titlespecific details of what and how the task needs to be performed e.g. which reports are being used etc.





Accuracy

TaskAction
titlespecific details of what and how the task needs to be performed e.g. which reports are being used etc.





Key Assumptions

  • Master Data Standard is up to date as on the date of documenting this conversion approach and data load.
  • Data Object is in scope based on data design and any exception requested by business.

Any additional key assumptions.


See also

Insert links and references to other documents which are relevant when trying to understand this decision and its implications. Other decisions are often impacted, so it's good to list them here with links. Attachments are also possible but dangerous as they are static documents and not updated by their authors.

Change log