Status

OwnerPrasad Naidu
Stakeholders

Purpose

The purpose of this document is to define the approach, scope, and rules for the conversion of Bank Master Data from legacy systems into SAP S/4HANA. The Bank Master includes information about House Banks and Bank Directory (External Banks) required for processing payment transactions such as vendor payments, customer refunds, and bank reconciliations.

This conversion ensures that all bank-related master data is available and consistent in the S/4HANA system to support business-critical financial operations post go-live. The data migration encompasses external Banks (from tables like BNKA): Used for storing details of banks globally (e.g., SWIFT, bank address, bank key).

In Syensqo, the bank master data will be loaded and post go-live, the bank master data will be maintained by loading delta changes on a monthly basis.

Conversion Scope

 Bank master Data will be migrated from PI2 client to S4 


Relevancy Criteria:

1) All Bank keys in BNKA table are in scope from PI2

2) If there are any duplicate keys in source systems 

    a) Read all the records from table BNKA 

    b) Eliminate Duplicates - Select max( BANKS, BANKL) , All the other fields from the step b by grouping on the fields BANKS, BANKL

3) Business is responsible for Data cleansing.


List of source systems and approximate number of records
SourceScope

Source Approx No. of Records

Target SystemTarget Approx

No. of Records

PI2Bank Master~374000S4~374000

Additional Information

Multi-language Requirement

Summarize Multi-language Requirement/s, if any

Document Management

Summarize Document Management requirement, if any

Legal Requirement

Summarize Legal Requirement/s, if any

Special Requirements

Specify any special requirements or considerations that may impact the data conversion process based on specific locations, regulatory compliance or system limitations. Clearly outline any regional or localization requirements such as country-specific data formats, legal reporting obligations or industry standards that must be adhered to (e.g., localization rules for countries like China).

If the data conversion involves third-party systems or external data sources, such as Icertis, describe any additional requirements related to data mapping, transformation logic, validation rules or security measures that must be followed.




Target Design

With Functional input, document the technical design of the target fields that are in the scope of this document.

The technical design of the target for this conversion approach.

TableFieldData ElementField DescriptionData TypeLengthRequirement
BNKABANKSBANKSBank Country/Region KeyCHAR3Mandatory
BNKABANKLBANKKBank KeysCHAR15Mandatory
BNKABANKABANKAName of Financial InstitutionCHAR60Mandatory
BNKAPROVZREGIORegion (State, Province, County)CHAR3Optional
BNKASTRASSTRAS_GPStreet and House NumberCHAR35Optional
BNKAORT01ORT01_GPCityCHAR35Optional
BNKASWIFTSWIFTSWIFT-BIC for International PaymentsCHAR11Optional
BNKABGRUPBGRUPBank Group (bank network)CHAR2Optional
BNKAXPGROXPGROPost Office Bank Current Account FlagCHAR1Optional
BNKALOEVMLOEVMDeletion IndicatorCHAR1Optional
BNKABNKLZBANKLBank NumberCHAR15Mandatory
BNKAADRNRAD_ADDRNUMAddress NumberCHAR10Optional
BNKABRNCHBRNCHBank BranchCHAR40Optional
BNKACHKMECHECKMETHCheck Digit Calculation MethodCHAR4Optional
BNKAVERSVERS_BFFormat of File with Bank DataCHAR3Optional
BNKABICKYPRQ_BICKYKey of a BIC+ data record (Swift)CHAR12Optional


Data Cleansing

The data  will be migrated from ECC (PI2) to S4 . The following cleansing criteria should be applied. This cleaning should be performed by business/Cleansing team


ID

Criticality

Error Message/Report Description

Rule

Output

Source System

1

High

Bank key not unique within country

BANKS + BANKL must be unique

Flag as duplicate; reject or merge records

PI2

2

High

Missing or invalid SWIFT/BIC code

SWIFT field must be 8 or 11 alphanumeric characters

Flag and request update from valid bank directory

PI2

3

Medium

Bank name contains special characters or inconsistent casing

BANKA field should follow naming conventions and avoid special characters

Normalize or correct bank name formatting

PI2

4

High

Obsolete bank record not linked to any business partner

Check if BANKL is not referenced in any related table (e.g., LFBK, KNBK)

Mark for archival or deletion after verification

PI2

7

High

Duplicate bank key entries detected

BANKL must be unique per BANKS; fuzzy match for duplicates

Flag duplicates; consolidate or clean up

PI2

10

High

Bank key format invalid per country-specific rules

BANKL format must match country-specific structure rules (e.g., DE = 8 digits)

Flag and correct according to rulebook or SAP note

PI2

Cleansing is not applicable if the bank master file is procured from the third-party SWIFTRef and no legacy bank master needs to be migrated.

Conversion Process

The high-level process is represented by the diagram below:

Data Privacy and Sensitivity

Summarize Data Privacy and Sensitivity Requirements, if any


Extraction

Data will be extracted from PI2  and will be migrated to S4

Extraction Run Sheet

Req #Requirement DescriptionTeam Responsible

NA










Selection Screen

If applicable, this section will give the details on any selection screen parameters, including the parameter type, that are required to be entered to ensure consistent data extracts.
Selection Ref ScreenParameter NameSelection TypeRequirementValue to be entered/set





















Data Collection Template (DCT)

No DCT for this object


Extraction Dependencies

List the steps that need to occur before extraction can commence

Item #Step DescriptionTeam Responsible

NA










Transformation

The Target fields are mapped to the applicable Legacy field that will be its source, this is a 3-way activity involving the Business, Functional team and Data team. This identifies the transformation activity required to allow Syniti Migrate to make the data Target ready:

  1. Perform value mapping and data transformation rules.
    1. Legacy values are mapped to the to-be values (this could include a default value)
    2. Values are transformed according to the rules defined in Syniti Migrate
  2. Prepare target-ready data in the structure and format that is required for loading via prescribed Load Tool. This step also produces the load data ready for business to perform Pre-load Data Validation

Transformation Run Sheet

Item #

Step description

Team responsible

1.      

Ensure all the fields that require value mapping, as stipulated  Mapping tables, have the latest signed-off mapping files imported into toolMigrate.

Data team

2.      

In tool, select the Object Bank Master

Data team

3.      

Go to Process Area Launch and Process the Object - Bank Master

Data team

4.      

Launch the Objects to execute transformation.

Data team

5.      

Monitor the transformation progress and ensure performance and completion is within allowed timeframe

Data team

6.      

Generate Pre-Load reports .

Data team

7.      

Generate data load count.

Data team

8.      

Log errors as defects, if any and address resolutions. Close defects.

Data team

9.      

Re-transform and re-validate the Pre-load reports if necessary.

Data team

10.   

Validate the transformed file as part of pre-load validation, raise data defects or provide the pre-load sign-off.

Business

11.   

Analyse and resolve any pre-load defects logged by business.

Data team

12.   

Repeat steps 6 to 11 if necessary

Data team

13.   

Proceed to pre-load validations

Data team


Transformation Rules


Rule#Source
System
Source
Table
Source FieldSource DescriptionTarget SystemTarget TableTarget FieldTarget DescriptionTransformation Logic
1ECC / SWIFTREFBNKABANKSBank Country/Region KeyS/4 HANABNKABANKSBank Country/Region KeyCopy as-is from ECC
2ECC / SWIFTREFBNKABANKLBank KeysS/4 HANABNKABANKLBank KeysCopy as-is from ECC
3ECC / SWIFTREFBNKABANKAName of Financial InstitutionS/4 HANABNKABANKAName of Financial InstitutionCopy as-is from ECC
4ECC / SWIFTREFBNKAPROVZRegion (State, Province, County)S/4 HANABNKAPROVZRegion (State, Province, County)Copy as-is from ECC
57ECC / SWIFTREFBNKASTRASStreet and House NumberS/4 HANABNKASTRASStreet and House NumberCopy as-is from ECC
6ECC / SWIFTREFBNKAORT01CityS/4 HANABNKAORT01CityCopy as-is from ECC
7ECC / SWIFTREFBNKASWIFTSWIFT-BIC for International PaymentsS/4 HANABNKASWIFTSWIFT-BIC for International PaymentsCopy as-is from ECC
8ECC / SWIFTREFBNKABGRUPBank Group (bank network)S/4 HANABNKABGRUPBank Group (bank network)Copy as-is from ECC
9ECC / SWIFTREFBNKAXPGROPost Office Bank Current Account FlagS/4 HANABNKAXPGROPost Office Bank Current Account FlagCopy as-is from ECC
10ECC / SWIFTREFBNKALOEVMDeletion IndicatorS/4 HANABNKALOEVMDeletion IndicatorCopy as-is from ECC
11ECC / SWIFTREFBNKABNKLZBank NumberS/4 HANABNKABNKLZBank KeysCopy as-is from ECC
12ECC / SWIFTREFBNKAADRNRAddress NumberS/4 HANABNKAADRNRAddress NumberCopy as-is from ECC
13ECC / SWIFTREFBNKABRNCHBank BranchS/4 HANABNKABRNCHBank BranchCopy as-is from ECC
14ECC / SWIFTREFBNKACHKMECheck Digit Calculation MethodS/4 HANABNKACHKMECheck Digit Calculation MethodCopy as-is from ECC
15ECC / SWIFTREFBNKAVERSFormat of File with Bank DataS/4 HANABNKAVERSFormat of File with Bank DataCopy as-is from ECC
16ECC / SWIFTREFBNKABICKYKey of a BIC+ data record (Swift)S/4 HANABNKABICKYKey of a BIC+ data record (Swift)Copy as-is from ECC


Transformation Mapping

Use the exact name and reference this section in the “Transformation rules” above
Mapping Table NameMapping Table Description








Transformation Dependencies

List the steps that need to occur before transformation can commence
Item #Step DescriptionTeam Responsible













Pre-Load Validation

Project Team

The following pre-load validations will be performed by the Project Team.

Completeness

TaskAction
Generation of
Pre-load reports

Reconciliation of total

Record Count

Summary of Record Counts from Constructed dataset, Valid and Invalid record set.




Accuracy

TaskAction
Mandatory field mapping and transformationObtain a list of the fields to be populated with values from mapping files and ensure all these fields contain S/4HANA values. 
Review the data report to ensure mapping value is not missing in tool.
Capture errors in the Data Error report.

Business

The following pre-load validations will be performed by the business.

Completeness

TaskAction
Verify record count in Pre-load reports Ensure third party file SWIFTREF used for Bank master is up to date and it is the latest version file.





Accuracy

TaskAction
titlespecific details of what and how the task needs to be performed e.g. which reports are being used etc.





Load

The load process includes:

  1. Execute the automated data load into target system using load tool or product the load file if the load must be done manually
  2. Once the data is loaded to the target system, it will be extracted and prepared for Post Load Data Validation

Load Run Sheet

Item

Step description


Team responsible

1

Ensure the load tools are transported into the correct tool instance.

Data team

2

Ensure DCTs and all required mappings are submitted and complete

Data team

3

Ensure Pre-load sign-offs are obtained.

Data team

4

Execute tool Trial Balance Upload

Data team

5

Generate the post load reports in tool.

Data team

6

Log errors as defects, if any and address resolutions. Close defects.

Data team

7

Resolve defects by reupload and re-generate post load reports if necessary.

Data team

8

Business to validate the post load files as part of post-load validation, raise data defects or provide the post-load sign-off.

Business

9

Repeat steps 5 to 7 if necessary.

Data team


Load Phase and Dependencies

Identify the phase as to “when” the load for this object will occur. <Pre-Cutover, Cutover, Post Cutover> and list the steps that need to occur before the load can commence

Configuration

List the Configurations required before loading can commence

Item #Configuration Item






Conversion Objects

Object #Preceding Object Conversion Approach

list the exact title of the conversion object of only the immediate predecessor – this will then confirm the DDD (Data Dependency Diagram)




Error Handling

The table below depicts some possible system errors for this data object during data load. All data load error is to be logged as defect and managed within the Defect Management

Error TypeError DescriptionAction Taken










Post-Load Validation

Project Team

The following post-load validations will be performed by the Project Team.

Completeness

TaskAction
Reconciliation of Record CountConfirm all the  Bank Master records listed in the Construction file (DCT )/ from the third party SWIFTREF file are loaded





Accuracy

TaskAction
Check values in key fields for accuracy

Post-load reports will have the same structure as the load file and some additional columns as required to facilitate the post load validation.

Any mismatch will be reported under the Post Load - Error report.

Each Column will have Source field, target field and Matching Status. ( XXXX_ECC, XXXX_S4HANA, XXXX_MATCH). A loading status field to show the record is loaded in S/4HANA.  Also, a field to unmatched fields in the Post-Load file. 






Business

The following post-load validations will be performed by the business.

Completeness

TaskAction
Record Count Check

Review the record count report from the Data Team and ensure it is correct by cross-checking with the record count confirmed during Pre-load Business Validations

Business will check BNKA table using tcode SE16N. 





Accuracy

TaskAction
Complete CheckPost-Load File contains matching status of each field. 
Spot check



Key Assumptions

  • Master Data Standard is up to date as on the date of documenting this conversion approach and data load.
  • Data Object is in scope based on data design and any exception requested by business.

Any additional key assumptions.


See also

Insert links and references to other documents which are relevant when trying to understand this decision and its implications. Other decisions are often impacted, so it's good to list them here with links. Attachments are also possible but dangerous as they are static documents and not updated by their authors.

Change log

Workflow history