Status

OwnerBAJAJ-ext, Manoj 
Stakeholders

Purpose

The purpose of this document is to define the conversion approach to create conversion for Object 1201 - BP Group Assignment for Inbound Correspondence

The primary function of this transaction of this object is to facilitate the assignment of inbound financial messages/ deals (coming from external systems like trading platforms) to existing or new TRM transactions in S/4HANA. This assignment is to a business partner group. This is maintained in the system using Transaction Code FTR_INB_ASSIGN and it is different from the correspondence tab of the BP Counterparty Company Code Data.

This document details out the field list, extraction criteria, scoping, mapping, transformation logic, generation of the pre-load file and loading of the same with respect to the object 1201- BP Group Assignment for Inbound Correspondence, to be used in treasury module of S4HANA.

Key tables and fields: 

The main table used for object BP Group Assignment for Inbound Correspondence is TCORT_INB_ASSIGN and its key fields are as below:

  • PARTNR - Partner - This denotes the Business Partner Number. 
  • COMPANY_CODE - Company Code - The company code for which the assignment is being done for Inbound Correspondence for a group of business partners
  • RECIPIENT_TYPE - Recipient Type - The business partner's role type, that will determine whether the correspondence is meant to be sent or received. In Syensquo's context this is either
    • Counterparty, or
    • Depository
  • PRODUCT_CAT- Product Category - The product category is an internal key that controls processing of the objects defined by the product type. The user must specify which product categories the trader assignment must be extended
  • PRODUCT_TYPE - Product Type - The product type allows the subdivision of a product category. Here user must specify which product types of the trader assignment must be extended. The product types enable to give the products company-specific names e.g. stocks can be grouped into domestic stocks and foreign stocks; bonds can be divided into fixed-interest bonds, variable-interest bonds, and zero bonds.
  • TRANSACTION_TYPE - Transaction Type - The financial transaction type specifies which types of transactions can be carried out with a certain product type. It needs to be specified which transaction types the trader assignment must be extended. Such as 
    • investment and borrowing of fixed-term deposits,
    • deposits at notice,
    • purchase and sale of bonds.
  • FUNCTION_IN - Correspondence Class - It denotes Type of Treasury related correspondence between a Company Code and a business partner, assigned with treasury roles. The correspondence class is related to the business transactions of the available message formats, such as SWIFT messages. Correspondence classes are used as configuration terms that control the channel, format, and other attributes of TRM (Treasury, Risk Management) correspondence communication.
  • BPGROUP - Business Partner Group ID - This is an ID that is created for a group of Business Partners and is used to assign to a correspondence class for the combination of Product Category, Product Type, Transaction Type. The Business Partner grouping is based on the role of the Business Partner and in Syensquo's context this applies to Business Partners with Counterparty or Depository roles

The treasury function in legacy is carried out in the Quantum system and therefore for the conversion of this object, the data/details need to be provided by business in a Data Collection Template (hereinafter referred to as DCT) and transformed as applicable for S4HANA in Syniti and load using a custom program <TBC> as no LTMC is available for this object.


Conversion Scope

The scope of this document covers the approach for maintaining active assignments of inbound correspondences to a Treasury Business Partner, or a group of Business Partners having Counterparty or Depository Role for the object 1201 - BP Group Assignment for Inbound Correspondence
into S/4HANA following "DD-FUN-050 Master Data Standard_1201 - BP Group Assignment for Inbound Correspondence".


List of source systems and approximate number of records

SourceScope

Source Approx No. of Records

Target SystemTarget Approx

No. of Records

Quantum

Business Partner Group Assignment for Inbound Correspondence NAS4HANA100 <TBC>

Additional Information

Multi-language Requirement

Not applicable

Document Management

Not applicable

Legal Requirement

Not applicable

Special Requirements

As the conversion involves a source system which is non-SAP (Quantum), the below areas need special handling:

  1. The relevant records to be provided into a DCT and stored in the Syniti for Transformation

Target Design

The technical design of the target for this conversion approach.

TableFieldData ElementField DescriptionData TypeLengthRequirement
TCORT_INB_ASSIGNPARTNERBP_PARTNR_NEW Business Partner NumberCHAR10Conditional
TCORT_INB_ASSIGNCOMPANY_CODEBUKRSCompany CodeCHAR4Conditional
TCORT_INB_ASSIGNRECIPIENT_TYPEFTR_RECIPIENT_TYPERecipient/Sender TypeCHAR15Conditional
TCORT_INB_ASSIGNPRODUCT_CATSANLFProduct CategoryNUMC3Conditional
TCORT_INB_ASSIGNPRODUCT_TYPE

VVSART

Product TypeCHAR3Conditional
TCORT_INB_ASSIGNTRANSACTION_TYPE

TB_SFHAART

Transaction TypeCHAR3Conditional
TCORT_INB_ASSIGNFUNCTION_INFTR_FUNCTIONCorrespondence ClassCHAR15Conditional
TCORT_INB_ASSIGNBPGROUPFTR_BPGRPBusiness Partner Group IDCHAR15Mandatory


Data Cleansing

All data cleansing should take place in the data source system as defined in this document, unless system limitations prevent it.

If data cleansing is managed outside of the source system (e.g. Syniti Migrate, 3rd Party Vendor, DCT), the necessary documentation must be produced and appended to this deliverable for sign-off.

IDCriticalityError Message/Report DescriptionRuleOutputSource System


























Conversion Process

The high-level process is represented by the diagram below:

Summarize High-Level Process. Include diagrams, where applicable. Include information supporting details of Extract, Transform and Load specific to the Data Object


Data Privacy and Sensitivity

Summarize Data Privacy and Sensitivity Requirements, if any


Extraction

Extract data from a source into Syniti Migrate. There are 2 possibilities:

  1. The data exists. Syniti Migrate connects to the source and loads the data into Syniti Migrate. There are 3 methods:
    1. Perform full data extraction from relevant tables in the source system(s).
    2. Perform extraction through the application layer.
    3. Only if Syniti Migrate; cannot connect to the source, data is loaded to the repository from the provided source system extract/report.
  2. The data does not exist (or cannot be converted from its current state). The data is manually collected by the business directly in Syniti Migrate. This is to be conducted using DCT (Data Collection Template) in Syniti Migrate

The agreed Relevancy criteria is applied to the extracted records to identify the records that are applicable for the Target loads

Extraction Run Sheet

Req #Requirement DescriptionTeam Responsible













Selection Screen

If applicable, this section will give the details on any selection screen parameters, including the parameter type, that are required to be entered to ensure consistent data extracts.
Selection Ref ScreenParameter NameSelection TypeRequirementValue to be entered/set





















Data Collection Template (DCT)

Target Ready Data Collection Template will be created for Data Object data with exception of some fields which require transformation as mentioned in the transformation rule.

<Object> DCT Rules

Field NameField DescriptionRule












Extraction Dependencies

List the steps that need to occur before extraction can commence

Item #Step DescriptionTeam Responsible













Transformation

The Target fields are mapped to the applicable Legacy field that will be its source, this is a 3-way activity involving the Business, Functional team and Data team. This identifies the transformation activity required to allow Syniti Migrate to make the data Target ready:

  1. Perform value mapping and data transformation rules.
    1. Legacy values are mapped to the to-be values (this could include a default value)
    2. Values are transformed according to the rules defined in Syniti Migrate
  2. Prepare target-ready data in the structure and format that is required for loading via prescribed Load Tool. This step also produces the load data ready for business to perform Pre-load Data Validation

Transformation Run Sheet

Item #Step DescriptionTeam Responsible













Transformation Rules

Rule #Source systemSource TableSource FieldSource DescriptionTarget SystemTarget TableTarget FieldTarget DescriptionTransformation Logic









































Transformation Mapping

Use the exact name and reference this section in the “Transformation rules” above
Mapping Table NameMapping Table Description








Transformation Dependencies

List the steps that need to occur before transformation can commence
Item #Step DescriptionTeam Responsible













Pre-Load Validation

Project Team

The following pre-load validations will be performed by the Project Team.

Completeness

TaskAction
titlespecific details of what and how the task needs to be performed e.g. which reports are being used etc.





Accuracy

TaskAction
titlespecific details of what and how the task needs to be performed e.g. which reports are being used etc.





Business

The following pre-load validations will be performed by the business.

Completeness

TaskAction
titlespecific details of what and how the task needs to be performed e.g. which reports are being used etc.





Accuracy

TaskAction
titlespecific details of what and how the task needs to be performed e.g. which reports are being used etc.





Load

The load process includes:

  1. Execute the automated data load into target system using load tool or product the load file if the load must be done manually
  2. Once the data is loaded to the target system, it will be extracted and prepared for Post Load Data Validation

Load Run Sheet

Item #Step DescriptionTeam Responsible













Load Phase and Dependencies

Identify the phase as to “when” the load for this object will occur. <Pre-Cutover, Cutover, Post Cutover> and list the steps that need to occur before the load can commence

Configuration

List the Configurations required before loading can commence

Item #Configuration Item






Conversion Objects

Object #Preceding Object Conversion Approach

list the exact title of the conversion object of only the immediate predecessor – this will then confirm the DDD (Data Dependency Diagram)




Error Handling

The table below depicts some possible system errors for this data object during data load. All data load error is to be logged as defect and managed within the Defect Management

Error TypeError DescriptionAction Taken










Post-Load Validation

Project Team

The following post-load validations will be performed by the Project Team.

Completeness

TaskAction
titlespecific details of what and how the task needs to be performed e.g. which reports are being used etc.





Accuracy

TaskAction
titlespecific details of what and how the task needs to be performed e.g. which reports are being used etc.





Business

The following post-load validations will be performed by the business.

Completeness

TaskAction
titlespecific details of what and how the task needs to be performed e.g. which reports are being used etc.





Accuracy

TaskAction
titlespecific details of what and how the task needs to be performed e.g. which reports are being used etc.





Key Assumptions

  • Master Data Standard is up to date as on the date of documenting this conversion approach and data load.
  • Data Object is in scope based on data design and any exception requested by business.

Any additional key assumptions.


See also

Insert links and references to other documents which are relevant when trying to understand this decision and its implications. Other decisions are often impacted, so it's good to list them here with links. Attachments are also possible but dangerous as they are static documents and not updated by their authors.

Change log