You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 3 Next »

Status

  Lead Approval

Owner
Stakeholders

Purpose

The purpose of this document is to define the conversion approach to create Business Partners - Customer (Sales and Service) - FLCU01 in S/4 HANA.

In SAP ECC, the Customer Sales View is part of the Customer Master Data, which is used to store customer-related information for sales transactions. It includes details such as sales area, pricing, delivery preferences, and billing information. The setup typically involves maintaining customer records separately for different sales organizations, distribution channels, and divisions.
In SAP S/4HANA, the Customer Sales View is integrated into the Business Partner (BP) model, which replaces the traditional customer/vendor objects from ECC. The Business Partner serves as a central entity, allowing a single record to hold multiple roles (e.g., customer and vendor). The Customer Sales View in S/4HANA is represented under the BP role FLCU01, which contains sales-specific data such as sales area assignments, pricing conditions, and delivery preferences


Conversion Scope

The scope of this document covers the approach for converting active from Legacy Source Systems into S/4HANA following the Business Partners - Customer (Sales and Service) - FLCU01 Master Data Design Standard.


The data from legacy system includes:

  1. The BP general is migrated
  2. The sales area under which the sales view data is maintained for the customer is within the scope of S4 Hana 
  3. There is no Central Sales Block (KNA1-CASSD) in the customer general master data, which is used to block all the sales view data of the customer.
  4. or there is no Deletion flag for customer (sales level) (KNVV-LOEVM)
  5. or there is no Customer order block (sales area) (KNVV-AUFSD)

The data from legacy system excludes:

  1. The sales org for the sales view is out of scope, such as Oil & Gas and Aroma specific sales organizations.


List of source systems and approximate number of records
SourceScope

Source Approx No. of Records

Target SystemTarget Approx

No. of Records

WP2Customer Sales View80000S480000
PF2Customer Sales View40000S440000










Additional Information

Multi-language Requirement

N/A

Document Management

N/A

Legal Requirement

N/A

Special Requirements

Due to compliance requirement, there will be one SAP instance for Rest of the World and one for China specifically. For entities in China, the data will be loaded into SAP China instance while the entire migration process will remain the same as rest of the world.

To identify the record is for SAP China Instance, it will use below logic. 

Customer Master Data - General Information

  1. The customer has sales area data in below entities. 
  2. If the customer is used in both China entities and ROW, then the sales data needs to be created in both SAP China and ROW instances.


SAP China Instance specific Sales Organization








Target Design

The technical design of the target for this conversion approach.

TableFieldData ElementField DescriptionData TypeLengthRequirement






















Data Cleansing

IDCriticalityError Message/Report DescriptionRuleOutputSource System


























Conversion Process

The high-level process is represented by the diagram below:

The ETL (Extract, Transform, Load) process is a structured approach to data migration and management, ensuring high-quality data is seamlessly transferred across systems. Here’s a breakdown of its key components:
1. Extraction
The process begins with extracting metadata and raw data from source systems, such as Syensqo ECC system (i.e., WP2/PF2) periodically. The extracted data is then staged for transformation.


2. Transformation
Once extracted, the data undergoes cleansing, consolidation, and governance. This step ensures data integrity, consistency, and compliance with business rules. The transformation process includes:
- Data validation to remove inconsistencies.
- Standardization to align formats across datasets.
- Business rule application to refine data for operational use.


3. Loading
The transformed data is then loaded into the target S4 Hana system. 


image-2025-6-6_10-55-42-1.png

Data Privacy and Sensitivity

N/A


Extraction

Extract data from a source into Syniti Migrate. There are 2 possibilities:

  1. The data exists. Syniti Migrate connects to the source and loads the data into Syniti Migrate. There are 3 methods:
    1. Perform full data extraction from relevant tables in the source system(s).
    2. Perform extraction through the application layer.
    3. Only if Syniti Migrate; cannot connect to the source, data is loaded to the repository from the provided source system extract/report.
  2. The data does not exist (or cannot be converted from its current state).  The data is manually collected by the business directly in Syniti Migrate. This is to be conducted using DCT (Data Collection Template) in Syniti Migrate

The agreed Relevancy criteria is applied to the extracted records to identify the records that are applicable for the Target loads

Extraction Run Sheet

Req #Requirement DescriptionTeam Responsible
Extraction Scope Definition- Identify the source systems and databases involved.
- Define the data objects (tables, fields, records) to be extracted.
- Establish business rules for data selection.
Synithi
Extraction Methodology- Specify the extraction approach (full, incremental, or delta extraction).
- Determine the tools and technologies used.
- Define data filtering criteria to exclude irrelevant records.
Synithi
Extraction Execution Plan- Establish execution timelines and batch processing schedules.
- Assign responsibilities for extraction monitoring.
- Document dependencies on other migration tasks.
Synithi
Data Quality and Validation- Define error handling mechanisms for extraction failures.Synithi


Selection Screen

Selection Ref ScreenParameter NameSelection TypeRequirementValue to be entered/set
N/A



















Data Collection Template (DCT)

Target Ready Data Collection Template will be created for data with exception of some fields which require transformation as mentioned in the transformation rule.

DCT Rules

Field NameField DescriptionRule












Extraction Dependencies

Item #Step DescriptionTeam Responsible
1

Source System Availability

  • Ensure that the source database or application is accessible.
  • Confirm that necessary credentials and permissions are granted
Syensqo IT
2

Data Structure

  • Identify relationships between tables, views, and stored procedures.
Synithi
3

Referential Integrity

  • Ensure dependent records are extracted together.
Synithi
4

Extraction Methodology

  • Define whether extraction is full, incremental, or delta-based.
  • Establish batch processing schedules for large datasets.
Synithi
5

Performance and Scalability Considerations

  • Optimize extraction queries to prevent system overload.
  • Ensure network bandwidth supports data transfer volumes.
Synithi
6

Security and Compliance

  • Adhere to regulatory standards for sensitive information if applicable
Synithi


Transformation

The Target fields are mapped to the applicable Legacy field that will be its source, this is a 3-way activity involving the Business, Functional team and Data team. This identifies the transformation activity required to allow Syniti Migrate to make the data Target ready:

  1. Perform value mapping and data transformation rules.
    1. Legacy values are mapped to the to-be values (this could include a default value)
    2. Values are transformed according to the rules defined in Syniti Migrate
  2. Prepare target-ready data in the structure and format that is required for loading via prescribed Load Tool. This step also produces the load data ready for business to perform Pre-load Data Validation

Transformation Run Sheet

Item #Step DescriptionTeam Responsible
1Transformation Scope Definition
- Identify the source and target data structures.
- Define business rules for data standardization.
- Establish data cleansing requirements to remove inconsistencies.
Data Team
2Data Mapping and Standardization
- Align source fields with target fields.
- Ensure unit consistency (e.g., currency, measurement units)
Data Team
3Business Rule Application
- Implement data enrichment/collection if applicable
- Apply conditional transformations based on predefined logic/business rules
Data Team
4Transformation Execution Plan
- Define batch processing schedules.
- Assign responsibilities for monitoring execution.
- Establish error-handling mechanisms
Synithi


Transformation Rules

Rule #Source systemSource TableSource FieldSource DescriptionTarget SystemTarget TableTarget FieldTarget DescriptionTransformation Logic









































Transformation Mapping

Mapping Table NameMapping Table Description
MAP_VKORGSales Organization Mapping Table
MAP_VTWEGDistribution Channel Mapping Table
MAP_SPARTDivision Mapping table
MAP_ZTERMPayment terms Mapping table
MAP_PARVWPartner Function Mapping table 


Transformation Dependencies

List the steps that need to occur before transformation can commence
Item #Step DescriptionTeam Responsible
1Source Data Integrity
- Ensure extracted data is complete, accurate, and consistent.
- Validate that data types and formats align with transformation requirements.
Synithi
2Referential Integrity
- Ensure dependent records are transformed together or in advance, such as CNV-3007 Business Partner General 
Synithi
3Transformation Logic and Mapping
- Define data mapping rules between source and target schemas.
Data Team
4Performance and Scalability Considerations
- Optimize transformation processes for large datasets.
- Ensure system resources can handle transformation workloads
Synithi
5Logging and Error Handling
- Maintain detailed logs of transformation activities.
- Define error-handling procedures for failed transformations
Synithi


Pre-Load Validation

Project Team

Completeness

TaskAction





Accuracy

TaskAction





Business

Completeness

TaskAction





Accuracy

TaskAction





Load

The load process includes:

  1. Execute the automated data load into target system using load tool or product the load file if the load must be done manually
  2. Once the data is loaded to the target system, it will be extracted and prepared for Post Load Data Validation

Load Run Sheet

Item #Step DescriptionTeam Responsible













Load Phase and Dependencies

Configuration

Item #Configuration Item






Conversion Objects

Object #Preceding Object Conversion Approach
3007Business Partner General (Role 000000)

Employee Personal Information
3011Business Partners - Contact Persons (BUP001)


Error Handling

Error TypeError DescriptionAction Taken
Configuration / Data TransformationThe value XXX for field XXX doesn't exist
  1. Check the mapping/conversion is done properly in the loading file
  2. Validate the target value is configured/transported in the target system
  3. Reach out to function team to validate the configuration
ConfigurationThere is mandatory field XXX missing
  1. Validate MDS if the fields are set as mandatory
  2. Validate if there is value in the pre-loading file
  3. Validate if the configuration for the mandatory fields are done properly




Post-Load Validation

Project Team

Completeness

TaskAction





Accuracy

TaskAction





Business

Completeness

TaskAction





Accuracy

TaskAction





Key Assumptions

  • Master Data Standard is up to date as on the date of documenting this conversion approach and data load.
  • is in scope based on data design and any exception requested by business.


See also

Change log

Version Published Changed By Comment
CURRENT (v. 3) Apr 08, 2026 13:46 RUAN-ext, Eric CR0455 update
v. 58 Apr 07, 2026 14:40 RUAN-ext, Eric CR0455 update the common DC logic for export/domestic
v. 57 Mar 23, 2026 09:56 RUAN-ext, Eric *20260323 change for shipping condition report
v. 56 Mar 17, 2026 14:54 RUAN-ext, Eric *20260317 Mapping table update
v. 55 Mar 16, 2026 14:03 RUAN-ext, Eric *20260316 mapping table update only
v. 54 Mar 14, 2026 14:09 RUAN-ext, Eric *20260314 update for the account group info in the cleansing report
v. 53 Mar 14, 2026 13:58 RUAN-ext, Eric *20260314 update for the cleansing report logic
v. 52 Mar 05, 2026 16:36 RUAN-ext, Eric *20260305 update to descope some fields
v. 51 Mar 03, 2026 11:46 RUAN-ext, Eric *20260303 update to be more precise when selecting relevant KNVV for migration
v. 50 Feb 22, 2026 13:40 RUAN-ext, Eric *20260222 Remove CUI from ETL graph

Go to Page History

Workflow history

Title Last Updated By Updated Status  
There are no pages at the moment.

  • No labels