You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 16 Next »

Status

  Lead Approval

Owner
Stakeholders

Purpose

The purpose of this document is to define the conversion approach to create Business Partners - Customer (Sales and Service) - FLCU01 in S/4 HANA.

In SAP ECC, the Customer Sales View is part of the Customer Master Data, which is used to store customer-related information for sales transactions. It includes details such as sales area, pricing, delivery preferences, and billing information. The setup typically involves maintaining customer records separately for different sales organizations, distribution channels, and divisions.
In SAP S/4HANA, the Customer Sales View is integrated into the Business Partner (BP) model, which replaces the traditional customer/vendor objects from ECC. The Business Partner serves as a central entity, allowing a single record to hold multiple roles (e.g., customer and vendor). The Customer Sales View in S/4HANA is represented under the BP role FLCU01, which contains sales-specific data such as sales area assignments, pricing conditions, and delivery preferences


Conversion Scope

The scope of this document covers the approach for converting active from Legacy Source Systems into S/4HANA following the Business Partners - Customer (Sales and Service) - FLCU01 Master Data Design Standard.


The data from legacy system includes:

  1. The BP general (customer general data such as the organization name, address) is migrated
  2. The sales area under which the sales view data is maintained for the customer is within the scope of S4 Hana 
  3. There is no Central Sales Block (KNA1-CASSD) in the customer general master data, which is used to block all the sales view data of the customer.
  4. There is no Deletion flag for customer (sales level) (KNVV-LOEVM)
  5. or there is no Customer order block (sales area) (KNVV-AUFSD)


CriteriaRelevancy RuleTechnical Details
1

The BP general is migrated

Select where KNVV-KUNNR = KNA1-KUNNR in scope
2

AND The sales area under which the sales view data is maintained for the customer is within the scope of S4 Hana 

AND KNVV-VKORG in (Sales Org in Scope)
3AND There is no Central Sales Block (KNA1-CASSD) in the customer general master data, which is used to block all the sales view data of the customer.AND KNA1-CASSD is initial
4AND there is no Deletion flag for customer at sales view levelAND KNVV-LOEVM is initial

The data from legacy system excludes:

  1. The sales org for the sales view is out of scope, such as Oil & Gas and Aroma specific sales organizations.
  2. The sales area is marked with deletion indicator.


List of source systems and approximate number of records
SourceScope

Source Approx No. of Records

Target SystemTarget Approx

No. of Records

WP2

Customer Master Data Sales View

Extract from KNVV/KNVI/KNVP etc.

50671S4 Hana ROW50671
PF2

Customer Master Data Sales View

Extract from KNVV/KNVI/KNVP etc.

30963S4 Hana ROW30963
WP2

Customer Master Data Sales View

Extract from KNVV/KNVI/KNVP etc.

8552S4 Hana China8552
PF2

Customer Master Data Sales View

Extract from KNVV/KNVI/KNVP etc.

8045S4 Hana China8045
WP2

Customer Master Data Sales View

Extract from KNVV/KNVI/KNVP etc. 

10753S4 Hana CUI10753
PF2

Customer Master Data Sales View

Extract from KNVV/KNVI/KNVP etc.

N/AS4 Hana CUIN/A


Additional Information

Multi-language Requirement

N/A

Document Management

N/A

Legal Requirement

CMMC 2.0 is a mandatory DoD cybersecurity certification for contractors handling Controlled Unclassified Information (CUI) and Federal Contract Information (FCI). CUI includes sensitive technical data (e.g., design specs, system info) related to U.S. military and space applications. The Composites Business handles CUI and is therefore within CMMC scope. Without certification, the business risks disqualification from existing and future DoD programs.
It is mandatory to implement CMMC-compliant systems and processes to for all the organizations that are dealing with CUI.

Therefore, there will be one SAP instance specifically for CUI related entities. The migration for CUI related entities will be covered by US based data consultant using separate tools.

Special Requirements

A. Different SAP Instance Migration Approach

Due to compliance requirement, there will be one SAP instance for Rest of the World (ROW), one for China and one for CUI.

  1. For entities in China, the data will be loaded into SAP China instance while the entire migration process will remain the same as rest of the world.
  2. For entities which will reside in CUI, the migration will be handled by US based data consultant. 

Please refer to the link for the entity mapping for each instance. SAP instance mapping based on company code

 - To identify the record is for which SAP instance, it will follow below logic. If the same customer is mapped to multiple instances, the customer sales view data will be migrated in the respective SAP instance. 

  1. For customer with sales view in scope, map the sales org to company code first, then map the company code to the SAP instance based on the mapping file.


B. One Sales Organization per GBU 

As elaborated in KDD060 - Sales Enterprise Structure - Sales Organization, in the S4 Hana design, one sales organization will be mapped to one GBU. Therefore, it is possible that one sales organization in legacy system is mapped to multiple sales organization in the S4. When such scenario happens, one record of customer master data sales view will be split into multiple records based on the mapping.


C. Distribution Channel Transformation

In S4 Hana design, there will be 3 distribution channels defined.

  • Domestic
  • Export
  • Intercompany

Domestic/Export distribution channel is used for external customers, and determined based on the Departure country (from Plant, or Sales Organization/Company code country if plant information is not applicable)/Destination Country (Ship-to Party country).

Intercompany distribution channel is determined based on the nature of the business partner, i.e. if it is Syensqo entity or affiliated companies, it will be defined as Intercompany Distribution Channel.

To identify Domestic/Export distribution channel, it will apply below logics. 

  1. Fetch the Sales history of the customer within the sales organization in scope (VBAK/VBAP/VBPA).
  2. Derive the matrix of Sales Organization/Plant/Plant country/Ship-to party/Ship-to country
  3. If Plant country is same as Ship-to country, then for key combination of Sales Organization/Ship-to party, the transformed distribution channel will be Domestic. 
  4. If Plant country is different as Ship-to country, then for key combination of Sales Organization/Ship-to party, the transformed distribution channel will be Export. 
  5. Apply the result to rest of the customers in the same sales document, e.g., Sold-to customer, Payer, Bill to etc. It will then get a matrix of Sales Org/Customer/Distribution Channel(After transformation)


If there is no sales history as this is a new customer, similar logic will be applied using Customer master data sales view partner data.

  1. Fetch the partner information from customer master data sales view for sales organization in scope (KNVV/KNVP).
  2. Derive the matrix of Sales Organization/Country/Ship-to party/Ship-to country
  3. If Sales Organization country is same as Ship-to country, then for key combination of Sales Organization/Ship-to party, the transformed distribution channel will be Domestic. 
  4. If Sales Organization country is different as Ship-to country, then for key combination of Sales Organization/Ship-to party, the transformed distribution channel will be Export. 
  5. Apply the result to rest of the customers in the same customer master data, e.g., Sold-to customer, Payer, Bill to etc. It will then get a matrix of Sales Org/Customer/Distribution Channel(After transformation)

Consolidate all the entries and remove the duplicate records. As a result, there will be a matrix based on Sales Organization/Distribution Channel/Customer. This information will be the base to migrate the customer sales view data (such as KNVV/KNVP etc.). 



Target Design

The technical design of the target for this conversion approach.

TableFieldData ElementField DescriptionData TypeLengthRequirement






















Data Cleansing

IDCriticalityError Message/Report DescriptionRuleOutputSource System
3003-001C-1Identify customer not used in the existing sales areaThe general view and sales view is active, and the sales view is created for more than 2 years, but there is no sales transaction within the sales area for more than 2 years for this customer

Customer/Name/Country/Sales Org/Distribution Channel/Division/Last usage


  1. Sales Document (Select MAX(VBAK-ERDAT) WHERE VBAK-VBELN=VBPA-VBELN AND VBPA-KUNNR = KNVV-KUNNR AND VBAK-VKORG = KNVV-VKORG AND VBAK-VTWEG = KNVV-VTWEG AND VBAK-SPART = KNVV-SPART)
  2. Delivery Document (Select MAX(LIKP-ERDAT) WHERE LIKP-VBELN=VBPA-VBELN AND VBPA-KUNNR =KNVV-KUNNR AND VBAK-VKORG = KNVV-VKORG AND VBAK-VTWEG = KNVV-VTWEG AND VBAK-SPART = KNVV-SPART AND LIKP-VBELN = LIPS-VBELN AND LIPS-VGBEL = VBAK-VBELN)
  3. Billing   (Select MAX(VBRK-ERDAT) WHERE VBRK-VBELN=VBPA-VBELN AND VBPA-KUNNR =KNVV-KUNNR AND VBRK-VKORG = KNVV-VKORG AND VBRK-VTWEG = KNVV-VTWEG AND VBRK-SPART = KNVV-SPART)

3003-002C-2Fill in mandatory fields based on master data standards
1. Incoterm
For all the sold-to party (partner function SP or AG) and the sales view is active, but there is no incoterm maintainedCustomer/Name/Country/Sales Org/Distribution Channel/Division/Incoterm1/Incoterms 2
3003-003C-1Fill in mandatory fields based on master data standards
2. Payment term
For all the payer party (partner function PY) and the sales view is active, but there is no payment term maintainedCustomer/Name/Country/Sales Org/Distribution Channel/Division/Payment Term
3003-004C-2Fill in mandatory fields based on master data standards
3. Shipping Condition
For all the ship-to party (partner function SH or WE) and the sales view is active, but there is no shipping condition maintainedCustomer/Name/Country/Sales Org/Distribution Channel/Division/Shipping condition
3003-005C-14. Validate non-ISO incoterm used, such as COL. CPU, DAT(replaced by DPU), PPA, PPDFor all the active customer in migration scope, if there is incoterm, and the incoterm value is included in list [COL. CPU, DAT(replaced by DPU), PPA, PPD]Customer/Name/Country/Sales Org/Distribution Channel/Division/Incoterm1/Incoterms 2
3003-006C-25. Validate obsolete payment term maintainedFor all the active customer in migration scope, if there is payment term, and the payment term value is not in S4 Hana designCustomer/Name/Country/Sales Org/Distribution Channel/Division/Payment term/Payment term description
3003-007C-26. Update obsolete CSR as business partnerFor all the active customer in migration scope, if there is partner function ZI/ZN for WP2. VE/VW for PF2Customer/Name/Country/Sales Org/Distribution Channel/Division/Partner function/Personnel Number/Name
3003-008C-37. Incoterm part 2 with "." maintainedFor all the active customer in migration scope, if the Incoterm 2 value is "."Customer/Name/Country/Sales Org/Distribution Channel/Division/Incoterm1/Incoterms 2
3003-009C-2Fill in mandatory fields based on master data standards
8. Missing sales group



3003-010C-2Fill in mandatory fields based on master data standards
9. Missing sales office
For all the active customer in migration scope, the sales office field is blankCustomer/Name/Country/Sales Org/Distribution Channel/Division/Sales Office
3003-011C-310. Check non standard currency code in use such as US$For all the active customer in migration scope, the currency field has value such as US$Customer/Name/Country/Sales Org/Distribution Channel/Division/Currency
3003-012
Due to sales area definition change, multiple sales view record might merge into one record. If multiple records have different values, pick the main records.


3003-013C-1For sales area marked with Customer Order Block, validate if the sales view should be blockedFor all the active customer in migration scope, there is Customer Order block

Customer/Name/Country/Sales Org/Distribution Channel/Division/Customer order block/last usage


Sales Document (Select MAX(VBAK-ERDAT) WHERE VBAK-VBELN=VBPA-VBELN AND VBPA-KUNNR = KNVV-KUNNR AND VBAK-VKORG = KNVV-VKORG AND VBAK-VTWEG = KNVV-VTWEG AND VBAK-SPART = KNVV-SPART)




Conversion Process

The high-level process is represented by the diagram below:

The ETL (Extract, Transform, Load) process is a structured approach to data migration and management, ensuring high-quality data is seamlessly transferred across systems. Here’s a breakdown of its key components:

1. Extraction
The process begins with extracting metadata and raw data from source systems, such as Syensqo ECC system (i.e., WP2/PF2) periodically. The extracted data is then staged for transformation.


2. Transformation
Once extracted, the data undergoes cleansing, consolidation, and governance. This step ensures data integrity, consistency, and compliance with business rules. The transformation process includes:
- Data validation to remove inconsistencies.
- Standardization to align formats across datasets.
- Business rule application to refine data for operational use.


3. Loading
The transformed data is then loaded into the target S4 Hana system using migration cockpit. 





Data Privacy and Sensitivity



Extraction

Extract data from a source into Syniti Migrate for SAP ROW and SAP China relevant entities. There are 2 possibilities:

  1. The data exists. Syniti Migrate connects to the source and loads the data into Syniti Migrate. There are 3 methods:
    1. Perform full data extraction from relevant tables in the source system(s).
    2. Perform extraction through the application layer.
    3. Only if Syniti Migrate cannot connect to the source, data is loaded to the repository from the provided source system extract/report.
  2. The data does not exist (or cannot be converted from its current state).  The data is manually collected by the business directly in Syniti Migrate. This is to be conducted using DCT (Data Collection Template) in Syniti Migrate

The agreed Relevancy criteria is applied to the extracted records to identify the records that are applicable for the Target loads.

Extraction Run Sheet

Req #Requirement DescriptionTeam Responsible
Extraction Scope Definition- Identify the source systems and databases involved.
- Define the data objects (tables, fields, records) to be extracted.
- Establish business rules for data selection.

Syniti

Syniti / LTC Data team

Extraction Methodology- Specify the extraction approach (full, incremental, or delta extraction).
- Determine the tools and technologies used.
- Define data filtering criteria to exclude irrelevant records.
Syniti
Extraction Execution Plan- Establish execution timelines and batch processing schedules.
- Assign responsibilities for extraction monitoring.
- Document dependencies on other migration tasks.
Syniti
Data Quality and Validation- Define error handling mechanisms for extraction failures.Syniti


Selection Screen

Selection Ref ScreenParameter NameSelection TypeRequirementValue to be entered/set
N/A



















Data Collection Template (DCT)

Target Ready Data Collection Template will be created for data with exception of some fields which require transformation as mentioned in the transformation rule.

DCT Rules

Field NameField DescriptionRule












Extraction Dependencies

Item #Step DescriptionTeam Responsible
1

Source System Availability

  • Ensure that the source database or application is accessible.
  • Confirm that necessary credentials and permissions are granted
Syensqo IT
2

Data Structure

  • Identify relationships between tables, views, and stored procedures.
Syniti
3

Referential Integrity

  • Ensure dependent records are extracted together.
Syniti
4

Extraction Methodology

  • Define whether extraction is full, incremental, or delta-based.
  • Establish batch processing schedules for large datasets.
Syniti
5

Performance and Scalability Considerations

  • Optimize extraction queries to prevent system overload.
  • Ensure network bandwidth supports data transfer volumes.
Syniti
6

Security and Compliance

  • Adhere to regulatory standards for sensitive information if applicable
Syniti


Transformation

The Target fields are mapped to the applicable Legacy field that will be its source, this is a 3-way activity involving the Business, Functional team and Data team. This identifies the transformation activity required to allow Syniti Migrate to make the data Target ready:

  1. Perform value mapping and data transformation rules.
    1. Legacy values are mapped to the to-be values (this could include a default value)
    2. Values are transformed according to the rules defined in Syniti Migrate
  2. Prepare target-ready data in the structure and format that is required for loading via prescribed Load Tool. This step also produces the load data ready for business to perform Pre-load Data Validation

Transformation Run Sheet

Item #Step DescriptionTeam Responsible
1Transformation Scope Definition
- Identify the source and target data structures.
- Define business rules for data standardization.
- Establish data cleansing requirements to remove inconsistencies.
Data Team
2Data Mapping and Standardization
- Align source fields with target fields.
- Ensure unit consistency (e.g., currency, measurement units)
Data Team
3Business Rule Application
- Implement data enrichment/collection if applicable
- Apply conditional transformations based on predefined logic/business rules
Data Team
4Transformation Execution Plan
- Define batch processing schedules.
- Assign responsibilities for monitoring execution.
- Establish error-handling mechanisms
Syniti


Transformation Rules

Rule #Source systemSource TableSource FieldSource DescriptionTarget SystemTarget TableTarget FieldTarget DescriptionTransformation Logic









































Transformation Mapping

Mapping Table NameMapping Table Description
MAP_VKORGSales Organization Mapping Table
MAP_VTWEGDistribution Channel Mapping Table
MAP_SPARTDivision Mapping table
MAP_ZTERMPayment terms Mapping table
MAP_PARVWPartner Function Mapping table 
MAP_VKGRPSales Group Mapping to new partner fucntion

Transformation Dependencies

List the steps that need to occur before transformation can commence
Item #Step DescriptionTeam Responsible
1Source Data Integrity
- Ensure extracted data is complete, accurate, and consistent.
- Validate that data types and formats align with transformation requirements.
Syniti
2Referential Integrity
- Ensure dependent records are transformed together or in advance, such as CNV-3007 Business Partner General 
Syniti
3Transformation Logic and Mapping
- Define data mapping rules between source and target schemas.
Data Team
4Performance and Scalability Considerations
- Optimize transformation processes for large datasets.
- Ensure system resources can handle transformation workloads
Syniti
5Logging and Error Handling
- Maintain detailed logs of transformation activities.
- Define error-handling procedures for failed transformations
Syniti


Pre-Load Validation

Project Team

The following pre-load validations will be performed by the Project Team.

Completeness

TaskAction
Compare Data Counts
  1. Verify row counts between source and target databases.
  2. Identify missing or duplicated records.


Validate the mandatory fieldsValidate there is value for all the mandatory fields
Validate Primary Keys and Unique Constraints
  1. Check for duplicate or missing primary key values, i.e., if there is same BP number.
  2. Ensure unique constraints are maintained.


Test Referential IntegrityConfirm dependent records exist in related tables

Accuracy

TaskAction
Validate the transformationValidate the fields which require transformation have the value after transformation instead of the original field value
Check Data Consistency
  1. Compare field values across systems
  2. Validate data formats and structures



Business

Post-load validation is a critical step in data migration, ensuring that transferred data is accurate, complete, and functional within the target system.

1. Ensuring Data Integrity
After migration, data must be consistent with its original structure. Post-load validation checks for missing records, incorrect mappings, and formatting errors to prevent discrepancies.
2. Business Continuity
Faulty data can disrupt operations, leading to financial losses and inefficiencies. Validating post-load data ensures that applications function as expected, preventing downtime.
3. Error Detection and Resolution
By validating data post-migration, businesses can detect anomalies early, reducing the cost and effort required for corrections


Completeness

TaskAction





Accuracy

TaskAction





Load

The load process includes:

  1. Execute the automated data load into target system using load tool or product the load file if the load must be done manually
  2. Once the data is loaded to the target system, it will be extracted and prepared for Post Load Data Validation

Load Run Sheet

Item #Step DescriptionTeam Responsible
1Load Scope Definition
- Identify the target system and database structure.
- Define data objects (tables, fields, records) to be loaded.
- Establish business rules for data validation.
Data team
2Load Methodology
- Specify the loading tools and technologies (Migration Cockpit, LSMW, custom loading program).
Syniti
3Data Quality and Validation
- Ensure data integrity checks (null values, duplicates, format validation).
- Perform pre-load validations to verify completeness.
- Define error handling mechanisms for load failures
Syniti
4Load Execution Plan
- Establish execution timelines and batch processing schedules.
- Assign responsibilities for monitoring execution.
- Document dependencies on other migration tasks
Syniti
5Logging and Reporting
- Maintain detailed logs of loading activities.
- Generate summary reports on loaded data volume and quality.
- Define escalation procedures for errors
Syniti


Load Phase and Dependencies

The Business Partner General will be loaded in the pre-cutover period.

Before loading, it will have dependency on the configuration. 

Configuration

Item #Configuration Item
1Sales Area Definition
2Sales Office Definition
3Sales Group Definition
4Payment Term definition
5Define Tax Determination Rule


Conversion Objects

Object #Preceding Object Conversion Approach
3007Business Partner General (Role 000000)

Employee Personal Information
3011Business Partners - Contact Persons (BUP001)


Error Handling

Error TypeError DescriptionAction Taken
Configuration / Data TransformationThe value XXX for field XXX doesn't exist
  1. Check the mapping/conversion is done properly in the loading file
  2. Validate the target value is configured/transported in the target system
  3. Reach out to function team to validate the configuration
ConfigurationThere is mandatory field XXX missing
  1. Validate MDS if the fields are set as mandatory
  2. Validate if there is value in the pre-loading file
  3. Validate if the configuration for the mandatory fields are done properly




Post-Load Validation

Project Team

The following post-load validations will be performed by the Project Team.

Completeness

TaskAction
Perform Source-to-Target Comparisons
  1. Validate that migrated data matches source records.
  2. Check for discrepancies in numerical values, text fields, and timestamps





Accuracy

TaskAction
Execute Sample Queries and Reports
  1. Run queries to validate business logic.
  2. Generate reports to compare expected vs. actual results
Conduct Post-Migration ReconciliationGenerate reports comparing pre- and post-migration data.



Business

Post-load validation is a critical step in data migration, ensuring that transferred data is accurate, complete, and functional within the target system.

1. Ensuring Data Integrity
After migration, data must be consistent with its original structure. Post-load validation checks for missing records, incorrect mappings, and formatting errors to prevent discrepancies.
2. Business Continuity
Faulty data can disrupt operations, leading to financial losses and inefficiencies. Validating post-load data ensures that applications function as expected, preventing downtime.
3. Error Detection and Resolution
By validating data post-migration, businesses can detect anomalies early, reducing the cost and effort required for corrections


Completeness

TaskAction





Accuracy

TaskAction





Key Assumptions

  • Master Data Standard is up to date as on the date of documenting this conversion approach and data load.
  • BP Customer sales view is in scope based on data design and any exception requested by business.
  • There will be 3 SAP instances, one for ROW, one for China and one for CUI only.
  • For SAP CUI instance, the migration activity will be handled by US based data consultant. 
  • One sales org will represent one GBU as captured in KDD060 - Sales Enterprise Structure.


See also

Change log

Version Published Changed By Comment
CURRENT (v. 16) Apr 08, 2026 13:46 RUAN-ext, Eric CR0455 update
v. 58 Apr 07, 2026 14:40 RUAN-ext, Eric CR0455 update the common DC logic for export/domestic
v. 57 Mar 23, 2026 09:56 RUAN-ext, Eric *20260323 change for shipping condition report
v. 56 Mar 17, 2026 14:54 RUAN-ext, Eric *20260317 Mapping table update
v. 55 Mar 16, 2026 14:03 RUAN-ext, Eric *20260316 mapping table update only
v. 54 Mar 14, 2026 14:09 RUAN-ext, Eric *20260314 update for the account group info in the cleansing report
v. 53 Mar 14, 2026 13:58 RUAN-ext, Eric *20260314 update for the cleansing report logic
v. 52 Mar 05, 2026 16:36 RUAN-ext, Eric *20260305 update to descope some fields
v. 51 Mar 03, 2026 11:46 RUAN-ext, Eric *20260303 update to be more precise when selecting relevant KNVV for migration
v. 50 Feb 22, 2026 13:40 RUAN-ext, Eric *20260222 Remove CUI from ETL graph

Go to Page History

Workflow history

Title Last Updated By Updated Status  
There are no pages at the moment.

  • No labels