You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 15 Next »

Status

  In Progress

OwnerRUAN-ext, Eric 
Stakeholders

Purpose

The purpose of this document is to define the conversion approach to create Customer Hierarchy in S/4 HANA.

In SAP ECC, the customer hierarchy is a tree-like hierarchy where each node is a customer (including parent and child customers). The primarily purpose is used for pricing, rebates, and reporting across related customers. It will be maintained via transaction code VDH1N.

In SAP S/4HANA, customers are managed as Business Partners (BP), enabling a more flexible and integrated data model. Hierarchy nodes are created as BPs with sales area data. It is still maintained via VDH1N (or Fiori app Display/Maintain Customer Hierarchy).


Conversion Scope

The scope of this document covers the approach for converting active Customer Hierarchy from Legacy Source Systems into S/4HANA following the Customer Hierarchy Master Data Design Standard.


The data from legacy system includes:

  1. For Parent, the sales area data has the sales organization in scope and the BP general data/sales view data is migrated.
  2. For Child, the sales area data has the sales organization in scope and the sales view is still active.
  3. Valid-to is after go live date
  4. There is pricing or rebate indicator in the source system

The data from legacy system excludes:

  1. The BP general is not in migration scope
  2. or the sales view for Child customer is not in migration scope
  3. Valid-to is before go live date
  4. Both Parent customer and all the child customers have sales area deletion indicator


List of source systems and approximate number of records
SourceScope

Source Approx No. of Records

Target SystemTarget Approx

No. of Records

WP2Customer Hierarchy1856S4 Hana ROW1856
WP2Customer HierarchyN/AS4 Hana ChinaN/A
WP2Customer HierarchyN/AS4 Hana CUIN/A

Additional Information

Multi-language Requirement

N/A

Document Management

N/A

Legal Requirement

CMMC 2.0 is a mandatory DoD cybersecurity certification for contractors handling Controlled Unclassified Information (CUI) and Federal Contract Information (FCI). CUI includes sensitive technical data (e.g., design specs, system info) related to U.S. military and space applications. The Composites Business handles CUI and is therefore within CMMC scope. Without certification, the business risks disqualification from existing and future DoD programs.
It is mandatory to implement CMMC-compliant systems and processes to for all the organizations that are dealing with CUI.

Therefore, there will be one SAP instance specifically for CUI related entities. 

Special Requirements

Different SAP Instance Migration Approach

Please refer to the link for the entity mapping for each instance. SAP instance mapping based on company code

 - To identify the record is for which SAP instance, it will follow below logic. 

  1. map the sales org to company code first, then map the company code to the SAP instance based on the mapping file.





Target Design

The technical design of the target for this conversion approach.

TableFieldData ElementField DescriptionData TypeLengthRequirement
KNVHMANDTMANDTClientCLNT3Internal
KNVHHITYPHITYPCust.hierarchy typeCHAR1Mandatory
KNVHKUNNRKUNNRCustomerCHAR10Mandatory
KNVHVKORGVKORGSales OrganizationCHAR4Mandatory
KNVHVTWEGVTWEGDistribution ChannelCHAR2Mandatory
KNVHSPARTSPARTDivisionCHAR2Mandatory
KNVHDATABDATABValid fromDATS8Mandatory
KNVHDATBIDATBIValid toDATS8Mandatory
KNVHHKUNNRHKUNNRHigher-level customerCHAR10Mandatory
KNVHHVKORGHVKORGHigher-lev.SalesOrgCHAR4Mandatory
KNVHHVTWEGHVTWEGHgLv distrib.channelCHAR2Mandatory
KNVHHSPARTHSPARTHigher-level divisionCHAR2Mandatory
KNVHGRPNOGRPNORoutine NumberNUMC3Not in use
KNVHBOKREBOKRERebateCHAR1Not in use
KNVHPRFREPRFREPrice determinationCHAR1Internal
KNVHHZUORHZUORHierarchy assignmentNUMC2Not in use
KNVHNODE_GUIDNODE_GUIDCustomer Hier. Node GUIDCHAR32Not in use
KNVHNODE_IDNODE_IDCustomer Hierarchy Node IDCHAR20Not in use


Data Cleansing

IDCriticalityError Message/Report DescriptionRuleOutputSource System
3005-1C1Remove obsolete child customer Child Customer has general data marked as for deletionHigher Customer Number/Name/Child Customer/Name/Sales Organization/Distribution Channel/Division/Deletion Indicator WP2
3005-2C1
Remove child customer with obsolete sales data
Child Customer has the sales area data marked for deletionHigher Customer Number/Name/Child Customer/Name/Sales Organization/Distribution Channel/Division /Deletion IndicatorWP2
3005-3C1Parent Customer with central deletion indicatorHigher Customer has general data marked as for deletionHigher Customer Number/Name/Child Customer/Name/Sales Organization/Distribution Channel/Division /Deletion IndicatorWP2
3005-4C1Parent Customer with sales area deletion indicatorHigher Customer has sales area data marked as for deletionHigher Customer Number/Name/Child Customer/Name/Sales Organization/Distribution Channel/Division /Deletion IndicatorWP2



Conversion Process

The high-level process is represented by the diagram below:

The ETL (Extract, Transform, Load) process is a structured approach to data migration and management, ensuring high-quality data is seamlessly transferred across systems. Here’s a breakdown of its key components:

1. Extraction
The process begins with extracting metadata and raw data from source systems, such as Syensqo ECC system (i.e., WP2/PF2) periodically. The extracted data is then staged for transformation.


2. Transformation
Once extracted, the data undergoes cleansing, consolidation, and governance. This step ensures data integrity, consistency, and compliance with business rules. The transformation process includes:
- Data validation to remove inconsistencies.
- Standardization to align formats across datasets.
- Business rule application to refine data for operational use.


3. Loading
The transformed data is then loaded into the target S4 Hana system. 




Data Privacy and Sensitivity



Extraction

Extract data from a source into Syniti Migrate for SAP ROW and SAP China relevant entities. There are 2 possibilities:

  1. The data exists. Syniti Migrate connects to the source and loads the data into Syniti Migrate. There are 3 methods:
    1. Perform full data extraction from relevant tables in the source system(s).
    2. Perform extraction through the application layer.
    3. Only if Syniti Migrate cannot connect to the source, data is loaded to the repository from the provided source system extract/report.
  2. The data does not exist (or cannot be converted from its current state).  The data is manually collected by the business directly in Syniti Migrate. This is to be conducted using DCT (Data Collection Template) in Syniti Migrate

The agreed Relevancy criteria is applied to the extracted records to identify the records that are applicable for the Target loads.

Extraction Run Sheet

Req #Requirement DescriptionTeam Responsible
Extraction Scope Definition- Identify the source systems and databases involved.
- Define the data objects (tables, fields, records) to be extracted.
- Establish business rules for data selection.

Syniti 

Syniti / LTC Data team

Extraction Methodology- Specify the extraction approach (full, incremental, or delta extraction).
- Determine the tools and technologies used.
- Define data filtering criteria to exclude irrelevant records.
Syniti 
Extraction Execution Plan- Establish execution timelines and batch processing schedules.
- Assign responsibilities for extraction monitoring.
- Document dependencies on other migration tasks.
Syniti 
Data Quality and Validation- Define error handling mechanisms for extraction failures.Syniti 


Selection Screen

Selection Ref ScreenParameter NameSelection TypeRequirementValue to be entered/set
N/A



















Data Collection Template (DCT)

Target Ready Data Collection Template will be created for data with exception of some fields which require transformation as mentioned in the transformation rule.

DCT Rules

Field NameField DescriptionRule
N/A










Extraction Dependencies

Item #Step DescriptionTeam Responsible
1

Source System Availability

  • Ensure that the source database or application is accessible.
  • Confirm that necessary credentials and permissions are granted
Syensqo IT
2

Data Structure

  • Identify relationships between tables, views, and stored procedures.
Syniti 
3

Referential Integrity

  • Ensure dependent records are extracted together.
Syniti 
4

Extraction Methodology

  • Define whether extraction is full, incremental, or delta-based.
  • Establish batch processing schedules for large datasets.
Syniti 
5

Performance and Scalability Considerations

  • Optimize extraction queries to prevent system overload.
  • Ensure network bandwidth supports data transfer volumes.
Syniti 
6

Security and Compliance

  • Adhere to regulatory standards for sensitive information if applicable
Syniti 


Transformation

The Target fields are mapped to the applicable Legacy field that will be its source, this is a 3-way activity involving the Business, Functional team and Data team. This identifies the transformation activity required to allow Syniti Migrate to make the data Target ready:

  1. Perform value mapping and data transformation rules.
    1. Legacy values are mapped to the to-be values (this could include a default value)
    2. Values are transformed according to the rules defined in Syniti Migrate
  2. Prepare target-ready data in the structure and format that is required for loading via prescribed Load Tool. This step also produces the load data ready for business to perform Pre-load Data Validation

Transformation Run Sheet

Item #Step DescriptionTeam Responsible
1Transformation Scope Definition
- Identify the source and target data structures.
- Define business rules for data standardization.
- Establish data cleansing requirements to remove inconsistencies.
Data Team
2Data Mapping and Standardization
- Align source fields with target fields.
- Ensure unit consistency (e.g., currency, measurement units)
Data Team
3Business Rule Application
- Implement data enrichment/collection if applicable
- Apply conditional transformations based on predefined logic/business rules
Data Team
4Transformation Execution Plan
- Define batch processing schedules.
- Assign responsibilities for monitoring execution.
- Establish error-handling mechanisms
Syniti 


Transformation Rules

Rule #Source systemSource TableSource FieldSource DescriptionTarget SystemTarget TableTarget FieldTarget DescriptionTransformation Logic
1WP2KNVHMANDTClientS4KNVHMANDTClientInternal -
2WP2KNVHHITYPCust.hierarchy typeS4KNVHHITYPCust.hierarchy typeMapping -
3WP2KNVHKUNNRCustomerS4KNVHKUNNRCustomer

Mapping - Map based on S4 BP

4WP2KNVHVKORGSales OrganizationS4KNVHVKORGSales Organization

Rule - Follow Higher-lev.SalesOrg HVKORG

5WP2KNVHVTWEGDistribution ChannelS4KNVHVTWEGDistribution Channel

Rule - Follow higher level DC, i.e., if higher DC is export, then this is export.

6WP2KNVHSPARTDivisionS4KNVHSPARTDivision

Default - Default to 01 - Product

7WP2KNVHDATABValid fromS4KNVHDATABValid fromCopy -
8WP2KNVHDATBIValid toS4KNVHDATBIValid toCopy -
9WP2KNVHHKUNNRHigher-level customerS4KNVHHKUNNRHigher-level customer

Mapping - Map based on S4 BP

10WP2KNVHHVKORGHigher-lev.SalesOrgS4KNVHHVKORGHigher-lev.SalesOrg

Mapping - Mapping - Refer to MAP_VKORG
When one legacy VKORG is mapped to multiple VKORG based on mapping table, it should multiply the KNVH record

11WP2KNVHHVTWEGHgLv distrib.channelS4KNVHHVTWEGHgLv distrib.channel

Rule - Get the distribution channel from S4 KNVV records based on legacy KUNNR/VKORG legacy value combinations

12WP2KNVHHSPARTHigher-level divisionS4KNVHHSPARTHigher-level division

Default - Default to 01 - Product

13WP2KNVHGRPNORoutine NumberS4KNVHGRPNORoutine NumberNot in Use -
14WP2KNVHBOKRERebateS4KNVHBOKRERebateNot in Use -
15WP2KNVHPRFREPrice determinationS4KNVHPRFREPrice determinationInternal
16WP2KNVHHZUORHierarchy assignmentS4KNVHHZUORHierarchy assignmentNot in Use -
 17




KNVHNODE_GUIDCustomer Hier. Node GUIDNot in Use -
 18




KNVHNODE_IDCustomer Hierarchy Node IDNot in Use -


Transformation Mapping

Mapping Table NameMapping Table Description
MAP_VKORGSales Organization Mapping Table
MAP_VTWEGDistribution Channel Mapping Table
MAP_SPARTDivision Mapping table


Transformation Dependencies

List the steps that need to occur before transformation can commence
Item #Step DescriptionTeam Responsible
1Source Data Integrity
- Ensure extracted data is complete, accurate, and consistent.
- Validate that data types and formats align with transformation requirements.
Syniti 
2Referential Integrity
- Ensure dependent records are transformed together or in advance
Syniti 
3Transformation Logic and Mapping
- Define data mapping rules between source and target schemas.
Data Team
4Performance and Scalability Considerations
- Optimize transformation processes for large datasets.
- Ensure system resources can handle transformation workloads
Syniti 
5Logging and Error Handling
- Maintain detailed logs of transformation activities.
- Define error-handling procedures for failed transformations
Syniti 


Pre-Load Validation

Project Team

Completeness

TaskAction
Compare Data Counts
  1. Verify row counts between source and target databases.
  2. Identify missing or duplicated records.


Validate the mandatory fieldsValidate there is value for all the mandatory fields
Validate Primary Keys and Unique Constraints
  1. Check for duplicate or missing primary key values, i.e., if there is same BP number.
  2. Ensure unique constraints are maintained.


Test Referential IntegrityConfirm dependent records exist in related tables

Accuracy

TaskAction
Validate the transformationValidate the fields which require transformation have the value after transformation instead of the original field value
Check Data Consistency
  1. Compare field values across systems
  2. Validate data formats and structures



Business

 The following pre-load validations will be performed by the business. 

Completeness

TaskAction
Compare Data Counts
  1. Verify row counts between source and target databases.
  2. Identify missing or duplicated records.


Review populated templates for missing or incorrect valuesUse checklists to verify completeness and correctness before submission



Accuracy

TaskAction





Load

The load process includes:

  1. Execute the automated data load into target system using load tool or product the load file if the load must be done manually
  2. Once the data is loaded to the target system, it will be extracted and prepared for Post Load Data Validation

Load Run Sheet

Item #Step DescriptionTeam Responsible
1Load Scope Definition
- Identify the target system and database structure.
- Define data objects (tables, fields, records) to be loaded.
- Establish business rules for data validation.
Data team
2Load Methodology
- Specify the loading tools and technologies (LSMW).
Syniti 
3Data Quality and Validation
- Ensure data integrity checks (null values, duplicates, format validation).
- Perform pre-load validations to verify completeness.
- Define error handling mechanisms for load failures
Syniti 
4Load Execution Plan
- Establish execution timelines and batch processing schedules.
- Assign responsibilities for monitoring execution.
- Document dependencies on other migration tasks
Syniti 
5Logging and Reporting
- Maintain detailed logs of loading activities.
- Generate summary reports on loaded data volume and quality.
- Define escalation procedures for errors
Syniti 


LSMW Upload Template


FieldDescriptionTypeField Length
HITYPCustomer hierarchy typeCHAR1
S_ERDATvalid from dateCHAR10
S_KUNNRhigher level customerCHAR10
HKUNNRCustomer number of the higher-level customer hierarchyCHAR10
HVKORGHigher-level sales organizationCHAR4
HVTWEGHigher-level distribution channelCHAR2
HSPARTHigher-level divisionCHAR2
KUNNRCustomerCHAR10
VKORGSales OrganizationCHAR4
VTWEGDistribution ChannelCHAR2
SPARTDivisionCHAR2
DATABStart of validity period for assignmentDATS10
DATBIEnd of validity period for the assignmentDATS10


Load Phase and Dependencies

The Customer Hierarchy will be loaded in the pre-cutover period.

Before loading, it will have dependency on the configuration. The configuration needs to be transported into the respective system first.

Configuration

Item #Configuration Item
1Assign sales are to customer hierarchy type




Conversion Objects

Object #Preceding Object Conversion Approach
3007Business Partners - General (Role 000000)
3003Business Partners - Customer (Sales and Service) - FLCU01


Error Handling

Error TypeError DescriptionAction Taken
Configuration ErrorThere is error message "Sales area assignment is not permitted" when assigning the child customer to parent customerSend the configuration to function team to transport the configuration







Post-Load Validation

Project Team

The following post-load validations will be performed by the Project Team.

Completeness

TaskAction
Perform Source-to-Target Comparisons
  1. Validate that migrated data matches source records.
  2. Check for discrepancies in numerical values, text fields, and timestamps





Accuracy

TaskAction
Execute Sample Queries and Reports
  1. Run queries to validate business logic.
  2. Generate reports to compare expected vs. actual results
Conduct Post-Migration ReconciliationGenerate reports comparing pre- and post-migration data.



Business

Post-load validation is a critical step in data migration, ensuring that transferred data is accurate, complete, and functional within the target system.

1. Ensuring Data Integrity
After migration, data must be consistent with its original structure. Post-load validation checks for missing records, incorrect mappings, and formatting errors to prevent discrepancies.
2. Business Continuity
Faulty data can disrupt operations, leading to financial losses and inefficiencies. Validating post-load data ensures that applications function as expected, preventing downtime.
3. Error Detection and Resolution
By validating data post-migration, businesses can detect anomalies early, reducing the cost and effort required for corrections


Completeness

TaskAction
Perform Source-to-Target Comparisons
  1. Validate that migrated data matches source records.
  2. Check for discrepancies in numerical values, text fields, and timestamps
Conduct Post-Migration ReconciliationGo through reports comparing pre- and post-migration data.



Accuracy

TaskAction
Perform Manual TestingConduct manual spot-checks for additional assurance.





Key Assumptions

  • Master Data Standard is up to date as on the date of documenting this conversion approach and data load.
  • Customer hierarchy is in scope based on data design and any exception requested by business.
  • There will be 3 SAP instances, one for ROW, one for China and one for CUI only.
  • For SAP CUI instance, the migration activity will be handled by US based data consultant. 


See also

Change log

Version Published Changed By Comment
CURRENT (v. 15) Nov 12, 2025 10:53 PUN-ext, Eddy
v. 17 Nov 03, 2025 07:04 RUAN-ext, Eric
v. 16 Sept 24, 2025 13:32 RUAN-ext, Eric
v. 15 Sept 14, 2025 08:46 RUAN-ext, Eric
v. 14 Sept 08, 2025 08:21 RUAN-ext, Eric
v. 13 Sept 08, 2025 08:05 RUAN-ext, Eric
v. 12 Sept 08, 2025 07:48 RUAN-ext, Eric
v. 11 Aug 14, 2025 08:41 RUAN-ext, Eric
v. 10 Aug 13, 2025 15:09 RUAN-ext, Eric
v. 9 Aug 12, 2025 10:13 RUAN-ext, Eric

Go to Page History

Workflow history

Title Last Updated By Updated Status  
There are no pages at the moment.

  • No labels