| Status | Revision in Progress |
|---|---|
| Owner | LIU-ext, Ekawati |
| Stakeholders |
Purpose
The purpose of this document is to define the conversion approach to create Materials - Sales view with sales long text in S/4 HANA.
Conversion Scope
The scope of this document covers the approach for converting active Material Master Sales View including the sales long text from Legacy Source Systems into S/4 HANA following the Material Sales View Master Data Design Standard.
From the current system landscape, Material Master data exists separately in the legacy systems (WP2 and PF2), with potential discrepancies in both organizations. Harmonization and validation are required to ensure accurate and consolidated data in S/4HANA. While WP2 and PF2 serve as source systems, extensive mapping and transformation logic will be necessary to produce properly formatted load templates in line with the target design.
The data from legacy system includes:
- Material without sales view block: X-distr.chain status OR DChain-spec. status are blank or 63 (Available for Sale) AND Material has sales transactions within the sales organizations in scope for the last 5 years, e.g. sales order, delivery or billing.
- Or Material without sales view block: X-distr.chain status OR DChain-spec. status are blank or 63 (Available for Sale) AND Material does not have sales transactions within the sales organizations in scope for the last 5 years, e.g. sales order, delivery or billing AND Material has stock within the sales organization in scope.
The data from legacy system excludes:
- Material with sales view block: X-distr.chain status OR DChain-spec. status are not blank or not 63 (Available for Sale)
List of source systems and approximate number of records
*MVKE Sales Organization X 3 Distribution Channel
| Source | Scope | Source Approx No. of Records | Target System | Target Approx No. of Records |
|---|---|---|---|---|
WP2 | Materials - Sales View | 93,000 | S/4HANA ROW | 93,000 |
PF2 | Materials - Sales View | S/4HANA ROW | ||
WP2 | Materials - Sales View | 13,500 | S/4HANA China | 13,500 |
PF2 | Materials - Sales View | S/4HANA China | ||
WP2 | Materials - Sales View | 63,000 | S/4HANA CUI | 63,000 |
PF2 | Materials - Sales View | S/4HANA CUI |
Additional Information
Multi-language Requirement
Not Applicable
Document Management
Not Applicable
Legal Requirement
As per US Department of Defence (DOD) a Cybersecurity Maturity Model Certification/ CMMC 2.0 is a mandatory certification for all contractors handling Controlled Unclassified Information (CUI) and Federal Contract Information (FCI). CUI includes sensitive technical data (e.g., design specs, system info) related to U.S. military and space applications. The Composites Business handles CUI and is therefore within CMMC scope. Without certification, the business risks disqualification from existing and future DoD programs.
It is mandatory to implement CMMC-compliant systems and processes to for all the organizations that are dealing with CUI.
Therefore, there will be one SAP instance specifically for CUI related entities. The migration for CUI related entities will be covered by US based data consultant using separate tools.
Special Requirements
Target Design
Materials - Sales View with Sales Long Text Data strictly adheres to the Master Data Standard. The complete information of the key fields that hold the Materials - Sales View with Sales Long Text information follows the Master Data Standard document.
The technical design of the target for this conversion approach.
| Table | Field | Data Element | Field Description | Data Type | Length | Requirement |
|---|---|---|---|---|---|---|
Data Cleansing
| ID | Criticality | Error Message/Report Description | Rule | Output | Source System |
|---|---|---|---|---|---|
2003-001 | PF2/ WP2 | ||||
2003-002 | PF2/ WP2 | ||||
2003-003 | PF2/ WP2 | ||||
Set sales view block for material not eligible for migration | Check the usage of material sales view not in use for 5 years and without stock, but still active |
Conversion Process
The high-level process is represented by the diagram below:
Data Privacy and Sensitivity
Extraction
Extract data from a source into . There are 2 possibilities:
- The data exists. connects to the source and loads the data into . There are 3 methods:
- Perform full data extraction from relevant tables in the source system(s).
- Perform extraction through the application layer.
- Only if ; cannot connect to the source, data is loaded to the repository from the provided source system extract/report.
- The data does not exist (or cannot be converted from its current state). The data is manually collected by the business directly in . This is to be conducted using DCT (Data Collection Template) in
The agreed Relevancy criteria is applied to the extracted records to identify the records that are applicable for the Target loads
Extraction Run Sheet
| Req # | Requirement Description | Team Responsible |
|---|---|---|
Extraction Scope Definition | - Identify the source systems and databases involved. - Define the data objects (tables, fields, records) to be extracted. - Establish business rules for data selection. | Syniti |
Extraction Methodology | - Specify the extraction approach (full, incremental, or delta extraction). - Determine the tools and technologies used. - Define data filtering criteria to exclude irrelevant records. | Syniti |
Extraction Execution Plan | - Establish execution timelines and batch processing schedules. - Assign responsibilities for extraction monitoring. - Document dependencies on other migration tasks. | Syniti |
Data Quality and Validation | - Define error handling mechanisms for extraction failures. | Syniti |
Selection Screen
| Selection Ref Screen | Parameter Name | Selection Type | Requirement | Value to be entered/set |
|---|---|---|---|---|
Data Collection Template (DCT)
Target Ready Data Collection Template will be created for data with exception of some fields which require transformation as mentioned in the transformation rule.Materials - Sales view with sales long text DCT Rules
| Field Name | Field Description | Rule |
|---|---|---|
Extraction Dependencies
| Item # | Step Description | Team Responsible |
|---|---|---|
| 1 | Source System Availability
| Syensqo IT |
| 2 | Data Structure
| Syniti |
| 3 | Referential Integrity
| Syniti |
| 4 | Extraction Methodology
| Syniti |
| 5 | Performance and Scalability Considerations
| Syniti |
| 6 | Security and Compliance
| Syniti |
| 7 | Data cleansing of legacy Material Master - Sales View with sales long text data must be completed. If standardization within the DCT begins using relevant data from PF2 and WP2 before the cleansing is finalized, it is understood that the business will take due diligence to ensure any subsequent delta cleansing is verified and aligned within the DCT. | Business |
Transformation
The Target fields are mapped to the applicable Legacy field that will be its source, this is a 3-way activity involving the Business, Functional team and Data team. This identifies the transformation activity required to allow to make the data Target ready:
- Perform value mapping and data transformation rules.
- Legacy values are mapped to the to-be values (this could include a default value)
- Values are transformed according to the rules defined in
- Prepare target-ready data in the structure and format that is required for loading via prescribed Load Tool. This step also produces the load data ready for business to perform Pre-load Data Validation
Transformation Run Sheet
| Item # | Step Description | Team Responsible |
|---|---|---|
1 | Transformation Scope Definition - Identify the source and target data structures. - Define business rules for data standardization. - Establish data cleansing requirements to remove inconsistencies. | Data Team |
2 | Data Mapping and Standardization - Align source fields with target fields. - Ensure unit consistency (e.g., currency, measurement units) | Data Team |
3 | Business Rule Application - Implement data enrichment/collection if applicable - Apply conditional transformations based on predefined logic/business rules | Data Team |
4 | Transformation Execution Plan - Define batch processing schedules. - Assign responsibilities for monitoring execution. - Establish error-handling mechanisms | Syniti |
Transformation Rules
| Rule # | Source system | Source Table | Source Field | Source Description | Target System | Target Table | Target Field | Target Description | Transformation Logic |
|---|---|---|---|---|---|---|---|---|---|
Transformation Mapping
| Mapping Table Name | Mapping Table Description |
|---|---|
| MAP_VKORG | Sales Organization Mapping |
| MAP_VTWEG | Distribution Channel Mapping SyWay - Sales Area.pptx --> All the actual distribution channels and divisions won't exist in the to-be solution Check current Dist.Ch. and discuss with Functional team and how to do the mapping |
| MAP_MEINS | Unit of Measure Mapping MVKE_VRKME shares the same mapping table with Basic view (MARA_MEINS). Share the same mapping table with S2P. To check with Jasleen Madhok, John Hancock, Angelo Buosi |
| MAP_MSTAV | Material Status - Sales Mapping MVSTAV - Cross Distribution Chain VMSTA - Distribution Chain Specific Check the configuration in ECC & S/4 HANA if we need this mapping or not? Or if status <> blank then excluded from ECC download? |
| MAP_WERKS | Plant Mapping WERKS - Plant DWERK - Delivering Plant Share the same mapping table with S2P. To check with Jasleen Madhok, John Hancock, Angelo Buosi |
| MAP_TATYP | Tax Condition Type Mapping |
| MAP_TAXKM | Tax Classification - Material Mapping |
| MAP_VERSG | Statistics Group - Material Mapping TBD |
| MAP_PRDHA | Product Hierarchy Mapping MARA_PRDHA - Basic Data 1 MVKE_PRODH - Sales: Sales Org. 2 Prod. Hierarchies' levels are the same. Share the same mapping table with Basic view (MARA_PRDHA). Share the same mapping table with S2P. To check with Jasleen Madhok, John Hancock, Angelo Buosi |
| MAP_KONDM | Material Price Group Mapping |
| MAP_KTGRM | Account Assignment Group - Material Mapping |
| MAP_MTPOS | Item Category Group Mapping MTPOS_MARA - Basic Data 1 MTPOS - Sales: Sales Org. 2 Share the same mapping table with S2P. To check with Jasleen Madhok, John Hancock, Angelo Buosi |
| MAP_MVGR1 | Material Group 1 Mapping Use in QF2 |
| MAP_MVGR2 | Material Group 2 Mapping Use in QF2 and WQ2 |
| MAP_MVGR3 | Material Group 3 Mapping Use in QF2 |
| MAP_MVGR5 | Material Group 5 Mapping Use in WQ2 |
Transformation Dependencies
List the steps that need to occur before transformation can commence| Item # | Step Description | Team Responsible |
|---|---|---|
1 | Source Data Integrity - Ensure extracted data is complete, accurate, and consistent. - Validate that data types and formats align with transformation requirements. | Syniti |
2 | Referential Integrity - Ensure dependent records are transformed together or in advance | Syniti |
3 | Transformation Logic and Mapping - Define data mapping rules between source and target schemas. | Data Team |
4 | Performance and Scalability Considerations - Optimize transformation processes for large datasets. - Ensure system resources can handle transformation workloads | Syniti |
5 | Logging and Error Handling - Maintain detailed logs of transformation activities. - Define error-handling procedures for failed transformations | Syniti |
Pre-Load Validation
Project Team
Completeness
| Task | Action |
|---|---|
Compare Data Counts |
|
Validate the mandatory fields | Validate there is value for all the mandatory fields |
Validate Primary Keys and Unique Constraints |
|
Test Referential Integrity | Confirm dependent records exist in related tables |
Accuracy
| Task | Action |
|---|---|
Validate the transformation | Validate the fields which require transformation have the value after transformation instead of the original field value |
Check Data Consistency |
|
Business
Completeness
| Task | Action |
|---|---|
Verify Record Count | Business Data Owner/s to verify that the total number of relevant records from the DCT is equal to the total number of records in the Preload and Load Sheets. |
Accuracy
| Task | Action |
|---|---|
Conversion Accuracy | Business Data Owner/s to verify that all the data in the load table/file is accurate as per endorsed transformation/ mapping rules (and signed-off DCT data). |
Load
The load process includes:
- Execute the automated data load into target system using load tool or product the load file if the load must be done manually
- Once the data is loaded to the target system, it will be extracted and prepared for Post Load Data Validation
Load Run Sheet
| Item # | Step Description | Team Responsible |
|---|---|---|
1 | Load Scope Definition - Identify the target system and database structure. - Define data objects (tables, fields, records) to be loaded. - Establish business rules for data validation. | Data team |
2 | Load Methodology - Specify the loading tools and technologies (Migration Cockpit, LSMW, custom loading program). | Syniti |
3 | Data Quality and Validation - Ensure data integrity checks (null values, duplicates, format validation). - Perform pre-load validations to verify completeness. - Define error handling mechanisms for load failures | Syniti |
4 | Load Execution Plan - Establish execution timelines and batch processing schedules. - Assign responsibilities for monitoring execution. - Document dependencies on other migration tasks | Syniti |
5 | Logging and Reporting - Maintain detailed logs of loading activities. - Generate summary reports on loaded data volume and quality. - Define escalation procedures for errors | Syniti |
Load Phase and Dependencies
The Materials – Sales views with sales long text will be loaded in the pre-cutover period.
Before loading, it will have dependency on the configuration and 2019 - Materials-Basic View(required Material Types-ROH/FERT/HALB/SRV/DIEN/ZDIE/Packaging/Spares). The configuration needs to be transported into the respective system first, including the manual configuration such as the number range set up. And 2019 Materials-Basic View(required Material Types-ROH/FERT/HALB/SRV/DIEN/ZDIE/Packaging/Spares) have to be uploaded into the respective system first.
Configuration
| Item # | Configuration Item |
|---|---|
1 | Sales Organization |
2 | Distribution Channel |
3 | Unit of Measure |
4 | Material Status - Sales |
5 | Plant |
6 | Tax Condition Type |
7 | Tax Classification - Material |
8 | Statistics Group - Material |
9 | Product Hierarchy |
10 | Material Price Group |
11 | Account Assignment Group - Material |
12 | Item Category Group |
13 | Material Group 1 |
14 | Material Group 2 |
15 | Material Group 3 |
16 | Material Group 5 |
Conversion Objects
| Object # | Preceding Object Conversion Approach |
|---|---|
2019 | Materials-Basic View(required Material Types-ROH/FERT/HALB/SRV/DIEN/ZDIE/Packaging/Spares |
Error Handling
| Error Type | Error Description | Action Taken |
|---|---|---|
Configuration / Data Transformation | The value XXX for field XXX doesn't exist |
|
Configuration | There is mandatory field XXX missing |
|
Post-Load Validation
Project Team
Completeness
| Task | Action |
|---|---|
Validate Record count in the backend | Validate the main tables, such as MVKE has the same records as the loading file |
Display Records | Pick up few random Material numbers, and Run the Material Report to validate the Material Sales View with Sales Text can be displayed without any error |
Perform Source-to-Target Comparisons |
|
Accuracy
| Task | Action |
|---|---|
Execute Sample Queries and Reports |
|
Conduct Post-Migration Reconciliation | Generate reports comparing pre- and post-migration data. |
Business
Post-load validation is a critical step in data migration, ensuring that transferred data is accurate, complete, and functional within the target system.
1. Ensuring Data Integrity
After migration, data must be consistent with its original structure. Post-load validation checks for missing records, incorrect mappings, and formatting errors to prevent discrepancies.
2. Business Continuity
Faulty data can disrupt operations, leading to financial losses and inefficiencies. Validating post-load data ensures that applications function as expected, preventing downtime.
3. Error Detection and Resolution
By validating data post-migration, businesses can detect anomalies early, reducing the cost and effort required for corrections
Completeness
| Task | Action |
|---|---|
Accuracy
| Task | Action |
|---|---|
Perform Manual Testing | Conduct manual spot-checks for additional assurance. |
Key Assumptions
- Master Data Standard is up to date as on the date of documenting this conversion approach and data load.
- Materials - Sales Views with sales long text is in scope based on data design and any exception requested by business.
See also
Change log
Workflow history
| Title | Last Updated By | Updated | Status | |
|---|---|---|---|---|
| There are no pages at the moment. | ||||

