| Status | |
|---|---|
| Owner | LIU-ext, Ekawati |
| Stakeholders | The business stakeholders involved in making, reviewing, and endorsing this decision. Type @ to mention people by name |
The purpose of this document is to define the conversion approach to create Materials - Sales view with sales long text in S/4 HANA.
Summarise how the data is currently utilized and set up in the legacy system/s and how object is intended to be represented in S/4, and any other relevant information
The scope of this document covers the approach for converting active Material Master Sales View including the sales long text from Legacy Source Systems into S/4 HANA following the Material Sales View Master Data Design Standard.
From the current system landscape, Material Master data exists separately in the legacy systems (WP2 and PF2), with potential discrepancies in both organizations. Harmonization and validation are required to ensure accurate and consolidated data in S/4HANA. While WP2 and PF2 serve as source systems, extensive mapping and transformation logic will be necessary to produce properly formatted load templates in line with the target design.
The data from legacy system includes:
The data from legacy system excludes:

List of source systems and approximate number of records
*MVKE Sales Organization X 3 Distribution Channel
| Source | Scope | Source Approx No. of Records | Target System | Target Approx No. of Records |
|---|---|---|---|---|
WP2 | Materials - Sales View | 93,000 | S/4HANA ROW | 93,000 |
PF2 | Materials - Sales View | S/4HANA ROW | ||
WP2 | Materials - Sales View | 13,500 | S/4HANA China | 13,500 |
PF2 | Materials - Sales View | S/4HANA China | ||
WP2 | Materials - Sales View | 63,000 | S/4HANA CUI | 63,000 |
PF2 | Materials - Sales View | S/4HANA CUI |
Not Applicable
Document Management
Not Applicable
As per US Department of Defence (DOD) a Cybersecurity Maturity Model Certification/ CMMC 2.0 is a mandatory certification for all contractors handling Controlled Unclassified Information (CUI) and Federal Contract Information (FCI). CUI includes sensitive technical data (e.g., design specs, system info) related to U.S. military and space applications. The Composites Business handles CUI and is therefore within CMMC scope. Without certification, the business risks disqualification from existing and future DoD programs.
It is mandatory to implement CMMC-compliant systems and processes to for all the organizations that are dealing with CUI.
Therefore, there will be one SAP instance specifically for CUI related entities. The migration for CUI related entities will be covered by US based data consultant using separate tools.
Specify any special requirements or considerations that may impact the data conversion process based on specific locations, regulatory compliance or system limitations. Clearly outline any regional or localization requirements such as country-specific data formats, legal reporting obligations or industry standards that must be adhered to (e.g., localization rules for countries like China).
If the data conversion involves third-party systems or external data sources, such as Icertis, describe any additional requirements related to data mapping, transformation logic, validation rules or security measures that must be followed.
Materials - Sales View with Sales Long Text
A. Different SAP Instance Migration Approach
Due to compliance requirement, there will be one SAP instance for Rest of the World (ROW), one for China and one for CUI.
Please refer to the link for the entity mapping for each instance. In case the data object is applicable for multiples instances, what business rule to follow?
- To identify the record is for SAP ROW Instance, it will use below logic.
Source | ||
|---|---|---|
| PF2 | 3383 | SOLVAY SPOL KR |
| PF2 | 5686 | SOLVAY QUIMICA (AR) |
| PF2 | 5835 | SOLVAY SPOL IT |
| PF2 | 5846 | SOLVAY SPOL JP |
| PF2 | 5955 | SOLVAY SPECIALIT IN |
| PF2 | 5978 | SOLVAY SPOL BE |
| PF2 | 6080 | SOLVAY FRANCE FR |
| PF2 | 6327 | SLV SP CHEM ASIA PAC |
| WP2 | AE01 | AE Composite Matls |
| WP2 | AU01 | Technology Solutions |
| WP2 | AU02 | AU Novecare |
| WP2 | BR42 | Quimicos BR Novecare |
| WP2 | BR44 | Quimicos BR SpecPoly |
| WP2 | BR47 | Quimicos BR Tech Sol |
| WP2 | CA12 | Technology Solutions |
| WP2 | CA15 | Cytec CA Novecare |
| WP2 | CL01 | Cytec Chile Tech Sol |
| WP2 | DE13 | Composites Materials |
| WP2 | FRAZ | SPOP Novecare |
| WP2 | GB34 | Solvay Solutions UK |
| WP2 | GB40 | Composites Materials |
| WP2 | ID01 | CYID Tech Solutions |
| WP2 | IN04 | SSIPL IN Novecare |
| WP2 | IT07 | Novecare Italie |
| WP2 | JP02 | Solvay Japan, Ltd |
| WP2 | JP03 | Solvay Nicca, Ltd |
| WP2 | JP10 | Technology Solutions |
| WP2 | KR11 | CY KR Novecare |
| WP2 | KR12 | CY KR Tech Solutions |
| WP2 | MX01 | MX Tech Solutions |
| WP2 | MX02 | MX Novecare |
| WP2 | MX08 | MX Spec Polymers |
| WP2 | NL01 | Cytec Ind BV TS |
| WP2 | NL05 | Solvay Sol NL Noveca |
| WP2 | NZ01 | Solvay New Zealand |
| WP2 | PE01 | Technology Solutions |
| WP2 | SG03 | Solvay Sp Chem SG |
| WP2 | SG08 | SSCAP SG Tech Sol |
| WP2 | TH01 | Technology Solutions |
| WP2 | TH03 | Solvay (Bangpoo) SC |
- To identify the record is for SAP China Instance, it will use below logic.
| PF2 | 5876 | SOLVAY SHANGHAI LTD |
| PF2 | 5991 | SOLVAY SPEC POLYMERS |
| WP2 | CN15 | Solvay ZJ Novecare |
| WP2 | CN18 | Zhuhai SLV Novecare |
| WP2 | CN19 | Solvay ZJG Novecare |
| WP2 | CN27 | Solvay ZJ Tech Sol |
| WP2 | CN28 | CYIS Tech Solutions |
| WP2 | CN36 | CYIS Composites Matl |
| WP2 | CN41 | CEM (SH) Compos Mats |
- To identify the record is for SAP CUI Instance, it will use below logic.
| PF2 | 5782 | SOLVAY SPOL US |
| WP2 | US05 | Solvay US Novecare |
| WP2 | US23 | Technology Solutions |
| WP2 | US32 | Composite Materials |
| WP2 | US33 | CEM Defense Material |
| WP2 | US50 | Composite Mat (7008) |
Materials - Sales View with Sales Long Text Data strictly adheres to the Master Data Standard. The complete information of the key fields that hold the Materials - Sales View with Sales Long Text information follows the Master Data Standard document.
The technical design of the target for this conversion approach.
| Table | Field | Data Element | Field Description | Data Type | Length | Requirement |
|---|---|---|---|---|---|---|
| MARA | MATNR | MATNR | Material Number | CHAR | 18 | Mandatory |
All data cleansing should take place in the data source system as defined in this document, unless system limitations prevent it.
If data cleansing is managed outside of the source system (e.g. Syniti Migrate, 3rd Party Vendor, DCT), the necessary documentation must be produced and appended to this deliverable for sign-off.
| ID | Criticality | Error Message/Report Description | Rule | Output | Source System |
|---|---|---|---|---|---|
2003-001 | PF2/ WP2 | ||||
2003-002 | PF2/ WP2 | ||||
2003-003 | PF2/ WP2 | ||||
Set sales view block for material not eligible for migration | Check the usage of material sales view not in use for 5 years and without stock, but still active |
The high-level process is represented by the diagram below:
Summarize High-Level Process. Include diagrams, where applicable. Include information supporting details of Extract, Transform and Load specific to the Data Object

Extract data from a source into Syniti Migrate. There are 2 possibilities:
The agreed Relevancy criteria is applied to the extracted records to identify the records that are applicable for the Target loads
| Req # | Requirement Description | Team Responsible |
|---|---|---|
Extraction Scope Definition | - Identify the source systems and databases involved. - Define the data objects (tables, fields, records) to be extracted. - Establish business rules for data selection. | Syniti |
Extraction Methodology | - Specify the extraction approach (full, incremental, or delta extraction). - Determine the tools and technologies used. - Define data filtering criteria to exclude irrelevant records. | Syniti |
Extraction Execution Plan | - Establish execution timelines and batch processing schedules. - Assign responsibilities for extraction monitoring. - Document dependencies on other migration tasks. | Syniti |
Data Quality and Validation | - Define error handling mechanisms for extraction failures. | Syniti |
| Selection Ref Screen | Parameter Name | Selection Type | Requirement | Value to be entered/set |
|---|---|---|---|---|
Materials - Sales view with sales long text DCT Rules
| Field Name | Field Description | Rule |
|---|---|---|
List the steps that need to occur before extraction can commence
| Item # | Step Description | Team Responsible |
|---|---|---|
| 1 | Source System Availability
| Syensqo IT |
| 2 | Data Structure
| Syniti |
| 3 | Referential Integrity
| Syniti |
| 4 | Extraction Methodology
| Syniti |
| 5 | Performance and Scalability Considerations
| Syniti |
| 6 | Security and Compliance
| Syniti |
| 7 | Data cleansing of legacy Material Master - Sales View with sales long text data must be completed. If standardization within the DCT begins using relevant data from PF2 and WP2 before the cleansing is finalized, it is understood that the business will take due diligence to ensure any subsequent delta cleansing is verified and aligned within the DCT. | Business |
The Target fields are mapped to the applicable Legacy field that will be its source, this is a 3-way activity involving the Business, Functional team and Data team. This identifies the transformation activity required to allow Syniti Migrate to make the data Target ready:
| Item # | Step Description | Team Responsible |
|---|---|---|
1 | Transformation Scope Definition - Identify the source and target data structures. - Define business rules for data standardization. - Establish data cleansing requirements to remove inconsistencies. | Data Team |
2 | Data Mapping and Standardization - Align source fields with target fields. - Ensure unit consistency (e.g., currency, measurement units) | Data Team |
3 | Business Rule Application - Implement data enrichment/collection if applicable - Apply conditional transformations based on predefined logic/business rules | Data Team |
4 | Transformation Execution Plan - Define batch processing schedules. - Assign responsibilities for monitoring execution. - Establish error-handling mechanisms | Syniti |
Transformation Rules
| Rule # | Source system | Source Table | Source Field | Source Description | Target System | Target Table | Target Field | Target Description | Transformation Logic |
|---|---|---|---|---|---|---|---|---|---|
| Mapping Table Name | Mapping Table Description |
|---|---|
| MAP_VKORG | Sales Organization Mapping |
| MAP_VTWEG | Distribution Channel Mapping SyWay - Sales Area.pptx --> All the actual distribution channels and divisions won't exist in the to-be solution Check current Dist.Ch. and discuss with Functional team and how to do the mapping |
| MAP_MEINS | Unit of Measure Mapping MVKE_VRKME shares the same mapping table with Basic view (MARA_MEINS). Share the same mapping table with S2P. To check with Jasleen Madhok, John Hancock, Angelo Buosi |
| MAP_MSTAV | Material Status - Sales Mapping MVSTAV - Cross Distribution Chain VMSTA - Distribution Chain Specific Check the configuration in ECC & S/4 HANA if we need this mapping or not? Or if status <> blank then excluded from ECC download? |
| MAP_WERKS | Plant Mapping WERKS - Plant DWERK - Delivering Plant Share the same mapping table with S2P. To check with Jasleen Madhok, John Hancock, Angelo Buosi |
| MAP_TATYP | Tax Condition Type Mapping |
| MAP_TAXKM | Tax Classification - Material Mapping |
| MAP_VERSG | Statistics Group - Material Mapping TBD |
| MAP_PRDHA | Product Hierarchy Mapping MARA_PRDHA - Basic Data 1 MVKE_PRODH - Sales: Sales Org. 2 Prod. Hierarchies' levels are the same. Share the same mapping table with Basic view (MARA_PRDHA). Share the same mapping table with S2P. To check with Jasleen Madhok, John Hancock, Angelo Buosi |
| MAP_KONDM | Material Price Group Mapping |
| MAP_KTGRM | Account Assignment Group - Material Mapping |
| MAP_MTPOS | Item Category Group Mapping MTPOS_MARA - Basic Data 1 MTPOS - Sales: Sales Org. 2 Share the same mapping table with S2P. To check with Jasleen Madhok, John Hancock, Angelo Buosi |
| MAP_MVGR1 | Material Group 1 Mapping Use in QF2 |
| MAP_MVGR2 | Material Group 2 Mapping Use in QF2 and WQ2 |
| MAP_MVGR3 | Material Group 3 Mapping Use in QF2 |
| MAP_MVGR5 | Material Group 5 Mapping Use in WQ2 |
| Item # | Step Description | Team Responsible |
|---|---|---|
1 | Source Data Integrity - Ensure extracted data is complete, accurate, and consistent. - Validate that data types and formats align with transformation requirements. | Syniti |
2 | Referential Integrity - Ensure dependent records are transformed together or in advance | Syniti |
3 | Transformation Logic and Mapping - Define data mapping rules between source and target schemas. | Data Team |
4 | Performance and Scalability Considerations - Optimize transformation processes for large datasets. - Ensure system resources can handle transformation workloads | Syniti |
5 | Logging and Error Handling - Maintain detailed logs of transformation activities. - Define error-handling procedures for failed transformations | Syniti |
| Task | Action |
|---|---|
Compare Data Counts |
|
Validate the mandatory fields | Validate there is value for all the mandatory fields |
Validate Primary Keys and Unique Constraints |
|
Test Referential Integrity | Confirm dependent records exist in related tables |
| Task | Action |
|---|---|
Validate the transformation | Validate the fields which require transformation have the value after transformation instead of the original field value |
Check Data Consistency |
|
| Task | Action |
|---|---|
Verify Record Count | Business Data Owner/s to verify that the total number of relevant records from the DCT is equal to the total number of records in the Preload and Load Sheets. |
| Task | Action |
|---|---|
Conversion Accuracy | Business Data Owner/s to verify that all the data in the load table/file is accurate as per endorsed transformation/ mapping rules (and signed-off DCT data). |
The load process includes:
| Item # | Step Description | Team Responsible |
|---|---|---|
1 | Load Scope Definition - Identify the target system and database structure. - Define data objects (tables, fields, records) to be loaded. - Establish business rules for data validation. | Data team |
2 | Load Methodology - Specify the loading tools and technologies (Migration Cockpit, LSMW, custom loading program). | Syniti |
3 | Data Quality and Validation - Ensure data integrity checks (null values, duplicates, format validation). - Perform pre-load validations to verify completeness. - Define error handling mechanisms for load failures | Syniti |
4 | Load Execution Plan - Establish execution timelines and batch processing schedules. - Assign responsibilities for monitoring execution. - Document dependencies on other migration tasks | Syniti |
5 | Logging and Reporting - Maintain detailed logs of loading activities. - Generate summary reports on loaded data volume and quality. - Define escalation procedures for errors | Syniti |
Load Phase and Dependencies
The Materials – Sales views with sales long text will be loaded in the pre-cutover period.
Before loading, it will have dependency on the configuration and 2019 - Materials-Basic View(required Material Types-ROH/FERT/HALB/SRV/DIEN/ZDIE/Packaging/Spares). The configuration needs to be transported into the respective system first, including the manual configuration such as the number range set up. And 2019 Materials-Basic View(required Material Types-ROH/FERT/HALB/SRV/DIEN/ZDIE/Packaging/Spares) have to be uploaded into the respective system first.
List the Configurations required before loading can commence
| Item # | Configuration Item |
|---|---|
1 | Sales Organization |
2 | Distribution Channel |
3 | Unit of Measure |
4 | Material Status - Sales |
5 | Plant |
6 | Tax Condition Type |
7 | Tax Classification - Material |
8 | Statistics Group - Material |
9 | Product Hierarchy |
10 | Material Price Group |
11 | Account Assignment Group - Material |
12 | Item Category Group |
13 | Material Group 1 |
14 | Material Group 2 |
15 | Material Group 3 |
16 | Material Group 5 |
| Object # | Preceding Object Conversion Approach |
|---|---|
2019 | Materials-Basic View(required Material Types-ROH/FERT/HALB/SRV/DIEN/ZDIE/Packaging/Spares |
The table below depicts some possible system errors for this data object during data load. All data load error is to be logged as defect and managed within the Defect Management
| Error Type | Error Description | Action Taken |
|---|---|---|
Configuration / Data Transformation | The value XXX for field XXX doesn't exist |
|
Configuration | There is mandatory field XXX missing |
|
| Task | Action |
|---|---|
Validate Record count in the backend | Validate the main tables, such as MVKE has the same records as the loading file |
Display Records | Pick up few random Material numbers, and Run the Material Report to validate the Material Sales View with Sales Text can be displayed without any error |
Perform Source-to-Target Comparisons |
|
| Task | Action |
|---|---|
Execute Sample Queries and Reports |
|
Conduct Post-Migration Reconciliation | Generate reports comparing pre- and post-migration data. |
Post-load validation is a critical step in data migration, ensuring that transferred data is accurate, complete, and functional within the target system.
1. Ensuring Data Integrity
After migration, data must be consistent with its original structure. Post-load validation checks for missing records, incorrect mappings, and formatting errors to prevent discrepancies.
2. Business Continuity
Faulty data can disrupt operations, leading to financial losses and inefficiencies. Validating post-load data ensures that applications function as expected, preventing downtime.
3. Error Detection and Resolution
By validating data post-migration, businesses can detect anomalies early, reducing the cost and effort required for corrections
| Task | Action |
|---|---|
| title | specific details of what and how the task needs to be performed e.g. which reports are being used etc. |
| Task | Action |
|---|---|
Perform Manual Testing | Conduct manual spot-checks for additional assurance. |
Any additional key assumptions.
Insert links and references to other documents which are relevant when trying to understand this decision and its implications. Other decisions are often impacted, so it's good to list them here with links. Attachments are also possible but dangerous as they are static documents and not updated by their authors.