| Status | Revision in Progress |
|---|---|
| Owner | |
| Stakeholders |
Purpose
The purpose of this document is to define the conversion approach to create Open Sales Order in S/4 HANA.
Open sales orders are orders that are not fully delivered or billed and still require fulfillment or financial processing. This conversion ensures business continuity and data consistency for ongoing customer transactions. It should be a seamless migration, and business is able to continue working with open Sales Order in S/4 HANA.
Conversion Scope
The scope of this document covers the approach for converting Open Sales Orders, including relevant schedule lines and partner data, from legacy source systems into SAP S/4HANA.
From the current system landscape, open sales order data resides separately across legacy systems (WP2 and PF2), often with inconsistent formatting, incomplete references, and varying document structures. It is required to harmonize, cleanse, and validate the sales order data to ensure that only business-relevant, open orders are migrated to S/4HANA.
While WP2 and PF2 serve as the primary source systems, various transformation and mapping logic will be applied to generate load templates that conform to the target S/4HANA order structure, including header, item, schedule line, pricing, and partner functions. The conversion scope specifically includes orders that are not fully delivered or billed, with open quantities, future delivery dates, or billing schedules, and which remain critical to business continuity post go-live.
The data from legacy system includes:
- Sales Orders with Open Quantities:
- Orders with undelivered items, i.e., delivery quantity < order quantity.
- Orders where the delivery status is not marked as complete.
- Orders with open billing quantity, meaning not fully invoiced.
- Special Sales Documents Still in Processing: XXX to be decided
- Return orders or credit memo requests not yet fully processed.
- Partial deliveries or partial credit documents awaiting completion.
- Orders Required for Business Continuity or Customer Service:
- Sales orders flagged by business or customer service for visibility after go-live.
- Orders still under negotiation or partial execution (e.g., changed or parked orders).
The data from legacy system excludes:
- Fully Delivered and Billed Sales Orders:
- Orders where all items are delivered (delivery complete) and invoiced (billing complete).
- Orders archived in legacy system and no longer used operationally.
- Rejected or Cancelled Orders:
- Sales orders where header or item status indicates cancellation or rejection.
- Orders marked with final rejection reason codes.
- Out-of-Scope Sales Document Types or Sales Areas:
- Document types not in use or excluded by process design (e.g., internal orders, sample/test).
- Orders belonging to sales organizations, distribution channels, or divisions not part of S/4 scope.
- Orders Without Business or Legal Justification:
- Orders with zero values, no meaningful transactional history, or created erroneously.
- Orders flagged by business for exclusion due to redundancy.
| Source | Scope | Source Approx No. of Records | Target System | Target Approx No. of Records |
|---|---|---|---|---|
WP2 | Open Sales Orders | XXX | S/4HANA System | XXX |
PF2 | Open Sales Orders | XXX | S/4HANA System | XXX |
Additional Information
Multi-language Requirement
Document Management
Legal Requirement
Special Requirements
Target Design
The technical design of the target for this conversion approach.
| Table | Field | Data Element | Field Description | Data Type | Length | Requirement |
|---|---|---|---|---|---|---|
Data Cleansing
| ID | Criticality | Error Message/Report Description | Rule | Output | Source System |
|---|---|---|---|---|---|
| 9065-001 | Sales order open for more than 3 years. | Orders created for more than 3 years ago. | PF2/WP2 | ||
| 9065-002 | Missing sold-to data | If sold-to is not in migration scope | PF2/WP2 | ||
| 9065-003 | Missing ship-to data | If ship-to is not in migration scope | PF2/WP2 | ||
Conversion Process
The high-level process is represented by the diagram below:
The ETL (Extract, Transform, Load) process is a structured approach to data migration and management, ensuring high-quality data is seamlessly transferred across systems. Here’s a breakdown of its key components:
1. Extraction
The process begins with extracting metadata and raw data from source systems, such as Syensqo ECC system (i.e., WP2/PF2) periodically. The extracted data is then staged for transformation.
2. Transformation
Once extracted, the data undergoes cleansing, consolidation, and governance. This step ensures data integrity, consistency, and compliance with business rules. The transformation process includes:
- Data validation to remove inconsistencies.
- Standardization to align formats across datasets.
- Business rule application to refine data for operational use.
3. Loading
The transformed data is then loaded into the target S4 Hana system.
Data Privacy and Sensitivity
N/AExtraction
xtract data from a source into Syniti Migrate. There are 2 possibilities:
- The data exists. Syniti Migrate connects to the source and loads the data into Syniti Migrate. There are 3 methods:
- Perform full data extraction from relevant tables in the source system(s).
- Perform extraction through the application layer.
- Only if Syniti Migrate cannot connect to the source, data is loaded to the repository from the provided source system extract/report.
- The data does not exist (or cannot be converted from its current state). The data is manually collected by the business directly in Syniti Migrate. This is to be conducted using DCT (Data Collection Template) in Syniti Migrate
The agreed Relevancy criteria is applied to the extracted records to identify the records that are applicable for the Target loads
Extraction Run Sheet
| Req # | Requirement Description | Team Responsible |
|---|---|---|
| Extraction Scope Definition | - Identify the source systems and databases involved. - Define the data objects (tables, fields, records) to be extracted. - Establish business rules for data selection. | Synithi |
| Extraction Methodology | - Specify the extraction approach (full, incremental, or delta extraction). - Determine the tools and technologies used. - Define data filtering criteria to exclude irrelevant records. | Synithi |
| Extraction Execution Plan | - Establish execution timelines and batch processing schedules. - Assign responsibilities for extraction monitoring. - Document dependencies on other migration tasks. | Synithi |
| Data Quality and Validation | - Define error handling mechanisms for extraction failures. | Synithi |
Selection Screen
| Selection Ref Screen | Parameter Name | Selection Type | Requirement | Value to be entered/set |
|---|---|---|---|---|
| N/A | ||||
Data Collection Template (DCT)
Target Ready Data Collection Template will be created for data with exception of some fields which require transformation as mentioned in the transformation rule.DCT Rules
| Field Name | Field Description | Rule |
|---|---|---|
Extraction Dependencies
| Item # | Step Description | Team Responsible |
|---|---|---|
| 1 | Source System Availability
| Syensqo IT |
| 2 | Data Structure
| Synithi |
| 3 | Referential Integrity
| Synithi |
| 4 | Extraction Methodology
| Synithi |
| 5 | Performance and Scalability Considerations
| Synithi |
| 6 | Security and Compliance
| Synithi |
Transformation
The Target fields are mapped to the applicable Legacy field that will be its source, this is a 3-way activity involving the Business, Functional team and Data team. This identifies the transformation activity required to allow Syniti Migrate to make the data Target ready:
- Perform value mapping and data transformation rules.
- Legacy values are mapped to the to-be values (this could include a default value)
- Values are transformed according to the rules defined in Syniti Migrate
- Prepare target-ready data in the structure and format that is required for loading via prescribed Load Tool. This step also produces the load data ready for business to perform Pre-load Data Validation
Transformation Run Sheet
| Item # | Step Description | Team Responsible |
|---|---|---|
| 1 | Transformation Scope Definition - Identify the source and target data structures. - Define business rules for data standardization. - Establish data cleansing requirements to remove inconsistencies. | Data Team |
| 2 | Data Mapping and Standardization - Align source fields with target fields. - Ensure unit consistency (e.g., currency, measurement units) | Data Team |
| 3 | Business Rule Application - Implement data enrichment/collection if applicable - Apply conditional transformations based on predefined logic/business rules | Data Team |
| 4 | Transformation Execution Plan - Define batch processing schedules. - Assign responsibilities for monitoring execution. - Establish error-handling mechanisms | Synithi |
Transformation Rules
| Rule # | Source system | Source Table | Source Field | Source Description | Target System | Target Table | Target Field | Target Description | Transformation Logic |
|---|---|---|---|---|---|---|---|---|---|
Transformation Mapping
| Mapping Table Name | Mapping Table Description |
|---|---|
MAP_ZTERM | Payment term Mapping |
MAP_PARVW | Partner function Mapping |
MAP_INCO1 | Incoterm1 Mapping |
MAP_VKORG | Sales Organization Mapping |
MAP_VTWEG | Distribution Channel Mapping |
MAP_TDID | Text ID Mapping |
MAP_VRKME | Unit of Measure |
MAP_WERKS | Plant |
MAP_SPART | Division |
Transformation Dependencies
List the steps that need to occur before transformation can commence| Item # | Step Description | Team Responsible |
|---|---|---|
| 1 | Source Data Integrity - Ensure extracted data is complete, accurate, and consistent. - Validate that data types and formats align with transformation requirements. | Synithi |
| 2 | Referential Integrity - Ensure dependent records are transformed together or in advance | Synithi |
| 3 | Transformation Logic and Mapping - Define data mapping rules between source and target schemas. | Data Team |
| 4 | Performance and Scalability Considerations - Optimize transformation processes for large datasets. - Ensure system resources can handle transformation workloads | Synithi |
| 5 | Logging and Error Handling - Maintain detailed logs of transformation activities. - Define error-handling procedures for failed transformations | Synithi |
Pre-Load Validation
Project Team
Completeness
| Task | Action |
|---|---|
Validate the record count | Validate the loading file has the correct number of records |
Validate the mandatory fields | Validate there is value for all the mandatory fields |
Accuracy
| Task | Action |
|---|---|
| Validate the transformation | Validate the fields which require transformation have the value after transformation instead of the original field value |
| Check Data Consistency |
|
Business
Completeness
| Task | Action |
|---|---|
Accuracy
| Task | Action |
|---|---|
Load
The load process includes:
- Execute the automated data load into target system using load tool or product the load file if the load must be done manually
- Once the data is loaded to the target system, it will be extracted and prepared for Post Load Data Validation
Load Run Sheet
| Item # | Step Description | Team Responsible |
|---|---|---|
| 1 | Load Scope Definition - Identify the target system and database structure. - Define data objects (tables, fields, records) to be loaded. - Establish business rules for data validation. | Data team |
| 2 | Load Methodology - Specify the loading tools and technologies (Migration Cockpit, LSMW, custom loading program). | Synithi |
| 3 | Data Quality and Validation - Ensure data integrity checks (null values, duplicates, format validation). - Perform pre-load validations to verify completeness. - Define error handling mechanisms for load failures | Synithi |
| 4 | Load Execution Plan - Establish execution timelines and batch processing schedules. - Assign responsibilities for monitoring execution. - Document dependencies on other migration tasks | Synithi |
| 5 | Logging and Reporting - Maintain detailed logs of loading activities. - Generate summary reports on loaded data volume and quality. - Define escalation procedures for errors | Synithi |
Load Phase and Dependencies
Open Sales Orders will be loaded in the pre-cutover or cutover phase, depending on business cut-off.
Configuration
| Item # | Configuration Item |
|---|---|
1 | Order type |
2 | Item categories |
3 | Schedule lines categories |
4 | Pricing procedure |
5 | Number range settings for orders |
Conversion Objects
| Object # | Preceding Object Conversion Approach |
|---|---|
1 | Business Partner |
2 | Material Mater |
3 | Price condition |
4 | Output condition |
Error Handling
| Error Type | Error Description | Action Taken |
|---|---|---|
| Configuration / Data Transformation | The value XXX for field XXX doesn't exist |
|
| Configuration | There is mandatory field XXX missing |
|
Post-Load Validation
Project Team
The following post-load validations will be performed by the Project Team.Completeness
| Task | Action |
|---|---|
Validate Record count in the backend | Validate the main tables, such as VBAK, VBAP has the same records as the loading file |
Display Records | Pick up few random sales document number numbers, display with Tcode VA03 and validate the details |
Accuracy
| Task | Action |
|---|---|
| Execute Sample Queries and Reports |
|
| Conduct Post-Migration Reconciliation | Generate reports comparing pre- and post-migration data. |
Business
Post-load validation is a critical step in data migration, ensuring that transferred data is accurate, complete, and functional within the target system.
1. Ensuring Data Integrity
After migration, data must be consistent with its original structure. Post-load validation checks for missing records, incorrect mappings, and formatting errors to prevent discrepancies.
2. Business Continuity
Faulty data can disrupt operations, leading to financial losses and inefficiencies. Validating post-load data ensures that applications function as expected, preventing downtime.
3. Error Detection and Resolution
By validating data post-migration, businesses can detect anomalies early, reducing the cost and effort required for corrections
Completeness
| Task | Action |
|---|---|
Accuracy
| Task | Action |
|---|---|
Display Records | Pick up few random sales document number numbers, display with Tcode VA03 and validate the details |
Key Assumptions
- Master Data Standard is up to date as on the date of documenting this conversion approach and data load.
- is in scope based on data design and any exception requested by business.
See also
Change log
Workflow history
| Title | Last Updated By | Updated | Status | |
|---|---|---|---|---|
| There are no pages at the moment. | ||||
