Status

Owner
StakeholdersThe business stakeholders involved in making, reviewing, and endorsing this decision. Type @ to mention people by name

Purpose

The purpose of this document is to define the conversion approach to create Open Sales Order in S/4 HANA.

Open sales orders are orders that are not fully delivered or billed and still require fulfillment or financial processing. This conversion ensures business continuity and data consistency for ongoing customer transactions. It should be a seamless migration, and business is able to continue working with open Sales Order in S/4 HANA.

Summarise how the data is currently utilized and set up in the legacy system/s and how object is intended to be represented in S/4, and any other relevant information


Conversion Scope

The scope of this document covers the approach for converting Open Sales Orders, including relevant schedule lines and partner data, from legacy source systems into SAP S/4HANA.

From the current system landscape, open sales order data resides separately across legacy systems (WP2 and PF2), often with inconsistent formatting, incomplete references, and varying document structures. It is required to harmonize, cleanse, and validate the sales order data to ensure that only business-relevant, open orders are migrated to S/4HANA.

While WP2 and PF2 serve as the primary source systems, various transformation and mapping logic will be applied to generate load templates that conform to the target S/4HANA order structure, including header, item, schedule line, pricing, and partner functions. The conversion scope specifically includes orders that are not fully delivered or billed, with open quantities, future delivery dates, or billing schedules, and which remain critical to business continuity post go-live.

The data from legacy system includes:

  1. Sales Orders with Open Quantities:
    1. Orders with undelivered items, i.e., delivery quantity < order quantity.
    2. Orders where the delivery status is not marked as complete.
    3. Orders with open billing quantity, meaning not fully invoiced.
  2. Special Sales Documents Still in Processing: XXX to be decided
    1. Return orders or credit memo requests not yet fully processed.
    2. Partial deliveries or partial credit documents awaiting completion.
  3. Orders Required for Business Continuity or Customer Service:
    1. Sales orders flagged by business or customer service for visibility after go-live.
    2. Orders still under negotiation or partial execution (e.g., changed or parked orders).
  4. Relevancy Criteria 2

The data from legacy system excludes:

  1. Fully Delivered and Billed Sales Orders:
    1. Orders where all items are delivered (delivery complete) and invoiced (billing complete).
    2. Orders archived in legacy system and no longer used operationally.
  2. Rejected or Cancelled Orders:
    1. Sales orders where header or item status indicates cancellation or rejection.
    2. Orders marked with final rejection reason codes.
  3. Out-of-Scope Sales Document Types or Sales Areas:
    1. Document types not in use or excluded by process design (e.g., internal orders, sample/test).
    2. Orders belonging to sales organizations, distribution channels, or divisions not part of S/4 scope.
  4. Orders Without Business or Legal Justification:
    1. Orders with zero values, no meaningful transactional history, or created erroneously.
    2. Orders flagged by business for exclusion due to redundancy.
  5. Exclusion Criteria 2


List of source systems and approximate number of records
SourceScope

Source Approx No. of Records

Target SystemTarget Approx

No. of Records

WP2

Open Sales Orders

 XXX

S/4HANA System

 XXX

PF2

Open Sales Orders

 XXX

S/4HANA System

 XXX











Additional Information

Multi-language Requirement

Summarize Multi-language Requirement/s, if any

Document Management

Summarize Document Management requirement, if any

Legal Requirement

Summarize Legal Requirement/s, if any

Special Requirements

Specify any special requirements or considerations that may impact the data conversion process based on specific locations, regulatory compliance or system limitations. Clearly outline any regional or localization requirements such as country-specific data formats, legal reporting obligations or industry standards that must be adhered to (e.g., localization rules for countries like China).

If the data conversion involves third-party systems or external data sources, such as Icertis, describe any additional requirements related to data mapping, transformation logic, validation rules or security measures that must be followed.




Target Design

With Functional input, document the technical design of the target fields that are in the scope of this document.

The technical design of the target for this conversion approach.

TableFieldData ElementField DescriptionData TypeLengthRequirement
MARAMATNRMATNRMaterial NumberCHAR18Mandatory






















Data Cleansing

All data cleansing should take place in the data source system as defined in this document, unless system limitations prevent it.

If data cleansing is managed outside of the source system (e.g. Syniti Migrate, 3rd Party Vendor, DCT), the necessary documentation must be produced and appended to this deliverable for sign-off.

IDCriticalityError Message/Report DescriptionRuleOutputSource System
9065-001
Sales order open for more than 3 years.Orders created for more than 3 years ago.
PF2/WP2
9065-002
Missing sold-to dataIf sold-to is not in migration scope
PF2/WP2
9065-003
Missing ship-to dataIf ship-to is not in migration scope
PF2/WP2








Conversion Process

The high-level process is represented by the diagram below:

The ETL (Extract, Transform, Load) process is a structured approach to data migration and management, ensuring high-quality data is seamlessly transferred across systems. Here’s a breakdown of its key components:
1. Extraction
The process begins with extracting metadata and raw data from source systems, such as Syensqo ECC system (i.e., WP2/PF2) periodically. The extracted data is then staged for transformation.


2. Transformation
Once extracted, the data undergoes cleansing, consolidation, and governance. This step ensures data integrity, consistency, and compliance with business rules. The transformation process includes:
- Data validation to remove inconsistencies.
- Standardization to align formats across datasets.
- Business rule application to refine data for operational use.


3. Loading
The transformed data is then loaded into the target S4 Hana system. 


Data Privacy and Sensitivity

N/A


Extraction

xtract data from a source into Syniti Migrate. There are 2 possibilities:

  1. The data exists. Syniti Migrate connects to the source and loads the data into Syniti Migrate. There are 3 methods:
    1. Perform full data extraction from relevant tables in the source system(s).
    2. Perform extraction through the application layer.
    3. Only if Syniti Migrate cannot connect to the source, data is loaded to the repository from the provided source system extract/report.
  2. The data does not exist (or cannot be converted from its current state).  The data is manually collected by the business directly in Syniti Migrate. This is to be conducted using DCT (Data Collection Template) in Syniti Migrate

The agreed Relevancy criteria is applied to the extracted records to identify the records that are applicable for the Target loads

Extraction Run Sheet

Req #Requirement DescriptionTeam Responsible
Extraction Scope Definition- Identify the source systems and databases involved.
- Define the data objects (tables, fields, records) to be extracted.
- Establish business rules for data selection.
Synithi
Extraction Methodology- Specify the extraction approach (full, incremental, or delta extraction).
- Determine the tools and technologies used.
- Define data filtering criteria to exclude irrelevant records.
Synithi
Extraction Execution Plan- Establish execution timelines and batch processing schedules.
- Assign responsibilities for extraction monitoring.
- Document dependencies on other migration tasks.
Synithi
Data Quality and Validation- Define error handling mechanisms for extraction failures.Synithi


Selection Screen

If applicable, this section will give the details on any selection screen parameters, including the parameter type, that are required to be entered to ensure consistent data extracts.
Selection Ref ScreenParameter NameSelection TypeRequirementValue to be entered/set
N/A



















Data Collection Template (DCT)

Target Ready Data Collection Template will be created for Data Object data with exception of some fields which require transformation as mentioned in the transformation rule.

<Object> DCT Rules

Field NameField DescriptionRule












Extraction Dependencies

List the steps that need to occur before extraction can commence

Item #Step DescriptionTeam Responsible
1

Source System Availability

  • Ensure that the source database or application is accessible.
  • Confirm that necessary credentials and permissions are granted
Syensqo IT
2

Data Structure

  • Identify relationships between tables, views, and stored procedures.
Synithi
3

Referential Integrity

  • Ensure dependent records are extracted together.
Synithi
4

Extraction Methodology

  • Define whether extraction is full, incremental, or delta-based.
  • Establish batch processing schedules for large datasets.
Synithi
5

Performance and Scalability Considerations

  • Optimize extraction queries to prevent system overload.
  • Ensure network bandwidth supports data transfer volumes.
Synithi
6

Security and Compliance

  • Adhere to regulatory standards for sensitive information if applicable
Synithi



Transformation

The Target fields are mapped to the applicable Legacy field that will be its source, this is a 3-way activity involving the Business, Functional team and Data team.  This identifies the transformation activity required to allow Syniti Migrate to make the data Target ready:

  1. Perform value mapping and data transformation rules.
    1. Legacy values are mapped to the to-be values (this could include a default value)
    2. Values are transformed according to the rules defined in Syniti Migrate
  2. Prepare target-ready data in the structure and format that is required for loading via prescribed Load Tool.  This step also produces the load data ready for business to perform Pre-load Data Validation

Transformation Run Sheet

Item #Step DescriptionTeam Responsible
1Transformation Scope Definition
- Identify the source and target data structures.
- Define business rules for data standardization.
- Establish data cleansing requirements to remove inconsistencies.
Data Team
2Data Mapping and Standardization
- Align source fields with target fields.
- Ensure unit consistency (e.g., currency, measurement units)
Data Team
3Business Rule Application
- Implement data enrichment/collection if applicable
- Apply conditional transformations based on predefined logic/business rules
Data Team
4Transformation Execution Plan
- Define batch processing schedules.
- Assign responsibilities for monitoring execution.
- Establish error-handling mechanisms
Synithi


Transformation Rules

Rule #Source systemSource TableSource FieldSource DescriptionTarget SystemTarget TableTarget FieldTarget DescriptionTransformation Logic









































Transformation Mapping

Use the exact name and reference this section in the “Transformation rules” above
Mapping Table NameMapping Table Description

MAP_ZTERM

Payment term Mapping

MAP_PARVW

Partner function Mapping

MAP_INCO1

Incoterm1 Mapping

MAP_VKORG

Sales Organization Mapping

MAP_VTWEG

Distribution Channel Mapping

MAP_TDID

Text ID Mapping

MAP_VRKME

Unit of Measure

MAP_WERKS

Plant

MAP_SPART

Division

Transformation Dependencies

List the steps that need to occur before transformation can commence
Item #Step DescriptionTeam Responsible
1Source Data Integrity
- Ensure extracted data is complete, accurate, and consistent.
- Validate that data types and formats align with transformation requirements.
Synithi
2Referential Integrity
- Ensure dependent records are transformed together or in advance
Synithi
3Transformation Logic and Mapping
- Define data mapping rules between source and target schemas.
Data Team
4Performance and Scalability Considerations
- Optimize transformation processes for large datasets.
- Ensure system resources can handle transformation workloads
Synithi
5Logging and Error Handling
- Maintain detailed logs of transformation activities.
- Define error-handling procedures for failed transformations
Synithi


Pre-Load Validation

Project Team

The following pre-load validations will be performed by the Project Team.

Completeness

TaskAction

Validate the record count

Validate the loading file has the correct number of records

Validate the mandatory fields

Validate there is value for all the mandatory fields




Accuracy

TaskAction
Validate the transformationValidate the fields which require transformation have the value after transformation instead of the original field value
Check Data Consistency
  1. Compare field values across systems including:
    1. Sold-to
    2. Ship-to
    3. PO date
    4. Pricing
    5. Number of line items
    6. Open quantities
  2. Validate data formats and structures



Business

The following pre-load validations will be performed by the business.

Completeness

TaskAction
titlespecific details of what and how the task needs to be performed e.g. which reports are being used etc.





Accuracy

TaskAction
titlespecific details of what and how the task needs to be performed e.g. which reports are being used etc.





Load

The load process includes:

  1. Execute the automated data load into target system using load tool or product the load file if the load must be done manually
  2. Once the data is loaded to the target system, it will be extracted and prepared for Post Load Data Validation

Load Run Sheet

Item #Step DescriptionTeam Responsible
1Load Scope Definition
- Identify the target system and database structure.
- Define data objects (tables, fields, records) to be loaded.
- Establish business rules for data validation.
Data team
2Load Methodology
- Specify the loading tools and technologies (Migration Cockpit, LSMW, custom loading program).
Synithi
3Data Quality and Validation
- Ensure data integrity checks (null values, duplicates, format validation).
- Perform pre-load validations to verify completeness.
- Define error handling mechanisms for load failures
Synithi
4Load Execution Plan
- Establish execution timelines and batch processing schedules.
- Assign responsibilities for monitoring execution.
- Document dependencies on other migration tasks
Synithi
5Logging and Reporting
- Maintain detailed logs of loading activities.
- Generate summary reports on loaded data volume and quality.
- Define escalation procedures for errors
Synithi


Load Phase and Dependencies

Open Sales Orders will be loaded in the pre-cutover or cutover phase, depending on business cut-off.

Configuration

List the Configurations required before loading can commence

Item #Configuration Item

1

Order type

2

Item categories

3

Schedule lines categories

4

Pricing procedure

5

Number range settings for orders

Conversion Objects

Object #Preceding Object Conversion Approach

1

Business Partner

2

Material Mater

3

Price condition

4

Output condition

Error Handling

The table below depicts some possible system errors for this data object during data load. All data load error is to be logged as defect and managed within the Defect Management

Error TypeError DescriptionAction Taken
Configuration / Data TransformationThe value XXX for field XXX doesn't exist
  1. Check the mapping/conversion is done properly in the loading file
  2. Validate the target value is configured/transported in the target system
  3. Reach out to function team to validate the configuration
ConfigurationThere is mandatory field XXX missing
  1. Validate MDS if the fields are set as mandatory
  2. Validate if there is value in the pre-loading file
  3. Validate if the configuration for the mandatory fields are done properly




Post-Load Validation

Project Team

The following post-load validations will be performed by the Project Team.

Completeness

TaskAction

Validate Record count in the backend

Validate the main tables, such as VBAK, VBAP has the same records as the loading file

Display Records

Pick up few random sales document number numbers, display with Tcode VA03 and validate the details




Accuracy

TaskAction
Execute Sample Queries and Reports
  1. Run queries to validate business logic.
  2. Generate reports to compare expected vs. actual results
Conduct Post-Migration ReconciliationGenerate reports comparing pre- and post-migration data.



Business

Post-load validation is a critical step in data migration, ensuring that transferred data is accurate, complete, and functional within the target system.

1. Ensuring Data Integrity
After migration, data must be consistent with its original structure. Post-load validation checks for missing records, incorrect mappings, and formatting errors to prevent discrepancies.
2. Business Continuity
Faulty data can disrupt operations, leading to financial losses and inefficiencies. Validating post-load data ensures that applications function as expected, preventing downtime.
3. Error Detection and Resolution
By validating data post-migration, businesses can detect anomalies early, reducing the cost and effort required for corrections

Completeness

TaskAction
titlespecific details of what and how the task needs to be performed e.g. which reports are being used etc.





Accuracy

TaskAction

Display Records

Pick up few random sales document number numbers, display with Tcode VA03 and validate the details






Key Assumptions

  • Master Data Standard is up to date as on the date of documenting this conversion approach and data load.
  • Data Object is in scope based on data design and any exception requested by business.

Any additional key assumptions.


See also

Insert links and references to other documents which are relevant when trying to understand this decision and its implications. Other decisions are often impacted, so it's good to list them here with links. Attachments are also possible but dangerous as they are static documents and not updated by their authors.

Change log

Workflow history