| Status | In Progress |
|---|---|
| Owner | |
| Stakeholders |
Purpose
The purpose of this document is to define the conversion approach for migrating Open Sales Scheduling Agreements (SAs) from legacy systems into SAP S/4HANA. Scheduling Agreements represent long-term agreements with customers with pre-defined delivery schedules. The migration ensures continuity in customer commitments, supply planning, and delivery accuracy post-cutover.
Conversion Scope
The scope of this document includes the identification, extraction, transformation, and loading of Open Sales Scheduling Agreements (SAs) and their associated delivery schedule lines into SAP S/4HANA. The objective is to ensure that ongoing and active customer delivery commitments are maintained post-go-live and that planning and fulfillment processes can continue seamlessly within the new system.
The data from legacy system includes:
- Documents where the document category (VBAK-VBTYP) is either
- "E" Scheduling agreement
- "F" Scheduling agreement with external service agent
- Scheduling Agreement is with:
- Order items where the delivery status is not marked as complete.
- Order items with open billing quantity, meaning not fully invoiced.
- The Sales Area of the Scheduling Agreement are within the scope of S4 HANA.
- Scheduling Agreements is created for the Customers and Materials in scope.
- Scheduling Agreement is within validity date.
The data from legacy system excludes:
- Fully Delivered and Billed Scheduling Agreement items:
- Order items where all items are delivered (delivery complete) and invoiced (billing complete).
- Order items archived in legacy system and no longer used operationally.
- Out-of-Scope Sales Document Types or Sales Areas:
- Document types not in used or excluded by process design (e.g., internal orders, sample/test).
- Orders belonging to sales organizations, distribution channels, or divisions not part of S/4 scope.
| Source | Scope | Source Approx No. of Records | Target System | Target Approx No. of Records |
|---|---|---|---|---|
WP2 | Open Scheduling Agreements | XXX | S/4HANA System | XXX |
PF2 | Open Scheduling Agreements | XXX | S/4HANA System | XXX |
Additional Information
Multi-language Requirement
Document Management
Legal Requirement
Special Requirements
Target Design
The technical design of the target for this conversion approach.
| Table | Field | Data Element | Field Description | Data Type | Length | Requirement |
|---|---|---|---|---|---|---|
Data Cleansing
| ID | Criticality | Error Message/Report Description | Rule | Output | Source System |
|---|---|---|---|---|---|
| 9064-001 | Sales order open for more than 3 years. | Orders created for more than 3 years ago. | PF2/WP2 | ||
| 9064-002 | Missing sold-to data | If the sold-to is marked for deletion | PF2/WP2 | ||
| 9064-003 | Missing ship-to data | If the ship-to is marked for deletion | PF2/WP2 | ||
| 9064-004 | Missing bill-to data | If the bill-to is marked for deletion | PF2/WP2 | ||
| 9064-005 | Missing payer data | If the payer-to is marked for deletion | PF2/WP2 | ||
| 9064-006 | Missing material master | If the material master is marked for deletion | PF2/WP |
Conversion Process
The high-level process is represented by the diagram below:
The ETL (Extract, Transform, Load) process is a structured approach to data migration and management, ensuring high-quality data is seamlessly transferred across systems. Here’s a breakdown of its key components:
1. Extraction
The process begins with extracting metadata and raw data from source systems, such as Syensqo ECC system (i.e., WP2/PF2) periodically. The extracted data is then staged for transformation.
2. Transformation
Once extracted, the data undergoes cleansing, consolidation, and governance. This step ensures data integrity, consistency, and compliance with business rules. The transformation process includes:
- Data validation to remove inconsistencies.
- Standardization to align formats across datasets.
- Business rule application to refine data for operational use.
3. Loading
The transformed data is then loaded into the target S4 Hana system.
Data Privacy and Sensitivity
Extraction
Extract data from a source into Syniti Migrate. There are 2 possibilities:
- The data exists. Syniti Migrate connects to the source and loads the data into Syniti Migrate. There are 3 methods:
- Perform full data extraction from relevant tables in the source system(s).
- Perform extraction through the application layer.
- Only if Syniti Migrate cannot connect to the source, data is loaded to the repository from the provided source system extract/report.
- The data does not exist (or cannot be converted from its current state). The data is manually collected by the business directly in Syniti Migrate. This is to be conducted using DCT (Data Collection Template) in Syniti Migrate
The agreed Relevancy criteria is applied to the extracted records to identify the records that are applicable for the Target loads
Extraction Run Sheet
| Req # | Requirement Description | Team Responsible |
|---|---|---|
| Extraction Scope Definition | - Identify the source systems and databases involved. - Define the data objects (tables, fields, records) to be extracted. - Establish business rules for data selection. | Synithi |
| Extraction Methodology | - Specify the extraction approach (full, incremental, or delta extraction). - Determine the tools and technologies used. - Define data filtering criteria to exclude irrelevant records. | Synithi |
| Extraction Execution Plan | - Establish execution timelines and batch processing schedules. - Assign responsibilities for extraction monitoring. - Document dependencies on other migration tasks. | Synithi |
| Data Quality and Validation | - Define error handling mechanisms for extraction failures. | Synithi |
Selection Screen
| Selection Ref Screen | Parameter Name | Selection Type | Requirement | Value to be entered/set |
|---|---|---|---|---|
Data Collection Template (DCT)
Target Ready Data Collection Template will be created for data with exception of some fields which require transformation as mentioned in the transformation rule.DCT Rules
| Field Name | Field Description | Rule |
|---|---|---|
Extraction Dependencies
| Item # | Step Description | Team Responsible |
|---|---|---|
| 1 | Source System Availability
| Syensqo IT |
| 2 | Data Structure
| Synithi |
| 3 | Referential Integrity
| Synithi |
| 4 | Extraction Methodology
| Synithi |
| 5 | Performance and Scalability Considerations
| Synithi |
| 6 | Security and Compliance
| Synithi |
Transformation
The Target fields are mapped to the applicable Legacy field that will be its source, this is a 3-way activity involving the Business, Functional team and Data team. This identifies the transformation activity required to allow Syniti Migrate to make the data Target ready:
- Perform value mapping and data transformation rules.
- Legacy values are mapped to the to-be values (this could include a default value)
- Values are transformed according to the rules defined in Syniti Migrate
- Prepare target-ready data in the structure and format that is required for loading via prescribed Load Tool. This step also produces the load data ready for business to perform Pre-load Data Validation
Transformation Run Sheet
| Item # | Step Description | Team Responsible |
|---|---|---|
| 1 | Transformation Scope Definition - Identify the source and target data structures. - Define business rules for data standardization. - Establish data cleansing requirements to remove inconsistencies. | Data Team |
| 2 | Data Mapping and Standardization - Align source fields with target fields. - Ensure unit consistency (e.g., currency, measurement units) | Data Team |
| 3 | Business Rule Application - Implement data enrichment/collection if applicable - Apply conditional transformations based on predefined logic/business rules | Data Team |
| 4 | Transformation Execution Plan - Define batch processing schedules. - Assign responsibilities for monitoring execution. - Establish error-handling mechanisms | Synithi |
Transformation Rules
| Rule # | Source system | Source Table | Source Field | Source Description | Target System | Target Table | Target Field | Target Description | Transformation Logic |
|---|---|---|---|---|---|---|---|---|---|
Transformation Mapping
| Mapping Table Name | Mapping Table Description |
|---|---|
MAP_ZTERM | Payment term Mapping |
MAP_PARVW | Partner function Mapping |
MAP_INCO1 | Incoterm1 Mapping |
MAP_VKORG | Sales Organization Mapping |
MAP_VTWEG | Distribution Channel Mapping |
MAP_TDID | Text ID Mapping |
MAP_VRKME | Unit of Measure |
MAP_WERKS | Plant |
MAP_SPART | Division |
Transformation Dependencies
List the steps that need to occur before transformation can commence| Item # | Step Description | Team Responsible |
|---|---|---|
| 1 | Source Data Integrity - Ensure extracted data is complete, accurate, and consistent. - Validate that data types and formats align with transformation requirements. | Synithi |
| 2 | Referential Integrity - Ensure dependent records are transformed together or in advance | Synithi |
| 3 | Transformation Logic and Mapping - Define data mapping rules between source and target schemas. | Data Team |
| 4 | Performance and Scalability Considerations - Optimize transformation processes for large datasets. - Ensure system resources can handle transformation workloads | Synithi |
| 5 | Logging and Error Handling - Maintain detailed logs of transformation activities. - Define error-handling procedures for failed transformations | Synithi |
Pre-Load Validation
Project Team
Completeness
| Task | Action |
|---|---|
Validate mandatory and key fields | Mandatory field check.
Check Business Partners Master Data
Check Material Master Data
Check Sales Order Item Consistency
Check Currency and Pricing Data
|
Reconciliation check | Record Count
Check Amount in Document Currency and Local Currency
Check Total Quantity
|
Accuracy
| Task | Action |
|---|---|
| Validate the transformation | Make sure all fields with transformation are populated with S/4 HANA values according to the mapping file. |
| Check Data Consistency |
|
Business
Completeness
| Task | Action |
|---|---|
| Conversion check | In legacy system, execute report Open Sales Orders List (e.g. via transaction code VA35 or equivalent custom report). Group the output of the report by Sales Organization and Sold-to Party using the subtotal function and compare the count in this report against the Open Sales Orders count in the pre-load file. The record count for Open Scheduling Agreements may also be done at a more granular level. The recommended granular level or subtotal fields can consist of:
If any of the count is different, raise a defect and flag the relevant record as error. |
Accuracy
| Task | Action |
|---|---|
| Key fields check | Perform the following checks on a selected sample of open sales orders:
|
| Validate the transformation | Make sure all fields with transformation are populated with S/4 HANA values according to the mapping file. |
Load
The load process includes:
- Execute the automated data load into target system using load tool or product the load file if the load must be done manually
- Once the data is loaded to the target system, it will be extracted and prepared for Post Load Data Validation
Load Run Sheet
| Item # | Step Description | Team Responsible |
|---|---|---|
| 1 | Load Scope Definition - Identify the target system and database structure. - Define data objects (tables, fields, records) to be loaded. - Establish business rules for data validation. | Data team |
| 2 | Load Methodology - Specify the loading tools and technologies (Migration Cockpit, LSMW, custom loading program). | Synithi |
| 3 | Data Quality and Validation - Ensure data integrity checks (null values, duplicates, format validation). - Perform pre-load validations to verify completeness. - Define error handling mechanisms for load failures | Synithi |
| 4 | Load Execution Plan - Establish execution timelines and batch processing schedules. - Assign responsibilities for monitoring execution. - Document dependencies on other migration tasks | Synithi |
| 5 | Logging and Reporting - Maintain detailed logs of loading activities. - Generate summary reports on loaded data volume and quality. - Define escalation procedures for errors | Synithi |
Load Phase and Dependencies
Configuration
| Item # | Configuration Item |
|---|---|
1 | Order type |
2 | Item categories |
3 | Schedule lines categories |
4 | Pricing procedure |
5 | Number range settings for orders |
Conversion Objects
| Object # | Preceding Object Conversion Approach |
|---|---|
1 | Business Partner |
2 | Material Mater |
3 | Price condition |
4 | Output condition |
Error Handling
| Error Type | Error Description | Action Taken |
|---|---|---|
| Configuration / Data Transformation | The value XXX for field XXX doesn't exist |
|
| Configuration | There is mandatory field XXX missing |
|
Post-Load Validation
Project Team
Completeness
| Task | Action |
|---|---|
Validate Record count | Total number of records loaded for Open Sales Orders will be generated in the post-load reports in the tool based on the target tables and fields mentioned in section target design. The reconciliation needs to be executed on the total number of 'valid' records and amounts per Sales Organization in the source compared to total number of records and amounts in S/4HANA. Record Count
Check Amount in Document Currency and Local Currency
|
Accuracy
| Task | Action |
|---|---|
| Validate Key Fields |
|
Business
Post-load validation is a critical step in data migration, ensuring that transferred data is accurate, complete, and functional within the target system.
1. Ensuring Data Integrity
After migration, data must be consistent with its original structure. Post-load validation checks for missing records, incorrect mappings, and formatting errors to prevent discrepancies.
2. Business Continuity
Faulty data can disrupt operations, leading to financial losses and inefficiencies. Validating post-load data ensures that applications function as expected, preventing downtime.
3. Error Detection and Resolution
By validating data post-migration, businesses can detect anomalies early, reducing the cost and effort required for corrections
Completeness
| Task | Action |
|---|---|
Accuracy
| Task | Action |
|---|---|
Display Records | Pick up few random sales document number numbers, display with Tcode VA33 and validate the details |
Key Assumptions
- Master Data Standard is up to date as on the date of documenting this conversion approach and data load.
- is in scope based on data design and any exception requested by business.
See also
Change log
Workflow history
| Title | Last Updated By | Updated | Status | |
|---|---|---|---|---|
| There are no pages at the moment. | ||||
