| Status | |
|---|---|
| Owner | |
| Stakeholders | The business stakeholders involved in making, reviewing, and endorsing this decision. Type @ to mention people by name |
The purpose of this document is to define the conversion approach for migrating Open Sales Scheduling Agreements (SAs) from legacy systems into SAP S/4HANA. Scheduling Agreements represent long-term agreements with customers with pre-defined delivery schedules. The migration ensures continuity in customer commitments, supply planning, and delivery accuracy post-cutover.
The scope of this document includes the identification, extraction, transformation, and loading of Open Sales Scheduling Agreements (SAs) and their associated delivery schedule lines into SAP S/4HANA. The objective is to ensure that ongoing and active customer delivery commitments are maintained post-go-live and that planning and fulfillment processes can continue seamlessly within the new system.
The data from legacy system includes:
The data from legacy system excludes:
| Source | Scope | Source Approx No. of Records | Target System | Target Approx No. of Records |
|---|---|---|---|---|
WP2 | Open Scheduling Agreements | XXX | S/4HANA System | XXX |
PF2 | Open Scheduling Agreements | XXX | S/4HANA System | XXX |
Summarize Multi-language Requirement/s, if any
Summarize Document Management requirement, if any
Summarize Legal Requirement/s, if any
Specify any special requirements or considerations that may impact the data conversion process based on specific locations, regulatory compliance or system limitations. Clearly outline any regional or localization requirements such as country-specific data formats, legal reporting obligations or industry standards that must be adhered to (e.g., localization rules for countries like China).
If the data conversion involves third-party systems or external data sources, such as Icertis, describe any additional requirements related to data mapping, transformation logic, validation rules or security measures that must be followed.
With Functional input, document the technical design of the target fields that are in the scope of this document.
The technical design of the target for this conversion approach.
| Table | Field | Data Element | Field Description | Data Type | Length | Requirement |
|---|---|---|---|---|---|---|
| MARA | MATNR | MATNR | Material Number | CHAR | 18 | Mandatory |
All data cleansing should take place in the data source system as defined in this document, unless system limitations prevent it.
If data cleansing is managed outside of the source system (e.g. Syniti Migrate, 3rd Party Vendor, DCT), the necessary documentation must be produced and appended to this deliverable for sign-off.
| ID | Criticality | Error Message/Report Description | Rule | Output | Source System |
|---|---|---|---|---|---|
| 9064-001 | Sales order open for more than 3 years. | Orders created for more than 3 years ago. | PF2/WP2 | ||
| 9064-002 | Missing sold-to data | If the sold-to is marked for deletion | PF2/WP2 | ||
| 9064-003 | Missing ship-to data | If the ship-to is marked for deletion | PF2/WP2 | ||
| 9064-004 | Missing bill-to data | If the bill-to is marked for deletion | PF2/WP2 | ||
| 9064-005 | Missing payer data | If the payer-to is marked for deletion | PF2/WP2 | ||
| 9064-006 | Missing material master | If the material master is marked for deletion | PF2/WP |
The high-level process is represented by the diagram below:
The ETL (Extract, Transform, Load) process is a structured approach to data migration and management, ensuring high-quality data is seamlessly transferred across systems. Here’s a breakdown of its key components:
1. Extraction
The process begins with extracting metadata and raw data from source systems, such as Syensqo ECC system (i.e., WP2/PF2) periodically. The extracted data is then staged for transformation.
2. Transformation
Once extracted, the data undergoes cleansing, consolidation, and governance. This step ensures data integrity, consistency, and compliance with business rules. The transformation process includes:
- Data validation to remove inconsistencies.
- Standardization to align formats across datasets.
- Business rule application to refine data for operational use.
3. Loading
The transformed data is then loaded into the target S4 Hana system.
Extract data from a source into Syniti Migrate. There are 2 possibilities:
The agreed Relevancy criteria is applied to the extracted records to identify the records that are applicable for the Target loads
| Req # | Requirement Description | Team Responsible |
|---|---|---|
| Extraction Scope Definition | - Identify the source systems and databases involved. - Define the data objects (tables, fields, records) to be extracted. - Establish business rules for data selection. | Synithi |
| Extraction Methodology | - Specify the extraction approach (full, incremental, or delta extraction). - Determine the tools and technologies used. - Define data filtering criteria to exclude irrelevant records. | Synithi |
| Extraction Execution Plan | - Establish execution timelines and batch processing schedules. - Assign responsibilities for extraction monitoring. - Document dependencies on other migration tasks. | Synithi |
| Data Quality and Validation | - Define error handling mechanisms for extraction failures. | Synithi |
| Selection Ref Screen | Parameter Name | Selection Type | Requirement | Value to be entered/set |
|---|---|---|---|---|
<Object> DCT Rules
| Field Name | Field Description | Rule |
|---|---|---|
List the steps that need to occur before extraction can commence
| Item # | Step Description | Team Responsible |
|---|---|---|
| 1 | Source System Availability
| Syensqo IT |
| 2 | Data Structure
| Synithi |
| 3 | Referential Integrity
| Synithi |
| 4 | Extraction Methodology
| Synithi |
| 5 | Performance and Scalability Considerations
| Synithi |
| 6 | Security and Compliance
| Synithi |
The Target fields are mapped to the applicable Legacy field that will be its source, this is a 3-way activity involving the Business, Functional team and Data team. This identifies the transformation activity required to allow Syniti Migrate to make the data Target ready:
| Item # | Step Description | Team Responsible |
|---|---|---|
| 1 | Transformation Scope Definition - Identify the source and target data structures. - Define business rules for data standardization. - Establish data cleansing requirements to remove inconsistencies. | Data Team |
| 2 | Data Mapping and Standardization - Align source fields with target fields. - Ensure unit consistency (e.g., currency, measurement units) | Data Team |
| 3 | Business Rule Application - Implement data enrichment/collection if applicable - Apply conditional transformations based on predefined logic/business rules | Data Team |
| 4 | Transformation Execution Plan - Define batch processing schedules. - Assign responsibilities for monitoring execution. - Establish error-handling mechanisms | Synithi |
Transformation Rules
| Rule # | Source system | Source Table | Source Field | Source Description | Target System | Target Table | Target Field | Target Description | Transformation Logic |
|---|---|---|---|---|---|---|---|---|---|
| Mapping Table Name | Mapping Table Description |
|---|---|
MAP_ZTERM | Payment term Mapping |
MAP_PARVW | Partner function Mapping |
MAP_INCO1 | Incoterm1 Mapping |
MAP_VKORG | Sales Organization Mapping |
MAP_VTWEG | Distribution Channel Mapping |
MAP_TDID | Text ID Mapping |
MAP_VRKME | Unit of Measure |
MAP_WERKS | Plant |
MAP_SPART | Division |
| Item # | Step Description | Team Responsible |
|---|---|---|
| 1 | Source Data Integrity - Ensure extracted data is complete, accurate, and consistent. - Validate that data types and formats align with transformation requirements. | Synithi |
| 2 | Referential Integrity - Ensure dependent records are transformed together or in advance | Synithi |
| 3 | Transformation Logic and Mapping - Define data mapping rules between source and target schemas. | Data Team |
| 4 | Performance and Scalability Considerations - Optimize transformation processes for large datasets. - Ensure system resources can handle transformation workloads | Synithi |
| 5 | Logging and Error Handling - Maintain detailed logs of transformation activities. - Define error-handling procedures for failed transformations | Synithi |
| Task | Action |
|---|---|
Validate mandatory and key fields | Mandatory field check.
Check Business Partners Master Data
Check Material Master Data
Check Sales Order Item Consistency
Check Currency and Pricing Data
|
Reconciliation check | Record Count
Check Amount in Document Currency and Local Currency
Check Total Quantity
|
| Task | Action |
|---|---|
| Validate the transformation | Make sure all fields with transformation are populated with S/4 HANA values according to the mapping file. |
| Check Data Consistency |
|
| Task | Action |
|---|---|
| Conversion check | In legacy system, execute report Open Sales Orders List (e.g. via transaction code VA35 or equivalent custom report). Group the output of the report by Sales Organization and Sold-to Party using the subtotal function and compare the count in this report against the Open Sales Orders count in the pre-load file. The record count for Open Scheduling Agreements may also be done at a more granular level. The recommended granular level or subtotal fields can consist of:
If any of the count is different, raise a defect and flag the relevant record as error. |
| Task | Action |
|---|---|
| Key fields check | Perform the following checks on a selected sample of open sales orders:
|
| Validate the transformation | Make sure all fields with transformation are populated with S/4 HANA values according to the mapping file. |
The load process includes:
| Item # | Step Description | Team Responsible |
|---|---|---|
| 1 | Load Scope Definition - Identify the target system and database structure. - Define data objects (tables, fields, records) to be loaded. - Establish business rules for data validation. | Data team |
| 2 | Load Methodology - Specify the loading tools and technologies (Migration Cockpit, LSMW, custom loading program). | Synithi |
| 3 | Data Quality and Validation - Ensure data integrity checks (null values, duplicates, format validation). - Perform pre-load validations to verify completeness. - Define error handling mechanisms for load failures | Synithi |
| 4 | Load Execution Plan - Establish execution timelines and batch processing schedules. - Assign responsibilities for monitoring execution. - Document dependencies on other migration tasks | Synithi |
| 5 | Logging and Reporting - Maintain detailed logs of loading activities. - Generate summary reports on loaded data volume and quality. - Define escalation procedures for errors | Synithi |
Load Phase and Dependencies
Identify the phase as to “when” the load for this object will occur. <Pre-Cutover, Cutover, Post Cutover> and list the steps that need to occur before the load can commence
List the Configurations required before loading can commence
| Item # | Configuration Item |
|---|---|
1 | Order type |
2 | Item categories |
3 | Schedule lines categories |
4 | Pricing procedure |
5 | Number range settings for orders |
| Object # | Preceding Object Conversion Approach |
|---|---|
1 | Business Partner |
2 | Material Mater |
3 | Price condition |
4 | Output condition |
The table below depicts some possible system errors for this data object during data load. All data load error is to be logged as defect and managed within the Defect Management
| Error Type | Error Description | Action Taken |
|---|---|---|
| Configuration / Data Transformation | The value XXX for field XXX doesn't exist |
|
| Configuration | There is mandatory field XXX missing |
|
| Task | Action |
|---|---|
Validate Record count | Total number of records loaded for Open Sales Orders will be generated in the post-load reports in the tool based on the target tables and fields mentioned in section target design. The reconciliation needs to be executed on the total number of 'valid' records and amounts per Sales Organization in the source compared to total number of records and amounts in S/4HANA. Record Count
Check Amount in Document Currency and Local Currency
|
| Task | Action |
|---|---|
| Validate Key Fields |
|
Post-load validation is a critical step in data migration, ensuring that transferred data is accurate, complete, and functional within the target system.
1. Ensuring Data Integrity
After migration, data must be consistent with its original structure. Post-load validation checks for missing records, incorrect mappings, and formatting errors to prevent discrepancies.
2. Business Continuity
Faulty data can disrupt operations, leading to financial losses and inefficiencies. Validating post-load data ensures that applications function as expected, preventing downtime.
3. Error Detection and Resolution
By validating data post-migration, businesses can detect anomalies early, reducing the cost and effort required for corrections
| Task | Action |
|---|---|
| title | specific details of what and how the task needs to be performed e.g. which reports are being used etc. |
| Task | Action |
|---|---|
Display Records | Pick up few random sales document number numbers, display with Tcode VA33 and validate the details |
Any additional key assumptions.
Insert links and references to other documents which are relevant when trying to understand this decision and its implications. Other decisions are often impacted, so it's good to list them here with links. Attachments are also possible but dangerous as they are static documents and not updated by their authors.