| Status | |
|---|---|
| Owner | |
| Stakeholders | The business stakeholders involved in making, reviewing, and endorsing this decision. Type @ to mention people by name |
The purpose of this document is to define the conversion approach to create Attachment for open sales transaction in S/4 HANA.
Attachment Functionality in SAP ECC
- Storage Method: Attachments are typically stored using Generic Object Services (GOS), which allow users to link files to sales documents (via transaction codes like VA02 or VA22).
- Technical Reference: The attachments are managed via the Business Object Repository (BOR) object. Below are the object names with their document types.
BUS2031 - Customer Quotation
BUS2032 - Sales Order
BUS2034 - Sales Contract
BUS2035 - Sales Scheduling Agreement
BUS2094 - Credit memo request
BUS2096 - Debit memo request
Changes in SAP S/4HANA
- Attachments are linked using the Attachment List Service (also used in Fiori apps).
- Technically embedded in the Business Object Processing Framework (BOPF) or handled by Object Page Layout for Fiori views.
- User Experience Enhancements: Drag-and-drop uploads via Fiori.
The scope of this document covers the approach for converting active Attachment for Open sales transaction from Legacy Source Systems into S/4HANA.
The data from legacy system includes:
The data from legacy system excludes:
| Source | Scope | Source Approx No. of Records | Target System | Target Approx No. of Records |
|---|---|---|---|---|
| WP2 | Attachment for open sales transaction | S4 Hana ROW | ||
| PF2 | Attachment for open sales transaction | S4 Hana ROW | ||
| WP2 | Attachment for open sales transaction | S4 Hana China | ||
| PF2 | Attachment for open sales transaction | S4 Hana China | ||
| WP2 | Attachment for open sales transaction | S4 Hana CUI | ||
| PF2 | Attachment for open sales transaction | S4 Hana CUI |
N/A
The DMS approach is elaborated in KDD "KDD085 - Document Management in the SyWay Solution".
CMMC 2.0 is a mandatory DoD cybersecurity certification for contractors handling Controlled Unclassified Information (CUI) and Federal Contract Information (FCI). CUI includes sensitive technical data (e.g., design specs, system info) related to U.S. military and space applications. The Composites Business handles CUI and is therefore within CMMC scope. Without certification, the business risks disqualification from existing and future DoD programs.
It is mandatory to implement CMMC-compliant systems and processes to for all the organizations that are dealing with CUI.
Therefore, there will be one SAP instance specifically for CUI related entities.
Different SAP Instance Migration Approach
Due to compliance requirement, there will be one SAP instance for Rest of the World, one for China and one for CUI.
Please refer to the link for the entity mapping for each instance.
With Functional input, document the technical design of the target fields that are in the scope of this document.
The technical design of the target for this conversion approach.
| Table | Field | Data Element | Field Description | Data Type | Length | Requirement |
|---|---|---|---|---|---|---|
| SRGBTBREL | CLIENT | Client | C | 3 | Internal | |
| SRGBTBREL | BRELGUID | GUID | X | 16 | Internal | |
| SRGBTBREL | RELTYPE | Relationship type | C | 10 | Mandatory | |
| SRGBTBREL | INSTID_A | Instance ID | C | 70 | Mandatory | |
| SRGBTBREL | TYPEID_A | Object Type | C | 32 | Mandatory | |
| SRGBTBREL | CATID_A | Object Category | C | 2 | Mandatory | |
| SRGBTBREL | INSTID_B | Instance ID | C | 70 | Internal | |
| SRGBTBREL | TYPEID_B | Object Type | C | 32 | Internal | |
| SRGBTBREL | CATID_B | Object Category | C | 2 | Internal | |
| SRGBTBREL | LOGSYS_A | Logical System | C | 10 | Not in use | |
| SRGBTBREL | ARCH_A | Object Archived | C | 1 | Not in use | |
| SRGBTBREL | LOGSYS_B | Logical System | C | 10 | Not in use | |
| SRGBTBREL | ARCH_B | Object Archived | C | 1 | Not in use | |
| SRGBTBREL | UTCTIME | Short Time Stamp | P | 8 | Internal | |
| SRGBTBREL | HOMESYS | Logical System | C | 10 | Not in use |
All data cleansing should take place in the data source system as defined in this document, unless system limitations prevent it.
If data cleansing is managed outside of the source system (e.g. Syniti Migrate, 3rd Party Vendor, DCT), the necessary documentation must be produced and appended to this deliverable for sign-off.
| ID | Criticality | Error Message/Report Description | Rule | Output | Source System |
|---|---|---|---|---|---|
| 9067-001 | C-2 | Review and remove the obsolete attachments | |||
The high-level process is represented by the diagram below:
The ETL (Extract, Transform, Load) process is a structured approach to data migration and management, ensuring high-quality data is seamlessly transferred across systems. Here’s a breakdown of its key components:
1. Extraction
The process begins with extracting metadata and raw data from source systems, such as Syensqo ECC system (i.e., WP2/PF2) periodically. The extracted data is then staged for transformation.
2. Transformation
Once extracted, the data undergoes cleansing, consolidation, and governance. This step ensures data integrity, consistency, and compliance with business rules. The transformation process includes:
- Data validation to remove inconsistencies.
- Standardization to align formats across datasets.
- Business rule application to refine data for operational use.
3. Loading
The transformed data is then loaded into the target S4 Hana system.

Extract data from a source into Syniti Migrate for SAP ROW and SAP China relevant entities. There are 2 possibilities:
The agreed Relevancy criteria is applied to the extracted records to identify the records that are applicable for the Target loads.
Extraction Run Sheet
| Req # | Requirement Description | Team Responsible |
|---|---|---|
| Extraction Scope Definition | - Identify the source systems and databases involved. - Define the data objects (tables, fields, records) to be extracted. - Establish business rules for data selection. | Syniti Syniti / LTC Data team |
| Extraction Methodology | - Specify the extraction approach (full, incremental, or delta extraction). - Determine the tools and technologies used. - Define data filtering criteria to exclude irrelevant records. | Syniti |
| Extraction Execution Plan | - Establish execution timelines and batch processing schedules. - Assign responsibilities for extraction monitoring. - Document dependencies on other migration tasks. | Syniti |
| Data Quality and Validation | - Define error handling mechanisms for extraction failures. | Syniti |
| Selection Ref Screen | Parameter Name | Selection Type | Requirement | Value to be entered/set |
|---|---|---|---|---|
| N/A | ||||
<Object> DCT Rules
| Field Name | Field Description | Rule |
|---|---|---|
| N/A | ||
List the steps that need to occur before extraction can commence
| Item # | Step Description | Team Responsible |
|---|---|---|
| 1 | Source System Availability
| Syensqo IT |
| 2 | Data Structure
| Syniti |
| 3 | Referential Integrity
| Syniti |
| 4 | Extraction Methodology
| Syniti |
| 5 | Performance and Scalability Considerations
| Syniti |
| 6 | Security and Compliance
| Syniti |
The Target fields are mapped to the applicable Legacy field that will be its source, this is a 3-way activity involving the Business, Functional team and Data team. This identifies the transformation activity required to allow Syniti Migrate to make the data Target ready:
| Item # | Step Description | Team Responsible |
|---|---|---|
| 1 | Transformation Scope Definition - Identify the source and target data structures. - Define business rules for data standardization. - Establish data cleansing requirements to remove inconsistencies. | Data Team |
| 2 | Data Mapping and Standardization - Align source fields with target fields. - Ensure unit consistency (e.g., currency, measurement units) | Data Team |
| 3 | Business Rule Application - Implement data enrichment/collection if applicable - Apply conditional transformations based on predefined logic/business rules | Data Team |
| 4 | Transformation Execution Plan - Define batch processing schedules. - Assign responsibilities for monitoring execution. - Establish error-handling mechanisms | Syniti |
Transformation Rules
| Rule # | Source system | Source Table | Source Field | Source Description | Target System | Target Table | Target Field | Target Description | Transformation Logic |
|---|---|---|---|---|---|---|---|---|---|
| 1 | WP2/PF2 | SRGBTBREL | CLIENT | Client | S4 Hana | SRGBTBREL | CLIENT | Client | Internal - |
| 2 | WP2/PF3 | SRGBTBREL | BRELGUID | GUID | S4 Hana | SRGBTBREL | BRELGUID | GUID | Internal - |
| 3 | WP2/PF4 | SRGBTBREL | RELTYPE | Relationship type | S4 Hana | SRGBTBREL | RELTYPE | Relationship type | Copy I will include type NOTE/PNOT/ ATTA (note, private note, attachment) |
| 4 | WP2/PF5 | SRGBTBREL | INSTID_A | Instance ID | S4 Hana | SRGBTBREL | INSTID_A | Instance ID | Xref - Map ECC sales document number to S4 sales document number |
| 5 | WP2/PF6 | SRGBTBREL | TYPEID_A | Object Type | S4 Hana | SRGBTBREL | TYPEID_A | Object Type | Copy - It will include below object types. BUS2031 - Customer Quotation |
| 6 | WP2/PF7 | SRGBTBREL | CATID_A | Object Category | S4 Hana | SRGBTBREL | CATID_A | Object Category | Copy - BO |
| 7 | WP2/PF8 | SRGBTBREL | INSTID_B | Instance ID | S4 Hana | SRGBTBREL | INSTID_B | Instance ID | Internal - |
| 8 | WP2/PF9 | SRGBTBREL | TYPEID_B | Object Type | S4 Hana | SRGBTBREL | TYPEID_B | Object Type | Copy - MESSAGE |
| 9 | WP2/PF10 | SRGBTBREL | CATID_B | Object Category | S4 Hana | SRGBTBREL | CATID_B | Object Category | Copy - BO |
| 10 | WP2/PF11 | SRGBTBREL | LOGSYS_A | Logical System | S4 Hana | SRGBTBREL | LOGSYS_A | Logical System | Not in Use - |
| 11 | WP2/PF12 | SRGBTBREL | ARCH_A | Object Archived | S4 Hana | SRGBTBREL | ARCH_A | Object Archived | Not in Use - |
| 12 | WP2/PF13 | SRGBTBREL | LOGSYS_B | Logical System | S4 Hana | SRGBTBREL | LOGSYS_B | Logical System | Not in Use - |
| 13 | WP2/PF14 | SRGBTBREL | ARCH_B | Object Archived | S4 Hana | SRGBTBREL | ARCH_B | Object Archived | Not in Use - |
| 14 | WP2/PF15 | SRGBTBREL | UTCTIME | Short Time Stamp | S4 Hana | SRGBTBREL | UTCTIME | Short Time Stamp | Internal - |
| 15 | WP2/PF15 | SRGBTBREL | HOMESYS | Logical System | S4 Hana | SRGBTBREL | HOMESYS | Logical System | Not in Use - |
| Mapping Table Name | Mapping Table Description |
|---|---|
| N/A | |
| Item # | Step Description | Team Responsible |
|---|---|---|
| 1 | Source Data Integrity - Ensure extracted data is complete, accurate, and consistent. - Validate that data types and formats align with transformation requirements. | Syniti |
| 2 | Referential Integrity - Ensure dependent records are transformed together or in advance, such as CNV-3007 Business Partner General and DMS server migration | Syniti |
| 3 | Transformation Logic and Mapping - Define data mapping rules between source and target schemas. | Data Team |
| 4 | Performance and Scalability Considerations - Optimize transformation processes for large datasets. - Ensure system resources can handle transformation workloads | Syniti |
| 5 | Logging and Error Handling - Maintain detailed logs of transformation activities. - Define error-handling procedures for failed transformations | Syniti |
| Task | Action |
|---|---|
| Compare Data Counts |
|
| Validate the mandatory fields | Validate there is value for all the mandatory fields |
| Validate Primary Keys and Unique Constraints |
|
| Test Referential Integrity | Confirm dependent records exist in related tables |
| Task | Action |
|---|---|
| Validate the transformation | Validate the fields which require transformation have the value after transformation instead of the original field value |
| Check Data Consistency |
|
| Task | Action |
|---|---|
| title | specific details of what and how the task needs to be performed e.g. which reports are being used etc. |
| Task | Action |
|---|---|
| title | specific details of what and how the task needs to be performed e.g. which reports are being used etc. |
The load process includes:
| Item # | Step Description | Team Responsible |
|---|---|---|
| 1 | Load Scope Definition - Identify the target system and database structure. - Define data objects (tables, fields, records) to be loaded. - Establish business rules for data validation. | Data team |
| 2 | Load Methodology - Specify the loading tools and technologies (Migration Cockpit, LSMW, custom loading program). | Syniti |
| 3 | Data Quality and Validation - Ensure data integrity checks (null values, duplicates, format validation). - Perform pre-load validations to verify completeness. - Define error handling mechanisms for load failures | Syniti |
| 4 | Load Execution Plan - Establish execution timelines and batch processing schedules. - Assign responsibilities for monitoring execution. - Document dependencies on other migration tasks | Syniti |
| 5 | Logging and Reporting - Maintain detailed logs of loading activities. - Generate summary reports on loaded data volume and quality. - Define escalation procedures for errors | Syniti |
Load Phase and Dependencies
The Attachment for customer master data will be loaded in the pre-cutover period.
Before loading, it will have dependency on the DMS server migration completion.
List the Configurations required before loading can commence
| Item # | Configuration Item |
|---|---|
| N/A | |
| Object # | Preceding Object Conversion Approach |
|---|---|
| 9065 | Open Sales Orders - Standard/Rush/FOC/Consignment/Intercompany/Thirdparty/Returns/Credit Memo/Debit Memo |
| 9064 | Open Sales Scheduling Agreements with long text with customized fields |
| 9062 | Open Quotations with long text with customized fields |
| 9063 | Open Sales Contracts with long text with customized fields |
The table below depicts some possible system errors for this data object during data load. All data load error is to be logged as defect and managed within the Defect Management
| Error Type | Error Description | Action Taken |
|---|---|---|
| Technical | There is error message when using the upload program | Raise ticket to the technical team |
| Task | Action |
|---|---|
| title | specific details of what and how the task needs to be performed e.g. which reports are being used etc. |
| Task | Action |
|---|---|
| title | specific details of what and how the task needs to be performed e.g. which reports are being used etc. |
Post-load validation is a critical step in data migration, ensuring that transferred data is accurate, complete, and functional within the target system.
1. Ensuring Data Integrity
After migration, data must be consistent with its original structure. Post-load validation checks for missing records, incorrect mappings, and formatting errors to prevent discrepancies.
2. Business Continuity
Faulty data can disrupt operations, leading to financial losses and inefficiencies. Validating post-load data ensures that applications function as expected, preventing downtime.
3. Error Detection and Resolution
By validating data post-migration, businesses can detect anomalies early, reducing the cost and effort required for corrections
| Task | Action |
|---|---|
| Task | Action |
|---|---|
Any additional key assumptions.
Insert links and references to other documents which are relevant when trying to understand this decision and its implications. Other decisions are often impacted, so it's good to list them here with links. Attachments are also possible but dangerous as they are static documents and not updated by their authors.