| Status | Revision in Progress |
|---|---|
| Owner | |
| Stakeholders |
Purpose
The purpose of this document is to define the conversion approach to migrate Attachment for customer master data in S/4 HANA.
Attachments in SAP ECC are typically stored using Generic Object Services (GOS), which allow users to link files to master data objects such as customers (via transaction codes like XD02). Attachments in SAP S/4HANA are linked using the Attachment List Service (also used in Fiori apps). This conversion specification will capture the details on how to migrate the attachments for those customers in migration scope into S4 Hana to keep the business continuity.
Conversion Scope
The scope of this document covers the approach for converting Attachment for active Customer master data from SAP ECC into S/4HANA following the document "DD-FUN-050 Master Data Standard_3004-Attachment for customer master data".
The data from legacy system includes:
- The customer general data is in migration scope. (Scope is elaborated in CNV-3007 Business Partners - General (Role 000000) )
- There is attachment maintained in legacy ECC system (WP2/PF2) for the active customers. The attachment relationship type includes note, private note, attachment.
The data from legacy system excludes:
- Attachment for customer outside of migration scope.
- Non-SAP ECC (WP2/PF2) attachment
| Source | Scope | Source Approx No. of Records | Target System | Target Approx No. of Records |
|---|---|---|---|---|
| WP2 | Attachment for customer master data | 473 | S4 Hana ROW | 473 |
| PF2 | Attachment for customer master data | 72 | S4 Hana ROW | 72 |
Additional Information
Multi-language Requirement
N/A
Document Management
The DMS approach is elaborated in KDD "KDD085 - Document Management in the SyWay Solution". The migration approach for the attachment will follow the DMS approach approved in this KDD.
Legal Requirement
CMMC 2.0 is a mandatory DoD cybersecurity certification for contractors handling Controlled Unclassified Information (CUI) and Federal Contract Information (FCI). CUI includes sensitive technical data (e.g., design specs, system info) related to U.S. military and space applications. The Composites Business handles CUI and is therefore within CMMC scope. Without certification, the business risks disqualification from existing and future DoD programs.
It is mandatory to implement CMMC-compliant systems and processes to for all the organizations that are dealing with CUI.
For this data object, if there is CUI related information, it will be handled by the US based consultants to meet the compliance requirement.
Special Requirements
N/A
Target Design
The technical design of the target for this conversion approach.
| Table | Field | Data Element | Field Description | Data Type | Length | Requirement |
|---|---|---|---|---|---|---|
| SRGBTBREL | CLIENT | CLIENT | Client | C | 3 | Internal |
| SRGBTBREL | BRELGUID | BRELGUID | GUID | X | 16 | Internal |
| SRGBTBREL | RELTYPE | RELTYPE | Relationship type | C | 10 | Mandatory |
| SRGBTBREL | INSTID_A | INSTID_A | Instance ID | C | 70 | Mandatory |
| SRGBTBREL | TYPEID_A | TYPEID_A | Object Type | C | 32 | Mandatory |
| SRGBTBREL | CATID_A | CATID_A | Object Category | C | 2 | Mandatory |
| SRGBTBREL | INSTID_B | INSTID_B | Instance ID | C | 70 | Internal |
| SRGBTBREL | TYPEID_B | TYPEID_B | Object Type | C | 32 | Internal |
| SRGBTBREL | CATID_B | CATID_B | Object Category | C | 2 | Internal |
| SRGBTBREL | LOGSYS_A | LOGSYS_A | Logical System | C | 10 | Not in use |
| SRGBTBREL | ARCH_A | ARCH_A | Object Archived | C | 1 | Not in use |
| SRGBTBREL | LOGSYS_B | LOGSYS_B | Logical System | C | 10 | Not in use |
| SRGBTBREL | ARCH_B | ARCH_B | Object Archived | C | 1 | Not in use |
| SRGBTBREL | UTCTIME | UTCTIME | Short Time Stamp | P | 8 | Internal |
| SRGBTBREL | HOMESYS | HOMESYS | Logical System | C | 10 | Not in use |
| SOOD | OBJTP | OBJTP | Document class | C | 3 | Internal |
| SOOD | OBJYR | OBJYR | Object year | C | 2 | Internal |
| SOOD | OBJNO | OBJNO | Object number | C | 12 | Internal |
| SOOD | OBJLA | OBJLA | Doc. language | C | 1 | Internal |
| SOOD | OBJSRT | OBJSRT | Sort field | C | 10 | Not in use |
| SOOD | OBJNAM | OBJNAM | Document Name | C | 12 | Internal |
| SOOD | OBJDES | OBJDES | Document title | C | 50 | Mandatory |
| SOOD | OWNTP | OWNTP | Owner type | C | 3 | Internal |
| SOOD | OWNYR | OWNYR | Owner year | C | 2 | Internal |
| SOOD | OWNNO | OWNNO | Owner number | C | 12 | Internal |
| SOOD | OWNNAM | OWNNAM | Owner name | C | 12 | Internal |
| SOOD | CROTP | CROTP | User Type | C | 3 | Internal |
| SOOD | CROYR | CROYR | User year | C | 2 | Internal |
| SOOD | CRONO | CRONO | User number | C | 12 | Internal |
| SOOD | CRONAM | CRONAM | Created by | C | 12 | Internal |
| SOOD | CRDAT | CRDAT | Date created | D | 8 | Internal |
| SOOD | CRTIM | CRTIM | Created at | T | 6 | Internal |
| SOOD | FILE_EXT | FILE_EXT | File extension | C | 3 | Mandatory |
Data Cleansing
| ID | Criticality | Error Message/Report Description | Rule | Output | Source System |
|---|---|---|---|---|---|
| N/A | |||||
Conversion Process
The high-level process is represented by the diagram below:
The ETL (Extract, Transform, Load) process is a structured approach to data migration and management, ensuring high-quality data is seamlessly transferred across systems. Here’s a breakdown of its key components:
1. Extraction
The process begins with extracting metadata and raw data from source systems, such as Syensqo ECC system (i.e., WP2/PF2) periodically. The extracted data is then staged for transformation.
2. Transformation
Once extracted, the data undergoes cleansing, consolidation, and governance. This step ensures data integrity, consistency, and compliance with business rules. The transformation process includes:
- Data validation to remove inconsistencies.
- Standardization to align formats across datasets.
- Business rule application to refine data for operational use.
3. Loading
The transformed data is then loaded into the target S4 Hana system. For attachment, it will be migrated via a custom program. The table list down the major steps to perform this ETL process.
| Step | Step Description | Owner | Remarks |
|---|---|---|---|
| 1 | Extract the table SRGBTBREL / SOOD for attachment | Syniti | |
| 2 | Apply the relevancy rule and define the scope of attachment to be migrated | Syniti | The output will be an excel file includes the ECC attachment information relevant for migration |
| 3 | Based on Step 2, download the attachment to local | A custom program will be developed in ECC (WP2/PF2). By reading the excel file in step 2, it will download the attachments to local | |
| 4 | Perform transformation, i.e., convert the ECC customer number to S4 BP number | Syniti | The output will be an excel file includes the S4 data object and the attachment information (i.e., file name from step 2) |
| 5 | Based on Step 4, upload the attachment in S4 | A custom program will be developed in S4. By reading the excel file in step 4, it will upload the attachment from local (saved from Step 3) |
Data Privacy and Sensitivity
N/A
Extraction
Extract data from SAP ECC table SRGBTBREL into Syniti Migrate. The data exists. Syniti Migrate connects to the source and loads the data into Syniti Migrate. Then perform full data extraction from relevant tables in the source system(s).
Extraction Run Sheet
| Req # | Requirement Description | Team Responsible |
|---|---|---|
| Extraction Scope Definition | - Identify the source systems WP2/PF2 and databases involved. - Define the data objects (tables SRGBTBREL) to be extracted. - Establish business rules for data selection. | Syniti |
| Extraction Methodology | - Specify the extraction approach (full extraction). - Determine the tools and technologies used. - Define data filtering criteria to exclude irrelevant records. | Syniti / |
| Extraction Execution Plan | - Establish execution timelines and batch processing schedules. - Assign responsibilities for extraction monitoring. - Document dependencies on other migration tasks. | Syniti / |
| Data Quality and Validation | - Define error handling mechanisms for extraction failures. | Syniti / |
| Extract the attachment | Run custom program based on below template to download the attachments to local | Data |
Sample template to download the attachment from SAP
| Object Type | Object Key | Document ID | File Name | MIME Type |
| BUS2032 | 4500001234 | DOC123456 | invoice.pdf | application/pdf |
Selection Screen
| Selection Ref Screen | Parameter Name | Selection Type | Requirement | Value to be entered/set |
|---|---|---|---|---|
| N/A | ||||
Data Collection Template (DCT)
Target Ready Data Collection Template will be created for data with exception of some fields which require transformation as mentioned in the transformation rule.DCT Rules
| Field Name | Field Description | Rule |
|---|---|---|
| N/A | ||
Extraction Dependencies
| Item # | Step Description | Team Responsible |
|---|---|---|
| 1 | Source System Availability
| Syensqo IT |
| 2 | Data Structure
| Syniti / |
| 3 | Referential Integrity
| Syniti / |
| 4 | Extraction Methodology
| Syniti / |
| 5 | Performance and Scalability Considerations
| Syniti / |
| 6 | Security and Compliance
| Syniti / |
Transformation
The Target fields are mapped to the applicable Legacy field that will be its source, this is a 3-way activity involving the Business, Functional team and Data team. This identifies the transformation activity required to allow Syniti Migrate to make the data Target ready:
- Perform value mapping and data transformation rules.
- Legacy values are mapped to the to-be values (this could include a default value)
- Values are transformed according to the rules defined in Syniti Migrate
- Prepare target-ready data in the structure and format that is required for loading via prescribed Load Tool. This step also produces the load data ready for business to perform Pre-load Data Validation
Transformation Run Sheet
| Item # | Step Description | Team Responsible |
|---|---|---|
| 1 | Transformation Scope Definition - Identify the source and target data structures. - Define business rules for data standardization. - Establish data cleansing requirements to remove inconsistencies. | Data Team |
| 2 | Data Mapping and Standardization - Align source fields with target fields. - Ensure unit consistency (e.g., currency, measurement units) | Data Team |
| 3 | Business Rule Application - Implement data enrichment/collection if applicable - Apply conditional transformations based on predefined logic/business rules | Data Team |
| 4 | Transformation Execution Plan - Define batch processing schedules. - Assign responsibilities for monitoring execution. - Establish error-handling mechanisms | Syniti / |
Transformation Rules
| Rule # | Source system | Source Table | Source Field | Source Description | Target System | Target Table | Target Field | Target Description | Transformation Logic |
|---|---|---|---|---|---|---|---|---|---|
| 1 | WP2/PF2 | SRGBTBREL | CLIENT | Client | S4 Hana | SRGBTBREL | CLIENT | Client | Internal - |
| 2 | WP2/PF2 | SRGBTBREL | BRELGUID | GUID | S4 Hana | SRGBTBREL | BRELGUID | GUID | Internal - |
| 3 | WP2/PF2 | SRGBTBREL | RELTYPE | Relationship type | S4 Hana | SRGBTBREL | RELTYPE | Relationship type | Copy it will have value as NOTE/PNOT/ ATTA (note, private note, attachment) |
| 4 | WP2/PF2 | SRGBTBREL | INSTID_A | Instance ID | S4 Hana | SRGBTBREL | INSTID_A | Instance ID | Mapping - MAP_KUNNR map ECC customer number to S4 BP number |
| 5 | WP2/PF2 | SRGBTBREL | TYPEID_A | Object Type | S4 Hana | SRGBTBREL | TYPEID_A | Object Type | Copy - KNA1 |
| 6 | WP2/PF2 | SRGBTBREL | CATID_A | Object Category | S4 Hana | SRGBTBREL | CATID_A | Object Category | Internal |
| 7 | WP2/PF2 | SRGBTBREL | INSTID_B | Instance ID | S4 Hana | SRGBTBREL | INSTID_B | Instance ID | Internal - |
| 8 | WP2/PF2 | SRGBTBREL | TYPEID_B | Object Type | S4 Hana | SRGBTBREL | TYPEID_B | Object Type | Internal |
| 9 | WP2/PF2 | SRGBTBREL | CATID_B | Object Category | S4 Hana | SRGBTBREL | CATID_B | Object Category | Internal |
| 10 | WP2/PF2 | SRGBTBREL | LOGSYS_A | Logical System | S4 Hana | SRGBTBREL | LOGSYS_A | Logical System | Not in Use - |
| 11 | WP2/PF2 | SRGBTBREL | ARCH_A | Object Archived | S4 Hana | SRGBTBREL | ARCH_A | Object Archived | Not in Use - |
| 12 | WP2/PF2 | SRGBTBREL | LOGSYS_B | Logical System | S4 Hana | SRGBTBREL | LOGSYS_B | Logical System | Not in Use - |
| 13 | WP2/PF2 | SRGBTBREL | ARCH_B | Object Archived | S4 Hana | SRGBTBREL | ARCH_B | Object Archived | Not in Use - |
| 14 | WP2/PF2 | SRGBTBREL | UTCTIME | Short Time Stamp | S4 Hana | SRGBTBREL | UTCTIME | Short Time Stamp | Internal - |
| 15 | WP2/PF2 | SRGBTBREL | HOMESYS | Logical System | S4 Hana | SRGBTBREL | HOMESYS | Logical System | Not in Use - |
| 16 | WP2/PF2 | SOOD | OBJTP | Document class | S4 Hana | SOOD | OBJTP | Document class | Internal |
| 17 | WP2/PF2 | SOOD | OBJYR | Object year | S4 Hana | SOOD | OBJYR | Object year | Internal |
| 18 | WP2/PF2 | SOOD | OBJNO | Object number | S4 Hana | SOOD | OBJNO | Object number | Internal |
| 19 | WP2/PF2 | SOOD | OBJLA | Doc. language | S4 Hana | SOOD | OBJLA | Doc. language | Internal |
| 20 | WP2/PF2 | SOOD | OBJSRT | Sort field | S4 Hana | SOOD | OBJSRT | Sort field | Not in use |
| 21 | WP2/PF2 | SOOD | OBJNAM | Document Name | S4 Hana | SOOD | OBJNAM | Document Name | Internal |
| 22 | WP2/PF2 | SOOD | OBJDES | Document title | S4 Hana | SOOD | OBJDES | Document title | Copy |
| 23 | WP2/PF2 | SOOD | OWNTP | Owner type | S4 Hana | SOOD | OWNTP | Owner type | Internal |
| 24 | WP2/PF2 | SOOD | OWNYR | Owner year | S4 Hana | SOOD | OWNYR | Owner year | Internal |
| 25 | WP2/PF2 | SOOD | OWNNO | Owner number | S4 Hana | SOOD | OWNNO | Owner number | Internal |
| 26 | WP2/PF2 | SOOD | OWNNAM | Owner name | S4 Hana | SOOD | OWNNAM | Owner name | Internal |
| 27 | WP2/PF2 | SOOD | CROTP | User Type | S4 Hana | SOOD | CROTP | User Type | Internal |
| 28 | WP2/PF2 | SOOD | CROYR | User year | S4 Hana | SOOD | CROYR | User year | Internal |
| 29 | WP2/PF2 | SOOD | CRONO | User number | S4 Hana | SOOD | CRONO | User number | Internal |
| 30 | WP2/PF2 | SOOD | CRONAM | Created by | S4 Hana | SOOD | CRONAM | Created by | Internal |
| 31 | WP2/PF2 | SOOD | CRDAT | Date created | S4 Hana | SOOD | CRDAT | Date created | Internal |
| 32 | WP2/PF2 | SOOD | CRTIM | Created at | S4 Hana | SOOD | CRTIM | Created at | Internal |
| 33 | WP2/PF2 | SOOD | FILE_EXT | File extension | S4 Hana | SOOD | FILE_EXT | File extension | Rule. Concatenate 'application/' & this field |
Transformation Mapping
| Mapping Table Name | Mapping Table Description |
|---|---|
| MAP_KUNNR | BP Customer Mapping Table |
Transformation Dependencies
List the steps that need to occur before transformation can commence| Item # | Step Description | Team Responsible |
|---|---|---|
| 1 | Source Data Integrity - Ensure extracted data is complete, accurate, and consistent. - Validate that data types and formats align with transformation requirements. | Syniti / |
| 2 | Referential Integrity - Ensure dependent records are transformed together or in advance, such as CNV-3007 Business Partner General and DMS server migration | Syniti / |
| 3 | Transformation Logic and Mapping - Define data mapping rules between source and target schemas. | Data Team |
| 4 | Performance and Scalability Considerations - Optimize transformation processes for large datasets. - Ensure system resources can handle transformation workloads | Syniti / |
| 5 | Logging and Error Handling - Maintain detailed logs of transformation activities. - Define error-handling procedures for failed transformations | Syniti / |
Pre-Load Validation
Project Team
Completeness
| Task | Action |
|---|---|
| Compare Data Counts |
|
| Validate the mandatory fields | Validate there is value for all the mandatory fields |
| Validate Primary Keys and Unique Constraints |
|
| Test Referential Integrity | Confirm dependent records exist in related tables |
Accuracy
| Task | Action |
|---|---|
| Validate the transformation | Validate the fields which require transformation have the value after transformation instead of the original field value |
| Check Data Consistency |
|
Business
Completeness
| Task | Action |
|---|---|
| Compare Data Counts |
|
Accuracy
| Task | Action |
|---|---|
| Check Data Consistency |
|
Load
The load process includes:
- Execute the data load into target system using load tool
- Once the data is loaded to the target system, it will be extracted and prepared for Post Load Data Validation
Load Run Sheet
| Item # | Step Description | Team Responsible |
|---|---|---|
| 1 | Load Scope Definition - Identify the target system and database structure. - Define data objects (tables, fields, records) to be loaded. - Establish business rules for data validation. | Data team |
| 2 | Load Methodology - Specify the loading tools and technologies (Migration Cockpit, LSMW, custom loading program). | Syniti |
| 3 | Data Quality and Validation - Ensure data integrity checks (null values, duplicates, format validation). - Perform pre-load validations to verify completeness. - Define error handling mechanisms for load failures | Syniti |
| 4 | Load Execution Plan - Establish execution timelines and batch processing schedules. - Assign responsibilities for monitoring execution. - Document dependencies on other migration tasks | Syniti |
| 5 | Logging and Reporting - Maintain detailed logs of loading activities. - Generate summary reports on loaded data volume and quality. - Define escalation procedures for errors | Syniti |
This object will be loaded via a custom program. The custom program will use the upload template below.
| Object Type | Object Key | File Path (to-be filled by uploader) | File Name | MIME Type |
| KNA1 | BP Number | C:\GOS\invoice.pdf local directory | invoice.pdf | application/pdf file type |
Load Phase and Dependencies
The Attachment for customer master data will be loaded in the pre-cutover period.
Before loading, it will have dependency on the DMS server set up.
Configuration
| Item # | Configuration Item |
|---|---|
| N/A | |
Conversion Objects
| Object # | Preceding Object Conversion Approach |
|---|---|
| 3007 | Business Partners - General (Role 000000) |
Error Handling
| Error Type | Error Description | Action Taken |
|---|---|---|
| Technical | There is error message when using the upload program | Raise ticket to the technical team |
Post-Load Validation
Project Team
Completeness
| Task | Action |
|---|---|
| Validate the data count in the database | SE16N to do record count based on table SRGBTBREL, Object Type KNA1, then compare the source data count |
| Validate the attachment in the BP |
|
Accuracy
| Task | Action |
|---|---|
| Compare uploaded data against source file |
|
Business
Post-load validation is a critical step in data migration, ensuring that transferred data is accurate, complete, and functional within the target system.
1. Ensuring Data Integrity
After migration, data must be consistent with its original structure. Post-load validation checks for missing records, incorrect mappings, and formatting errors to prevent discrepancies.
2. Business Continuity
Faulty data can disrupt operations, leading to financial losses and inefficiencies. Validating post-load data ensures that applications function as expected, preventing downtime.
3. Error Detection and Resolution
By validating data post-migration, businesses can detect anomalies early, reducing the cost and effort required for corrections
Completeness
| Task | Action |
|---|---|
| Verify the attachment count |
|
Accuracy
| Task | Action |
|---|---|
| Compare uploaded data against source file |
|
Key Assumptions
- Master Data Standard is up to date as on the date of documenting this conversion approach and data load.
- Attachment for customer master data is in scope based on data design and any exception requested by business.
- There will be 3 SAP instances, one for ROW, one for China and one for CUI only.
See also
Change log
Workflow history
| Title | Last Updated By | Updated | Status | |
|---|---|---|---|---|
| There are no pages at the moment. | ||||
