| Status | |
| Owner | |
| Stakeholders | The business stakeholders involved in making, reviewing, and endorsing this decision. Type @ to mention people by name |
The purpose of this document is to define the conversion approach to create Business Partners - General in S/4 HANA.
In SAP ECC, customer and vendor master data are maintained separately as distinct entities. Customers are managed through Customer Master Data, while vendors are handled via Vendor Master Data. These records store essential details such as company name, address, payment terms, and tax information.
In SAP S/4HANA, the Business Partner (BP) concept replaces the traditional customer and vendor master data approach. The BP model integrates both customer and vendor roles into a single entity, simplifying data management and ensuring consistency across different business functions
The scope of this document covers the approach for converting active Customer Master Data General and Vendor Master General from Legacy Source Systems into S/4HANA Business Partner (BP) General (Role 000000) Master Data Design Standard.
Customer Master Data - General Information
The data from legacy system includes:
The data from legacy system excludes:
Vendor Master Data - General Information
The data from legacy system includes:
The data from legacy system excludes:
| Source | Scope | Source Approx No. of Records | Target System | Target Approx No. of Records |
|---|---|---|---|---|
| WP2 | Customer Master Data General Information | 50000 | S4 Hana System | 50000 |
| PF2 | Customer Master Data General Information | 50000 | S4 Hana System | 50000 |
The customer and vendor general data may contain international address. Therefore, the conversion will also need to support the multi-language address. Below languages (International versions) are supported.
| International Version | Description |
|---|---|
| C | Simplified Chinese |
| R | Cyrillic |
| K | Kanji (Japanese) |
| A | Arabic |
| 3 | Korean |
| T | Thai |
| H | Hangul |
It is possible the customer has attachment in the legacy system. The migration of attachment will be captured in conversion spec CNV-3004 - Attachment for customer master data.
Summarize Legal Requirement/s, if any
Due to compliance requirement, there will be one SAP instance for Rest of the World and one for China specifically. For entities in China, the data will be loaded into SAP instance while the entire migration process will remain the same as rest of the world.
If the data conversion involves third-party systems or external data sources, such as Icertis, describe any additional requirements related to data mapping, transformation logic, validation rules or security measures that must be followed.
With Functional input, document the technical design of the target fields that are in the scope of this document.
The technical design of the target for this conversion approach.
| Table | Field | Data Element | Field Description | Data Type | Length | Requirement |
|---|---|---|---|---|---|---|
| MARA | MATNR | MATNR | Material Number | CHAR | 18 | Mandatory |
All data cleansing should take place in the data source system as defined in this document, unless system limitations prevent it.
If data cleansing is managed outside of the source system (e.g. Syniti Migrate, 3rd Party Vendor, DCT), the necessary documentation must be produced and appended to this deliverable for sign-off.
Customer Master Data - General Information
| ID | Criticality | Error Message/Report Description | Rule | Output | Source System |
|---|---|---|---|---|---|
| 3007-001 | Missing Postal code in the general data | PF2/WP2 | |||
| 3007-002 | Missing Street in the general data | PF2/WP2 | |||
| 3007-003 | Missing region in the general data | PF2/WP2 | |||
| 3007-004 | Review the international version address maintained for the customer | PF2/WP2 | |||
| 3007-005 | Review the customer with obsolete region code | PF2/WP2 | |||
| 3007-006 | Identify duplicate BP 1. customer vs customer | PF2/WP2 | |||
| 3007-007 | Identify duplicate BP 2. vendor vs customer if applicable | PF2/WP2 | |||
| Block customer general data without any usage for more than 2 years | For customer without general block and customer is not used in any sales or finance transaction for more than 2 years under company company code |
Vendor Master Data - General Information
| ID | Criticality | Error Message/Report Description | Rule | Output | Source System |
|---|---|---|---|---|---|
The high-level process is represented by the diagram below:
The ETL (Extract, Transform, Load) process is a structured approach to data migration and management, ensuring high-quality data is seamlessly transferred across systems. Here’s a breakdown of its key components:
1. Extraction
The process begins with extracting metadata and raw data from source systems, such as Syensqo ECC system (i.e., WP2/PF2) periodically. The extracted data is then staged for transformation.
2. Transformation
Once extracted, the data undergoes cleansing, consolidation, and governance. This step ensures data integrity, consistency, and compliance with business rules. The transformation process includes:
- Data validation to remove inconsistencies.
- Standardization to align formats across datasets.
- Business rule application to refine data for operational use.
3. Loading
The transformed data is then loaded into the target S4 Hana system.

Extract data from a source into Syniti Migrate. There are 2 possibilities:
The agreed Relevancy criteria is applied to the extracted records to identify the records that are applicable for the Target loads
| Req # | Requirement Description | Team Responsible |
|---|---|---|
| Extraction Scope Definition | - Identify the source systems and databases involved. - Define the data objects (tables, fields, records) to be extracted. - Establish business rules for data selection. | Synithi |
| Extraction Methodology | - Specify the extraction approach (full, incremental, or delta extraction). - Determine the tools and technologies used. - Define data filtering criteria to exclude irrelevant records. | Synithi |
| Extraction Execution Plan | - Establish execution timelines and batch processing schedules. - Assign responsibilities for extraction monitoring. - Document dependencies on other migration tasks. | Synithi |
| Data Quality and Validation | - Define error handling mechanisms for extraction failures. | Synithi |
| Selection Ref Screen | Parameter Name | Selection Type | Requirement | Value to be entered/set |
|---|---|---|---|---|
| N/A | ||||
<Object> DCT Rules
| Field Name | Field Description | Rule |
|---|---|---|
List the steps that need to occur before extraction can commence
| Item # | Step Description | Team Responsible |
|---|---|---|
| 1 | Source System Availability
| Syensqo IT |
| 2 | Data Structure
| Synithi |
| 3 | Referential Integrity
| Synithi |
| 4 | Extraction Methodology
| Synithi |
| 5 | Performance and Scalability Considerations
| Synithi |
| 6 | Security and Compliance
| Synithi |
The Target fields are mapped to the applicable Legacy field that will be its source, this is a 3-way activity involving the Business, Functional team and Data team. This identifies the transformation activity required to allow Syniti Migrate to make the data Target ready:
| Item # | Step Description | Team Responsible |
|---|---|---|
| 1 | Transformation Scope Definition - Identify the source and target data structures. - Define business rules for data standardization. - Establish data cleansing requirements to remove inconsistencies. | Data Team |
| 2 | Data Mapping and Standardization - Align source fields with target fields. - Ensure unit consistency (e.g., currency, measurement units) | Data Team |
| 3 | Business Rule Application - Implement data enrichment/collection if applicable - Apply conditional transformations based on predefined logic/business rules | Data Team |
| 4 | Transformation Execution Plan - Define batch processing schedules. - Assign responsibilities for monitoring execution. - Establish error-handling mechanisms | Synithi |
Transformation Rules
| Rule # | Source system | Source Table | Source Field | Source Description | Target System | Target Table | Target Field | Target Description | Transformation Logic |
|---|---|---|---|---|---|---|---|---|---|
| Mapping Table Name | Mapping Table Description |
|---|---|
| MAP_BU_GROUP | BP Grouping Mapping Table |
| MAP_REGION | Country/Region Code Mapping Table |
| MAP_BPKIND | BP Type Mapping Table |
| Item # | Step Description | Team Responsible |
|---|---|---|
| 1 | Source Data Integrity - Ensure extracted data is complete, accurate, and consistent. - Validate that data types and formats align with transformation requirements. | Synithi |
| 2 | Referential Integrity - Ensure dependent records are transformed together or in advance | Synithi |
| 3 | Transformation Logic and Mapping - Define data mapping rules between source and target schemas. | Data Team |
| 4 | Performance and Scalability Considerations - Optimize transformation processes for large datasets. - Ensure system resources can handle transformation workloads | Synithi |
| 5 | Logging and Error Handling - Maintain detailed logs of transformation activities. - Define error-handling procedures for failed transformations | Synithi |
| Task | Action |
|---|---|
| Compare Data Counts |
|
| Validate the mandatory fields | Validate there is value for all the mandatory fields |
| Validate Primary Keys and Unique Constraints |
|
| Test Referential Integrity | Confirm dependent records exist in related tables |
| Task | Action |
|---|---|
| Validate the transformation | Validate the fields which require transformation have the value after transformation instead of the original field value |
| Check Data Consistency |
|
| Task | Action |
|---|---|
| title | specific details of what and how the task needs to be performed e.g. which reports are being used etc. |
| Task | Action |
|---|---|
| title | specific details of what and how the task needs to be performed e.g. which reports are being used etc. |
The load process includes:
| Item # | Step Description | Team Responsible |
|---|---|---|
| 1 | Load Scope Definition - Identify the target system and database structure. - Define data objects (tables, fields, records) to be loaded. - Establish business rules for data validation. | Data team |
| 2 | Load Methodology - Specify the loading tools and technologies (Migration Cockpit, LSMW, custom loading program). | Synithi |
| 3 | Data Quality and Validation - Ensure data integrity checks (null values, duplicates, format validation). - Perform pre-load validations to verify completeness. - Define error handling mechanisms for load failures | Synithi |
| 4 | Load Execution Plan - Establish execution timelines and batch processing schedules. - Assign responsibilities for monitoring execution. - Document dependencies on other migration tasks | Synithi |
| 5 | Logging and Reporting - Maintain detailed logs of loading activities. - Generate summary reports on loaded data volume and quality. - Define escalation procedures for errors | Synithi |
Load Phase and Dependencies
The Business Partner General will be loaded in the pre-cutover period.
Before loading, it will have dependency on the configuration. The configuration needs to be transported into the respective system first, including the manual configuration such as the number range set up.
List the Configurations required before loading can commence
| Item # | Configuration Item |
|---|---|
| 1 | BP Grouping |
| 2 | Customer/Vendor Account Group |
| 3 | International Version |
| 4 | Tax Category |
| 5 | BP Type |
| 6 | BP Number Range/Customer/Vendor Number range |
| Object # | Preceding Object Conversion Approach |
|---|---|
| 1083 | Bank Master |
The table below depicts some possible system errors for this data object during data load. All data load error is to be logged as defect and managed within the Defect Management
| Error Type | Error Description | Action Taken |
|---|---|---|
| Configuration / Data Transformation | The value XXX for field XXX doesn't exist |
|
| Configuration | There is mandatory field XXX missing |
|
| Configuration | The BP grouping is External or Internal Number range |
|
| Task | Action |
|---|---|
| Display Records | Pick up few random BP numbers, and Run the BP Report to validate the BP general information can be displayed without any error |
| Perform Source-to-Target Comparisons |
|
| Task | Action |
|---|---|
| Execute Sample Queries and Reports |
|
| Conduct Post-Migration Reconciliation | Generate reports comparing pre- and post-migration data. |
Post-load validation is a critical step in data migration, ensuring that transferred data is accurate, complete, and functional within the target system.
1. Ensuring Data Integrity
After migration, data must be consistent with its original structure. Post-load validation checks for missing records, incorrect mappings, and formatting errors to prevent discrepancies.
2. Business Continuity
Faulty data can disrupt operations, leading to financial losses and inefficiencies. Validating post-load data ensures that applications function as expected, preventing downtime.
3. Error Detection and Resolution
By validating data post-migration, businesses can detect anomalies early, reducing the cost and effort required for corrections
| Task | Action |
|---|---|
| title | specific details of what and how the task needs to be performed e.g. which reports are being used etc. |
| Task | Action |
|---|---|
| Perform Manual Testing | Conduct manual spot-checks for additional assurance. |
Any additional key assumptions.