| Status | In Progress |
|---|---|
| Owner | |
| Stakeholders |
Purpose
The purpose of this document is to define the conversion approach to create Service Master records in S/4HANA. It follows the Procurement Master Data Design Standard and is based on data being migrated from legacy SAP ECC systems (PF2, WP2) into SAP S/4HANA.
The purpose of this document is to define the conversion approach to upload data in S/4 HANA.
---CHECKLIST---
- Logical Source and Target Systems are identified.
- Processing Type is specified (i.e. Direct Input, BAPI, IDoc, Web Service, OData API, etc.)
- Standard or custom load program name/BAPI/IDoc is specified to support this functionality.
- Data Mapping is provided for mappings which are not obvious to a suitably-skilled and experienced developer
- All translation requirements are clearly defined (including data validation rules, data derivation/calculation and default values, if applicable)Input and/or Output file layouts are provided for all record types possible
- Transaction volume is specified Execution frequency is specified Restart/Recovery requirements havee been defined
- Error handling requirements are specified, including alerting requirement and expected action on failure.
- Application log requirement if applicable to be specified.
--- Additional topics to check---
- Rules for Data Merging (all systems)
Conversion Scope
Current Usage and Setup in Legacy Systems:
In the legacy SAP ECC systems (PF2 and WP2), Service Master records are maintained primarily to support procurement processes involving services. However, the level of data quality and consistency varies across systems. In some cases, key fields such as Service Descriptions, Units of Measure, or Valuation Classes may be incomplete or non-standardized. Additionally, the linkage between services and associated purchasing conditions is often loosely maintained or inconsistently applied.
Target Representation in SAP S/4HANA:
In S/4HANA, Service Master data will be created in alignment with the Procurement Master Data Design Standard. The objective is to ensure that services are structured, clean, and standardized across the entire enterprise. This includes enforcing harmonized naming conventions, classification, and valuation data to support downstream procurement and analytics processes. Where applicable, service hierarchies and pricing conditions will also be defined to enable integrated procurement planning and execution.
Additional Relevant Information:
Service Master records in S/4HANA will be loaded using the SAP LSMW with Function Module BAPI_SERVICE_CREATE , based on validated and cleansed legacy extracts.
Redundant or obsolete service entries will be excluded as part of the cleansing process.
Each service entry will be reviewed and approved by the business before final load.
The mapping of legacy keys to S/4-compatible structures (e.g., Service Type, Material Group) will be maintained for traceability.
The data from legacy system includes:
Service records marked as relevant for purchasing processes
Entries with valid service numbers and active status
Records linked to active cost centers or purchasing groups
Service descriptions maintained in English
The data from legacy system excludes:
Obsolete or blocked service records
Entries with missing mandatory fields (e.g., Unit of Measure, Valuation Class)
Services not linked to any purchasing documents or usage history
Duplicate entries identified through harmonization process
| Source | Scope | Source Approx No. of Records | Target System | Target Approx No. of Records |
|---|---|---|---|---|
PF2 | The Purchasing Info Records will be extracted/collected via DCT. An initial extract of the relevant data will be provided in Google Sheet format to assist business in decision making on including any relevant data from Source Systems. Any additional data that need to be created to support the new design may be added in the DCT. A data review and standardization will be done across all Info Records in the DCT. | 38.594 | S4H | 16.000 (50% -) |
WP2 | 883 | S4H | 440 (50% -) |
Additional Information
Multi-language Requirement
Material Master will have a default description available in English.
Texts associated with multiple languages can be maintained accordingly, provided they are available in the source system. In case of the existence of a custom language in the source system, this can be migrated only in case the same exist on the targe system, otherwise skip the record.
International Version | Description |
Document Management
Legal Requirement
Minor effort is foreseen for some specific countries having specific taxations, the topic is limited to have the tax codes mapping (from Old to New tax code).
Special Requirements
Processing Type ETL:
Transformation (T): Syniti will be responsible for cleansing, transforming, and merging the data, as well as updating the staging areas of the SAP Migration Cockpit.
Extraction (E): Data will be extracted from at least two source systems. This activity will be performed by Syniti.
- Loading (L): The final data load into the target SAP system will be carried out using the standard functionalities of the SAP Migration Cockpit.
Deduplication Rules
Rule ID | Semantic Rule | Purpose |
| SR01 | Short text (Description) in combination with Language | |
| SR02 | Material Group or Service Group (check mapping when deduplication between PF2 & WP2) | |
| SR03 | Activity in the past 3 years (excluding records marked for deletion) | |
| SR04 | Material Type is the same | |
| SR05 | Material Status field is set as active |
Target Design
The technical design of the target for this conversion approach.
| Table | Field | Data Element | Field Description | Data Type | Length | Requirement |
|---|---|---|---|---|---|---|
Data Cleansing
All data cleansing should take place in the data source system as defined in this document, unless system limitations prevent it.
If data cleansing is managed outside of the source system (e.g. Syniti Migrate, 3rd Party Vendor, DCT), the necessary documentation must be produced and appended to this deliverable for sign-off.
| ID | Criticality | Error Message/Report Description | Rule | Output | Source System |
|---|---|---|---|---|---|
| 001 | Medium | Remove Duplicated Service Descriptions | Remove duplicates based on Service Description + UoM + Service Group | PF2 / WP2 | |
| 002 | Medium | Validation of Service Group, UoM, and active status in target config | Validate existence of Service Group, UoM, and active status in target config | PF2 / WP2 |
Conversion Process
The high-level process is represented by the diagram below:
Data Privacy and Sensitivity
N/A
Data Merging / Data Consolidation
The data merging is restricted to the Materia Types Z720 / ZB20 / ZDIE
Extraction
Extract data from a source into . There are 2 possibilities:
- The data exists. connects to the source and loads the data into . There are 3 methods:
- Perform full data extraction from relevant tables in the source system(s).
- Perform extraction through the application layer.
- Only if ; cannot connect to the source, data is loaded to the repository from the provided source system extract/report.
- The data does not exist (or cannot be converted from its current state). The data is manually collected by the business directly in . This is to be conducted using DCT (Data Collection Template) in
The agreed Relevancy criteria is applied to the extracted records to identify the records that are applicable for the Target loads
Extraction Run Sheet
| Req # | Requirement Description | Team Responsible |
|---|---|---|
| 001 | Do not extract records where ASMD-LVORM = X (record deleted) | Syniti Team |
| 002 | Do not extract records where ASMD-MATKL (Material Group) is set as out of scope | Syniti Team |
| 003 | Do not extract records where ASMD-MLANG (Main Language) is set as out of scope | Syniti Team |
| 004 | Do not extract records where ASMD-BKLAS (Valuation Class) is set as out of scope | Syniti Team |
| 005 | Service Group (ASMD-ASBGR) must exist in customizing table T023 | Syniti Team |
Selection Screen
| Selection Ref Screen | Parameter Name | Selection Type | Requirement | Value to be entered/set |
|---|---|---|---|---|
Data Collection Template (DCT)
Target Ready Data Collection Template will be created for data with exception of some fields which require transformation as mentioned in the transformation rule.DCT Rules
| Field Name | Field Description | Rule |
|---|---|---|
Extraction Dependencies
| Item # | Step Description | Team Responsible |
|---|---|---|
Transformation
The Target fields are mapped to the applicable Legacy field that will be its source, this is a 3-way activity involving the Business, Functional team and Data team. This identifies the transformation activity required to allow to make the data Target ready:
- Perform value mapping and data transformation rules.
- Legacy values are mapped to the to-be values (this could include a default value)
- Values are transformed according to the rules defined in
- Prepare target-ready data in the structure and format that is required for loading via prescribed Load Tool. This step also produces the load data ready for business to perform Pre-load Data Validation
Transformation Run Sheet
| Item # | Step Description | Team Responsible |
|---|---|---|
Transformation Rules
| Rule # | Source system | Source Table | Source Field | Source Description | Target System | Target Table | Target Field | Target Description | Transformation Logic |
|---|---|---|---|---|---|---|---|---|---|
| 001 | PF2 & WP2 | ASMD | MATKL | Material Group | S4HANA | ASMD | MATKL | S4/Hana | Create a transformation rule for MATKL by mapping source values to target equivalents using Syniti Migrate |
| 002 | PF2 & WP2 | ASMD | MLANG | Language | S4HANA | ASMD | MLANG | S4/Hana | Create a transformation rule for MLANG by mapping source values to target equivalents using Syniti Migrate. |
| 003 | PF2 & WP2 | ASMD | BKLAS | Valuation Class | S4HANA | ASMD | BKLAS | S4/Hana | Create a transformation rule for BKLAS by mapping source values to target equivalents using Syniti Migrate. |
| 004 | PF2 & WP2 | ASMD | ASBGR | Service Group | S4HANA | ASMD | ASBGR | S4/Hana | Ensure the Service Group (ASMD-ASBGR) exists in the customizing table T023 before migration. |
Transformation Mapping
| Mapping Table Name | Mapping Table Description |
|---|---|
Transformation Dependencies
List the steps that need to occur before transformation can commence| Item # | Step Description | Team Responsible |
|---|---|---|
Pre-Load Validation
Project Team
Completeness
| Task | Action |
|---|---|
| Mandatory Fields | Check mandatory fields for completeness |
| Field Format | Validate field formats (text length, data types) |
| Service Group and UOM | Confirm Service Group and UoM exist in target system |
| Duplicated Records | Check for duplicates before load |
Accuracy
| Task | Action |
|---|---|
Business
Completeness
| Task | Action |
|---|---|
Accuracy
| Task | Action |
|---|---|
Load
The load process includes:
- Execute the automated data load into target system using load tool or product the load file if the load must be done manually
- Once the data is loaded to the target system, it will be extracted and prepared for Post Load Data Validation
The loading will be executed using LSMW with the BAPI BAPI_SERVICE_CREATE, as there is no standard object available in the SAP Migration Cockpit for Service Master data.
Load Run Sheet
| Item # | Step Description | Team Responsible |
|---|---|---|
Load Phase and Dependencies
Configuration
| Item # | Configuration Item |
|---|---|
Conversion Objects
| Object # | Preceding Object Conversion Approach |
|---|---|
Error Handling
| Error Type | Error Description | Action Taken |
|---|---|---|
Post-Load Validation
Project Team
Completeness
| Task | Action |
|---|---|
| Check record count against DCT and load file. | |
| Verify random samples in S/4 using transaction AC03. | |
| Ensure data appears in standard search helps (e.g., ML81N). | |
| Log any mismatches or errors for remediation. |
Accuracy
| Task | Action |
|---|---|
Business
Completeness
| Task | Action |
|---|---|
Accuracy
| Task | Action |
|---|---|
Key Assumptions
- Master Data Standard is up to date as on the date of documenting this conversion approach and data load.
- is in scope based on data design and any exception requested by business.
See also
Change log
Workflow history
| Title | Last Updated By | Updated | Status | |
|---|---|---|---|---|
| There are no pages at the moment. | ||||
