Status

Owner
StakeholdersThe business stakeholders involved in making, reviewing, and endorsing this decision. Type @ to mention people by name

Purpose

The purpose of this document is to define the conversion approach to create Service Master records in S/4HANA. It follows the Procurement Master Data Design Standard and is based on data being migrated from legacy SAP ECC systems (PF2, WP2) into SAP S/4HANA.

The purpose of this document is to define the conversion approach to upload data in S/4 HANA.


---CHECKLIST---

  • Logical Source and Target Systems are identified. 
  • Processing Type is specified (i.e. Direct Input, BAPI, IDoc, Web Service, OData API, etc.)
  • Standard or custom load program name/BAPI/IDoc is specified to support this functionality.
  • Data Mapping is provided for mappings which are not obvious to a suitably-skilled and experienced developer
  • All translation requirements are clearly defined (including data validation rules, data derivation/calculation and default values, if applicable)Input and/or Output file layouts are provided for all record types possible
  • Transaction volume is specified Execution frequency is specified Restart/Recovery requirements have been defined
  • Error handling requirements are specified, including alerting requirement and expected action on failure. 
  • Application log requirement if applicable to be specified. 



--- Additional topics to check---

  • Update the Data Volume with a actual figures
  • Rules for Data Merging (all systems)
  • Mapping Rules & Conversion Rules



Conversion Scope

Current Usage and Setup in Legacy Systems:

In the legacy SAP ECC systems (PF2 and WP2), Service Master records are maintained primarily to support procurement processes involving services. However, the level of data quality and consistency varies across systems. In some cases, key fields such as Service Descriptions, Units of Measure, or Valuation Classes may be incomplete or non-standardized. Additionally, the linkage between services and associated purchasing conditions is often loosely maintained or inconsistently applied.

Target Representation in SAP S/4HANA:

In S/4HANA, Service Master data will be created in alignment with the Procurement Master Data Design Standard. The objective is to ensure that services are structured, clean, and standardized across the entire enterprise. This includes enforcing harmonized naming conventions, classification, and valuation data to support downstream procurement and analytics processes. Where applicable, service hierarchies and pricing conditions will also be defined to enable integrated procurement planning and execution.

Additional Relevant Information:

  • Service Master records in S/4HANA will be loaded using the SAP Migration Cockpit, based on validated and cleansed legacy extracts.

  • Redundant or obsolete service entries will be excluded as part of the cleansing process.

  • Each service entry will be reviewed and approved by the business before final load.

  • The mapping of legacy keys to S/4-compatible structures (e.g., Service Type, Material Group) will be maintained for traceability.


The data from legacy system includes:

  1. Service records marked as relevant for purchasing processes

  2. Entries with valid service numbers and active status

  3. Records linked to active cost centers or purchasing groups

  4. Service descriptions maintained in English

The data from legacy system excludes:

  1. Obsolete or blocked service records

  2. Entries with missing mandatory fields (e.g., Unit of Measure, Valuation Class)

  3. Test or dummy records (e.g., codes starting with “ZZZ”)

  4. Services not linked to any purchasing documents or usage history

  5. Duplicate entries identified through harmonization process


List of source systems and approximate number of records
SourceScope

Source Approx No. of Records

Target SystemTarget Approx

No. of Records

PF2

The Purchasing Info Records will be extracted/collected via DCT. An initial extract of the relevant data will be provided in Google Sheet format to assist business in decision making on including any relevant data from Source Systems.

Any additional data that need to be created to support the new design may be added in the DCT.

A data review and standardization will be done across all Info Records in the DCT. 

38.590

S4H

35.000

WP2

882S4H880

Additional Information

Multi-language Requirement

Texts associated with multiple languages can be maintained accordingly, provided they are available in the source system. In case of the existence of a custom language in the source system, this can be migrated only in case the same exist on the targe system, otherwise skip the record.

International Version

Description

BGBulgarian
CSCzech
DEGerman
ELGreek
ENEnglish
ESSpanish
FRFrench
ITItalian
JAJapanese
KOKorean
NLDutch (Netherlands)
PLPolish
PTPortuguese
SKSlovak
ZHChinese

Document Management

Summarize Document Management requirement, if any

Legal Requirement

Minor effort is foreseen for some specific countries having specific taxations, the topic is limited to have the tax codes mapping (from Old to New tax code).

Special Requirements

Processing Type ETL: 

  • Transformation (T): Syniti will be responsible for cleansing, transforming, and merging the data, as well as updating the staging areas of the SAP Migration Cockpit.

  • Extraction (E): Data will be extracted from at least two source systems. This activity will be performed by Syniti.

  • Loading (L): The final data load into the target SAP system will be carried out using the standard functionalities of the SAP Migration Cockpit.





Target Design

With Functional input, document the technical design of the target fields that are in the scope of this document.

The technical design of the target for this conversion approach.

TableFieldData ElementField DescriptionData TypeLengthRequirement
ASMDMANDT      MANDT       Client                       CLNT    3Required
ASMDSRV_INTERNALASNUM       Service Number               CHAR    10Required
ASMDLOEKZ      LOEKZ       Deletion Indicator           CHAR    1Optional
ASMDKTGRM      KTGRM       Account Assignment Category  CHAR    2Optional
ASMDMEINS      MEINS       Base Unit of Measure         UNIT    3Required
ASMDBKLAS      BKLAS       Valuation Class              CHAR    4Optional
ASMDPEINH      PEINH       Price Unit                   DEC     5Optional
ASMDWAERS      WAERS       Currency Key                 CUKY    5Optional







ASMDTMANDT      MANDT       Client                       CLNT    3Required
ASMDTSRV_INTERNALASNUM       Service Number               CHAR    10Required
ASMDTSPRAS      SPRAS       Language Key                 LANG    1Required
ASMDTKTEXT      ASMD_KTEXT  Short Text                   CHAR    40Required
ASMDTMAKTX      MAKTX       Service Description          CHAR    40Optional


Data Cleansing

All data cleansing should take place in the data source system as defined in this document, unless system limitations prevent it.


If data cleansing is managed outside of the source system (e.g. Syniti Migrate, 3rd Party Vendor, DCT), the necessary documentation must be produced and appended to this deliverable for sign-off.


IDCriticalityError Message/Report DescriptionRuleOutputSource System
001MediumRemove Duplicated Service DescriptionsRemove duplicates based on Service Description + UoM + Service Group
PF2 / WP2
002MediumValidation of Service Group, UoM, and active status in target configValidate existence of Service Group, UoM, and active status in target config
PF2 / WP2



Conversion Process

The high-level process is represented by the diagram below:

image-2025-6-20_9-51-38.png



Data Privacy and Sensitivity

N/A


Extraction

Extract data from a source into Syniti Migrate. There are 2 possibilities:

  1. The data exists. Syniti Migrate connects to the source and loads the data into Syniti Migrate. There are 3 methods:
    1. Perform full data extraction from relevant tables in the source system(s).
    2. Perform extraction through the application layer.
    3. Only if Syniti Migrate; cannot connect to the source, data is loaded to the repository from the provided source system extract/report.
  2. The data does not exist (or cannot be converted from its current state). The data is manually collected by the business directly in Syniti Migrate. This is to be conducted using DCT (Data Collection Template) in Syniti Migrate

The agreed Relevancy criteria is applied to the extracted records to identify the records that are applicable for the Target loads

Extraction Run Sheet

Req #Requirement DescriptionTeam Responsible
001Do not extract records where ASMD-LVORM = X (record deleted)Syniti Team
002Do not extract records where ASMD-MATKL (Material Group) is set as out of scopeSyniti Team
003Do not extract records where ASMD-MLANG (Main Language)  is set as out of scopeSyniti Team
004Do not extract records where ASMD-BKLAS (Valuation Class)  is set as out of scopeSyniti Team
005

Service Group (ASMD-ASBGR) must exist in customizing table T023

Syniti Team


Selection Screen

If applicable, this section will give the details on any selection screen parameters, including the parameter type, that are required to be entered to ensure consistent data extracts.
Selection Ref ScreenParameter NameSelection TypeRequirementValue to be entered/set





















Data Collection Template (DCT)

Target Ready Data Collection Template will be created for Data Object data with exception of some fields which require transformation as mentioned in the transformation rule.

<Object> DCT Rules

Field NameField DescriptionRule












Extraction Dependencies

List the steps that need to occur before extraction can commence

Item #Step DescriptionTeam Responsible













Transformation

The Target fields are mapped to the applicable Legacy field that will be its source, this is a 3-way activity involving the Business, Functional team and Data team. This identifies the transformation activity required to allow Syniti Migrate to make the data Target ready:

  1. Perform value mapping and data transformation rules.
    1. Legacy values are mapped to the to-be values (this could include a default value)
    2. Values are transformed according to the rules defined in Syniti Migrate
  2. Prepare target-ready data in the structure and format that is required for loading via prescribed Load Tool. This step also produces the load data ready for business to perform Pre-load Data Validation

Transformation Run Sheet

Item #Step DescriptionTeam Responsible













Transformation Rules

Rule #Source systemSource TableSource FieldSource DescriptionTarget SystemTarget TableTarget FieldTarget DescriptionTransformation Logic
001PF2 & WP2ASMDMATKL

Material Group

S4HANAASMDMATKLS4/HanaCreate a transformation rule for MATKL by mapping source values to target equivalents using Syniti Migrate
002PF2 & WP2ASMDMLANG

Language

S4HANAASMDMLANGS4/Hana

Create a transformation rule for MLANG by mapping source values to target equivalents using Syniti Migrate.

003PF2 & WP2ASMDBKLAS

Valuation Class

S4HANAASMDBKLASS4/HanaCreate a transformation rule for BKLAS by mapping source values to target equivalents using Syniti Migrate.
004PF2 & WP2ASMDASBGR

Service Group

S4HANAASMDASBGRS4/Hana

Ensure the Service Group (ASMD-ASBGR) exists in the customizing table T023 before migration.

 


Transformation Mapping

Use the exact name and reference this section in the “Transformation rules” above
Mapping Table NameMapping Table Description








Transformation Dependencies

List the steps that need to occur before transformation can commence
Item #Step DescriptionTeam Responsible













Pre-Load Validation

Project Team

The following pre-load validations will be performed by the Project Team.

Completeness

TaskAction
Mandatory FieldsCheck mandatory fields for completeness
Field FormatValidate field formats (text length, data types)
Service Group and UOMConfirm Service Group and UoM exist in target system
Duplicated RecordsCheck for duplicates before load

Accuracy

TaskAction
titlespecific details of what and how the task needs to be performed e.g. which reports are being used etc.





Business

The following pre-load validations will be performed by the business.

Completeness

TaskAction
titlespecific details of what and how the task needs to be performed e.g. which reports are being used etc.





Accuracy

TaskAction
titlespecific details of what and how the task needs to be performed e.g. which reports are being used etc.





Load

The load process includes:

  1. Execute the automated data load into target system using load tool or product the load file if the load must be done manually
  2. Once the data is loaded to the target system, it will be extracted and prepared for Post Load Data Validation


The loading will be executed using LSMW with the BAPI BAPI_SERVICE_CREATE, as there is no standard object available in the SAP Migration Cockpit for Service Master data.

Load Run Sheet

Item #Step DescriptionTeam Responsible













Load Phase and Dependencies

Identify the phase as to “when” the load for this object will occur. <Pre-Cutover, Cutover, Post Cutover> and list the steps that need to occur before the load can commence

Configuration

List the Configurations required before loading can commence

Item #Configuration Item






Conversion Objects

Object #Preceding Object Conversion Approach






Error Handling

The table below depicts some possible system errors for this data object during data load. All data load error is to be logged as defect and managed within the Defect Management

Error TypeError DescriptionAction Taken










Post-Load Validation

Project Team

The following post-load validations will be performed by the Project Team.

Completeness

TaskAction
titleCheck record count against DCT and load file.

Verify random samples in S/4 using transaction AC03.

Ensure data appears in standard search helps (e.g., ML81N).

Log any mismatches or errors for remediation.

Accuracy

TaskAction
titlespecific details of what and how the task needs to be performed e.g. which reports are being used etc.







Business

The following post-load validations will be performed by the business.

Completeness

TaskAction
titlespecific details of what and how the task needs to be performed e.g. which reports are being used etc.





Accuracy

TaskAction
titlespecific details of what and how the task needs to be performed e.g. which reports are being used etc.





Key Assumptions

  • Master Data Standard is up to date as on the date of documenting this conversion approach and data load.
  • Data Object is in scope based on data design and any exception requested by business.

Any additional key assumptions.


See also

Insert links and references to other documents which are relevant when trying to understand this decision and its implications. Other decisions are often impacted, so it's good to list them here with links. Attachments are also possible but dangerous as they are static documents and not updated by their authors.

Change log

Workflow history