Status

Owner
Stakeholders

Purpose

The purpose of this document is to define the conversion approach to create 1057 - QM Master Inspection Characteristics in S/4 HANA.

Master Inspection Characteristics are a fundamental element of SAP Quality Management (QM) used to define the parameters, specifications, and methods for quality inspections. MICs provide a standardized definition of what is to be inspected and how it should be measured, ensuring uniformity and consistency across inspection plans and quality processes. MICs can be maintained as Quantitative (numeric specifications such as measurement ranges and tolerances) or Qualitative (descriptive specifications such as defect classes or codes).

In SAP S/4HANA, the structure and usage of MICs remain consistent with SAP ECC, typically defined by key combinations such as plant, characteristic name, and characteristic type. MICs may include additional settings such as selected sets for qualitative characteristics, catalog assignments, default inspection methods, target values, upper and lower specification limits, and sampling procedures.

In SAP ECC, aside from the standard MIC structures, there may be additional configurations, such as plant-independent characteristics, characteristics linked to custom catalogs, or MICs with special control indicators and custom fields. Certain legacy systems may also include MICs with obsolete catalog references, inactive units of measure, or unused selected sets (pending MDS review), which must be validated before conversion.

This conversion aims to migrate active and relevant MIC records, along with their associated control indicators, selected sets, catalog assignments, default inspection methods, and specification limits, from existing ECC systems into S/4HANA. The migration will apply the required transformation logic using Syniti as the data migration and transformation platform. The converted records will be loaded into the target S/4HANA system using standard SAP mechanisms such as BAPIs (e.g., BAPI_INSPOPER_RECORDRESULTS for linking to inspection operations, or QS21 transaction for creation), IDOCs, or direct table loads where applicable, ensuring data integrity, compliance, and reusability across inspection plans.


Conversion Scope

The scope of this document covers the approach for converting active <Data Object> from Legacy Source Systems into S/4HANA following the <Data Object> Master Data Design Standard.


The data from legacy system includes:

  1. Data active and three years or less usage
  2. Data without deletion flag
  3. Data relevant for material/plant combination active according to the Latest MARC definition

The data from legacy system excludes:

  1. Data older than three years
  2. Data flagged for deletion
  3. Data that belong to Inspection plans for materials not mapped in the latest version of the MARC table


List of source systems and approximate number of records
SourceScope

Source Approx No. of Records

Target SystemTarget Approx

No. of Records

PF2

S/4 HANA
WP2

S/4 HANA










Additional Information

Multi-language Requirement


Document Management

Not applicable

Legal Requirement

Not applicable

Special Requirements

Not applicable


Target Design

With Functional input, document the technical design of the target fields that are in the scope of this document.

The technical design of the target for this conversion approach.

TableFieldData ElementField DescriptionData TypeLengthRequirement
QPMKVERWMERKMVERWMERKMMaster Inspection Characteristic (MIC)CHAR18Mandatory
QPMKVERSIONVERSIONMIC VersionNUMC2Mandatory
QPMKWERKSWERKS_DPlant (plant-specific MIC)CHAR4Mandatory
QPMKSPRASSPRASLanguage KeyLANG1Mandatory
QPMKKURZTEXTKURZTEXTShort Text (MIC name/description)CHAR40Mandatory
QPMKQUALIQUALIQual/Quant Indicator (blank = qualitative, ‘1’ = quantitative)CHAR1Mandatory
QPMKPRUEFMEMSEHIUnit of Measure (for quantitative MIC)UNIT3Mandatory for Quantitative MIC
QPMKANZSTELLANZSTELLDecimal PlacesNUMC2Mandatory for Quantitative MIC
QPMKMWERTSMWERTSTarget Value (Default)DEC15Optional
QPMKMWERTUMWERTULower Spec Limit (Default)DEC15Optional
QPMKMWERTOMWERTOUpper Spec Limit (Default)DEC15Optional
QPMKTOLKZTOLKZTolerance IndicatorCHAR1Optional
QPMKKATALOGARTKATALOGARTCatalog Type (for qualitative MIC)CHAR3Mandatory for Qualitative MIC
QPMKCODEGRPCODEGRPCode Group (qualitative default)CHAR8Optional
QPMKAUSWMENGEAUSWMENGESelected Set (qualitative default)CHAR8Optional
QPMKSTICHPRVERSTICHPRVERSampling Procedure (default)CHAR8Optional
QPMKPRFGEBPRFGEBInspection Method (default)CHAR12Optional
QPMKPRFGVERPRFGVERInspection Method VersionNUMC2Optional
QPMKSTEUERKZSTEUERKZControl Indicators (packed flags)CHAR30Optional
QPMKERSTELLERERNAMCreated ByCHAR12Optional
QPMKDATUMERSTERDATCreated OnDATS8Optional
QPMKAENDERERAENAMLast Changed ByCHAR12Optional
QPMKDATUMAENDAEDATLast Changed OnDATS8Optional
QPMKLOEKZLOEKZDeletion Flag (MIC)CHAR1Optional (leave blank for active)
QPMTVERWMERKMVERWMERKMMIC (text object link)CHAR18Mandatory
QPMTVERSIONVERSIONMIC VersionNUMC2Mandatory
QPMTSPRASSPRASLanguageLANG1Mandatory
QPMTKURZTEXTKURZTEXTShort Text (language-dependent)CHAR40Optional


Data Cleansing

All data cleansing should take place in the data source system as defined in this document, unless system limitations prevent it.

If data cleansing is managed outside of the source system (e.g. Syniti Migrate, 3rd Party Vendor, DCT), the necessary documentation must be produced and appended to this deliverable for sign-off.

IDCriticalityError Message/Report DescriptionRuleOutputSource System


























Conversion Process

The high-level process is represented by the diagram below:

The ETL (Extract, Transform, Load) process is a structured approach to data migration and management, ensuring high-quality data is seamlessly transferred across systems. Here’s a breakdown of its key components:

1. Extraction
The process begins with extracting metadata and raw data from source systems, such as Syensqo ECC system (i.e. WP2/PF2) periodically. The extracted data is then staged for transformation.


2. Transformation
Once extracted, the data undergoes cleansing, consolidation, and governance. This step ensures data integrity, consistency, and compliance with business rules. The transformation process includes:
- Data validation to remove inconsistencies.
- Standardization to align formats across datasets.
- Business rule application to refine data for operational use.


3. Loading
The transformed data is then loaded into the target S/4HANA system. 

image-2025-6-10_17-0-33.png

Data Privacy and Sensitivity

Not applicable


Extraction

Extract data from a source into Syniti Migrate. There are 2 possibilities:

  1. The data exists. Syniti Migrate connects to the source and loads the data into Syniti Migrate. There are 3 methods:
    1. Perform full data extraction from relevant tables in the source system(s).
    2. Perform extraction through the application layer.
    3. Only if Syniti Migrate; cannot connect to the source, data is loaded to the repository from the provided source system extract/report.
  2. The data does not exist (or cannot be converted from its current state). The data is manually collected by the business directly in Syniti Migrate. This is to be conducted using DCT (Data Collection Template) in Syniti Migrate

The agreed Relevancy criteria is applied to the extracted records to identify the records that are applicable for the Target loads

Extraction Run Sheet

Req #Requirement DescriptionTeam Responsible
Extraction Scope Definition- Identify the source systems and databases involved.
- Define the data objects (tables, fields, records) to be extracted.
- Establish business rules for data selection.

Syniti / LTC Data team

Extraction Methodology- Specify the extraction approach (full, incremental, or delta extraction).
- Determine the tools and technologies used.
- Define data filtering criteria to exclude irrelevant records.
Syniti 
Extraction Execution Plan- Establish execution timelines and batch processing schedules.
- Assign responsibilities for extraction monitoring.
- Document dependencies on other migration tasks.
Syniti
Data Quality and Validation- Define error handling mechanisms for extraction failures.Syniti


Selection Screen

If applicable, this section will give the details on any selection screen parameters, including the parameter type, that are required to be entered to ensure consistent data extracts.
Selection Ref ScreenParameter NameSelection TypeRequirementValue to be entered/set





















Data Collection Template (DCT)

Target Ready Data Collection Template will be created for Data Object data with exception of some fields which require transformation as mentioned in the transformation rule.

<Object> DCT Rules

Field NameField DescriptionRule


Pending MDS and number of data in source system









Extraction Dependencies

List the steps that need to occur before extraction can commence

Item #Step DescriptionTeam Responsible
1

Source System Availability

  • Ensure that the source database or application is accessible.
  • Confirm that necessary credentials and permissions are granted

Syensqo IT

2

Data Structure

  • Identify relationships between tables, views, and stored procedures.

Syniti

3

Referential Integrity

  • Ensure dependent records are extracted together.

Syniti

4

Extraction Methodology

  • Define whether extraction is full, incremental, or delta-based.
  • Establish batch processing schedules for large datasets.

Syniti

5

Performance and Scalability Considerations

  • Optimize extraction queries to prevent system overload.
  • Ensure network bandwidth supports data transfer volumes.

Syniti

6

Security and Compliance

  • Adhere to regulatory standards for sensitive information if applicable

Syniti


Transformation

The Target fields are mapped to the applicable Legacy field that will be its source, this is a 3-way activity involving the Business, Functional team and Data team. This identifies the transformation activity required to allow Syniti Migrate to make the data Target ready:

  1. Perform value mapping and data transformation rules.
    1. Legacy values are mapped to the to-be values (this could include a default value)
    2. Values are transformed according to the rules defined in Syniti Migrate
  2. Prepare target-ready data in the structure and format that is required for loading via prescribed Load Tool. This step also produces the load data ready for business to perform Pre-load Data Validation

Transformation Run Sheet

Item #Step DescriptionTeam Responsible













Transformation Rules

Rule #Source systemSource TableSource FieldSource DescriptionTarget SystemTarget TableTarget FieldTarget DescriptionTransformation Logic









































Transformation Mapping

Use the exact name and reference this section in the “Transformation rules” above
Mapping Table NameMapping Table Description








Transformation Dependencies

List the steps that need to occur before transformation can commence
Item #Step DescriptionTeam Responsible













Pre-Load Validation

Project Team

The following pre-load validations will be performed by the Project Team.

Completeness

TaskAction
titlespecific details of what and how the task needs to be performed e.g. which reports are being used etc.





Accuracy

TaskAction
titlespecific details of what and how the task needs to be performed e.g. which reports are being used etc.





Business

The following pre-load validations will be performed by the business.

Completeness

TaskAction
titlespecific details of what and how the task needs to be performed e.g. which reports are being used etc.





Accuracy

TaskAction
titlespecific details of what and how the task needs to be performed e.g. which reports are being used etc.





Load

The load process includes:

  1. Execute the automated data load into target system using load tool or product the load file if the load must be done manually
  2. Once the data is loaded to the target system, it will be extracted and prepared for Post Load Data Validation

Load Run Sheet

Item #Step DescriptionTeam Responsible













Load Phase and Dependencies

Identify the phase as to “when” the load for this object will occur. <Pre-Cutover, Cutover, Post Cutover> and list the steps that need to occur before the load can commence

Configuration

List the Configurations required before loading can commence

Item #Configuration Item






Conversion Objects

Object #Preceding Object Conversion Approach

list the exact title of the conversion object of only the immediate predecessor – this will then confirm the DDD (Data Dependency Diagram)




Error Handling

The table below depicts some possible system errors for this data object during data load. All data load error is to be logged as defect and managed within the Defect Management

Error TypeError DescriptionAction Taken










Post-Load Validation

Project Team

The following post-load validations will be performed by the Project Team.

Completeness

TaskAction
titlespecific details of what and how the task needs to be performed e.g. which reports are being used etc.





Accuracy

TaskAction
titlespecific details of what and how the task needs to be performed e.g. which reports are being used etc.





Business

The following post-load validations will be performed by the business.

Completeness

TaskAction
titlespecific details of what and how the task needs to be performed e.g. which reports are being used etc.





Accuracy

TaskAction
titlespecific details of what and how the task needs to be performed e.g. which reports are being used etc.





Key Assumptions

  • Master Data Standard is up to date as on the date of documenting this conversion approach and data load.
  • Data Object is in scope based on data design and any exception requested by business.

Any additional key assumptions.


See also

Insert links and references to other documents which are relevant when trying to understand this decision and its implications. Other decisions are often impacted, so it's good to list them here with links. Attachments are also possible but dangerous as they are static documents and not updated by their authors.

Change log

Workflow history