| Status | |
|---|---|
| Owner | |
| Stakeholders |
The purpose of this document is to define the conversion approach to create 1057 - QM Master Inspection Characteristics in S/4 HANA.
Master Inspection Characteristics are a fundamental element of SAP Quality Management (QM) used to define the parameters, specifications, and methods for quality inspections. MICs provide a standardized definition of what is to be inspected and how it should be measured, ensuring uniformity and consistency across inspection plans and quality processes. MICs can be maintained as Quantitative (numeric specifications such as measurement ranges and tolerances) or Qualitative (descriptive specifications such as defect classes or codes).
In SAP S/4HANA, the structure and usage of MICs remain consistent with SAP ECC, typically defined by key combinations such as plant, characteristic name, and characteristic type. MICs may include additional settings such as selected sets for qualitative characteristics, catalog assignments, default inspection methods, target values, upper and lower specification limits, and sampling procedures.
In SAP ECC, aside from the standard MIC structures, there may be additional configurations, such as plant-independent characteristics, characteristics linked to custom catalogs, or MICs with special control indicators and custom fields. Certain legacy systems may also include MICs with obsolete catalog references, inactive units of measure, or unused selected sets (pending MDS review), which must be validated before conversion.
This conversion aims to migrate active and relevant MIC records, along with their associated control indicators, selected sets, catalog assignments, default inspection methods, and specification limits, from existing ECC systems into S/4HANA. The migration will apply the required transformation logic using Syniti as the data migration and transformation platform. The converted records will be loaded into the target S/4HANA system using standard SAP mechanisms such as BAPIs (e.g., BAPI_INSPOPER_RECORDRESULTS for linking to inspection operations, or QS21 transaction for creation), IDOCs, or direct table loads where applicable, ensuring data integrity, compliance, and reusability across inspection plans.
The scope of this document covers the approach for converting active <Data Object> from Legacy Source Systems into S/4HANA following the <Data Object> Master Data Design Standard.
The data from legacy system includes:
The data from legacy system excludes:
| Source | Scope | Source Approx No. of Records | Target System | Target Approx No. of Records |
|---|---|---|---|---|
| PF2 | S/4 HANA | |||
| WP2 | S/4 HANA | |||
Not applicable
Not applicable
Not applicable
With Functional input, document the technical design of the target fields that are in the scope of this document.
The technical design of the target for this conversion approach.
| Table | Field | Data Element | Field Description | Data Type | Length | Requirement |
|---|---|---|---|---|---|---|
| QPMK | VERWMERKM | VERWMERKM | Master Inspection Characteristic (MIC) | CHAR | 18 | Mandatory |
| QPMK | VERSION | VERSION | MIC Version | NUMC | 2 | Mandatory |
| QPMK | WERKS | WERKS_D | Plant (plant-specific MIC) | CHAR | 4 | Mandatory |
| QPMK | SPRAS | SPRAS | Language Key | LANG | 1 | Mandatory |
| QPMK | KURZTEXT | KURZTEXT | Short Text (MIC name/description) | CHAR | 40 | Mandatory |
| QPMK | QUALI | QUALI | Qual/Quant Indicator (blank = qualitative, ‘1’ = quantitative) | CHAR | 1 | Mandatory |
| QPMK | PRUEFME | MSEHI | Unit of Measure (for quantitative MIC) | UNIT | 3 | Mandatory for Quantitative MIC |
| QPMK | ANZSTELL | ANZSTELL | Decimal Places | NUMC | 2 | Mandatory for Quantitative MIC |
| QPMK | MWERTS | MWERTS | Target Value (Default) | DEC | 15 | Optional |
| QPMK | MWERTU | MWERTU | Lower Spec Limit (Default) | DEC | 15 | Optional |
| QPMK | MWERTO | MWERTO | Upper Spec Limit (Default) | DEC | 15 | Optional |
| QPMK | TOLKZ | TOLKZ | Tolerance Indicator | CHAR | 1 | Optional |
| QPMK | KATALOGART | KATALOGART | Catalog Type (for qualitative MIC) | CHAR | 3 | Mandatory for Qualitative MIC |
| QPMK | CODEGRP | CODEGRP | Code Group (qualitative default) | CHAR | 8 | Optional |
| QPMK | AUSWMENGE | AUSWMENGE | Selected Set (qualitative default) | CHAR | 8 | Optional |
| QPMK | STICHPRVER | STICHPRVER | Sampling Procedure (default) | CHAR | 8 | Optional |
| QPMK | PRFGEB | PRFGEB | Inspection Method (default) | CHAR | 12 | Optional |
| QPMK | PRFGVER | PRFGVER | Inspection Method Version | NUMC | 2 | Optional |
| QPMK | STEUERKZ | STEUERKZ | Control Indicators (packed flags) | CHAR | 30 | Optional |
| QPMK | ERSTELLER | ERNAM | Created By | CHAR | 12 | Optional |
| QPMK | DATUMERST | ERDAT | Created On | DATS | 8 | Optional |
| QPMK | AENDERER | AENAM | Last Changed By | CHAR | 12 | Optional |
| QPMK | DATUMAEND | AEDAT | Last Changed On | DATS | 8 | Optional |
| QPMK | LOEKZ | LOEKZ | Deletion Flag (MIC) | CHAR | 1 | Optional (leave blank for active) |
| QPMT | VERWMERKM | VERWMERKM | MIC (text object link) | CHAR | 18 | Mandatory |
| QPMT | VERSION | VERSION | MIC Version | NUMC | 2 | Mandatory |
| QPMT | SPRAS | SPRAS | Language | LANG | 1 | Mandatory |
| QPMT | KURZTEXT | KURZTEXT | Short Text (language-dependent) | CHAR | 40 | Optional |
All data cleansing should take place in the data source system as defined in this document, unless system limitations prevent it.
If data cleansing is managed outside of the source system (e.g. Syniti Migrate, 3rd Party Vendor, DCT), the necessary documentation must be produced and appended to this deliverable for sign-off.
| ID | Criticality | Error Message/Report Description | Rule | Output | Source System |
|---|---|---|---|---|---|
The high-level process is represented by the diagram below:
The ETL (Extract, Transform, Load) process is a structured approach to data migration and management, ensuring high-quality data is seamlessly transferred across systems. Here’s a breakdown of its key components:
1. Extraction
The process begins with extracting metadata and raw data from source systems, such as Syensqo ECC system (i.e. WP2/PF2) periodically. The extracted data is then staged for transformation.
2. Transformation
Once extracted, the data undergoes cleansing, consolidation, and governance. This step ensures data integrity, consistency, and compliance with business rules. The transformation process includes:
- Data validation to remove inconsistencies.
- Standardization to align formats across datasets.
- Business rule application to refine data for operational use.
3. Loading
The transformed data is then loaded into the target S/4HANA system.

Extract data from a source into Syniti Migrate. There are 2 possibilities:
The agreed Relevancy criteria is applied to the extracted records to identify the records that are applicable for the Target loads
| Req # | Requirement Description | Team Responsible |
|---|---|---|
| Extraction Scope Definition | - Identify the source systems and databases involved. - Define the data objects (tables, fields, records) to be extracted. - Establish business rules for data selection. | Syniti / LTC Data team |
| Extraction Methodology | - Specify the extraction approach (full, incremental, or delta extraction). - Determine the tools and technologies used. - Define data filtering criteria to exclude irrelevant records. | Syniti |
| Extraction Execution Plan | - Establish execution timelines and batch processing schedules. - Assign responsibilities for extraction monitoring. - Document dependencies on other migration tasks. | Syniti |
| Data Quality and Validation | - Define error handling mechanisms for extraction failures. | Syniti |
| Selection Ref Screen | Parameter Name | Selection Type | Requirement | Value to be entered/set |
|---|---|---|---|---|
<Object> DCT Rules
| Field Name | Field Description | Rule |
|---|---|---|
| Pending MDS and number of data in source system | ||
List the steps that need to occur before extraction can commence
| Item # | Step Description | Team Responsible |
|---|---|---|
| 1 | Source System Availability
| Syensqo IT |
| 2 | Data Structure
| Syniti |
| 3 | Referential Integrity
| Syniti |
| 4 | Extraction Methodology
| Syniti |
| 5 | Performance and Scalability Considerations
| Syniti |
| 6 | Security and Compliance
| Syniti |
The Target fields are mapped to the applicable Legacy field that will be its source, this is a 3-way activity involving the Business, Functional team and Data team. This identifies the transformation activity required to allow Syniti Migrate to make the data Target ready:
| Item # | Step Description | Team Responsible |
|---|---|---|
Transformation Rules
| Rule # | Source system | Source Table | Source Field | Source Description | Target System | Target Table | Target Field | Target Description | Transformation Logic |
|---|---|---|---|---|---|---|---|---|---|
| Mapping Table Name | Mapping Table Description |
|---|---|
| Item # | Step Description | Team Responsible |
|---|---|---|
| Task | Action |
|---|---|
| title | specific details of what and how the task needs to be performed e.g. which reports are being used etc. |
| Task | Action |
|---|---|
| title | specific details of what and how the task needs to be performed e.g. which reports are being used etc. |
| Task | Action |
|---|---|
| title | specific details of what and how the task needs to be performed e.g. which reports are being used etc. |
| Task | Action |
|---|---|
| title | specific details of what and how the task needs to be performed e.g. which reports are being used etc. |
The load process includes:
| Item # | Step Description | Team Responsible |
|---|---|---|
Load Phase and Dependencies
Identify the phase as to “when” the load for this object will occur. <Pre-Cutover, Cutover, Post Cutover> and list the steps that need to occur before the load can commence
List the Configurations required before loading can commence
| Item # | Configuration Item |
|---|---|
| Object # | Preceding Object Conversion Approach |
|---|---|
| list the exact title of the conversion object of only the immediate predecessor – this will then confirm the DDD (Data Dependency Diagram) | |
The table below depicts some possible system errors for this data object during data load. All data load error is to be logged as defect and managed within the Defect Management
| Error Type | Error Description | Action Taken |
|---|---|---|
| Task | Action |
|---|---|
| title | specific details of what and how the task needs to be performed e.g. which reports are being used etc. |
| Task | Action |
|---|---|
| title | specific details of what and how the task needs to be performed e.g. which reports are being used etc. |
| Task | Action |
|---|---|
| title | specific details of what and how the task needs to be performed e.g. which reports are being used etc. |
| Task | Action |
|---|---|
| title | specific details of what and how the task needs to be performed e.g. which reports are being used etc. |
Any additional key assumptions.
Insert links and references to other documents which are relevant when trying to understand this decision and its implications. Other decisions are often impacted, so it's good to list them here with links. Attachments are also possible but dangerous as they are static documents and not updated by their authors.