| Status | Revision in Progress |
|---|---|
| Owner | |
| Stakeholders |
Purpose
The purpose of this document is to define the conversion approach to create 1041 Inspection Plan Conversion Specification Document in S/4 HANA.
Inspection Plans are a core component of SAP Quality Management (QM) that define how materials are inspected, including inspection characteristics, sampling procedures, and inspection methods. An inspection plan contains detailed steps for quality checks and ensures that inspections are carried out in a standardized, repeatable, and compliant manner for specific materials, plants, and usage scenarios.
In SAP S/4HANA, the structure and usage of inspection plans remain largely consistent with SAP ECC, typically organized by key combinations such as material/plant, material group/plant, or multiple materials assigned to the same inspection plan via task lists. Inspection plans may also be linked to specific usage types (e.g., goods receipt, in-process inspection, final inspection) and can be maintained with reference to alternative groupings for flexibility.
In SAP ECC, aside from the standard structures, there may be inspection plans with additional variants such as multiple alternative plans for the same material/plant combination, inspection plans assigned to multiple task list groups, or plans that use reference materials. In some legacy systems, plans may also include non-standard fields or custom enhancements (pending MDS review) that will require evaluation before conversion.
This conversion aims to migrate active and relevant inspection plan master data, including associated operations, inspection characteristics, sampling procedures, and inspection methods, from existing ECC systems into S/4HANA. The migration will apply required transformation logic using Syniti as the data migration and transformation platform. The converted records will be loaded into the target S/4HANA system using standard SAP mechanisms such as BAPIs (e.g., BAPI_INSPECTIONPLAN_CREATE), IDOCs, or direct table loads where applicable, ensuring data consistency and compliance with S/4HANA requirements
Conversion Scope
The scope of this document covers the approach for converting active Inspection Plan from Legacy Source Systems into S/4 HANA system following the TBD Inspection Plan Master Data Design Standard.
From the current system landscape, Inspection Plan data exists separately in the legacy systems (PF2 and WP2), with potential discrepancies in both systems. Harmonization and validation are required to ensure accurate and consolidated data in S/4HANA. While PF2 and WP2 serve as source systems, extensive mapping and transformation logic will be necessary to produce properly formatted load templates in line with the target design.
The data from legacy system includes:
- Data not older than 3 years of usage
- Data not flagged for deletion
- Data that has material / plant relationship and material is active
- Data is relevant for migration according with the Material/Plant mapping match with MARC table (ToBe)
The data from legacy system excludes:
- Data older than 3 years
- Data flagged for deletion
- Data that has material / plant relationship and material is not active or falged for deletion
- Data that is not material / plant relevant according to the latest MARC table definition
| Source | Scope | Source Approx No. of Records | Target System | Target Approx No. of Records |
|---|---|---|---|---|
| PF2/WP2 | Inspection Plan will be extracted from PF2 and WP2 | PF2 = 8129 records WP2 = 37451 records | S/4 HANA | 13209 records |
| DCT | Material with Plant and Inspection type combination which do not have data existing from PF2 and WP2 | TBD | S/4 HANA | TBD |
Additional Information
Multi-language Requirement
Not applicable
Document Management
None identified. The conversion scope is limited to the migration of Inspection Plan object. No document attachments, images, or supporting files are included.
Legal Requirement
Not applicable
Special Requirements
Not applicable
Target Design
Inspection Plan strictly adheres to the Master Data Standard. The complete information of the tables and key fields that hold the Inspection Plan information follows the Master Data Standard document.
The technical design of the target for this conversion approach (pending MDS)
| Table | Field | Data Element | Field Description | Data Type | Length | Requirement |
|---|---|---|---|---|---|---|
| PLKO | PLNTY | PLNTY | Task List Type (Q = Inspection Plan) | CHAR | 1 | Mandatory |
| PLKO | PLNNR | PLNNR | Task List Group | CHAR | 8 | Mandatory |
| PLKO | PLNAL | PLNAL | Group Counter (alternative) | CHAR | 2 | Mandatory |
| PLKO | WERKS | WERKS_D | Plant | CHAR | 4 | Mandatory |
| PLKO | DATUV | DATUV | Valid-From Date | DATS | 8 | Mandatory |
| PLKO | VERWE | VERWE | Task List Usage (e.g., 1/5/9 – confirm in T416) | CHAR | 3 | Mandatory |
| PLKO | STATU | STATU | Task List Status (e.g., Released) | CHAR | 1 | Mandatory for load (set to released if required) |
| PLKO | KTEXT | KTEXT | Header Description | CHAR | 40 | Recommended |
| PLKO | LOEKZ | LOEKZ | Deletion Indicator | CHAR | 1 | Optional (leave blank for active) |
| PLKO | KLART | KLART | Class Type (if classification used) | CHAR | 3 | Optional |
| PLKO | KZKUP | KZKUP | Reference Indicator (Header) | CHAR | 1 | Optional |
| PLPO | PLNKN | PLNKN | Node Number (internal op ID) | NUMC | 8 | Mandatory |
| PLPO | VORNR | VORNR | Operation Number (sequence) | CHAR | 4 | Mandatory |
| PLPO | LTXA1 | LTXA1 | Operation Short Text | CHAR | 40 | Mandatory |
| PLPO | STEUS | STEUS | Control Key | CHAR | 4 | Mandatory (drives QM/confirmation behavior) |
| PLPO | ARBID | ARBID | Work Center (internal ID) | NUMC | 8 | Mandatory (or map via ARBPL/CRHD) |
| PLPO | WERKS | WERKS_D | Plant (Work Center Plant) | CHAR | 4 | Mandatory |
| PLPO | BMSCH | BMSCH | Base Quantity (for std values) | QUAN | 13 | Recommended |
| PLPO | VGW01 | VGWRT | Standard Value 1 (e.g., Setup/Inspect time) | DEC | 7 | Optional |
| PLPO | VGE01 | VGWTS | Unit for Standard Value 1 | UNIT | 3 | Optional |
| PLPO | VGW02 | VGWRT | Standard Value 2 | DEC | 7 | Optional |
| PLPO | VGE02 | VGWTS | Unit for Standard Value 2 | UNIT | 3 | Optional |
| PLPO | KAPAZ | KAPAZ | Capacity Category (if used) | CHAR | 3 | Optional |
| PLPO | ARBPL | ARBPL | Work Center (external key, if you load views) | CHAR | 8 | Optional (derived from ARBID) |
| PLPO | LOEKZ | LOEKZ | Deletion Indicator (Operation) | CHAR | 1 | Optional |
| PLMK | MERKNR | MERKNR | Characteristic Number (in plan) | NUMC | 4 | Mandatory |
| PLMK | KURZTEXT | KURZTEXT | Char. Short Text | CHAR | 40 | Mandatory |
| PLMK | VORGLFNR | VORGLFNR | Op Link (operation sequence number) | NUMC | 4 | Mandatory (links to PLPO) |
| PLMK | KZREF | KZREF | Reference to Master Insp. Char. (MIC) | CHAR | 1 | Mandatory if using MIC |
| PLMK | VERWMERKM | VERWMERKM | MIC Name (Referenced Char.) | CHAR | 18 | Mandatory if using MIC |
| PLMK | VERSION | VERSION | MIC Version | NUMC | 2 | Mandatory if using MIC versioning |
| PLMK | PRUEFUMF | PRUEFUMF | Inspection Scope / Sample Procedure Link | CHAR | 4 | Optional (if using sampling procedures) |
| PLMK | STICHPRVER | STICHPRVER | Sampling Procedure | CHAR | 8 | Optional |
| PLMK | KATALGART | KATALGART | Catalog Type (for qualitative) | CHAR | 3 | Optional/Required for qualitative |
| PLMK | CODEGRP | CODEGRP | Code Group (qualitative) | CHAR | 8 | Optional/Required for qualitative |
| PLMK | AUSWMENGE | AUSWMENGE | Selected Set (qualitative) | CHAR | 8 | Optional |
| PLMK | PRUEFME | MSEHI | Unit of Measure (quantitative) | UNIT | 3 | Required for quantitative |
| PLMK | TOLKZ | TOLKZ | Tolerance Indicator | CHAR | 1 | Optional (quantitative) |
| PLMK | MWERTS | MWERTS | Target Value | DEC | 15 | Optional (quantitative) |
| PLMK | MWERTU | MWERTU | Lower Spec Limit | DEC | 15 | Optional/Required per spec rule |
| PLMK | MWERTO | MWERTO | Upper Spec Limit | DEC | 15 | Optional/Required per spec rule |
| PLMK | PRFGEB | PRFGEB | Inspection Method Assignment | CHAR | 12 | Optional |
| PLMK | PRFGVER | PRFGVER | Inspection Method Version | NUMC | 2 | Optional |
| PLMK | ANZST | ANZST | No. of Single Values / Samples | NUMC | 3 | Optional |
| PLMK | LOEKZ | LOEKZ | Deletion Indicator (Characteristic) | CHAR | 1 | Optional |
| MAPL | MATNR | MATNR | Material | CHAR | 40 | Mandatory (if plan is material-assigned) |
| MAPL | WERKS | WERKS_D | Plant | CHAR | 4 | Mandatory |
| MAPL | PLNTY | PLNTY | Task List Type | CHAR | 1 | Mandatory |
| MAPL | PLNNR | PLNNR | Task List Group | CHAR | 8 | Mandatory |
| MAPL | PLNAL | PLNAL | Group Counter | CHAR | 2 | Mandatory |
| MAPL | DATUV | DATUV | Valid-From Date | DATS | 8 | Mandatory |
| MAPL | LOEKZ | LOEKZ | Deletion Indicator | CHAR | 1 | Optional |
Data Cleansing
| ID | Criticality | Error Message/Report Description | Rule | Output | Source System |
|---|---|---|---|---|---|
| 2009-001 | C1 | Invalid or inactive Material Assignment in Inspection Plan | Inspection plan (MAPL) is assigned to a material/plant that does not exist or is inactive in Material Master (MARA/MARC). | Material Master | PF2/WP2 |
| 1006-001 | C1 | Invalid or inactive Work Center linked to Operation | Operation (PLPO) assigned to a Work Center (CRHD) that does not exist or is flagged for deletion. | Work Center | PF2/WP2 |
| 1041-001 | C1 | Inspection Plan not released | Header (PLKO-STATU) not set to “Released” (status = 4). Only released plans are migrated. | Inspection Plan Header | PF2/WP2 |
| 1041-002 | C1 | Invalid Usage assignment | Inspection plan header (PLKO-VERWE) has an invalid or non-configured Task List Usage. | Usage | PF2/WP2 |
| 1041-003 | C1 | Missing or invalid Group Counter | Inspection plan (PLKO-PLNAL) is blank, duplicated, or inconsistent with group. | Plan Counter | PF2/WP2 |
| 1057-001 | C1 | Invalid Inspection Characteristic | Plan characteristic (PLMK) references a Master Inspection Characteristic (QPMK) that does not exist, is inactive, or marked for deletion. | Master Insp. Characteristic | PF2/WP2 |
| 1057-002 | C1 | Inconsistent Quantitative/Qualitative Settings | Plan characteristic (PLMK-QUALI) is inconsistent with assigned MIC type (quantitative/qualitative). | Characteristic | PF2/WP2 |
| 1064-001 | C1 | Missing Sampling Procedure | Plan characteristic (PLMK-STICHPRVER) references a Sampling Procedure (QDSV) that does not exist or is obsolete. | Sampling Procedure | PF2/WP2 |
| 1043-001 | C1 | Missing or Invalid Inspection Method | Plan characteristic (PLMK-PRFGEB) references an Inspection Method (QMTB) that does not exist, is obsolete, or not valid for plant. | Inspection Method | PF2/WP2 |
| 1057-003 | C1 | Duplicate Inspection Characteristics | Inspection plan contains duplicate operation/characteristic entries (same VORNR + MERKNR). | Inspection Plan Operations/Characteristics | PF2/WP2 |
| 1041-004 | C1 | Invalid Validity Date | Valid-From date (PLKO-DATUV) is in the future or overlaps inconsistently with other plans for same material/usage. | Inspection Plan Header | PF2/WP2 |
| 1041-005 | C1 | Obsolete Inspection Plan | Header marked with deletion flag (PLKO-LOEKZ = X). Exclude from migration. | Inspection Plan Header | PF2/WP2 |
Conversion Process
The high-level process is represented by the diagram below:
The ETL (Extract, Transform, Load) process is a structured approach to data migration and management, ensuring high-quality data is seamlessly transferred across systems. Here’s a breakdown of its key components:
1. Extraction
The process begins with extracting metadata and raw data from source systems, such as Syensqo ECC system (i.e. WP2/PF2) periodically. The extracted data is then staged for transformation.
2. Transformation
Once extracted, the data undergoes cleansing, consolidation, and governance. This step ensures data integrity, consistency, and compliance with business rules. The transformation process includes:
- Data validation to remove inconsistencies.
- Standardization to align formats across datasets.
- Business rule application to refine data for operational use.
3. Loading
The transformed data is then loaded into the target S/4HANA system.
Data Privacy and Sensitivity
Not applicableExtraction
Extract data from a source into . There are 2 possibilities:
- The data exists. connects to the source and loads the data into . There are 3 methods:
- Perform full data extraction from relevant tables in the source system(s).
- Perform extraction through the application layer.
- Only if ; cannot connect to the source, data is loaded to the repository from the provided source system extract/report.
- The data does not exist (or cannot be converted from its current state). The data is manually collected by the business directly in . This is to be conducted using DCT (Data Collection Template) in
The agreed Relevancy criteria is applied to the extracted records to identify the records that are applicable for the Target loads
Extraction Run Sheet
| Req # | Requirement Description | Team Responsible |
|---|---|---|
| Extraction Scope Definition | - Identify the source systems and databases involved. - Define the data objects (tables, fields, records) to be extracted. - Establish business rules for data selection. | Syniti / LTC Data team |
| Extraction Methodology | - Specify the extraction approach (full, incremental, or delta extraction). - Determine the tools and technologies used. - Define data filtering criteria to exclude irrelevant records. | Syniti |
| Extraction Execution Plan | - Establish execution timelines and batch processing schedules. - Assign responsibilities for extraction monitoring. - Document dependencies on other migration tasks. | Syniti |
| Data Quality and Validation | - Define error handling mechanisms for extraction failures. | Syniti |
Selection Screen
| Selection Ref Screen | Parameter Name | Selection Type | Requirement | Value to be entered/set |
|---|---|---|---|---|
| Not applicable | ||||
Data Collection Template (DCT)
Target Ready Data Collection Template will be created for data with exception of some fields which require transformation as mentioned in the transformation rule.DCT Rules
| Field Name | Field Description | Rule |
|---|---|---|
| Pending MDS and number of data in source system | ||
Extraction Dependencies
| Item # | Step Description | Team Responsible |
|---|---|---|
| 1 | Source System Availability
| Syensqo IT |
| 2 | Data Structure
| Syniti |
| 3 | Referential Integrity
| Syniti |
| 4 | Extraction Methodology
| Syniti |
| 5 | Performance and Scalability Considerations
| Syniti |
| 6 | Security and Compliance
| Syniti |
| 7 | Data cleansing of legacy Material Master - QM View with assigned inspection type data must be completed. If standardization within the DCT begins using relevant data from PF2 and WP2 before the cleansing is finalized, it is understood that the business will take due diligence to ensure any subsequent delta cleansing is verified and aligned within the DCT. | Business |
Transformation
The Target fields are mapped to the applicable Legacy field that will be its source, this is a 3-way activity involving the Business, Functional team and Data team. This identifies the transformation activity required to allow to make the data Target ready:
- Perform value mapping and data transformation rules.
- Legacy values are mapped to the to-be values (this could include a default value)
- Values are transformed according to the rules defined in
- Prepare target-ready data in the structure and format that is required for loading via prescribed Load Tool. This step also produces the load data ready for business to perform Pre-load Data Validation
Transformation Run Sheet
| Item # | Step Description | Team Responsible |
|---|---|---|
1 | Transformation Scope Definition - Identify the source and target data structures. - Define business rules for data standardization. - Establish data cleansing requirements to remove inconsistencies. | Data Team |
2 | Data Mapping and Standardization - Align source fields with target fields. - Ensure unit consistency (e.g., currency, measurement units) | Data Team |
3 | Business Rule Application - Implement data enrichment/collection if applicable - Apply conditional transformations based on predefined logic/business rules | Data Team |
4 | Transformation Execution Plan - Define batch processing schedules. - Assign responsibilities for monitoring execution. - Establish error-handling mechanisms | Syniti |
Transformation Rules
| Rule # | Source system | Source Table | Source Field | Source Description | Target System | Target Table | Target Field | Target Description | Transformation Logic |
|---|---|---|---|---|---|---|---|---|---|
Transformation Mapping
| Mapping Table Name | Mapping Table Description |
|---|---|
Transformation Dependencies
List the steps that need to occur before transformation can commence| Item # | Step Description | Team Responsible |
|---|---|---|
Pre-Load Validation
Project Team
Completeness
| Task | Action |
|---|---|
Compare Data Counts |
|
Validate the mandatory fields | Validate there is value for all the mandatory fields |
Validate Primary Keys and Unique Constraints |
|
Test Referential Integrity | Confirm dependent records exist in related tables |
Accuracy
| Task | Action |
|---|---|
Validate the transformation | Validate the fields which require transformation have the value after transformation instead of the original field value |
Check Data Consistency |
|
Business
Completeness
| Task | Action |
|---|---|
Compare Data Count |
|
| Review populated templates for missing or incorrect values | Use checklists to verify completeness and correctness before submission |
Accuracy
| Task | Action |
|---|---|
Conversion Accuracy | Business Data Owner/s to verify that all the data in the load table/file is accurate as per endorsed transformation/ mapping rules (and signed-off DCT data). |
Load
The load process includes:
- Execute the automated data load into target system using load tool or product the load file if the load must be done manually
- Once the data is loaded to the target system, it will be extracted and prepared for Post Load Data Validation
Load Run Sheet
| Item # | Step Description | Team Responsible |
|---|---|---|
1 | Load Scope Definition - Identify the target system and database structure. - Define data objects (tables, fields, records) to be loaded. - Establish business rules for data validation. | Data team |
2 | Load Methodology - Specify the loading tools and technologies (Migration Cockpit, LSMW, custom loading program). | Syniti |
3 | Data Quality and Validation - Ensure data integrity checks (null values, duplicates, format validation). - Perform pre-load validations to verify completeness. - Define error handling mechanisms for load failures | Syniti |
4 | Load Execution Plan - Establish execution timelines and batch processing schedules. - Assign responsibilities for monitoring execution. - Document dependencies on other migration tasks | Syniti |
5 | Logging and Reporting - Maintain detailed logs of loading activities. - Generate summary reports on loaded data volume and quality. - Define escalation procedures for errors | Syniti |
Load Phase and Dependencies
The Material Listing and Exclusions will be loaded in the pre-cutover (PreCutover 4 phase) period.
Before loading, it will have dependency on the following configuration and data objects in the S/4 HANA.
Configuration
| Item # | Configuration Item |
|---|---|
Conversion Objects
| Object # | Preceding Object Conversion Approach |
|---|---|
| list the exact title of the conversion object of only the immediate predecessor – this will then confirm the DDD (Data Dependency Diagram) | |
Error Handling
| Error Type | Error Description | Action Taken |
|---|---|---|
Configuration / Data Transformation | The value XXX for field XXX doesn't exist |
|
Configuration | There is mandatory field XXX missing |
|
Post-Load Validation
Project Team
Completeness
| Task | Action |
|---|---|
Validate Record count in the backend | Validate all tables with prefix “KOTG” has the same records as the loading file |
Display Records | Pick up a few random Material Listing or Material Exclusions, and run t-code: VB03 to validate the Material Listing and Exclusions can be displayed without any error. |
Perform Source-to-Target Comparisons |
|
Accuracy
| Task | Action |
|---|---|
Execute Sample Queries and Reports |
|
Conduct Post-Migration Reconciliation | Generate reports comparing pre- and post-migration data. |
Business
Post-load validation is a critical step in data migration, ensuring that transferred data is accurate, complete, and functional within the target system.
1. Ensuring Data Integrity
After migration, data must be consistent with its original structure. Post-load validation checks for missing records, incorrect mappings, and formatting errors to prevent discrepancies.
2. Business Continuity
Faulty data can disrupt operations, leading to financial losses and inefficiencies. Validating post-load data ensures that applications function as expected, preventing downtime.
3. Error Detection and Resolution
By validating data post-migration, businesses can detect anomalies early, reducing the cost and effort required for corrections
Completeness
| Task | Action |
|---|---|
| Perform Source-to-Target Comparisons |
|
| Conduct Post-Migration Reconciliation | Go through reports comparing pre- and post-migration data. |
Accuracy
| Task | Action |
|---|---|
Perform Manual Testing | Conduct manual spot-checks for additional assurance. |
Key Assumptions
- Master Data Standard is up to date as on the date of documenting this conversion approach and data load.
- Inspection Plan is in scope based on data design and any exception requested by business.
See also
Change log
Workflow history
| Title | Last Updated By | Updated | Status | |
|---|---|---|---|---|
| There are no pages at the moment. | ||||
