Status

Owner
Stakeholders

Purpose

The purpose of this document is to define the conversion approach to create 1041 Inspection Plan Conversion Specification Document in S/4 HANA.

Inspection Plans are a core component of SAP Quality Management (QM) that define how materials are inspected, including inspection characteristics, sampling procedures, and inspection methods. An inspection plan contains detailed steps for quality checks and ensures that inspections are carried out in a standardized, repeatable, and compliant manner for specific materials, plants, and usage scenarios.

In SAP S/4HANA, the structure and usage of inspection plans remain largely consistent with SAP ECC, typically organized by key combinations such as material/plant, material group/plant, or multiple materials assigned to the same inspection plan via task lists. Inspection plans may also be linked to specific usage types (e.g., goods receipt, in-process inspection, final inspection) and can be maintained with reference to alternative groupings for flexibility.

In SAP ECC, aside from the standard structures, there may be inspection plans with additional variants such as multiple alternative plans for the same material/plant combination, inspection plans assigned to multiple task list groups, or plans that use reference materials. In some legacy systems, plans may also include non-standard fields or custom enhancements (pending MDS review) that will require evaluation before conversion.

This conversion aims to migrate active and relevant inspection plan master data, including associated operations, inspection characteristics, sampling procedures, and inspection methods, from existing ECC systems into S/4HANA. The migration will apply required transformation logic using Syniti as the data migration and transformation platform. The converted records will be loaded into the target S/4HANA system using standard SAP mechanisms such as BAPIs (e.g., BAPI_INSPECTIONPLAN_CREATE), IDOCs, or direct table loads where applicable, ensuring data consistency and compliance with S/4HANA requirements


Conversion Scope

The scope of this document covers the approach for converting active Inspection Plan from Legacy Source Systems into S/4 HANA system following the TBD Inspection Plan Master Data Design Standard.

From the current system landscape, Inspection Plan data exists separately in the legacy systems (PF2 and WP2), with potential discrepancies in both systems. Harmonization and validation are required to ensure accurate and consolidated data in S/4HANA. While PF2 and WP2 serve as source systems, extensive mapping and transformation logic will be necessary to produce properly formatted load templates in line with the target design.

The data from legacy system includes:

  1. Data not older than 3 years of usage
  2. Data  not flagged for deletion
  3. Data that has material / plant relationship and material is active
  4. Data is relevant for migration according with the Material/Plant mapping match with MARC table (ToBe)

The data from legacy system excludes:

  1. Data older than 3 years
  2. Data flagged for deletion
  3. Data that has material / plant relationship and material is not active or falged for deletion
  4. Data that is not material / plant relevant according to the latest MARC table definition


List of source systems and approximate number of records
SourceScope

Source Approx No. of Records

Target SystemTarget Approx

No. of Records

PF2/WP2

Inspection Plan will be extracted from PF2 and WP2  

PF2 = 8129 records

WP2 = 37451 records

S/4 HANA

13209 records

DCT

Material with Plant and Inspection type combination which do not have data existing from PF2 and WP2

TBDS/4 HANATBD










Additional Information

Multi-language Requirement

Not applicable

Document Management

None identified. The conversion scope is limited to the migration of Inspection Plan object. No document attachments, images, or supporting files are included.

Legal Requirement

Not applicable

Special Requirements

Not applicable



Target Design

Inspection Plan strictly adheres to the Master Data Standard. The complete information of the tables and key fields that hold the Inspection Plan information follows the Master Data Standard document.

The technical design of the target for this conversion approach (pending MDS)

TableFieldData ElementField DescriptionData TypeLengthRequirement
PLKOPLNTYPLNTYTask List Type (Q = Inspection Plan)CHAR1Mandatory
PLKOPLNNRPLNNRTask List GroupCHAR8Mandatory
PLKOPLNALPLNALGroup Counter (alternative)CHAR2Mandatory
PLKOWERKSWERKS_DPlantCHAR4Mandatory
PLKODATUVDATUVValid-From DateDATS8Mandatory
PLKOVERWEVERWETask List Usage (e.g., 1/5/9 – confirm in T416)CHAR3Mandatory
PLKOSTATUSTATUTask List Status (e.g., Released)CHAR1Mandatory for load (set to released if required)
PLKOKTEXTKTEXTHeader DescriptionCHAR40Recommended
PLKOLOEKZLOEKZDeletion IndicatorCHAR1Optional (leave blank for active)
PLKOKLARTKLARTClass Type (if classification used)CHAR3Optional
PLKOKZKUPKZKUPReference Indicator (Header)CHAR1Optional
PLPOPLNKNPLNKNNode Number (internal op ID)NUMC8Mandatory
PLPOVORNRVORNROperation Number (sequence)CHAR4Mandatory
PLPOLTXA1LTXA1Operation Short TextCHAR40Mandatory
PLPOSTEUSSTEUSControl KeyCHAR4Mandatory (drives QM/confirmation behavior)
PLPOARBIDARBIDWork Center (internal ID)NUMC8Mandatory (or map via ARBPL/CRHD)
PLPOWERKSWERKS_DPlant (Work Center Plant)CHAR4Mandatory
PLPOBMSCHBMSCHBase Quantity (for std values)QUAN13Recommended
PLPOVGW01VGWRTStandard Value 1 (e.g., Setup/Inspect time)DEC7Optional
PLPOVGE01VGWTSUnit for Standard Value 1UNIT3Optional
PLPOVGW02VGWRTStandard Value 2DEC7Optional
PLPOVGE02VGWTSUnit for Standard Value 2UNIT3Optional
PLPOKAPAZKAPAZCapacity Category (if used)CHAR3Optional
PLPOARBPLARBPLWork Center (external key, if you load views)CHAR8Optional (derived from ARBID)
PLPOLOEKZLOEKZDeletion Indicator (Operation)CHAR1Optional
PLMKMERKNRMERKNRCharacteristic Number (in plan)NUMC4Mandatory
PLMKKURZTEXTKURZTEXTChar. Short TextCHAR40Mandatory
PLMKVORGLFNRVORGLFNROp Link (operation sequence number)NUMC4Mandatory (links to PLPO)
PLMKKZREFKZREFReference to Master Insp. Char. (MIC)CHAR1Mandatory if using MIC
PLMKVERWMERKMVERWMERKMMIC Name (Referenced Char.)CHAR18Mandatory if using MIC
PLMKVERSIONVERSIONMIC VersionNUMC2Mandatory if using MIC versioning
PLMKPRUEFUMFPRUEFUMFInspection Scope / Sample Procedure LinkCHAR4Optional (if using sampling procedures)
PLMKSTICHPRVERSTICHPRVERSampling ProcedureCHAR8Optional
PLMKKATALGARTKATALGARTCatalog Type (for qualitative)CHAR3Optional/Required for qualitative
PLMKCODEGRPCODEGRPCode Group (qualitative)CHAR8Optional/Required for qualitative
PLMKAUSWMENGEAUSWMENGESelected Set (qualitative)CHAR8Optional
PLMKPRUEFMEMSEHIUnit of Measure (quantitative)UNIT3Required for quantitative
PLMKTOLKZTOLKZTolerance IndicatorCHAR1Optional (quantitative)
PLMKMWERTSMWERTSTarget ValueDEC15Optional (quantitative)
PLMKMWERTUMWERTULower Spec LimitDEC15Optional/Required per spec rule
PLMKMWERTOMWERTOUpper Spec LimitDEC15Optional/Required per spec rule
PLMKPRFGEBPRFGEBInspection Method AssignmentCHAR12Optional
PLMKPRFGVERPRFGVERInspection Method VersionNUMC2Optional
PLMKANZSTANZSTNo. of Single Values / SamplesNUMC3Optional
PLMKLOEKZLOEKZDeletion Indicator (Characteristic)CHAR1Optional
MAPLMATNRMATNRMaterialCHAR40Mandatory (if plan is material-assigned)
MAPLWERKSWERKS_DPlantCHAR4Mandatory
MAPLPLNTYPLNTYTask List TypeCHAR1Mandatory
MAPLPLNNRPLNNRTask List GroupCHAR8Mandatory
MAPLPLNALPLNALGroup CounterCHAR2Mandatory
MAPLDATUVDATUVValid-From DateDATS8Mandatory
MAPLLOEKZLOEKZDeletion IndicatorCHAR1Optional


Data Cleansing

IDCriticalityError Message/Report DescriptionRuleOutputSource System
2009-001C1Invalid or inactive Material Assignment in Inspection PlanInspection plan (MAPL) is assigned to a material/plant that does not exist or is inactive in Material Master (MARA/MARC).Material MasterPF2/WP2
1006-001C1Invalid or inactive Work Center linked to OperationOperation (PLPO) assigned to a Work Center (CRHD) that does not exist or is flagged for deletion.Work CenterPF2/WP2
1041-001C1Inspection Plan not releasedHeader (PLKO-STATU) not set to “Released” (status = 4). Only released plans are migrated.Inspection Plan HeaderPF2/WP2
1041-002C1Invalid Usage assignmentInspection plan header (PLKO-VERWE) has an invalid or non-configured Task List Usage.UsagePF2/WP2
1041-003C1Missing or invalid Group CounterInspection plan (PLKO-PLNAL) is blank, duplicated, or inconsistent with group.Plan CounterPF2/WP2
1057-001C1Invalid Inspection CharacteristicPlan characteristic (PLMK) references a Master Inspection Characteristic (QPMK) that does not exist, is inactive, or marked for deletion.Master Insp. CharacteristicPF2/WP2
1057-002C1Inconsistent Quantitative/Qualitative SettingsPlan characteristic (PLMK-QUALI) is inconsistent with assigned MIC type (quantitative/qualitative).CharacteristicPF2/WP2
1064-001C1Missing Sampling ProcedurePlan characteristic (PLMK-STICHPRVER) references a Sampling Procedure (QDSV) that does not exist or is obsolete.Sampling ProcedurePF2/WP2
1043-001C1Missing or Invalid Inspection MethodPlan characteristic (PLMK-PRFGEB) references an Inspection Method (QMTB) that does not exist, is obsolete, or not valid for plant.Inspection MethodPF2/WP2
1057-003C1Duplicate Inspection CharacteristicsInspection plan contains duplicate operation/characteristic entries (same VORNR + MERKNR).Inspection Plan Operations/CharacteristicsPF2/WP2
1041-004C1Invalid Validity DateValid-From date (PLKO-DATUV) is in the future or overlaps inconsistently with other plans for same material/usage.Inspection Plan HeaderPF2/WP2
1041-005C1Obsolete Inspection PlanHeader marked with deletion flag (PLKO-LOEKZ = X). Exclude from migration.Inspection Plan HeaderPF2/WP2



Conversion Process

The high-level process is represented by the diagram below:

The ETL (Extract, Transform, Load) process is a structured approach to data migration and management, ensuring high-quality data is seamlessly transferred across systems. Here’s a breakdown of its key components:

1. Extraction
The process begins with extracting metadata and raw data from source systems, such as Syensqo ECC system (i.e. WP2/PF2) periodically. The extracted data is then staged for transformation.


2. Transformation
Once extracted, the data undergoes cleansing, consolidation, and governance. This step ensures data integrity, consistency, and compliance with business rules. The transformation process includes:
- Data validation to remove inconsistencies.
- Standardization to align formats across datasets.
- Business rule application to refine data for operational use.


3. Loading
The transformed data is then loaded into the target S/4HANA system. 

image-2025-6-10_17-0-33.png

Data Privacy and Sensitivity

Not applicable


Extraction

Extract data from a source into Syniti Migrate. There are 2 possibilities:

  1. The data exists. Syniti Migrate connects to the source and loads the data into Syniti Migrate. There are 3 methods:
    1. Perform full data extraction from relevant tables in the source system(s).
    2. Perform extraction through the application layer.
    3. Only if Syniti Migrate; cannot connect to the source, data is loaded to the repository from the provided source system extract/report.
  2. The data does not exist (or cannot be converted from its current state). The data is manually collected by the business directly in Syniti Migrate. This is to be conducted using DCT (Data Collection Template) in Syniti Migrate

The agreed Relevancy criteria is applied to the extracted records to identify the records that are applicable for the Target loads

Extraction Run Sheet

Req #Requirement DescriptionTeam Responsible
Extraction Scope Definition- Identify the source systems and databases involved.
- Define the data objects (tables, fields, records) to be extracted.
- Establish business rules for data selection.

Syniti / LTC Data team

Extraction Methodology- Specify the extraction approach (full, incremental, or delta extraction).
- Determine the tools and technologies used.
- Define data filtering criteria to exclude irrelevant records.
Syniti 
Extraction Execution Plan- Establish execution timelines and batch processing schedules.
- Assign responsibilities for extraction monitoring.
- Document dependencies on other migration tasks.
Syniti
Data Quality and Validation- Define error handling mechanisms for extraction failures.Syniti


Selection Screen

If applicable, this section will give the details on any selection screen parameters, including the parameter type, that are required to be entered to ensure consistent data extracts.
Selection Ref ScreenParameter NameSelection TypeRequirementValue to be entered/set
Not applicable



















Data Collection Template (DCT)

Target Ready Data Collection Template will be created for Data Object data with exception of some fields which require transformation as mentioned in the transformation rule.

<Object> DCT Rules

Field NameField DescriptionRule


Pending MDS and number of data in source system









Extraction Dependencies

List the steps that need to occur before extraction can commence

Item #Step DescriptionTeam Responsible
1

Source System Availability

  • Ensure that the source database or application is accessible.
  • Confirm that necessary credentials and permissions are granted

Syensqo IT

2

Data Structure

  • Identify relationships between tables, views, and stored procedures.

Syniti

3

Referential Integrity

  • Ensure dependent records are extracted together.

Syniti

4

Extraction Methodology

  • Define whether extraction is full, incremental, or delta-based.
  • Establish batch processing schedules for large datasets.

Syniti

5

Performance and Scalability Considerations

  • Optimize extraction queries to prevent system overload.
  • Ensure network bandwidth supports data transfer volumes.

Syniti

6

Security and Compliance

  • Adhere to regulatory standards for sensitive information if applicable

Syniti

7

Data cleansing of legacy Material Master - QM View with assigned inspection type data must be completed.

If standardization within the DCT begins using relevant data from PF2 and WP2 before the cleansing is finalized, it is understood that the business will take due diligence to ensure any subsequent delta cleansing is verified and aligned within the DCT.

Business


Transformation

The Target fields are mapped to the applicable Legacy field that will be its source, this is a 3-way activity involving the Business, Functional team and Data team. This identifies the transformation activity required to allow Syniti Migrate to make the data Target ready:

  1. Perform value mapping and data transformation rules.
    1. Legacy values are mapped to the to-be values (this could include a default value)
    2. Values are transformed according to the rules defined in Syniti Migrate
  2. Prepare target-ready data in the structure and format that is required for loading via prescribed Load Tool. This step also produces the load data ready for business to perform Pre-load Data Validation

Transformation Run Sheet

Item #Step DescriptionTeam Responsible

1

Transformation Scope Definition

- Identify the source and target data structures.

- Define business rules for data standardization.

- Establish data cleansing requirements to remove inconsistencies.

Data Team

2

Data Mapping and Standardization

- Align source fields with target fields.

- Ensure unit consistency (e.g., currency, measurement units)

Data Team

3

Business Rule Application

- Implement data enrichment/collection if applicable

- Apply conditional transformations based on predefined logic/business rules

Data Team

4

Transformation Execution Plan

- Define batch processing schedules.

- Assign responsibilities for monitoring execution.

- Establish error-handling mechanisms

Syniti


Transformation Rules

Rule #Source systemSource TableSource FieldSource DescriptionTarget SystemTarget TableTarget FieldTarget DescriptionTransformation Logic









































Transformation Mapping

Use the exact name and reference this section in the “Transformation rules” above
Mapping Table NameMapping Table Description








Transformation Dependencies

List the steps that need to occur before transformation can commence
Item #Step DescriptionTeam Responsible













Pre-Load Validation

Project Team

The following pre-load validations will be performed by the Project Team.

Completeness

TaskAction

Compare Data Counts

  1. Verify row counts between source and target databases.
  2. Identify missing or duplicated records.

Validate the mandatory fields

Validate there is value for all the mandatory fields

Validate Primary Keys and Unique Constraints

  1. Check for duplicate or missing primary key values.
  2. Ensure unique constraints are maintained.

Test Referential Integrity

Confirm dependent records exist in related tables


Accuracy

TaskAction

Validate the transformation

Validate the fields which require transformation have the value after transformation instead of the original field value

Check Data Consistency

  1. Compare field values across systems
  2. Validate data formats and structures

Business

The following pre-load validations will be performed by the business.

Completeness

TaskAction

Compare Data Count

  1. Verify row counts between source and target databases.
  2. Identify missing or duplicated records.
Review populated templates for missing or incorrect valuesUse checklists to verify completeness and correctness before submission

Accuracy

TaskAction

Conversion Accuracy

Business Data Owner/s to verify that all the data in the load table/file is accurate as per endorsed transformation/ mapping rules (and signed-off DCT data).


Load

The load process includes:

  1. Execute the automated data load into target system using load tool or product the load file if the load must be done manually
  2. Once the data is loaded to the target system, it will be extracted and prepared for Post Load Data Validation

Load Run Sheet

Item #Step DescriptionTeam Responsible

1

Load Scope Definition

- Identify the target system and database structure.

- Define data objects (tables, fields, records) to be loaded.

- Establish business rules for data validation.

Data team

2

Load Methodology

- Specify the loading tools and technologies (Migration Cockpit, LSMW, custom loading program).

Syniti 

3

Data Quality and Validation

- Ensure data integrity checks (null values, duplicates, format validation).

- Perform pre-load validations to verify completeness.

- Define error handling mechanisms for load failures

Syniti 

4

Load Execution Plan

- Establish execution timelines and batch processing schedules.

- Assign responsibilities for monitoring execution.

- Document dependencies on other migration tasks

Syniti

5

Logging and Reporting

- Maintain detailed logs of loading activities.

- Generate summary reports on loaded data volume and quality.

- Define escalation procedures for errors

Syniti 


Load Phase and Dependencies

The Material Listing and Exclusions will be loaded in the pre-cutover (PreCutover 4 phase) period.

Before loading, it will have dependency on the following configuration and data objects in the S/4 HANA.

Configuration

List the Configurations required before loading can commence

Item #Configuration Item






Conversion Objects

Object #Preceding Object Conversion Approach

list the exact title of the conversion object of only the immediate predecessor – this will then confirm the DDD (Data Dependency Diagram)




Error Handling

The table below depicts some possible system errors for this data object during data load. All data load error is to be logged as defect and managed within the Defect Management

Error TypeError DescriptionAction Taken

Configuration / Data Transformation

The value XXX for field XXX doesn't exist

  1. Check the mapping/ conversion is done properly in the loading file
  2. Validate the target value is configured/ transported in the target system
  3. Reach out to function team to validate the configuration

Configuration

There is mandatory field XXX missing

  1. Validate MDS if the fields are set as mandatory
  2. Validate if there is value in the pre-loading file
  3. Validate if the configuration for the mandatory fields are done properly


Post-Load Validation

Project Team

The following post-load validations will be performed by the Project Team.

Completeness

TaskAction

Validate Record count in the backend

Validate all tables with prefix “KOTG” has the same records as the loading file

Display Records

Pick up a few random Material Listing or Material Exclusions, and run t-code: VB03 to validate the Material Listing and Exclusions can be displayed without any error.

Perform Source-to-Target Comparisons

  1. Validate that migrated data matches source records.
  2. Check for discrepancies in numerical values, text fields, and timestamps

Accuracy

TaskAction

Execute Sample Queries and Reports

  1. Run queries to validate business logic.
  2. Generate reports to compare expected vs. actual results

Conduct Post-Migration Reconciliation

Generate reports comparing pre- and post-migration data.


Business

Post-load validation is a critical step in data migration, ensuring that transferred data is accurate, complete, and functional within the target system.

1. Ensuring Data Integrity
After migration, data must be consistent with its original structure. Post-load validation checks for missing records, incorrect mappings, and formatting errors to prevent discrepancies.
2. Business Continuity
Faulty data can disrupt operations, leading to financial losses and inefficiencies. Validating post-load data ensures that applications function as expected, preventing downtime.
3. Error Detection and Resolution
By validating data post-migration, businesses can detect anomalies early, reducing the cost and effort required for corrections


Completeness

TaskAction
Perform Source-to-Target Comparisons
  1. Validate that migrated data matches source records.
  2. Check for discrepancies in numerical values, text fields, and timestamps
Conduct Post-Migration ReconciliationGo through reports comparing pre- and post-migration data.

Accuracy

TaskAction

Perform Manual Testing

Conduct manual spot-checks for additional assurance.


Key Assumptions

  • Master Data Standard is up to date as on the date of documenting this conversion approach and data load.
  • Inspection Plan is in scope based on data design and any exception requested by business.

Any additional key assumptions.


See also

Insert links and references to other documents which are relevant when trying to understand this decision and its implications. Other decisions are often impacted, so it's good to list them here with links. Attachments are also possible but dangerous as they are static documents and not updated by their authors.

Change log

Workflow history