Status

Owner
Stakeholders

Purpose

The purpose of this document is to define the conversion approach to create 1044 Quality Info Record in S/4 HANA.

Quality Info Records are used in SAP Quality Management (QM) to control the procurement and usage of materials from vendors or for customers. A QIR documents whether a supplier is approved to deliver a particular material, under what conditions, and with what quality agreements in place. It also supports restrictions such as blocking or requiring source inspections before delivery. This feature helps ensure supplier quality compliance and strengthens the linkage between procurement and quality processes.

In SAP S/4HANA, the structure and usage of Quality Info Records remain consistent with SAP ECC, generally defined by key combinations such as Material / Vendor / Plant or Material / Customer / Plant. QIRs can include attributes such as release status, validity dates, procurement blocks, source inspection requirements, and quality agreement indicators.

In SAP ECC, aside from the standard structures, Quality Info Records may also exist with extended combinations or enhancements, for example:

  • Material / Vendor without plant reference,

  • Material / Customer for sales-side QIRs,

  • Additional custom fields capturing supplier quality ratings,

  • Legacy blocks or obsolete release indicators that are no longer in use.
    Such cases must be reviewed carefully (pending MDS) to determine whether they remain relevant for migration.

This conversion aims to migrate active and relevant Quality Info Records from existing ECC systems into S/4HANA by applying the required transformation logic using Syniti as the data migration and transformation platform. The converted records will be loaded into the target S/4HANA system using standard SAP mechanisms such as BAPIs (e.g., BAPI_QM_QUALINFORECORD_CREATE), IDOCs, or direct table loads where applicable, ensuring compliance with procurement and quality integration processes.

Conversion Scope

The scope of this document covers the approach for converting active Quality Info Record from Legacy Source Systems into S/4HANA following the Quality Info Record - Master Data Design Standard.

The data from legacy system includes:

  1. Active QIRs used within the last three (3) years in procurement or sales processes.

  2. QIRs without deletion flag.

  3. Plant-specific QIRs that will be migrated to the To-Be Plant Mapping (taking into consideration the To-Be definition of Plants).

  4. QIRs referenced in active material-vendor (procurement) or material-customer (sales) combinations.

The data from legacy system excludes:

  1. Inactive QIRs not used in more than three (3) years.

  2. QIRs marked for deletion.

  3. QIRs belonging to deleted plants (per To-Be Plant mapping).

  4. Obsolete or duplicate QIRs that are no longer relevant (e.g., replaced by harmonized corporate standards).


List of source systems and approximate number of records
SourceScope

Source Approx No. of Records

Target SystemTarget Approx

No. of Records

PF2 and WP2Quality Info Record will be extracted from PF2 and WP2 client

PF2 = 33642 records

WP2 = 8889 records

S/4 HANA13655 records
DCTQuality Info Records for Material / Vendor / Customer / Plant combinations which do not have data existing from PF2 and WP2TBDS/4 HANATBD










Additional Information

Multi-language Requirement

Not applicable

Document Management

Not applicable

Legal Requirement

Not applicable

Special Requirements

Not applicable


Target Design

The technical design of the target for this conversion approach.

TableFieldData ElementField DescriptionData TypeLengthRequirement
QINFMATNRMATNRMaterial NumberCHAR18Mandatory – Material subject to QIR
QINFLIFNRLIFNRVendor NumberCHAR10Mandatory (for procurement-side QIR)
QINFKUNNRKUNNRCustomer NumberCHAR10Mandatory (for customer-side QIR)
QINFWERKSWERKS_DPlantCHAR4Mandatory – Plant where QIR applies
QINFQINFARTQINFARTQIR Category (Procurement, Sales)CHAR1Mandatory
QINFGUELTIGABDATUVValid-From DateDATS8Mandatory
QINFGUELTIGBISDATUBValid-To DateDATS8Optional (blank = indefinite)
QINFSPERREQSPERRProcurement Block IndicatorCHAR1Optional (X = Blocked)
QINFFREIGABEQFREIGRelease IndicatorCHAR1Mandatory – Status of approval (Released/Not Released)
QINFQVERWEISQVERWQuality Agreement Reference (custom use)CHAR10Optional
QINFPRUEFLOSERFQPRLOSSource Inspection Required (Yes/No)CHAR1Optional
QINFKZLOEKZLOEKZDeletion FlagCHAR1Optional (exclude if flagged)
QINFERNAMERNAMCreated ByCHAR12Optional (audit)
QINFERDATERDATCreated OnDATS8Optional (audit)
QINFAENAMAENAMLast Changed ByCHAR12Optional
QINFAEDATAEDATLast Changed OnDATS8Optional
QIHTMATNRMATNRMaterial NumberCHAR18Mandatory (link key)
QIHTLIFNR / KUNNRLIFNR / KUNNRVendor/CustomerCHAR10Mandatory (link key)
QIHTWERKSWERKS_DPlantCHAR4Mandatory (link key)
QIHTSPRASSPRASLanguage KeyLANG1Mandatory
QIHTKURZTEXTKURZTEXTShort Text / Description of QIRCHAR40Mandatory (per language)
QIHTLTEXTKZLTEXTKZLong Text Exists IndicatorCHAR1Optional (if long text maintained)


Data Cleansing

IDCriticalityError Message/Report DescriptionRuleOutputSource System
2020-001C1Invalid or inactive Material in QIRQIR references a material that does not exist in MARA/MARC or is flagged for deletion/blocked at plant.MaterialPF2/WP2
1139-001
3018-001
3021-001
3024-001
3025-001
3028-001
3031-001
C1Invalid or inactive Vendor in QIR (procurement side)QIR (MATNR/LIFNR/WERKS) references vendor not in LFA1/LFM1 (or blocked for purchasing).VendorPF2/WP2
1139-001
3018-001
3021-001
3024-001
3025-001
3028-001
3031-001
C1Invalid or inactive Customer in QIR (sales side)QIR (MATNR/KUNNR/WERKS) references customer not in KNA1/KNVV (or blocked).CustomerPF2/WP2
2010-001
3027-001
C1Invalid PlantQIR plant (WERKS) does not exist or is obsolete.PlantPF2/WP2
1044-001C1Duplicate QIR keyDuplicate active QIRs with same combination (Material + Vendor/Customer + Plant) and overlapping validity.QIRPF2/WP2
1044-001C1Validity overlap within same keyMultiple QIR records for same key have overlapping GUELTIGAB/GUELTIGBIS dates.QIRPF2/WP2
1044-001C1QIR not released/approvedRelease indicator/status not set to “Released/Approved” as per business rule.QIRPF2/WP2
1044-001C1QIR blockedProcurement block indicator set (e.g., SPERRE = X) while record flagged as active for migration.QIRPF2/WP2
1044-001C1Expired QIRValid-to date is before cutover date; exclude unless explicitly required.QIRPF2/WP2
1044-001C1Future-dated QIR outside go-live horizonValid-from date is after the go-live window (e.g., > N days after cutover) and no business justification provided.QIRPF2/WP2
2020-001
1139-001
3018-001
3021-001
3024-001
3025-001
3028-001
3031-001
C1Source Inspection required but no enabling setupQIR requires source inspection (e.g., indicator set) but vendor/material lacks inspection type & control in S/4 target design.Material/Vendor SetupPF2/WP2
1044-001C1Missing text/description for key languagesLanguage-dependent short text missing in required languages for the QIR.QIR TextPF2/WP2
1044-001C1Conflicting status flagsRecord simultaneously marked for deletion and released (inconsistent).QIRPF2/WP2
1139-001
3018-001
3021-001
3024-001
3025-001
3028-001
3031-001
C1Vendor Subrange / Purchasing Org mismatchQIR expects vendor subrange or purchasing org-specific data that doesn’t exist/align.Vendor MasterPF2/WP2
2009-001C1Material quality view missingMaterial lacks required QM view/settings where policy mandates QM integration with QIR.MaterialPF2/WP2

C1Customer-side QIR without usageCustomer QIR exists but material/plant not configured to use it (sales-side QM not in scope).Scope DecisionPF2/WP2

C1Obsolete custom fields populatedLegacy custom fields used in ECC with no S/4 mapping; values must be cleared/mapped.Mapping IssuePF2/WP2
1044-001C1Audit fields inconsistentCreated/changed by/on are missing or malformed (optional clean-up for traceability).QIRPF2/WP2


Conversion Process

The high-level process is represented by the diagram below:

The ETL (Extract, Transform, Load) process is a structured approach to data migration and management, ensuring high-quality data is seamlessly transferred across systems. Here’s a breakdown of its key components:

1. Extraction
The process begins with extracting metadata and raw data from source systems, such as Syensqo ECC system (i.e. WP2/PF2) periodically. The extracted data is then staged for transformation.


2. Transformation
Once extracted, the data undergoes cleansing, consolidation, and governance. This step ensures data integrity, consistency, and compliance with business rules. The transformation process includes:
- Data validation to remove inconsistencies.
- Standardization to align formats across datasets.
- Business rule application to refine data for operational use.


3. Loading
The transformed data is then loaded into the target S/4HANA system. 

image-2025-6-10_17-0-33.png


Data Privacy and Sensitivity

Not applicable


Extraction

Extract data from a source into Syniti Migrate. There are 2 possibilities:

  1. The data exists. Syniti Migrate connects to the source and loads the data into Syniti Migrate. There are 3 methods:
    1. Perform full data extraction from relevant tables in the source system(s).
    2. Perform extraction through the application layer.
    3. Only if Syniti Migrate; cannot connect to the source, data is loaded to the repository from the provided source system extract/report.
  2. The data does not exist (or cannot be converted from its current state). The data is manually collected by the business directly in Syniti Migrate. This is to be conducted using DCT (Data Collection Template) in Syniti Migrate

The agreed Relevancy criteria is applied to the extracted records to identify the records that are applicable for the Target loads

Extraction Run Sheet

Req #Requirement DescriptionTeam Responsible
Extraction Scope Definition- Identify the source systems and databases involved.
- Define the data objects (tables, fields, records) to be extracted.
- Establish business rules for data selection.

Syniti / LTC Data team

Extraction Methodology- Specify the extraction approach (full, incremental, or delta extraction).
- Determine the tools and technologies used.
- Define data filtering criteria to exclude irrelevant records.
Syniti 
Extraction Execution Plan- Establish execution timelines and batch processing schedules.
- Assign responsibilities for extraction monitoring.
- Document dependencies on other migration tasks.
Syniti
Data Quality and Validation- Define error handling mechanisms for extraction failures.Syniti


Selection Screen


Selection Ref ScreenParameter NameSelection TypeRequirementValue to be entered/set





















Data Collection Template (DCT)

Target Ready Data Collection Template will be created for Quality Info Record data with exception of some fields which require transformation as mentioned in the transformation rule.

Quality Info Record DCT Rules

Field NameField DescriptionRule


Pending MDS and number of data in source system









Extraction Dependencies

Item #Step DescriptionTeam Responsible
1

Source System Availability

  • Ensure that the source database or application is accessible.
  • Confirm that necessary credentials and permissions are granted

Syensqo IT

2

Data Structure

  • Identify relationships between tables, views, and stored procedures.

Syniti

3

Referential Integrity

  • Ensure dependent records are extracted together.

Syniti

4

Extraction Methodology

  • Define whether extraction is full, incremental, or delta-based.
  • Establish batch processing schedules for large datasets.

Syniti

5

Performance and Scalability Considerations

  • Optimize extraction queries to prevent system overload.
  • Ensure network bandwidth supports data transfer volumes.

Syniti

6

Security and Compliance

  • Adhere to regulatory standards for sensitive information if applicable

Syniti


Transformation

The Target fields are mapped to the applicable Legacy field that will be its source, this is a 3-way activity involving the Business, Functional team and Data team. This identifies the transformation activity required to allow Syniti Migrate to make the data Target ready:

  1. Perform value mapping and data transformation rules.
    1. Legacy values are mapped to the to-be values (this could include a default value)
    2. Values are transformed according to the rules defined in Syniti Migrate
  2. Prepare target-ready data in the structure and format that is required for loading via prescribed Load Tool. This step also produces the load data ready for business to perform Pre-load Data Validation

Transformation Run Sheet

Item #Step DescriptionTeam Responsible













Transformation Rules

Rule #Source systemSource TableSource FieldSource DescriptionTarget SystemTarget TableTarget FieldTarget DescriptionTransformation Logic









































Transformation Mapping

Use the exact name and reference this section in the “Transformation rules” above
Mapping Table NameMapping Table Description








Transformation Dependencies

List the steps that need to occur before transformation can commence
Item #Step DescriptionTeam Responsible













Pre-Load Validation

Project Team

The following pre-load validations will be performed by the Project Team.

Completeness

TaskAction
titlespecific details of what and how the task needs to be performed e.g. which reports are being used etc.





Accuracy

TaskAction
titlespecific details of what and how the task needs to be performed e.g. which reports are being used etc.





Business

The following pre-load validations will be performed by the business.

Completeness

TaskAction
titlespecific details of what and how the task needs to be performed e.g. which reports are being used etc.





Accuracy

TaskAction
titlespecific details of what and how the task needs to be performed e.g. which reports are being used etc.





Load

The load process includes:

  1. Execute the automated data load into target system using load tool or product the load file if the load must be done manually
  2. Once the data is loaded to the target system, it will be extracted and prepared for Post Load Data Validation

Load Run Sheet

Item #Step DescriptionTeam Responsible













Load Phase and Dependencies

Identify the phase as to “when” the load for this object will occur. <Pre-Cutover, Cutover, Post Cutover> and list the steps that need to occur before the load can commence

Configuration

List the Configurations required before loading can commence

Item #Configuration Item






Conversion Objects

Object #Preceding Object Conversion Approach

list the exact title of the conversion object of only the immediate predecessor – this will then confirm the DDD (Data Dependency Diagram)




Error Handling

The table below depicts some possible system errors for this data object during data load. All data load error is to be logged as defect and managed within the Defect Management

Error TypeError DescriptionAction Taken










Post-Load Validation

Project Team

The following post-load validations will be performed by the Project Team.

Completeness

TaskAction
titlespecific details of what and how the task needs to be performed e.g. which reports are being used etc.





Accuracy

TaskAction
titlespecific details of what and how the task needs to be performed e.g. which reports are being used etc.





Business

The following post-load validations will be performed by the business.

Completeness

TaskAction
titlespecific details of what and how the task needs to be performed e.g. which reports are being used etc.





Accuracy

TaskAction
titlespecific details of what and how the task needs to be performed e.g. which reports are being used etc.





Key Assumptions

  • Master Data Standard is up to date as on the date of documenting this conversion approach and data load.
  • Data Object is in scope based on data design and any exception requested by business.

Any additional key assumptions.


See also

Insert links and references to other documents which are relevant when trying to understand this decision and its implications. Other decisions are often impacted, so it's good to list them here with links. Attachments are also possible but dangerous as they are static documents and not updated by their authors.

Change log