Status

Owner
Stakeholders

Purpose

The purpose of this document is to define the conversion approach to create 1044 Quality Info Record in S/4 HANA.

Quality Info Records are used in SAP Quality Management (QM) to control the procurement and usage of materials from vendors or for customers. A QIR documents whether a supplier is approved to deliver a particular material, under what conditions, and with what quality agreements in place. It also supports restrictions such as blocking or requiring source inspections before delivery. This feature helps ensure supplier quality compliance and strengthens the linkage between procurement and quality processes.

In SAP S/4HANA, the structure and usage of Quality Info Records remain consistent with SAP ECC, generally defined by key combinations such as Material / Vendor / Plant or Material / Customer / Plant. QIRs can include attributes such as release status, validity dates, procurement blocks, source inspection requirements, and quality agreement indicators.

In SAP ECC, aside from the standard structures, Quality Info Records may also exist with extended combinations or enhancements, for example:

  • Material / Vendor without plant reference,

  • Material / Customer for sales-side QIRs,

  • Additional custom fields capturing supplier quality ratings,

  • Legacy blocks or obsolete release indicators that are no longer in use.
    Such cases must be reviewed carefully (pending MDS) to determine whether they remain relevant for migration.

This conversion aims to migrate active and relevant Quality Info Records from existing ECC systems into S/4HANA by applying the required transformation logic using Syniti as the data migration and transformation platform. The converted records will be loaded into the target S/4HANA system using standard SAP mechanisms such as BAPIs (e.g., BAPI_QM_QUALINFORECORD_CREATE), IDOCs, or direct table loads where applicable, ensuring compliance with procurement and quality integration processes.

Conversion Scope

The scope of this document covers the approach for converting active Quality Info Record from Legacy Source Systems into S/4HANA following the Quality Info Record - Master Data Design Standard.

The data from legacy system includes:

  1. Active QIRs used within the last three (3) years in procurement or sales processes.

  2. QIRs without deletion flag.

  3. Plant-specific QIRs that will be migrated to the To-Be Plant Mapping (taking into consideration the To-Be definition of Plants).

  4. QIRs referenced in active material-vendor (procurement) or material-customer (sales) combinations.

The data from legacy system excludes:

  1. Inactive QIRs not used in more than three (3) years.

  2. QIRs marked for deletion.

  3. QIRs belonging to deleted plants (per To-Be Plant mapping).

  4. Obsolete or duplicate QIRs that are no longer relevant (e.g., replaced by harmonized corporate standards).


List of source systems and approximate number of records
SourceScope

Source Approx No. of Records

Target SystemTarget Approx

No. of Records

PF2 and WP2Quality Info Record will be extracted from PF2 and WP2 client

PF2:

Raw Data   : 33650 records
In-scope     : 8086 records

WP2:

Raw Data   : 8892 records
In-scope     : 5168 records

S/4 HANA

In-scope data:

13254 records

Additional Information

Multi-language Requirement

Not applicable

Document Management

Not applicable

Legal Requirement

Not applicable

Special Requirements

Not applicable


Target Design

The technical design of the target for this conversion approach.

TableFieldData ElementField DescriptionData TypeLengthRequirement
QINFMANDTMANDTThe client field, identifies the SAP client for which record was createdCLNT3R
QINFMATNRMATNRMaterial Number for which quality info record is to be createdCHAR40R
QINFWERKWERKS_DPlant Code for which quality info record is to be createdCHAR4R
QINFLIEFERANTELIFNVendor Code for which quality info record is to be createdCHAR10R
QINFREVLVREVLVThis field stores specific version of a material for a given vendor-plant combination in a quality info recordCHAR2R
QINFFREI_DATQFREIDATValid to date for the quality info recordDATS8R
QINFFREI_MNGQFREIMGThis field functions as a limit the controls how much of a product can be ordered from a specific vendor before further action is taken, such as needing a new quality inspection.QUAN13C
QINFSPERRGRUNDQSPERRGRThis field is a blocking reason code that specifies why a material-vendor combination is blocked or released for procurement transactions like quotations, purchase orders, or goods receipts.CHAR40C
QINFSPERRFKTQSPERRFKTThis field can can block a supplier for purchase orders, goods receipts, or both, by setting the appropriate value.CHAR2C
QINFFREI_MGKZQFREIMGKZThis indicator effectively blocks new purchase orders for the specified material from that vendor, as the "release quantity" has been exhausted or used up.CHAR1C
QINFNOINSPQNOINSPThis field indicates whether a goods receipt inspection is to be skipped for a specific material, vendor, and plant combination. It allows users to skip inspections for materials that are already known to be of high quality or where no incoming inspection is required by the business process.CHAR1C
QINFQSSYSFAMQSSYS_ISTThi field indicates the specific quality management system used by a supplier for a particular material and vendor combination.CHAR4C
QINFQSSYSDATQQSSYSDATValid to Date for Actual (Existing) QM systemDATS8C
QINFVORLABNQVORLABNLot Creation lead time for source inspectionINT13C
QINFVARIABNAHMQINFVAR1Inspection type for inspection lot origin (source inspection)CHAR8C
QINFCERTCONTROLQCERTCONTROLThis field allows to control how quality certificates are handled for a material from a specific supplier, for example, to trigger inspections or block procurement until a certificate is received and verified.CHAR1C
QINFZAEHLQ_COUNTInternal Counter for DB ObjectsNUMC6S
QINFERSTELLERQINFERSTName of User who Created the Q-Info RecordCHAR12S
QINFERSTELLDATQINFERSTDTCreate Date of Q-Info RecordDATS8S
QINFAENDERERQINFAENDName of User who Most Recently Changed the Q-Info RecordCHAR12S
QINFAENDERDATQINFAENDDTChange Date of Q-Info RecordDATS8S
QINFBEST_MGQBSTMGThis field accumulates the total quantity of a specific material that has been ordered from a particular vendor for a given plant.QUAN13C
QINFMEQBSTMEBase Unit of Measure for the MaterialUNIT3C
QINFDAT_RUECKQRSDATThe reset date clears the accumulated total ordered quantity, allowing for new orders to be placed again against the release quantity.DATS8C
QINFSPRACHESPRASThis field indicates the language in which information, like text descriptions or status messages, is stored or displayedLANG1C
QINFQVVORHQQSVVORHIf the Quality agreement exists ( is True) for this Material with this Supplier, certain aspects of the Quality Info Records will be determined by the Quality Agreement.CHAR1S
QINFLTEXTKZQLTEXTKZInspection Characteristic Long Text ExistsCHAR1C
QINFSPRASSPRASThis field allows SAP to store and retrieve information in different languages for users and customers with varying language preferences.LANG1C
QINFPLOSQPLOSQINFThis field provides the direct link between the material's procurement record and quality inspection process for that specific batch or quantityNUMC12S
QINFPLOS2QPLOSQINF2This field is used to link a info record to a subsequent inspection lot, to define conditions or actions that should be triggered based on the outcome of a previous inspection.NUMC12S
QINFSTSMAQSTSMAQINFStatus profile defines a set of statuses for a quality info recordCHAR8C
QINFOBJNRQINFOBJNRSystem generated unique identifier for quality info recordCHAR22S
QINFNOINSPABNQNOWEABNThis field prevents goods receipt (GR) inspection from taking place if a source inspection lot has already been created for that material and vendor combination.CHAR1C
QINFLOEKZQLOEKZThis field indicates that quality info record is flagged for deletion or has been archived, but it is not yet physically deleted from the databaseCHAR1C


Data Cleansing

IDCriticalityError Message/Report DescriptionRuleOutputSource System
2020-001C1Invalid or inactive Material in QIRQIR references a material that does not exist in MARA/MARC or is flagged for deletion/blocked at plant.MaterialPF2/WP2
1139-001
3018-001
3021-001
3024-001
3025-001
3028-001
3031-001
C1Invalid or inactive Vendor in QIR (procurement side)QIR (MATNR/LIFNR/WERKS) references vendor not in LFA1/LFM1 (or blocked for purchasing).VendorPF2/WP2
1139-001
3018-001
3021-001
3024-001
3025-001
3028-001
3031-001
C1Invalid or inactive Customer in QIR (sales side)QIR (MATNR/KUNNR/WERKS) references customer not in KNA1/KNVV (or blocked).CustomerPF2/WP2
2010-001
3027-001
C1Invalid PlantQIR plant (WERKS) does not exist or is obsolete.PlantPF2/WP2
1044-001C1Duplicate QIR keyDuplicate active QIRs with same combination (Material + Vendor/Customer + Plant) and overlapping validity.QIRPF2/WP2
1044-001C1Validity overlap within same keyMultiple QIR records for same key have overlapping GUELTIGAB/GUELTIGBIS dates.QIRPF2/WP2
1044-001C1QIR not released/approvedRelease indicator/status not set to “Released/Approved” as per business rule.QIRPF2/WP2
1044-001C1QIR blockedProcurement block indicator set (e.g., SPERRE = X) while record flagged as active for migration.QIRPF2/WP2
1044-001C1Expired QIRValid-to date is before cutover date; exclude unless explicitly required.QIRPF2/WP2
1044-001C1Future-dated QIR outside go-live horizonValid-from date is after the go-live window (e.g., > N days after cutover) and no business justification provided.QIRPF2/WP2
2020-001
1139-001
3018-001
3021-001
3024-001
3025-001
3028-001
3031-001
C1Source Inspection required but no enabling setupQIR requires source inspection (e.g., indicator set) but vendor/material lacks inspection type & control in S/4 target design.Material/Vendor SetupPF2/WP2
1044-001C1Missing text/description for key languagesLanguage-dependent short text missing in required languages for the QIR.QIR TextPF2/WP2
1044-001C1Conflicting status flagsRecord simultaneously marked for deletion and released (inconsistent).QIRPF2/WP2
1139-001
3018-001
3021-001
3024-001
3025-001
3028-001
3031-001
C1Vendor Subrange / Purchasing Org mismatchQIR expects vendor subrange or purchasing org-specific data that doesn’t exist/align.Vendor MasterPF2/WP2
2009-001C1Material quality view missingMaterial lacks required QM view/settings where policy mandates QM integration with QIR.MaterialPF2/WP2

C1Customer-side QIR without usageCustomer QIR exists but material/plant not configured to use it (sales-side QM not in scope).Scope DecisionPF2/WP2

C1Obsolete custom fields populatedLegacy custom fields used in ECC with no S/4 mapping; values must be cleared/mapped.Mapping IssuePF2/WP2
1044-001C1Audit fields inconsistentCreated/changed by/on are missing or malformed (optional clean-up for traceability).QIRPF2/WP2


Conversion Process

The high-level process is represented by the diagram below:

The ETL (Extract, Transform, Load) process is a structured approach to data migration and management, ensuring high-quality data is seamlessly transferred across systems. Here’s a breakdown of its key components:

1. Extraction
The process begins with extracting metadata and raw data from source systems, such as Syensqo ECC system (i.e. WP2/PF2) periodically. The extracted data is then staged for transformation.


2. Transformation
Once extracted, the data undergoes cleansing, consolidation, and governance. This step ensures data integrity, consistency, and compliance with business rules. The transformation process includes:
- Data validation to remove inconsistencies.
- Standardization to align formats across datasets.
- Business rule application to refine data for operational use.


3. Loading
The transformed data is then loaded into the target S/4HANA system. 


Data Privacy and Sensitivity

Not applicable


Extraction

Extract data from a source into Syniti Migrate. There are 2 possibilities:

  1. The data exists. Syniti Migrate connects to the source and loads the data into Syniti Migrate. There are 3 methods:
    1. Perform full data extraction from relevant tables in the source system(s).
    2. Perform extraction through the application layer.
    3. Only if Syniti Migrate; cannot connect to the source, data is loaded to the repository from the provided source system extract/report.
  2. The data does not exist (or cannot be converted from its current state). The data is manually collected by the business directly in Syniti Migrate. This is to be conducted using DCT (Data Collection Template) in Syniti Migrate

The agreed Relevancy criteria is applied to the extracted records to identify the records that are applicable for the Target loads

Extraction Run Sheet

Req #Requirement DescriptionTeam Responsible
Extraction Scope Definition- Identify the source systems and databases involved.
- Define the data objects (tables, fields, records) to be extracted.
- Establish business rules for data selection.

Syniti / LTC Data team

Extraction Methodology- Specify the extraction approach (full, incremental, or delta extraction).
- Determine the tools and technologies used.
- Define data filtering criteria to exclude irrelevant records.
Syniti 
Extraction Execution Plan- Establish execution timelines and batch processing schedules.
- Assign responsibilities for extraction monitoring.
- Document dependencies on other migration tasks.
Syniti
Data Quality and Validation- Define error handling mechanisms for extraction failures.Syniti


Selection Screen


Selection Ref ScreenParameter NameSelection TypeRequirementValue to be entered/set





















Data Collection Template (DCT)

The Data Collection Template (DCT) will not be applicable in this case. If there is a need to create a new Master Data (MD) for Inspection Plan object, the business must perform this activity in the source system. The newly created object will then be captured and migrated as part of the standard migration process.

Extraction Dependencies

Item #Step DescriptionTeam Responsible
1

Source System Availability

  • Ensure that the source database or application is accessible.
  • Confirm that necessary credentials and permissions are granted

Syensqo IT

2

Data Structure

  • Identify relationships between tables, views, and stored procedures.

Syniti

3

Referential Integrity

  • Ensure dependent records are extracted together.

Syniti

4

Extraction Methodology

  • Define whether extraction is full, incremental, or delta-based.
  • Establish batch processing schedules for large datasets.

Syniti

5

Performance and Scalability Considerations

  • Optimize extraction queries to prevent system overload.
  • Ensure network bandwidth supports data transfer volumes.

Syniti

6

Security and Compliance

  • Adhere to regulatory standards for sensitive information if applicable

Syniti


Transformation

The Target fields are mapped to the applicable Legacy field that will be its source, this is a 3-way activity involving the Business, Functional team and Data team. This identifies the transformation activity required to allow Syniti Migrate to make the data Target ready:

  1. Perform value mapping and data transformation rules.
    1. Legacy values are mapped to the to-be values (this could include a default value)
    2. Values are transformed according to the rules defined in Syniti Migrate
  2. Prepare target-ready data in the structure and format that is required for loading via prescribed Load Tool. This step also produces the load data ready for business to perform Pre-load Data Validation

Transformation Run Sheet

Item #Step DescriptionTeam Responsible

1

Transformation Scope Definition

- Identify the source and target data structures.

- Define business rules for data standardization.

- Establish data cleansing requirements to remove inconsistencies.

Data Team

2

Data Mapping and Standardization

- Align source fields with target fields.

- Ensure unit consistency (e.g., currency, measurement units)

Data Team

3

Business Rule Application

- Implement data enrichment/collection if applicable

- Apply conditional transformations based on predefined logic/business rules

Data Team

4

Transformation Execution Plan

- Define batch processing schedules.

- Assign responsibilities for monitoring execution.

- Establish error-handling mechanisms

Syniti


Transformation Rules

Rule #Source systemSource TableSource FieldSource DescriptionTarget SystemTarget TableTarget FieldTarget DescriptionTransformation Logic









































Transformation Mapping

Use the exact name and reference this section in the “Transformation rules” above
Mapping Table NameMapping Table Description








Transformation Dependencies

List the steps that need to occur before transformation can commence
Item #Step DescriptionTeam Responsible

1

Value Mappings are according to the latest design - <List of Value Mappings>

SyWay Data Team


Pre-Load Validation

Project Team

The following pre-load validations will be performed by the Project Team.

Completeness

TaskAction

Compare Data Counts

  1. Verify row counts between source and target databases.
  2. Identify missing or duplicated records.

Validate the mandatory fields

Validate there is value for all the mandatory fields

Validate Primary Keys and Unique Constraints

  1. Check for duplicate or missing primary key values.
  2. Ensure unique constraints are maintained.

Test Referential Integrity

Confirm dependent records exist in related tables


Accuracy

TaskAction

Validate the transformation

Validate the fields which require transformation have the value after transformation instead of the original field value

Check Data Consistency

  1. Compare field values across systems
  2. Validate data formats and structures

Business

Completeness

TaskAction

Compare Data Count

  1. Verify row counts between source and target databases.
  2. Identify missing or duplicated records.
Review populated templates for missing or incorrect valuesUse checklists to verify completeness and correctness before submission



Accuracy

TaskAction

Conversion Accuracy

Business Data Owner/s to verify that all the data in the load table/file is accurate as per endorsed transformation/ mapping rules (and signed-off DCT data).


Load

The load process includes:

  1. Execute the automated data load into target system using load tool or product the load file if the load must be done manually
  2. Once the data is loaded to the target system, it will be extracted and prepared for Post Load Data Validation

Load Run Sheet

Item #Step DescriptionTeam Responsible

1

Load Scope Definition

- Identify the target system and database structure.

- Define data objects (tables, fields, records) to be loaded.

- Establish business rules for data validation.

Data team

2

Load Methodology

- Specify the loading tools and technologies (Migration Cockpit, LSMW, custom loading program).

Syniti 

3

Data Quality and Validation

- Ensure data integrity checks (null values, duplicates, format validation).

- Perform pre-load validations to verify completeness.

- Define error handling mechanisms for load failures

Syniti 

4

Load Execution Plan

- Establish execution timelines and batch processing schedules.

- Assign responsibilities for monitoring execution.

- Document dependencies on other migration tasks

Syniti

5

Logging and Reporting

- Maintain detailed logs of loading activities.

- Generate summary reports on loaded data volume and quality.

- Define escalation procedures for errors

Syniti 


Load Phase and Dependencies

The Quality Info records will be loaded in the pre-cutover (PreCutover 4 phase) period.

Before loading, it will have dependency on the following configuration and data objects in the S/4 HANA.

Configuration

List the Configurations required before loading can commence

Item #Configuration Item






Conversion Objects

Object #Preceding Object Conversion Approach

list the exact title of the conversion object of only the immediate predecessor – this will then confirm the DDD (Data Dependency Diagram)




Error Handling

The table below depicts some possible system errors for this data object during data load. All data load error is to be logged as defect and managed within the Defect Management

Error TypeError DescriptionAction Taken










Post-Load Validation

Project Team

Completeness

TaskAction

Validate Record count in the backend

Validate all tables with prefix “KOTG” has the same records as the loading file

Display Records

Pick up a few random Material Listing or Material Exclusions, and run t-code: VB03 to validate the Material Listing and Exclusions can be displayed without any error.

Perform Source-to-Target Comparisons

  1. Validate that migrated data matches source records.
  2. Check for discrepancies in numerical values, text fields, and timestamps

Accuracy

TaskAction

Execute Sample Queries and Reports

  1. Run queries to validate business logic.
  2. Generate reports to compare expected vs. actual results

Conduct Post-Migration Reconciliation

Generate reports comparing pre- and post-migration data.


Business

Post-load validation is a critical step in data migration, ensuring that transferred data is accurate, complete, and functional within the target system.

1. Ensuring Data Integrity
After migration, data must be consistent with its original structure. Post-load validation checks for missing records, incorrect mappings, and formatting errors to prevent discrepancies.
2. Business Continuity
Faulty data can disrupt operations, leading to financial losses and inefficiencies. Validating post-load data ensures that applications function as expected, preventing downtime.
3. Error Detection and Resolution
By validating data post-migration, businesses can detect anomalies early, reducing the cost and effort required for corrections

Completeness

TaskAction
Perform Source-to-Target Comparisons
  1. Validate that migrated data matches source records.
  2. Check for discrepancies in numerical values, text fields, and timestamps
Conduct Post-Migration ReconciliationGo through reports comparing pre- and post-migration data.

Accuracy

TaskAction

Perform Manual Testing

Conduct manual spot-checks for additional assurance.


Key Assumptions

  • Master Data Standard is up to date as on the date of documenting this conversion approach and data load.
  • Inspection Plan is in scope based on data design and any exception requested by business.
  • Data cleansing has met the required percentage threshold for the specified mock cycle and all preparation activities have been completed.
  • Data entries in DCT are target-ready data unless a specific transformation rule is stated for that field in the transformation rules.


See also


Change log