| Status | Revision in Progress |
|---|---|
| Owner | |
| Stakeholders |
Purpose
The purpose of this document is to define the conversion approach to create 1044 Quality Info Record in S/4 HANA.
Quality Info Records are used in SAP Quality Management (QM) to control the procurement and usage of materials from vendors or for customers. A QIR documents whether a supplier is approved to deliver a particular material, under what conditions, and with what quality agreements in place. It also supports restrictions such as blocking or requiring source inspections before delivery. This feature helps ensure supplier quality compliance and strengthens the linkage between procurement and quality processes.
In SAP S/4HANA, the structure and usage of Quality Info Records remain consistent with SAP ECC, generally defined by key combinations such as Material / Vendor / Plant or Material / Customer / Plant. QIRs can include attributes such as release status, validity dates, procurement blocks, source inspection requirements, and quality agreement indicators.
In SAP ECC, aside from the standard structures, Quality Info Records may also exist with extended combinations or enhancements, for example:
Material / Vendor without plant reference,
Material / Customer for sales-side QIRs,
Additional custom fields capturing supplier quality ratings,
Legacy blocks or obsolete release indicators that are no longer in use.
Such cases must be reviewed carefully (pending MDS) to determine whether they remain relevant for migration.
This conversion aims to migrate active and relevant Quality Info Records from existing ECC systems into S/4HANA by applying the required transformation logic using Syniti as the data migration and transformation platform. The converted records will be loaded into the target S/4HANA system using standard SAP mechanisms such as BAPIs (e.g., BAPI_QM_QUALINFORECORD_CREATE), IDOCs, or direct table loads where applicable, ensuring compliance with procurement and quality integration processes.
Conversion Scope
The scope of this document covers the approach for converting active Quality Info Record from Legacy Source Systems into S/4HANA following the Quality Info Record - Master Data Design Standard.
The data from legacy system includes:
Active QIRs used within the last three (3) years in procurement or sales processes.
QIRs without deletion flag.
Plant-specific QIRs that will be migrated to the To-Be Plant Mapping (taking into consideration the To-Be definition of Plants).
QIRs referenced in active material-vendor (procurement) or material-customer (sales) combinations.
The data from legacy system excludes:
Inactive QIRs not used in more than three (3) years.
QIRs marked for deletion.
QIRs belonging to deleted plants (per To-Be Plant mapping).
Obsolete or duplicate QIRs that are no longer relevant (e.g., replaced by harmonized corporate standards).
| Source | Scope | Source Approx No. of Records | Target System | Target Approx No. of Records |
|---|---|---|---|---|
| PF2 and WP2 | Quality Info Record will be extracted from PF2 and WP2 client | PF2: Raw Data : 33650 records WP2: Raw Data : 8892 records | S/4 HANA | In-scope data: 13254 records |
Additional Information
Multi-language Requirement
Not applicable
Document Management
Not applicable
Legal Requirement
Not applicable
Special Requirements
Not applicable
Target Design
The technical design of the target for this conversion approach.
| Table | Field | Data Element | Field Description | Data Type | Length | Requirement |
| QINF | MANDT | MANDT | The client field, identifies the SAP client for which record was created | CLNT | 3 | R |
| QINF | MATNR | MATNR | Material Number for which quality info record is to be created | CHAR | 40 | R |
| QINF | WERK | WERKS_D | Plant Code for which quality info record is to be created | CHAR | 4 | R |
| QINF | LIEFERANT | ELIFN | Vendor Code for which quality info record is to be created | CHAR | 10 | R |
| QINF | REVLV | REVLV | This field stores specific version of a material for a given vendor-plant combination in a quality info record | CHAR | 2 | R |
| QINF | FREI_DAT | QFREIDAT | Valid to date for the quality info record | DATS | 8 | R |
| QINF | FREI_MNG | QFREIMG | This field functions as a limit the controls how much of a product can be ordered from a specific vendor before further action is taken, such as needing a new quality inspection. | QUAN | 13 | C |
| QINF | SPERRGRUND | QSPERRGR | This field is a blocking reason code that specifies why a material-vendor combination is blocked or released for procurement transactions like quotations, purchase orders, or goods receipts. | CHAR | 40 | C |
| QINF | SPERRFKT | QSPERRFKT | This field can can block a supplier for purchase orders, goods receipts, or both, by setting the appropriate value. | CHAR | 2 | C |
| QINF | FREI_MGKZ | QFREIMGKZ | This indicator effectively blocks new purchase orders for the specified material from that vendor, as the "release quantity" has been exhausted or used up. | CHAR | 1 | C |
| QINF | NOINSP | QNOINSP | This field indicates whether a goods receipt inspection is to be skipped for a specific material, vendor, and plant combination. It allows users to skip inspections for materials that are already known to be of high quality or where no incoming inspection is required by the business process. | CHAR | 1 | C |
| QINF | QSSYSFAM | QSSYS_IST | Thi field indicates the specific quality management system used by a supplier for a particular material and vendor combination. | CHAR | 4 | C |
| QINF | QSSYSDAT | QQSSYSDAT | Valid to Date for Actual (Existing) QM system | DATS | 8 | C |
| QINF | VORLABN | QVORLABN | Lot Creation lead time for source inspection | INT1 | 3 | C |
| QINF | VARIABNAHM | QINFVAR1 | Inspection type for inspection lot origin (source inspection) | CHAR | 8 | C |
| QINF | CERTCONTROL | QCERTCONTROL | This field allows to control how quality certificates are handled for a material from a specific supplier, for example, to trigger inspections or block procurement until a certificate is received and verified. | CHAR | 1 | C |
| QINF | ZAEHL | Q_COUNT | Internal Counter for DB Objects | NUMC | 6 | S |
| QINF | ERSTELLER | QINFERST | Name of User who Created the Q-Info Record | CHAR | 12 | S |
| QINF | ERSTELLDAT | QINFERSTDT | Create Date of Q-Info Record | DATS | 8 | S |
| QINF | AENDERER | QINFAEND | Name of User who Most Recently Changed the Q-Info Record | CHAR | 12 | S |
| QINF | AENDERDAT | QINFAENDDT | Change Date of Q-Info Record | DATS | 8 | S |
| QINF | BEST_MG | QBSTMG | This field accumulates the total quantity of a specific material that has been ordered from a particular vendor for a given plant. | QUAN | 13 | C |
| QINF | ME | QBSTME | Base Unit of Measure for the Material | UNIT | 3 | C |
| QINF | DAT_RUECK | QRSDAT | The reset date clears the accumulated total ordered quantity, allowing for new orders to be placed again against the release quantity. | DATS | 8 | C |
| QINF | SPRACHE | SPRAS | This field indicates the language in which information, like text descriptions or status messages, is stored or displayed | LANG | 1 | C |
| QINF | QVVORH | QQSVVORH | If the Quality agreement exists ( is True) for this Material with this Supplier, certain aspects of the Quality Info Records will be determined by the Quality Agreement. | CHAR | 1 | S |
| QINF | LTEXTKZ | QLTEXTKZ | Inspection Characteristic Long Text Exists | CHAR | 1 | C |
| QINF | SPRAS | SPRAS | This field allows SAP to store and retrieve information in different languages for users and customers with varying language preferences. | LANG | 1 | C |
| QINF | PLOS | QPLOSQINF | This field provides the direct link between the material's procurement record and quality inspection process for that specific batch or quantity | NUMC | 12 | S |
| QINF | PLOS2 | QPLOSQINF2 | This field is used to link a info record to a subsequent inspection lot, to define conditions or actions that should be triggered based on the outcome of a previous inspection. | NUMC | 12 | S |
| QINF | STSMA | QSTSMAQINF | Status profile defines a set of statuses for a quality info record | CHAR | 8 | C |
| QINF | OBJNR | QINFOBJNR | System generated unique identifier for quality info record | CHAR | 22 | S |
| QINF | NOINSPABN | QNOWEABN | This field prevents goods receipt (GR) inspection from taking place if a source inspection lot has already been created for that material and vendor combination. | CHAR | 1 | C |
| QINF | LOEKZ | QLOEKZ | This field indicates that quality info record is flagged for deletion or has been archived, but it is not yet physically deleted from the database | CHAR | 1 | C |
Data Cleansing
| ID | Criticality | Error Message/Report Description | Rule | Output | Source System |
|---|---|---|---|---|---|
| 2020-001 | C1 | Invalid or inactive Material in QIR | QIR references a material that does not exist in MARA/MARC or is flagged for deletion/blocked at plant. | Material | PF2/WP2 |
| 1139-001 3018-001 3021-001 3024-001 3025-001 3028-001 3031-001 | C1 | Invalid or inactive Vendor in QIR (procurement side) | QIR (MATNR/LIFNR/WERKS) references vendor not in LFA1/LFM1 (or blocked for purchasing). | Vendor | PF2/WP2 |
| 1139-001 3018-001 3021-001 3024-001 3025-001 3028-001 3031-001 | C1 | Invalid or inactive Customer in QIR (sales side) | QIR (MATNR/KUNNR/WERKS) references customer not in KNA1/KNVV (or blocked). | Customer | PF2/WP2 |
| 2010-001 3027-001 | C1 | Invalid Plant | QIR plant (WERKS) does not exist or is obsolete. | Plant | PF2/WP2 |
| 1044-001 | C1 | Duplicate QIR key | Duplicate active QIRs with same combination (Material + Vendor/Customer + Plant) and overlapping validity. | QIR | PF2/WP2 |
| 1044-001 | C1 | Validity overlap within same key | Multiple QIR records for same key have overlapping GUELTIGAB/GUELTIGBIS dates. | QIR | PF2/WP2 |
| 1044-001 | C1 | QIR not released/approved | Release indicator/status not set to “Released/Approved” as per business rule. | QIR | PF2/WP2 |
| 1044-001 | C1 | QIR blocked | Procurement block indicator set (e.g., SPERRE = X) while record flagged as active for migration. | QIR | PF2/WP2 |
| 1044-001 | C1 | Expired QIR | Valid-to date is before cutover date; exclude unless explicitly required. | QIR | PF2/WP2 |
| 1044-001 | C1 | Future-dated QIR outside go-live horizon | Valid-from date is after the go-live window (e.g., > N days after cutover) and no business justification provided. | QIR | PF2/WP2 |
| 2020-001 1139-001 3018-001 3021-001 3024-001 3025-001 3028-001 3031-001 | C1 | Source Inspection required but no enabling setup | QIR requires source inspection (e.g., indicator set) but vendor/material lacks inspection type & control in S/4 target design. | Material/Vendor Setup | PF2/WP2 |
| 1044-001 | C1 | Missing text/description for key languages | Language-dependent short text missing in required languages for the QIR. | QIR Text | PF2/WP2 |
| 1044-001 | C1 | Conflicting status flags | Record simultaneously marked for deletion and released (inconsistent). | QIR | PF2/WP2 |
| 1139-001 3018-001 3021-001 3024-001 3025-001 3028-001 3031-001 | C1 | Vendor Subrange / Purchasing Org mismatch | QIR expects vendor subrange or purchasing org-specific data that doesn’t exist/align. | Vendor Master | PF2/WP2 |
| 2009-001 | C1 | Material quality view missing | Material lacks required QM view/settings where policy mandates QM integration with QIR. | Material | PF2/WP2 |
| C1 | Customer-side QIR without usage | Customer QIR exists but material/plant not configured to use it (sales-side QM not in scope). | Scope Decision | PF2/WP2 | |
| C1 | Obsolete custom fields populated | Legacy custom fields used in ECC with no S/4 mapping; values must be cleared/mapped. | Mapping Issue | PF2/WP2 | |
| 1044-001 | C1 | Audit fields inconsistent | Created/changed by/on are missing or malformed (optional clean-up for traceability). | QIR | PF2/WP2 |
Conversion Process
The high-level process is represented by the diagram below:
The ETL (Extract, Transform, Load) process is a structured approach to data migration and management, ensuring high-quality data is seamlessly transferred across systems. Here’s a breakdown of its key components:
1. Extraction
The process begins with extracting metadata and raw data from source systems, such as Syensqo ECC system (i.e. WP2/PF2) periodically. The extracted data is then staged for transformation.
2. Transformation
Once extracted, the data undergoes cleansing, consolidation, and governance. This step ensures data integrity, consistency, and compliance with business rules. The transformation process includes:
- Data validation to remove inconsistencies.
- Standardization to align formats across datasets.
- Business rule application to refine data for operational use.
3. Loading
The transformed data is then loaded into the target S/4HANA system.
Data Privacy and Sensitivity
Not applicableExtraction
Extract data from a source into . There are 2 possibilities:
- The data exists. connects to the source and loads the data into . There are 3 methods:
- Perform full data extraction from relevant tables in the source system(s).
- Perform extraction through the application layer.
- Only if ; cannot connect to the source, data is loaded to the repository from the provided source system extract/report.
- The data does not exist (or cannot be converted from its current state). The data is manually collected by the business directly in . This is to be conducted using DCT (Data Collection Template) in
The agreed Relevancy criteria is applied to the extracted records to identify the records that are applicable for the Target loads
Extraction Run Sheet
| Req # | Requirement Description | Team Responsible |
|---|---|---|
| Extraction Scope Definition | - Identify the source systems and databases involved. - Define the data objects (tables, fields, records) to be extracted. - Establish business rules for data selection. | Syniti / LTC Data team |
| Extraction Methodology | - Specify the extraction approach (full, incremental, or delta extraction). - Determine the tools and technologies used. - Define data filtering criteria to exclude irrelevant records. | Syniti |
| Extraction Execution Plan | - Establish execution timelines and batch processing schedules. - Assign responsibilities for extraction monitoring. - Document dependencies on other migration tasks. | Syniti |
| Data Quality and Validation | - Define error handling mechanisms for extraction failures. | Syniti |
Selection Screen
| Selection Ref Screen | Parameter Name | Selection Type | Requirement | Value to be entered/set |
|---|---|---|---|---|
Data Collection Template (DCT)
The Data Collection Template (DCT) will not be applicable in this case. If there is a need to create a new Master Data (MD) for Inspection Plan object, the business must perform this activity in the source system. The newly created object will then be captured and migrated as part of the standard migration process.
Extraction Dependencies
| Item # | Step Description | Team Responsible |
|---|---|---|
| 1 | Source System Availability
| Syensqo IT |
| 2 | Data Structure
| Syniti |
| 3 | Referential Integrity
| Syniti |
| 4 | Extraction Methodology
| Syniti |
| 5 | Performance and Scalability Considerations
| Syniti |
| 6 | Security and Compliance
| Syniti |
Transformation
The Target fields are mapped to the applicable Legacy field that will be its source, this is a 3-way activity involving the Business, Functional team and Data team. This identifies the transformation activity required to allow to make the data Target ready:
- Perform value mapping and data transformation rules.
- Legacy values are mapped to the to-be values (this could include a default value)
- Values are transformed according to the rules defined in
- Prepare target-ready data in the structure and format that is required for loading via prescribed Load Tool. This step also produces the load data ready for business to perform Pre-load Data Validation
Transformation Run Sheet
| Item # | Step Description | Team Responsible |
|---|---|---|
1 | Transformation Scope Definition - Identify the source and target data structures. - Define business rules for data standardization. - Establish data cleansing requirements to remove inconsistencies. | Data Team |
2 | Data Mapping and Standardization - Align source fields with target fields. - Ensure unit consistency (e.g., currency, measurement units) | Data Team |
3 | Business Rule Application - Implement data enrichment/collection if applicable - Apply conditional transformations based on predefined logic/business rules | Data Team |
4 | Transformation Execution Plan - Define batch processing schedules. - Assign responsibilities for monitoring execution. - Establish error-handling mechanisms | Syniti |
Transformation Rules
| Rule # | Source system | Source Table | Source Field | Source Description | Target System | Target Table | Target Field | Target Description | Transformation Logic |
|---|---|---|---|---|---|---|---|---|---|
Transformation Mapping
| Mapping Table Name | Mapping Table Description |
|---|---|
Transformation Dependencies
List the steps that need to occur before transformation can commence| Item # | Step Description | Team Responsible |
|---|---|---|
1 | Value Mappings are according to the latest design - <List of Value Mappings> | SyWay Data Team |
Pre-Load Validation
Project Team
Completeness
| Task | Action |
|---|---|
Compare Data Counts |
|
Validate the mandatory fields | Validate there is value for all the mandatory fields |
Validate Primary Keys and Unique Constraints |
|
Test Referential Integrity | Confirm dependent records exist in related tables |
Accuracy
| Task | Action |
|---|---|
Validate the transformation | Validate the fields which require transformation have the value after transformation instead of the original field value |
Check Data Consistency |
|
Business
Completeness
| Task | Action |
|---|---|
Compare Data Count |
|
| Review populated templates for missing or incorrect values | Use checklists to verify completeness and correctness before submission |
Accuracy
| Task | Action |
|---|---|
Conversion Accuracy | Business Data Owner/s to verify that all the data in the load table/file is accurate as per endorsed transformation/ mapping rules (and signed-off DCT data). |
Load
The load process includes:
- Execute the automated data load into target system using load tool or product the load file if the load must be done manually
- Once the data is loaded to the target system, it will be extracted and prepared for Post Load Data Validation
Load Run Sheet
| Item # | Step Description | Team Responsible |
|---|---|---|
1 | Load Scope Definition - Identify the target system and database structure. - Define data objects (tables, fields, records) to be loaded. - Establish business rules for data validation. | Data team |
2 | Load Methodology - Specify the loading tools and technologies (Migration Cockpit, LSMW, custom loading program). | Syniti |
3 | Data Quality and Validation - Ensure data integrity checks (null values, duplicates, format validation). - Perform pre-load validations to verify completeness. - Define error handling mechanisms for load failures | Syniti |
4 | Load Execution Plan - Establish execution timelines and batch processing schedules. - Assign responsibilities for monitoring execution. - Document dependencies on other migration tasks | Syniti |
5 | Logging and Reporting - Maintain detailed logs of loading activities. - Generate summary reports on loaded data volume and quality. - Define escalation procedures for errors | Syniti |
Load Phase and Dependencies
The Quality Info records will be loaded in the pre-cutover (PreCutover 4 phase) period.
Before loading, it will have dependency on the following configuration and data objects in the S/4 HANA.
Configuration
| Item # | Configuration Item |
|---|---|
Conversion Objects
| Object # | Preceding Object Conversion Approach |
|---|---|
| list the exact title of the conversion object of only the immediate predecessor – this will then confirm the DDD (Data Dependency Diagram) | |
Error Handling
| Error Type | Error Description | Action Taken |
|---|---|---|
Post-Load Validation
Project Team
Completeness
| Task | Action |
|---|---|
Validate Record count in the backend | Validate all tables with prefix “KOTG” has the same records as the loading file |
Display Records | Pick up a few random Material Listing or Material Exclusions, and run t-code: VB03 to validate the Material Listing and Exclusions can be displayed without any error. |
Perform Source-to-Target Comparisons |
|
Accuracy
| Task | Action |
|---|---|
Execute Sample Queries and Reports |
|
Conduct Post-Migration Reconciliation | Generate reports comparing pre- and post-migration data. |
Business
Post-load validation is a critical step in data migration, ensuring that transferred data is accurate, complete, and functional within the target system.
1. Ensuring Data Integrity
After migration, data must be consistent with its original structure. Post-load validation checks for missing records, incorrect mappings, and formatting errors to prevent discrepancies.
2. Business Continuity
Faulty data can disrupt operations, leading to financial losses and inefficiencies. Validating post-load data ensures that applications function as expected, preventing downtime.
3. Error Detection and Resolution
By validating data post-migration, businesses can detect anomalies early, reducing the cost and effort required for corrections
Completeness
| Task | Action |
|---|---|
| Perform Source-to-Target Comparisons |
|
| Conduct Post-Migration Reconciliation | Go through reports comparing pre- and post-migration data. |
Accuracy
| Task | Action |
|---|---|
Perform Manual Testing | Conduct manual spot-checks for additional assurance. |
Key Assumptions
- Master Data Standard is up to date as on the date of documenting this conversion approach and data load.
- Inspection Plan is in scope based on data design and any exception requested by business.
- Data cleansing has met the required percentage threshold for the specified mock cycle and all preparation activities have been completed.
- Data entries in DCT are target-ready data unless a specific transformation rule is stated for that field in the transformation rules.
