You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 4 Next »

Status

  Revision in Progress

OwnerREDDY-ext, Naren 
Stakeholders

Purpose

The purpose of this document is to define the conversion approach to create 1151 Certificate Profiles in S/4 HANA.
in S/4 HANA.

Certificate Profiles are used in SAP Quality Management (QM) to control and define the certification requirements for materials and suppliers. A Certificate Profile specifies which quality certificates are required for a material, at what stage of the process they must be provided, and the specific inspection characteristics or parameters to be validated. This ensures that incoming materials or finished products meet the defined quality and regulatory standards before acceptance or shipment.

In SAP S/4HANA, the structure and usage of Certificate Profiles remain consistent with SAP ECC, generally defined by key attributes such as Material / Plant / Certificate Type. Certificate Profiles may include additional parameters such as certificate category, inspection lot linkage, and certificate relevance for procurement or production.

In SAP ECC, aside from the standard structure, Certificate Profiles may also exist with extended combinations or enhancements, for example:
    •    Profiles assigned to multiple materials or plants
    •    Custom fields defining regulatory or customer-specific certification needs
    •    Obsolete or inactive profiles that are no longer in use

Such cases must be reviewed carefully (MDS) to determine whether they remain relevant for migration.

This conversion aims to migrate active and valid Certificate Profiles from existing ECC systems into SAP S/4HANA by applying the required transformation logic using Syniti as the data migration and transformation platform. The converted records will be loaded into the target SAP S/4HANA system using standard SAP mechanisms such as BAPIs (e.g., BAPI_QM_CERTIFICATE_PROFILE_CREATE), IDOCs, or direct table loads, ensuring compliance with quality and regulatory requirements.


Conversion Scope

The scope of this document covers the approach for converting active Certificate Profiles from Legacy Source Systems into S/4HANA following the Certificate Profiles Master Data Design Standard.


The data from legacy system includes:

  1. All Active Certificate Profiles(Header table) active/created during the last 4 years
    QCVK-ERSTELDAT >= CURRENT DATE - 4

  2. Certificate profile characteristics with Active MICs in the last 4 years. Refer Master Inspection characteristics.
    QCVM-MKMNR = QPMT-QPMT
  3. Plant-specific Certificate Profiles that will be migrated to the To-Be Plant Mapping in Enterprise Structure Catalog worksheet "30. Plants"

  4. Certificate Profiles referenced in active material-vendor (procurement).  Refer Material and BP-Vendor objects.

The data from legacy system excludes:

  1. Certificate Profiles not used in more than four (4) years.

  2. Certificate Profiles belonging to deleted plants (per To-Be Plant mapping).

  3. Certificate Profiles referenced in inactive material-vendor (procurement).


List of source systems and approximate number of records
SourceScope

Source Approx No. of Records

Target SystemTarget Approx

No. of Records

PF2 & WP2Certificate profiles data will be extracted from client PF2 and WP2

PF2 = 662 records

WP2 =1709 records

S/4 HANA2371

Additional Information

Multi-language Requirement

Not applicable

Document Management

Not applicable

Legal Requirement

Not applicable

Special Requirements

Not applicable




Target Design

The technical design of the target for this conversion approach.

TableFieldData ElementField DescriptionData TypeLengthRequirement
QCVKCTYPQZGTYPCertificate TypeCHAR4R
QCVKVORLNRQVORLNRCertificate Profile NumberCHAR12R
QCVKVERSIONQCVERSIONVersion Number of Certificate ProfileCHAR6S
QCVKSORTFELDQSORTFELDSearch FieldCHAR40C
QCVKGUELTIGABQ_UNUSEDDTField Not Used Field Reserved for SAPDATS8S
QCVKOBJNRJ_OBJNRObject numberCHAR22S
QCVKOBTYPJ_OBTYPObject CategoryCHAR3S
QCVKSTSMAJ_STSMAStatus ProfileCHAR8NU
QCVKTDFORMQCTDFORMForm NameCHAR16R
QCVKKZGUTLOSEQKZGUTLOSEIndicator: Only Lots Whose Usage Decision is OKCHAR1C
QCVKMINQKZQKZMINMinimum Quality Score for Good QualityDEC3C
QCVKKZLASTLOTQKZLASTLOTIndicator: Last Lot for BatchCHAR1C
QCVKKZLIEFSCHQKZLIEFSCHIndicator: Delivery Note in Inspection LotCHAR1C
QCVKHERKUNFTQHERKInspection Lot OriginCHAR2C
QCVKARTQPARTInspection TypeCHAR8C
QCVKKONSISTENTQKZKONSISTIndicator: Certificate Profile is ConsistentCHAR1NU
QCVKKURZTEXTQKURZTEXTShort TextCHAR40R
QCVKLTEXTKZQKZLTXLong Text ExistsCHAR1S
QCVKWITHTRANSFQCTRANSFERStock Transfers from Batch Where-Used ListCHAR1C
QCVKQDI_NOPDFQC_QDI_NOPDFIndicator: Do Not Send PDF DocumentCHAR1C
QCVKALSTFQCALSTFExpansion Level of Batch Where-Used ListNUMC2NU
QCVKADBFORMQC_PDF_FORMNAMEPDF-Based Forms: Form NameCHAR30NU
QCVKQKZ_USEADBQC_QKZ_USEADBUse Adobe FormCHAR1S
QCVKCHAR_FIELD_CONFQC_CHAR_FIELD_CONFConfiguration of Characteristics FieldCHAR3NU
QCVMCTYPQZGTYPCertificate TypeCHAR4R
QCVMVORLNRQVORLNRCertificate Profile NumberCHAR12R
QCVMVERSIONQCVERSIONVersion Number of Certificate ProfileCHAR6S
QCVMBLOCKNRQBLOCKNRNumber of Characteristic BlockNUMC4S
QCVMMERKMALNRQMKNRConsecutive Characteristic NumberNUMC4S
QCVMSORTNRQSORTNRSort NumberNUMC4R
QCVMTEXT_ELEMQCTEXTELEMAssignment of a Characteristic to a Form Text ElementCHAR4S
QCVMKATEGORIEQCCHARCATCharacteristic Category for CertificatesCHAR1S
QCVMZAEHLERWERKS_DPlantCHAR4R
QCVMMKMNRQMERKNRMaster Inspection CharacteristicsCHAR8R
QCVMVERSIONMKQVERSNRMKVersion Number of the Master Inspection CharacteristicCHAR6C
QCVMCMERKMALIDATINNInternal characteristicNUMC10NU
QCVMKZHERKWERTQKZHERKWTOrigin of the Characteristic Result on the CertificateCHAR2R
QCVMKZHERKTEXTQKZHERKTXTOrigin of Characteristic Short TextCHAR2R
QCVMKZAUSGFMTQKZAUSGFMTCharacteristic Output FormatCHAR2C
QCVMKZMETHODEQKZMETHODEOutput Inspection MethodCHAR2C
QCVMKZESELQKZESELLevel for Result SelectionCHAR1R
QCVMKZHERKVGQKZHERKVGOrigin of Inspection SpecificationsCHAR2R
QCVMKZSTRSKIPQKZSTRSKIPOutput Strategy for Skip CharacteristicsCHAR2R
QCVMMASSEINHSWQMASSEHUnit of Measurement, in Which Quantitative Data Is StoredUNIT3S
QCVMAUSGLENQLAENGEOutput LengthINT13NU
QCVMSTELLENQSTELLENNumber of Places to the Right of a Decimal Point (Accuracy)INT13C
QCVMKZEXPDSQKZEXPDSIndicator: Exponential NotationCHAR1NU
QCVMKZOFFSPTXQKZOFFSPTXAdditional Text When Result Is Outside SpecificationsCHAR1NU
QCVMKZOBLQKZMKOBLIndicator: Characteristic RequiredCHAR1C
QCVMVMSORTNRQCSORTNRSort Number in List of Materials for Certificate ProfileNUMC4NU
QCVMCOLUMNSORTQCCOLSORTDisplay by ColumnsCHAR1NU
QCVMCOLUMNSQCCOLUMNSNumber of ColumnsNUMC2NU
QCVMUNITNRSORTQCUNITSORTSort by Term for Inspected UnitCHAR1NU
QCVMCHARACT_ID1QCHARACT_ID1Characteristic Description for Quality Data ExchangeCHAR40C
QCVMTCTYPQZGTYPCertificate TypeCHAR4R
QCVMTVORLNRQVORLNRCertificate Profile NumberCHAR12R
QCVMTVERSIONQCVERSIONVersion Number of Certificate ProfileCHAR6S
QCVMTBLOCKNRQBLOCKNRNumber of Characteristic BlockNUMC4S
QCVMTMERKMALNRQMKNRConsecutive Characteristic NumberNUMC4S
QCVMTSPRACHESPRASLanguage KeyLANG1R
QCVMTKURZTEXTQKURZTEXTShort TextCHAR40R
QCVVCTYPQZGTYPCertificate TypeCHAR4R
QCVVVORLNRQVORLNRCertificate Profile NumberCHAR12R
QCVVVERSIONQCVERSIONVersion Number of Certificate ProfileCHAR6S
QCVVVMKATEGQCVMATKATCategory of Data Origin (Insp. Lot/Batch) for CertificateCHAR1R
QCVVMATNRMATNRMaterial NumberCHAR40R
QCVVCOUNTERQCVVCOUNTSequential Number for Material in a Certificate ProfileNUMC3S
QCVVBLOCKNRQBLOCKNRNumber of Characteristic BlockNUMC4S
QCVVVMSORTNRQCSORTNRSort Number in List of Materials for Certificate ProfileNUMC4S
QCVVHERKUNFTQHERKInspection Lot OriginCHAR2C
QCVVARTQPARTInspection TypeCHAR8C
QCVVKZGUTLOSEQKZGUTLOSEIndicator: Only Lots Whose Usage Decision is OKCHAR1C
QCVVMINQKZQKZMINMinimum Quality Score for Good QualityDEC3NU
KONDIKNUMHKNUMHCond.Record No.CHAR10C
KONDICTYPQZGTYPCertificateTypeCHAR4C
KONDIVORLNRQVORLNRCert. ProfileCHAR12C
KONDIVERSIONQCVERSIONVersion No.CHAR6C
KONDIERSTELLERQERSTELLERCreated ByCHAR12C
KONDIERSTELDATQDATUMERSTCreated OnDATS8C
KOTI001KAPPLKAPPLApplicationCHAR2C
KOTI001KSCHLKSCHIAssignment typeCHAR4C
KOTI001MATNRMATNRMaterialCHAR40C
KOTI001DATBIKODATBIValid ToDATS8C
KOTI001DATABKODATABValid FromDATS8C
KOTI001KNUMHKNUMHCond.Record No.CHAR10C
KOTI002KAPPLKAPPLApplicationCHAR2C
KOTI002KSCHLKSCHIAssignment typeCHAR4C
KOTI002MATNRMATNRMaterialCHAR40C
KOTI002KNDNRKUNNR_VCustomerCHAR10C
KOTI002DATBIKODATBIValid ToDATS8C
KOTI002DATABKODATABValid FromDATS8C
KOTI002KNUMHKNUMHCond.Record No.CHAR10C
KOTI501KAPPLKAPPLApplicationCHAR2C
KOTI501KSCHLKSCHIAssignment typeCHAR4C
KOTI501WERKSWERKS_DPlantCHAR4C
KOTI501MATNRMATNRMaterialCHAR18C
KOTI501DATBIKODATBIValid ToDATS8C
KOTI501DATABKODATABValid FromDATS8C
KOTI501KNUMHKNUMHCond.Record No.CHAR10C
KOTI502KAPPLKAPPLApplicationCHAR2C
KOTI502KSCHLKSCHIAssignment typeCHAR4C
KOTI502WERKSWERKS_DPlantCHAR4C
KOTI502MATNRMATNRMaterialCHAR18C
KOTI502KNDNRKUNNR_VCustomerCHAR10C
KOTI502DATBIKODATBIValid ToDATS8C
KOTI502DATABKODATABValid FromDATS8C
KOTI502KNUMHKNUMHCond.Record No.CHAR10C
KOTI901KAPPLKAPPLApplicationCHAR2C
KOTI901KSCHLKSCHIAssignment typeCHAR4C
KOTI901KUNWEKUNWEShip-To PartyCHAR10C
KOTI901MATNRMATNRMaterialCHAR18C
KOTI901DATBIKODATBIValid toDATS8C
KOTI901DATABKODATABValid FromDATS8C
KOTI901KNUMHKNUMHCond.record no.CHAR10C
KOTI902KAPPLKAPPLApplicationCHAR2C
KOTI902KSCHLKSCHIAssignment typeCHAR4C
KOTI902KUNWEKUNWEShip-To PartyCHAR10C
KOTI902MATKLMATKLMaterial GroupCHAR9C
KOTI902DATBIKODATBIValid toDATS8C
KOTI902DATABKODATABValid FromDATS8C
KOTI902KNUMHKNUMHCond.record no.CHAR10C
KOTI903KAPPLKAPPLApplicationCHAR2C
KOTI903KSCHLKSCHIAssignment typeCHAR4C
KOTI903WERKSWERKS_DPlantCHAR4C
KOTI903KUNWEKUNWEShip-To PartyCHAR10C
KOTI903MATKLMATKLMaterial GroupCHAR9C
KOTI903DATBIKODATBIValid toDATS8C
KOTI903DATABKODATABValid FromDATS8C
KOTI903KNUMHKNUMHCond.record no.CHAR10C
KOTI904KAPPLKAPPLApplicationCHAR2C
KOTI904KSCHLKSCHIAssignment typeCHAR4C
KOTI904WERKSWERKS_DPlantCHAR4C
KOTI904MATKLMATKLMaterial GroupCHAR9C
KOTI904DATBIKODATBIValid toDATS8C
KOTI904DATABKODATABValid FromDATS8C
KOTI904KNUMHKNUMHCond.record no.CHAR10C
KOTI905KAPPLKAPPLApplicationCHAR2C
KOTI905KSCHLKSCHIAssignment typeCHAR4C
KOTI905VKORGVKORGSales Org.CHAR4C
KOTI905MATKLMATKLMaterial GroupCHAR9C
KOTI905DATBIKODATBIValid toDATS8C
KOTI905DATABKODATABValid FromDATS8C
KOTI905KNUMHKNUMHCond.record no.CHAR10C
KOTI907KAPPLKAPPLApplicationCHAR2C
KOTI907KSCHLKSCHIAssignment typeCHAR4C
KOTI907WERKSWERKS_DPlantCHAR4C
KOTI907KUNWEKUNWEShip-To PartyCHAR10C
KOTI907MATKLMATKLMaterial GroupCHAR9C
KOTI907ZQ_OUTYPEKSCHLCondition typeCHAR4C
KOTI907DATBIKODATBIValid toDATS8C
KOTI907DATABKODATABValid FromDATS8C
KOTI907KNUMHKNUMHCond.record no.CHAR10C
KOTI908KAPPLKAPPLApplicationCHAR2C
KOTI908KSCHLKSCHIAssignment typeCHAR4C
KOTI908WERKSWERKS_DPlantCHAR4C
KOTI908MATKLMATKLMaterial GroupCHAR9C
KOTI908ZQ_OUTYPEKSCHLCondition typeCHAR4C
KOTI908DATBIKODATBIValid toDATS8C
KOTI908DATABKODATABValid FromDATS8C
KOTI908KNUMHKNUMHCond.record no.CHAR10C


Data Cleansing


IDCriticalityError Message/Report DescriptionRuleOutputSource System


























Conversion Process

The high-level process is represented by the diagram below:

The high-level process is represented by the diagram below:

The ETL (Extract, Transform, Load) process is a structured approach to data migration and management, ensuring high-quality data is seamlessly transferred across systems. Here’s a breakdown of its key components:

1. Extraction
The process begins with extracting metadata and raw data from source systems, such as Syensqo ECC system (i.e. WP2/PF2) periodically. The extracted data is then staged for transformation.


2. Transformation
Once extracted, the data undergoes cleansing, consolidation, and governance. This step ensures data integrity, consistency, and compliance with business rules. The transformation process includes:
- Data validation to remove inconsistencies.
- Standardization to align formats across datasets.
- Business rule application to refine data for operational use.


3. Loading
The transformed data is then loaded into the target S/4HANA system. 


Data Privacy and Sensitivity

Not applicable


Extraction

Extract data from a source into . There are 2 possibilities:

  1. The data exists. connects to the source and loads the data into . There are 3 methods:
    1. Perform full data extraction from relevant tables in the source system(s).
    2. Perform extraction through the application layer.
    3. Only if ; cannot connect to the source, data is loaded to the repository from the provided source system extract/report.
  2. The data does not exist (or cannot be converted from its current state). The data is manually collected by the business directly in . This is to be conducted using DCT (Data Collection Template) in

The agreed Relevancy criteria is applied to the extracted records to identify the records that are applicable for the Target loads

Extraction Run Sheet

Req #Requirement DescriptionTeam Responsible
Extraction Scope Definition- Identify the source systems and databases involved.
- Define the data objects (tables, fields, records) to be extracted.
- Establish business rules for data selection.

Syniti / LTC Data team

Extraction Methodology- Specify the extraction approach (full, incremental, or delta extraction).
- Determine the tools and technologies used.
- Define data filtering criteria to exclude irrelevant records.
Syniti 
Extraction Execution Plan- Establish execution timelines and batch processing schedules.
- Assign responsibilities for extraction monitoring.
- Document dependencies on other migration tasks.
Syniti
Data Quality and Validation- Define error handling mechanisms for extraction failures.Syniti


Selection Screen


Selection Ref ScreenParameter NameSelection TypeRequirementValue to be entered/set
Not applicable




Data Collection Template (DCT)

The Data Collection Template (DCT) will not be applicable in this case. If there is a need to create a new Master Data (MD) for Certificate Profiles object, the business must perform this activity in the source system. The newly created object will then be captured and migrated as part of the standard migration process.

Extraction Dependencies

Item #Step DescriptionTeam Responsible
1

Source System Availability

  • Ensure that the source database or application is accessible.
  • Confirm that necessary credentials and permissions are granted

Syensqo IT

2

Data Structure

  • Identify relationships between tables, views, and stored procedures.

Syniti

3

Referential Integrity

  • Ensure dependent records are extracted together.

Syniti

4

Extraction Methodology

  • Define whether extraction is full, incremental, or delta-based.
  • Establish batch processing schedules for large datasets.

Syniti

5

Performance and Scalability Considerations

  • Optimize extraction queries to prevent system overload.
  • Ensure network bandwidth supports data transfer volumes.

Syniti

6

Security and Compliance

  • Adhere to regulatory standards for sensitive information if applicable

Syniti


Transformation

The Target fields are mapped to the applicable Legacy field that will be its source, this is a 3-way activity involving the Business, Functional team and Data team. This identifies the transformation activity required to allow  to make the data Target ready:

  1. Perform value mapping and data transformation rules.
    1. Legacy values are mapped to the to-be values (this could include a default value)
    2. Values are transformed according to the rules defined in
  2. Prepare target-ready data in the structure and format that is required for loading via prescribed Load Tool. This step also produces the load data ready for business to perform Pre-load Data Validation

Transformation Run Sheet

Item #Step DescriptionTeam Responsible

1

Transformation Scope Definition

- Identify the source and target data structures.

- Define business rules for data standardization.

- Establish data cleansing requirements to remove inconsistencies.

Data Team

2

Data Mapping and Standardization

- Align source fields with target fields.

- Ensure unit consistency (e.g., currency, measurement units)

Data Team

3

Business Rule Application

- Implement data enrichment/collection if applicable

- Apply conditional transformations based on predefined logic/business rules

Data Team

4

Transformation Execution Plan

- Define batch processing schedules.

- Assign responsibilities for monitoring execution.

- Establish error-handling mechanisms

Syniti


Transformation Rules

Rule #Source systemSource TableSource FieldSource DescriptionTarget SystemTarget TableTarget FieldTarget DescriptionTransformation Logic









































Transformation Mapping

Mapping Table NameMapping Table Description








Transformation Dependencies

List the steps that need to occur before transformation can commence
Item #Step DescriptionTeam Responsible













Pre-Load Validation

Project Team

Completeness

TaskAction

Compare Data Counts

  1. Verify row counts between source and target databases.
  2. Identify missing or duplicated records.

Validate the mandatory fields

Validate there is value for all the mandatory fields

Validate Primary Keys and Unique Constraints

  1. Check for duplicate or missing primary key values.
  2. Ensure unique constraints are maintained.

Test Referential Integrity

Confirm dependent records exist in related tables


Accuracy

TaskAction

Validate the transformation

Validate the fields which require transformation have the value after transformation instead of the original field value

Check Data Consistency

  1. Compare field values across systems
  2. Validate data formats and structures

Business

Completeness

TaskAction

Compare Data Count

  1. Verify row counts between source and target databases.
  2. Identify missing or duplicated records.
Review populated templates for missing or incorrect valuesUse checklists to verify completeness and correctness before submission

Accuracy

TaskAction

Conversion Accuracy

Business Data Owner/s to verify that all the data in the load table/file is accurate as per endorsed transformation/ mapping rules (and signed-off DCT data).


Load

The load process includes:

  1. Execute the automated data load into target system using load tool or product the load file if the load must be done manually
  2. Once the data is loaded to the target system, it will be extracted and prepared for Post Load Data Validation

Load Run Sheet

Item #Step DescriptionTeam Responsible

1

Load Scope Definition

- Identify the target system and database structure.

- Define data objects (tables, fields, records) to be loaded.

- Establish business rules for data validation.

Data team

2

Load Methodology

- Specify the loading tools and technologies (Migration Cockpit, LSMW, custom loading program).

Syniti 

3

Data Quality and Validation

- Ensure data integrity checks (null values, duplicates, format validation).

- Perform pre-load validations to verify completeness.

- Define error handling mechanisms for load failures

Syniti 

4

Load Execution Plan

- Establish execution timelines and batch processing schedules.

- Assign responsibilities for monitoring execution.

- Document dependencies on other migration tasks

Syniti

5

Logging and Reporting

- Maintain detailed logs of loading activities.

- Generate summary reports on loaded data volume and quality.

- Define escalation procedures for errors

Syniti 


Load Phase and Dependencies

The Certificate Profiles will be loaded in the pre-cutover (PreCutover 4 phase) period.

Before loading, it will have dependency on the following configuration and data objects in the S/4 HANA.

Configuration

Item #Configuration Item






Conversion Objects

Object #Preceding Object Conversion Approach

list the exact title of the conversion object of only the immediate predecessor – this will then confirm the DDD (Data Dependency Diagram)




Error Handling

Error TypeError DescriptionAction Taken










Post-Load Validation

Project Team

Completeness

TaskAction

Validate Record count in the backend

Validate all tables with prefix “QINF” has the same records as the loading file

Display Records

Pick up a few random Material Listing or Material Exclusions, and run t-code: QI03 to validate the QIR and can be displayed without any error.

Perform Source-to-Target Comparisons

  1. Validate that migrated data matches source records.
  2. Check for discrepancies in numerical values, text fields, and timestamps

Accuracy

TaskAction

Execute Sample Queries and Reports

  1. Run queries to validate business logic.
  2. Generate reports to compare expected vs. actual results

Conduct Post-Migration Reconciliation

Generate reports comparing pre- and post-migration data.


Business

Post-load validation is a critical step in data migration, ensuring that transferred data is accurate, complete, and functional within the target system.

1. Ensuring Data Integrity
After migration, data must be consistent with its original structure. Post-load validation checks for missing records, incorrect mappings, and formatting errors to prevent discrepancies.
2. Business Continuity
Faulty data can disrupt operations, leading to financial losses and inefficiencies. Validating post-load data ensures that applications function as expected, preventing downtime.
3. Error Detection and Resolution
By validating data post-migration, businesses can detect anomalies early, reducing the cost and effort required for corrections

Completeness

TaskAction
Perform Source-to-Target Comparisons
  1. Validate that migrated data matches source records.
  2. Check for discrepancies in numerical values, text fields, and timestamps
Conduct Post-Migration ReconciliationGo through reports comparing pre- and post-migration data.

Accuracy

TaskAction

Perform Manual Testing

Conduct manual spot-checks for additional assurance.


Key Assumptions

  • Master Data Standard is up to date as on the date of documenting this conversion approach and data load.
  • Certificate Profiles is in scope based on data design and any exception requested by business.
  • Data cleansing has met the required percentage threshold for the specified mock cycle and all preparation activities have been completed.
  • Data entries in DCT are target-ready data unless a specific transformation rule is stated for that field in the transformation rules.


See also


Change log

Version Published Changed By Comment
CURRENT (v. 4) Mar 25, 2026 14:04 REDDY-ext, Naren
v. 21 Feb 24, 2026 11:43 REDDY-ext, Naren Removed the CUI object statement from Purpose
v. 20 Feb 20, 2026 13:51 REDDY-ext, Naren Added VERSION field across DCT tables
v. 19 Feb 17, 2026 16:23 REDDY-ext, Naren Added QCVV-VMSORTNR, QCVM-KATEGORIE fields to DCT
v. 18 Feb 17, 2026 15:21 REDDY-ext, Naren Updated the DCT for Long text
v. 17 Feb 16, 2026 14:00 REDDY-ext, Naren Updated DCT template
v. 16 Nov 28, 2025 14:57 REDDY-ext, Naren Updated the Validation reports link(Post load validation)
v. 15 Nov 27, 2025 14:05 REDDY-ext, Naren Removed STXH-TDLOCK, STXL-TDSPO from transformation rules
v. 14 Nov 21, 2025 14:44 REDDY-ext, Naren Updated Transformation mappings for Material group as per Inline comments
v. 13 Nov 21, 2025 13:27 REDDY-ext, Naren

Go to Page History

  • No labels