Status

Owner
StakeholdersThe business stakeholders involved in making, reviewing, and endorsing this decision. Type @ to mention people by name

Purpose

The purpose of this document is to define the conversion approach to create Batches with Classification Data in S/4 HANA.

Batches are used extensively used in Syensqo Legacy System and used for below purposes.: 

Quality Control

  • Each batch can be tested and its quality attributes recorded. SAP enables linking quality results to specific batches, ensuring only compliant products are shipped.

Inventory Management

  • Batches help manage inventory by expiration date, production date, or quality status.
  • Enables First-Expired-First-Out (FEFO) or First-In-First-Out (FIFO) inventory strategies.

Production Efficiency

  • Batches allow for detailed production planning and tracking.


Conversion Scope

The scope of this document covers the approach for converting active Batches & Classification Data from Legacy Source Systems into S/4HANA following the 
Master Data Design Standard.


The data from legacy system includes:

  1. Batches in Stock
  2. Relevancy Criteria 2
  3. Relevancy Criteria n

The data from legacy system excludes:

  1. Exclusion Criteria 1
  2. Exclusion Criteria 2
  3. Exclusion Criteria n


List of source systems and approximate number of records
SourceScope

Source Approx No. of Records

Target SystemTarget Approx

No. of Records

PF2Batches in Stock


WP2Batches in Stock












Additional Information

Multi-language Requirement

NA

Document Management

NA

Legal Requirement

NA

Special Requirements

NA

NA




Target Design

With Functional input, document the technical design of the target fields that are in the scope of this document.

The technical design of the target for this conversion approach.

TableFieldData ElementField DescriptionData TypeLengthRequirement
MCHAHSDATHSDATDate of Manufacture of the batchCHAR18Mandatory
MCHAVFDATVFDATShelf Life Expiration Date of the batch


MCHAVERABVERABBatch available from date


MARAIPRKZIPRKZPeriod indicator


MCHAZUSCHZUSCHUnrestricted or Restricted


MCHAZAEDTZAEDTDate of last status change


MCHAQNDATQNDATDate of next inspection


MCH1LVORMLVORMDeletion flag at client level


MCHALVORMLVORMDeltion flag at plant level


MCHAZFDATZFDATDate of certification


MCHALIFNRLIFNRSupplier number


MCHALICHALICHASupplier batch


MCHALWEDTLWEDTDate of last goods receipt


MCHAHERKLHERKLBatch country of origin


MCHAHERKRHERKRRegion of origin


MCHAMTVERMTVERGrouping of materials with similar instrastat requriements


MCHAFVDT1FVDT1Date for free use


MCHAFVDT2FVDT2Date for free use


MCHAFVDT3FVDT3Date for free use


MCHAFVDT4FVDT4Date for free use


MCHAFVDT5FVDT5Date for free use


MCHAFVDT6FVDT6Date for free use


AUSPATWRTATWRTValue for each characteristic of class


MARAMHDRZMHDRZCalculated remaining Shelf Life


MARAMHDHBMHDHBTotal shelf life


MARAIPRKZIPRKZPeriod indicator from material


MARARDMHDRDMHDRounding rule



Data Cleansing

All data cleansing should take place in the data source system as defined in this document, unless system limitations prevent it.

If data cleansing is managed outside of the source system (e.g. Syniti Migrate, 3rd Party Vendor, DCT), the necessary documentation must be produced and appended to this deliverable for sign-off.

IDCriticalityError Message/Report DescriptionRuleOutputSource System


























Conversion Process

The high-level process is represented by the diagram below:

Summarize High-Level Process. Include diagrams, where applicable. Include information supporting details of Extract, Transform and Load specific to the Data Object


Data Privacy and Sensitivity

Summarize Data Privacy and Sensitivity Requirements, if any


Extraction

Extract data from a source into Syniti Migrate. There are 2 possibilities:

  1. The data exists. Syniti Migrate connects to the source and loads the data into Syniti Migrate. There are 3 methods:
    1. Perform full data extraction from relevant tables in the source system(s).
    2. Perform extraction through the application layer.
    3. Only if Syniti Migrate; cannot connect to the source, data is loaded to the repository from the provided source system extract/report.
  2. The data does not exist (or cannot be converted from its current state). The data is manually collected by the business directly in Syniti Migrate. This is to be conducted using DCT (Data Collection Template) in Syniti Migrate

The agreed Relevancy criteria is applied to the extracted records to identify the records that are applicable for the Target loads

Extraction Run Sheet

Req #Requirement DescriptionTeam Responsible













Selection Screen

If applicable, this section will give the details on any selection screen parameters, including the parameter type, that are required to be entered to ensure consistent data extracts.
Selection Ref ScreenParameter NameSelection TypeRequirementValue to be entered/set





















Data Collection Template (DCT)

Target Ready Data Collection Template will be created for Data Object data with exception of some fields which require transformation as mentioned in the transformation rule.

<Object> DCT Rules

Field NameField DescriptionRule












Extraction Dependencies

List the steps that need to occur before extraction can commence

Item #Step DescriptionTeam Responsible













Transformation

The Target fields are mapped to the applicable Legacy field that will be its source, this is a 3-way activity involving the Business, Functional team and Data team. This identifies the transformation activity required to allow Syniti Migrate to make the data Target ready:

  1. Perform value mapping and data transformation rules.
    1. Legacy values are mapped to the to-be values (this could include a default value)
    2. Values are transformed according to the rules defined in Syniti Migrate
  2. Prepare target-ready data in the structure and format that is required for loading via prescribed Load Tool. This step also produces the load data ready for business to perform Pre-load Data Validation

Transformation Run Sheet

Item #Step DescriptionTeam Responsible













Transformation Rules

Rule #Source systemSource TableSource FieldSource DescriptionTarget SystemTarget TableTarget FieldTarget DescriptionTransformation Logic









































Transformation Mapping

Use the exact name and reference this section in the “Transformation rules” above
Mapping Table NameMapping Table Description








Transformation Dependencies

List the steps that need to occur before transformation can commence
Item #Step DescriptionTeam Responsible

Extract PF2 & WP2 source tables – MCHA & AUSP



Ensure all vendor master data collection templates, deduplication and cleansing completeness and no errors



Ensure that Business Partner – General is loaded



Ensure that all required Configurations are in S/4HANA



Pre-Load Validation

Project Team

The following pre-load validations will be performed by the Project Team.

Completeness

TaskAction
Verify DCT & Load File Count

SCM Data Team to verify that the total number of relevant records from the  DCT is equal to the total number of records in the Preload and Load Sheets.

Verify Consent

Verify the appropriate consents for the records have been obtained by the business/Data Owners and properly recorded


Accuracy

TaskAction

Conversion Accuracy

SCM Data team to verify that all the data in the load table/file is accurate as per signed-off DCT contents and transformation rules with below checks: 

  1. Mandatory Fields
  2. Field and Value Mapping Correctness
  3. Null Checks
Review Error ReportsReview and correct the errors.  Achieve a zero-error record count as much as possible. Raise defects for data remediated and requiring a correction in the source data.

Business

The following pre-load validations will be performed by the business.

Completeness

TaskAction
Verify Record Count

Business Data Owner/s to verify that the total number of relevant records from the the DCT is equal to the total number of records in the Preload and Load Sheets.

Verify Consent

Verify that the appropriate consents for the records have been obtained by the business and properly recorded


Accuracy

TaskAction

Conversion Accuracy

Business Data Owner/s to verify that all the data in the load table/file is accurate as per endorsed transformation/mapping rules (and signed-off DCT data).


Load

The load process includes:

  1. Execute the automated data load into target system using load tool or product the load file if the load must be done manually
  2. Once the data is loaded to the target system, it will be extracted and prepared for Post Load Data Validation
  3. Load approach: Migration Cockpit using Staging Tables

Load Run Sheet

Item #Step DescriptionTeam Responsible

1

Ensure Pre-load sign-offs are obtained.

SCM Data team

2

Go to the load tool and select the correct load Program.

SCM Data team

3

Proceed with Data load.

SCM Data team

4

Validate few records loaded by accessing standard transactions.

SCM Data team

5

Generate the post load reports in the tool.

SCM Data team

6

Log errors as defects, if any and address resolutions. Close defects.

SCM Data team

7

Resolve defects by re-upload and re-generate post load reports if necessary.

SCM Data team

8

Business to validate the post load files as part of post-load validation, raise data defects or provide the post-load sign-off.

Business

9

Repeat steps 5 to 7 if necessary.

SCM Data team


Load Phase and Dependencies

Identify the phase as to “when” the load for this object will occur. <Pre-Cutover, Cutover, Post Cutover> and list the steps that need to occur before the load can commence

Configuration

List the Configurations required before loading can commence

Item #Configuration Item






Conversion Objects

Object #Preceding Object Conversion Approach
1

CNV-1047 Batch Characteristics of Class Type: 023

2CNV-1046 Classes Type: 023 ( Batch)

Error Handling


Error TypeError DescriptionAction Taken

Configuration

<configuration> is not valid/missing

If it is a missing configuration item then engage Functional team to expedite and fix the error in the system.

Invalid Data

<parameter> is not valid.

The parameter entry needs to be reviewed (ex. invalid payment terms).

If it is an invalid data, business needs to review and correct the source of the data either in Legacy Systems PF2 and WP2 or if it’s mapped in DCT.

Technical Setup

Interface / Connection issue within target system’s landscape

N/A – the data will be loaded directly to S/4HANA environment

Authorisation Error

<USER ID> is not authorised to access Business Partner create

Contact Basis to obtain Fire Fighter ID  or fix the authorisation setup


Post-Load Validation

Project Team

Completeness

TaskAction
Verify the count

Verify that the record count in the post-load file is the same as the record count based on the relevancy (including any deduplication) results 


Accuracy

TaskAction

Data Accuracy

SCM Data team to verify that all the data in the post load table/file is accurate as per signed-off transformation rules and DCT contents

Error Reports

Verify that all necessary error reports have been validated, and that errors have been addressed.

Data Consistency

Verify that the data loaded is correctly reflected in T-Code: MSC03N or Table: MCHA & AUSP.


Business


Completeness

TaskAction

Verify Count

Verify that the record count in the post load file is the same as the record count based on the relevancy (including deduplication) results

Validate Loaded Data

Validate, as per the load files signed-off, that all records were created


Accuracy

TaskAction

Verify Data Accuracy

Verify that all the data in the target tables is accurate as per signed-off transformation rules and DCT contents

Review Post Load Error Reports

Verify that all necessary post load error reports have been validated, and that errors have been addressed.

Validate Loaded Data

Validate, as per the loads files signed-off, that all records are in the target system


Key Assumptions

  • The Master Data Standards document is a evolving document where value mappings/validation checks are still being finalized as we are currently in the detailed phase as of September 2025.
  • Batch Master and Classification data is in scope based on data design and any exception requested by business.


See also

Insert links and references to other documents which are relevant when trying to understand this decision and its implications. Other decisions are often impacted, so it's good to list them here with links. Attachments are also possible but dangerous as they are static documents and not updated by their authors.

Change log