| Status | Revision under Review |
|---|---|
| Owner | THANGARAJAN-ext, Ganesan |
| Stakeholders |
Purpose
The purpose of this document is to define the data conversion approach for creating Pricing Condition Records in the SAP S/4HANA target system, as part of a Greenfield implementation.
In SAP ECC, sales pricing condition records exist as part of the pricing procedure configuration and are stored in condition tables such as KONV, A*, and related master and transactional tables. These records are used to define pricing elements such as base price, discounts, freight, and surcharges across sales documents.
In SAP S/4HANA, the structure and usage of pricing condition records remain largely consistent; however, data models may be simplified, and dependencies on business partners (replacing customer/vendor master records) become critical in ensuring consistency across sales and purchasing functions.
This conversion aims to migrate active and relevant sales pricing condition records from existing ECC systems into S/4HANA by applying required transformation logic using Syniti as the data migration and transformation platform. The converted records will be loaded into the target S/4HANA system using standard SAP mechanisms such as IDOCs, BAPIs, or direct table loads where applicable.
Conversion Scope
The scope of this document includes the end-to-end approach for:
- Extracting existing Pricing Condition Records from SAP ECC systems WP2 and PF2.
- Applying transformation and cleansing logic via Syniti to conform with the S/4HANA data model and business partner framework.
- Loading the transformed condition records into SAP S/4HANA while ensuring data integrity, correct assignment to condition types, condition tables, access sequences, and pricing procedures.
This process will support the migration of condition records such as:
- Base Price (e.g., PR00)
- Discounts (e.g., K007, K004)
- Freight and surcharges
The conversion will ensure all condition records are aligned to the new Pricing Key combinations defined for example, based on Business Partner, Material Masters and relevant Organizational Units (Sales Org, Distribution Channel, Division) as designed for the target S/4HANA landscape.
The data from legacy system includes:
- List price at various key combinations including scale-based pricing as applicable
- Discounts and surcharges at various key combinations
- Freight condition records at various key combinations
The data from legacy system excludes:
- Tax condition records that will be covered in CONV-1034 Tax condition records spec.
- Rebate condition records that will be managed via Condition contracts set up
| Source | Scope | Source Approx No. of Records | Target System | Target Approx No. of Records |
|---|---|---|---|---|
WP2 | Pricing Condition Records |
| S/4HANA System |
|
PF2 | Pricing Condition Records |
| S/4HANA System |
|
Additional Information
Multi-language Requirement
International Version | Description |
C | Simplified Chinese |
R | Cyrillic |
K | Kanji (Japanese) |
A | Arabic |
3 | Korean |
T | Thai |
H | Hangul |
Document Management
Legal Requirement
Special Requirements
Target Design
The technical design of the target for this conversion approach.
| Table | Field | Data Element | Field Description | Data Type | Length | Requirement |
|---|---|---|---|---|---|---|
KONH | KNUMH | KNUMH | Condition Record Number | CHAR | 10 | Mandatory (Key) |
KONH | ERDAT | ERDAT | Created On | DATS | 8 | Optional |
KONH | DATAB | DATAB | Valid From | DATS | 8 | Mandatory |
KONH | DATBI | DATBI | Valid To | DATS | 8 | Mandatory |
KONP | KSCHL | KSCHL | Condition Type | CHAR | 4 | Mandatory |
KONP | KBETR | KBETR | Rate | CURR | 11 | Mandatory |
KONP | KPEIN | KPEIN | Pricing Unit | DEC | 5 | Mandatory if UoM |
KONP | KMEIN | KMEIN | Unit of Measure | UNIT | 3 | Optional |
KONP | WAERS | WAERS | Currency | CUKY | 5 | Mandatory |
KONP | KZBZG | KZBZG | Calculation Type | CHAR | 1 | Mandatory |
Data Cleansing
| ID | Criticality | Error Message/Report Description | Rule | Output | Source System |
|---|---|---|---|---|---|
Conversion Process
The high-level process is represented by the diagram below:
Data Privacy and Sensitivity
Extraction
Extract data from a source into . There are 2 possibilities:
- The data exists. connects to the source and loads the data into . There are 3 methods:
- Perform full data extraction from relevant tables in the source system(s).
- Perform extraction through the application layer.
- Only if ; cannot connect to the source, data is loaded to the repository from the provided source system extract/report.
- The data does not exist (or cannot be converted from its current state). The data is manually collected by the business directly in . This is to be conducted using DCT (Data Collection Template) in
The agreed Relevancy criteria is applied to the extracted records to identify the records that are applicable for the Target loads
Extraction Run Sheet
| Req # | Requirement Description | Team Responsible |
|---|---|---|
Selection Screen
| Selection Ref Screen | Parameter Name | Selection Type | Requirement | Value to be entered/set |
|---|---|---|---|---|
Data Collection Template (DCT)
Target Ready Data Collection Template will be created for data with exception of some fields which require transformation as mentioned in the transformation rule.DCT Rules
| Field Name | Field Description | Rule |
|---|---|---|
Extraction Dependencies
| Item # | Step Description | Team Responsible |
|---|---|---|
Transformation
The Target fields are mapped to the applicable Legacy field that will be its source, this is a 3-way activity involving the Business, Functional team and Data team. This identifies the transformation activity required to allow to make the data Target ready:
- Perform value mapping and data transformation rules.
- Legacy values are mapped to the to-be values (this could include a default value)
- Values are transformed according to the rules defined in
- Prepare target-ready data in the structure and format that is required for loading via prescribed Load Tool. This step also produces the load data ready for business to perform Pre-load Data Validation
Transformation Run Sheet
| Item # | Step Description | Team Responsible |
|---|---|---|
Transformation Rules
| Rule # | Source system | Source Table | Source Field | Source Description | Target System | Target Table | Target Field | Target Description | Transformation Logic |
|---|---|---|---|---|---|---|---|---|---|
Transformation Mapping
| Mapping Table Name | Mapping Table Description |
|---|---|
Transformation Dependencies
List the steps that need to occur before transformation can commence| Item # | Step Description | Team Responsible |
|---|---|---|
Pre-Load Validation
Project Team
Completeness
| Task | Action |
|---|---|
Accuracy
| Task | Action |
|---|---|
Business
Completeness
| Task | Action |
|---|---|
Accuracy
| Task | Action |
|---|---|
Load
The load process includes:
- Execute the automated data load into target system using load tool or product the load file if the load must be done manually
- Once the data is loaded to the target system, it will be extracted and prepared for Post Load Data Validation
Load Run Sheet
| Item # | Step Description | Team Responsible |
|---|---|---|
Load Phase and Dependencies
Configuration
| Item # | Configuration Item |
|---|---|
Conversion Objects
| Object # | Preceding Object Conversion Approach |
|---|---|
| list the exact title of the conversion object of only the immediate predecessor – this will then confirm the DDD (Data Dependency Diagram) | |
Error Handling
| Error Type | Error Description | Action Taken |
|---|---|---|
Post-Load Validation
Project Team
Completeness
| Task | Action |
|---|---|
Accuracy
| Task | Action |
|---|---|
Business
Completeness
| Task | Action |
|---|---|
Accuracy
| Task | Action |
|---|---|
Key Assumptions
- Master Data Standard is up to date as on the date of documenting this conversion approach and data load.
- is in scope based on data design and any exception requested by business.
See also
Change log
Workflow history
| Title | Last Updated By | Updated | Status | |
|---|---|---|---|---|
| There are no pages at the moment. | ||||
