When a new code (like a site) is created into SAP and replacing another one, transcodification can be used to move in an automated way all the data from old code to new one into Dynasys.

2 steps :

  • Master table transcodification
  • Dynamic DH transcodification


1/ Master table transcodification

Topic : Transcodify a master data record into a master table of the hierarchy

Target : When a record is replaced by another one (like a SKU or a site for example), transcodifying the record avoid to copy/paste all manual data filled into Dynasys on the current SKU to the new one

How to do it : Either by editing the master table and modifying manually the record or by executing a SQL script into APP database

=> Applicative database architecture :

  • CR_CRITERION table contains the list of master table with their associated ID
  • CR_MODALITY table contains the records from all the master table with the link record - master table ID


=> Example of site full transco :

  • Transcodification of the site code (from record 7712R to record 4049R, into "Site" master table) into the SKU record (master table "SKU")  : update cr_modality set name = left(m.NAME, CHARINDEX('@', m.NAME)-1) + '@' + '4049R' from CR_MODALITY m inner join CR_CRITerion c on m.CRITERION_ID = c.ID where c.name = 'SKU' and m.name like '%@7712R'
  • Transcodification of the site code (from record 7712R to record 4049R, into "Site" master table) : update cr_modality set name = '4049R' from CR_MODALITY m inner join CR_CRITerion c on m.CRITERION_ID = c.ID where c.name = 'Site' and m.name = '7712R'
  • Transcodification of the main prod site code (from record 7712R to record 4049R, into "Main production site" master table) update cr_modality set name = '4049R' from CR_MODALITY m inner join CR_CRITerion c on m.CRITERION_ID = c.ID where c.name = 'Main production site' and m.name = '7712R'


=> Warnings / pre-requisite : 

  • Ensure the new record is not already existing otherwise it will generate duplicates => will generate KO into batches and reporting


=> Actions to perform :

  1. Data analysis :
    1. Are duplicates generated with the new records ? => if yes, then data to be analysed / excluded
    2. Are new attributes already existing or not ? according to the answer, attributes to be only updated or also created
  2. Transcodify the records
  3. Update the attributes
    1. Either manually
    2. Or using a manual CSV import mapping with dedicated data
    3. Or let interfaces update data (if the delay does not impact the users and if no lack of data into the interfaces)


=> As soon as the master data is transcodified, we don't have to create again the old master data (SKU) into the hierarchy :

  • Master data has to be deactivated into SAP => material no more linked to a site code, in order to receive it with a deletion flag
  • Sometimes not possible, a security has been set up in GetUNC to force deletion flag on SKU linked to site transcoded


=> Case of PO2 : 

  • Material, shipto, soldto, site replacement depending on the GBU : new codes for some GBU, current codes kept for the other GBU
  • After analysis it was decided to :
    • transcodify only DFU, SKU and site master tables
    • not transcodify material, shipto, soldto but to create manually the new one into the master tables of the concerned GBU
  • DP :
    • transco DFU
    • transco Main Shipping Site / Main Production Site / Main production plant
    • creation of new material / shipto / soldto / STDC / SoldtoDC / SoldtoBU
    • update attributes
    • some new sites were already existing into DP => DFU transco and then update of the site attribute
  • DIP :
    • transco SKU
    • transco Site / Main shipping site / SAP plant code / Main production site / Main product plant / Manufacturing plant / Prod source / Homologated sites / Resource / Line Workshop
    • creation of new material / shipto / soldto / STDC / SoldtoDC / SoldtoBU
    • update attributes
    • some new sites were already existing into DIP => SKU transco and then update of the site attribute
    • DH initialization for homologation


2/ Dynamic DH transcodification

Issue :

  • When a new code (like a site code) is created into SAP and replacing another one, some transactional data are not moved to the new code (like actuals)
  • Meaning some transactional data are still stored on old code when new code is now used into SAP by business team.
  • In Dynasys old and new code data are received as is and we need to merge all the data on the new code, so that the users have a consolidated and single view on the new code

Principle :

  • The transcodification is done into DH MAIN table, set up into GetUNC, file by file depending on the master data to be transcodified
  • Sites, SKU and DFU to be transcodified are listed in a datafield into Dynasys, exported into the DH and then used into GetUNC to transcodify main tables
  • DH tables have to be initialized before batches
  • Settings DH table are the following :
    • INT_SITE_TRANSCO_MAIN
    • INT_SKU_TRANSCO_MAIN
    • INT_DFU_TRANSCO_MAIN
  • Transcodification can be extended to other data and tables

Procedure :

DP :

  • fill in datafields Site and DFU
  • data are exported at the end of the static batch
  • then the data are used into the next update DH to transcodify MAIN tables

Datafields to be used :

Then exported into settings DH tables within static batch using following mappings :

Then used into GetUNC / Update DH batch, with the following script, example of MATERIAL MAIN table transco :

TableArray(9,11) = "update DYS_BW_MATERIAL set Main_Production_Plant_Code = b.target_site_code from DYS_BW_MATERIAL a inner join int_site_transco b on a.Main_Production_Plant_Code = b.site_code"


DIP :

  • fill in datafields Site, SKU exception and SKU
  • "Site SKU Exception" datafield is used to flag the exceptions, meaning SKU where we want to keep the old site and not transcodify the SKU
  • data are exported at the end of the static batch
  • then the data are used into the next update DH to transcodify MAIN tables

Datafields to be used :

Then exported into settings DH tables within static batch using following mappings :

Then used into GetUNC / Update DH batch, with the following script, example of MATERIAL MAIN table transco :

TableArray(5,10) = "update DYS_BW_MATERIAL set Main_Production_Plant_Code = b.target_site_code from DYS_BW_MATERIAL a inner join int_site_transco b on a.Main_Production_Plant_Code = b.site_code where b.Material_Code is NULL and a.Material_Code not in (select material_code from int_site_transco where Material_Code is not NULL and a.Main_Production_Plant_Code = site_code)"


  • No labels