Definition

  • Boost export speeds export execution from DSCP to DH
  • Way of work:

1.Export data in a csv file, using existing export scenario/mapping

2.Load csv file in a temporary table into the DH

3.Insert data from temporary table to destination table

4.Delete csv file


Function BulkGenericExportScenarioInfoById

Structure :
  • BulkGenericExportScenarioInfoById(ByRef TextFile, ByRef oDBCON, ByVal nId, ByRef vsFileName, ByRef bRemoveZero, ByRef nPastYearsToKeep, ByRef nFutureMonthsToKeep)
Example :
  • BulkGenericExportScenarioInfoById TextFile, oDBCON, 16787388, strBulkFileName, False, 1, 0
  • This export is applied to DR export tables


Structure :
  • ByRef TextFile: file type
  • ByRef oDBCON: DH connection
  • ByVal nId: export scenario ID
  • ByRef vsFileName: file name
  • ByRef bRemoveZero: boolean to manage 0 (case default value <> 0):
    • True : 0 data are deleted into temporary table
    • False: 0 data are kept into temporary table
  • ByRef nPastYearsToKeep: past data management (by complete year):
    • 0: data all over the horizon are kept
    • Otherwise formula : delete data where year <= current year- nPastYearsToKeep
    • 1: we delete past data before current year
      • Example current day 24/11/2020 => past data until 2019 included are deleted
    • 2: we delete past data before current year -1
      • Example current day 24/11/2020 => past data until 2018 included are deleted
    • 3 et plus (variable not limited) : 2017 and before
  • ByRef nFutureMonthsToKeep : future data management (by month)
    • 0: data all over the horizon are kept
    • Otherwise formula : delete data where month > current month + nFutureMonthsto Keep
Use in a macro :
  • Call function strBulkFilename

strBulkFileName = GetUNC(1) & "\" & strSQLConnectionName & "_" & GetMacroIdByName(GetCurrentMacroName) & "_bf"

  • Call export function :

 BulkGenericExportScenarioInfoById TextFile, oDBCON, 16787388, strBulkFileName, False, 1

  • Export scenario needs to be created before => ID
  • Call delete function to delete csv files after export:

DeleteFile(strBulkFileName)


Use in a macro :
  • Following functions must be created into « Import – Library – Base Function»
    • GetMacroIdByName
    • DeleteFile
  • Path files locating csv files generated are defined into « GetUNC»
    • strAppInterfacesCusPI / TI / SI / DI
    • Corresponding to GetUNC(1)


Datahub :
  • Use of « sys_bulk» temporary tables



Execution details :
  • Step 1: data export (except default values) in a csv file with name of scenario on the server => data are exported following datafields dimensions order

  • Step 2: function determines the datafield number of dimensions and delete corresponding temporary table

  • Step 3: csv file integration into corresponding temporary table

  • Step 4: management of 0 values into temporary table


Execution details :
  • Step 5: past data management

  • Step 6 : future data management

  • Step 7: delete destination table
    • The function uses the name of ODBC query to call corresponding SQL view => they must have the same name to feed corresponding destination DH table

  • Step 8: insert data from temporary to destination DH table
    • The function uses also the name of the ODBC query to call SQL view

  • In the case above, function inserts :
    • strSQLConnectionName
    • 2 empty fields
    • Then fields of temporary table


Execution details :
  • Step 7:
    • Temporary and destination tables structures are differents => insert structure is specific
    • Insert doesn’t manage the order of data between temporary and destination tables => data order when generating csv file must fit with destination table fields order
    • In DR export case, temporary tables and DR tables consistents data types => VARCHAR => no problem within insert
  • Global :
    • If a selection of data needs to be adjusted, function variables need to be modified and not the mapping scenario (the best is to align both)


Deployment – Pre-requisite

  • Can be applied to export with execution during more than 10s
  • To be checked before:

1.Scenario export settings :

1.Selection on horizon, sku...

2.Default values + null values settings

3.Destination table reset

2.Exported datafield dimensions order vs destination table

3.Date format of horizon period

4.SQL table or view and corresponding ODBC query names – similar names without space and parenthesis

5.Temporary tables corresponding to number of dimensions to be exported

6.Data type consistent between temporary and destination tables

7.Insert structure corresponding to destination tables


  • The function is currently adapted to QV and DR SQL tables
  • The code will probably have to updated in case of deployment of this function on other exports


Step by step

  • Step 1 : identify an export to be boosted
    • Example: execution > 10s (info in log file)


  • Step 2 : check pre-requiste
    • Datafield settings:
      • Default value = 0


  • Step 2 : identify settings of this export
    • Datafield settings:
      • Dimensions order conform to destination export table : DFU_Code (Item) first then Period (Horizon)

 


  • Step 2 : identify settings of this export
    • Current mapping settings:
      • Default values not exported only
      • Past horizon before current year not exported

      


  • Table is reset before export + SQL view and ODBC query have the same name :


  • Step 3 : modify export macro
    • Set up macro with function boost:






Definition

  • Boost export speeds export execution from DSCP to DH
  • Way of work:

1.Export data in a csv file, using existing export scenario/mapping

2.Load csv file in a temporary table into the DH

3.Insert data from temporary table to destination table

4.Delete csv file


Function BulkGenericExportScenarioInfoById

Structure :
  • BulkGenericExportScenarioInfoById(ByRef TextFile, ByRef oDBCON, ByVal nId, ByRef vsFileName, ByRef bRemoveZero, ByRef nPastYearsToKeep, ByRef nFutureMonthsToKeep)
Example :
  • BulkGenericExportScenarioInfoById TextFile, oDBCON, 16787388, strBulkFileName, False, 1, 0
  • This export is applied to DR export tables


Structure :
  • ByRef TextFile: file type
  • ByRef oDBCON: DH connection
  • ByVal nId: export scenario ID
  • ByRef vsFileName: file name
  • ByRef bRemoveZero: boolean to manage 0 (case default value <> 0):
    • True : 0 data are deleted into temporary table
    • False: 0 data are kept into temporary table
  • ByRef nPastYearsToKeep: past data management (by complete year):
    • 0: data all over the horizon are kept
    • Otherwise formula : delete data where year <= current year- nPastYearsToKeep
    • 1: we delete past data before current year
      • Example current day 24/11/2020 => past data until 2019 included are deleted
    • 2: we delete past data before current year -1
      • Example current day 24/11/2020 => past data until 2018 included are deleted
    • 3 et plus (variable not limited) : 2017 and before
  • ByRef nFutureMonthsToKeep : future data management (by month)
    • 0: data all over the horizon are kept
    • Otherwise formula : delete data where month > current month + nFutureMonthsto Keep
Use in a macro :
  • Call function strBulkFilename

strBulkFileName = GetUNC(1) & "\" & strSQLConnectionName & "_" & GetMacroIdByName(GetCurrentMacroName) & "_bf"

  • Call export function :

 BulkGenericExportScenarioInfoById TextFile, oDBCON, 16787388, strBulkFileName, False, 1

  • Export scenario needs to be created before => ID
  • Call delete function to delete csv files after export:

DeleteFile(strBulkFileName)


Use in a macro :
  • Following functions must be created into « Import – Library – Base Function»
    • GetMacroIdByName
    • DeleteFile
  • Path files locating csv files generated are defined into « GetUNC»
    • strAppInterfacesCusPI / TI / SI / DI
    • Corresponding to GetUNC(1)


Datahub :
  • Use of « sys_bulk» temporary tables



Execution details :
  • Step 1: data export (except default values) in a csv file with name of scenario on the server => data are exported following datafields dimensions order

  • Step 2: function determines the datafield number of dimensions and delete corresponding temporary table

  • Step 3: csv file integration into corresponding temporary table

  • Step 4: management of 0 values into temporary table


Execution details :
  • Step 5: past data management

  • Step 6: future data management

  • Step 7: delete destination table
    • The function uses the name of ODBC query to call corresponding SQL view => they must have the same name to feed corresponding destination DH table

  • Step 8: insert data from temporary to destination DH table
    • The function uses also the name of the ODBC query to call SQL view

  • In the case above, function inserts :
    • strSQLConnectionName
    • 2 empty fields
    • Then fields of temporary table


Execution details :
  • Step 7:
    • Temporary and destination tables structures are differents => insert structure is specific
    • Insert doesn’t manage the order of data between temporary and destination tables => data order when generating csv file must fit with destination table fields order
    • In DR export case, temporary tables and DR tables consistents data types => VARCHAR => no problem within insert
  • Global :
    • If a selection of data needs to be adjusted, function variables need to be modified and not the mapping scenario (the best is to align both)


Deployment – Pre-requisite

  • Can be applied to export with execution during more than 10s
  • To be checked before:

1.Scenario export settings :

1.Selection on horizon, sku...

2.Default values + null values settings

3.Destination table reset

2.Exported datafield dimensions order vs destination table

3.Date format of horizon period

4.SQL table or view and corresponding ODBC query names – similar names without space and parenthesis

5.Temporary tables corresponding to number of dimensions to be exported

6.Data type consistent between temporary and destination tables

7.Insert structure corresponding to destination tables


  • The function is currently adapted to QV and DR SQL tables
  • The code will probably have to updated in case of deployment of this function on other exports


Step by step

  • Step 1 : identify an export to be boosted
    • Example: execution > 10s (info in log file)


  • Step 2 : check pre-requiste
    • Datafield settings:
      • Default value = 0


  • Step 2 : identify settings of this export
    • Datafield settings:
      • Dimensions order conform to destination export table : DFU_Code (Item) first then Period (Horizon)

 


  • Step 2 : identify settings of this export
    • Current mapping settings:
      • Default values not exported only
      • Past horizon before current year not exported

      


  • Table is reset before export + SQL view and ODBC query have the same name :


  • Step 3 : modify export macro
    • Set up macro with function boost:






  • No labels