Document Links

Introduction

This Data Flow Specification (DFS) defines the end-to-end data flow required to meet the following requirements:

LevelProcessDescriptionJira StyJira ModMigrationDetails
BucketBucket PlanningTop-down and Bottom-up planning12467290364 views, year, category

Item

Item Planning

Plan Project Cost(FEC) & FY Budget Request66926659035 / 1022

2 views, month, category


Business Case

Calculate Return on Investment

Only Financial relevant projects

Commentary

263
  • 2822
  • 2823
  • 2824
  • 2825
  • 2826
  • 2827
  • 2828
9105
  • Portfolio MD (defaults)
  • Item MD
  • Costs
  • P&L
  • Cash Impacts
  • Subsidies
  • Calculations

ScoringMake comparable6692875n/a

Probably just a view in the DDFS rather than an AM

Separate model (persist data for snapshots)


PrioritisationPriorityReq 519n/an/a

Item RetractionExport to S/4671n/an/aIssues regarding the filter and locking

Item SnapshotsChange of Gate or variationReq 8792964n/aQuarterly, Variation, Gates, Adhoc

Item Automation

Quarterly or adhoc snapshot

accuracy calc

Req 759
n/a

Overwrite plan with actuals (business need, not reporting solution).

Mass update of scoring, update accuracy

ProjectWBS PlanningPlan costs67026641026Extracting WBS from RPSCO
Portfolio ReportingBucket, Item and Reporting on the Plan data against the Actuals

Req 152


n/aAnalytic Model design TBC
Project ReportingWBSReporting on the Plan data against the Actuals

Req 760


n/aAnalytic Model design TBC

Gaps

ItemDetailStatus
Bucket snapshotNo request? Potentially just a version copy?Open

Data Flow Diagram

Source System Extractors


SystemCodeExtractor NamePurposeDeltaBuild Jira Ref For Extension InformationTexts
S4HRAuths

P_PPM_AUTHORIZATIONBYUSER

Maps users to ItemsnoTBCno
S4HRT_Scoring

/SYQ/T_SCORING

Global parametersnoTBCno
S4HRT_RiskM

/SYQ/T_RISK_MAGNITUDE

Starting Risk Magnitude,  Residual Risk Magnitude

noTBCno
S4HRT_RiskL

/SYQ/T_RISK_LKLHD

Starting Risk Likelihood, Residual Risk LikelihoodnoTBCno
S4HRT_RegM

/SYQ/T_REG_MAGNITUDE

Regulatory Risk Magnitude

noTBCno
S4HRT_RegL

/SYQ/T_REG_LKLHD

Regulatory Risk LikelihoodnoTBCno
S4HRT_RegT

/SYQ/T_REG_TIME_HORIZON

Regulatory Mandated Time HorizonnoTBCno
S4HRT_Strat

/SYQ/T_STRATEGIC

Strategic WeightingnoTBCno
S4HxRPSCO

/SYS/I_PROJECTPLANCOSTAMOUNTUNION

WBS Planning.

SAP standard uses ACDOCP so we will have to design our own extractor where we union the 16 period columns


TBC

The Item and Bucket hierarchy will not be extracted but rather built in DSP via the Parent UUID. This will work perfectly for the SAC planning component but the DSP reporting component will need to be tested.

All of the above are considered as Tier 1 and only maintained in RoW



Inbound Layer

No inbound field adjustments are applied. Standard technical fields (load date/time, source system) are retained as delivered.

/SYQ/ replaced by SYQ


Harmonisation Layer

2VR_RPSCO → 2VR_S4HARM_SYQI_ProjectPlanCostAmountUnion

Purpose: 

Plan and Budget for projects at WBS element

2VR_Auths → 2VR_S4HARM_SYQI_AuthorizationUser

Purpose: 

Assigns Item UUID to authorised users where there is sensitive project. Will be read and added to ProjectPortfolioItem Read/Write fields

2VR_TScore → 2VR_S4HARM_SYQ_T_Scoring

Purpose: 

Global Parameter table used in scoring

2VR_TStrat → 2VR_S4HARM_SYQ_T_Strategic

Purpose: 

Parameter table used in scoring. Detail by GBU

2VR_TRiskM → 2VR_S4HARM_SYQ_T_RiskM

Purpose: 

Master data table to map the key to a value

2VR_TRiskL → 2VR_S4HARM_SYQ_T_RiskL

Purpose: 

Master data table to map the key to a value

2VR_TRegM → 2VR_S4HARM_SYQ_T_RegM

Purpose: 

Master data table to map the key to a value

2VR_TRegL → 2VR_S4HARM_SYQ_T_RegL

Purpose: 

Master data table to map the key to a value

2VR_TRegT → 2VR_S4HARM_SYQ_T_RegT

Purpose: 

Master data table to map the key to a value


Propagation Layer

Join with long texts (excluding Z07)

Create Odata connection to fill the master data required by SAC planning

2VR_LeadLed → 3VR_PrjAct

Purpose: 

Restrict to PPM relevant data

Filter:

where project <> ''

Comments:

Ensure we have commitments in this view



Data Migration approach:

Normally stored in a separate model, however, as this is seamless planning and the data will need to be amended along with new data, it will have to be loaded as normal data in the same model.

Comments:

Long texts are required from STXL

Types of projects:

Expenditure on capital (Capex), which always is managed via a project using PPM

Capital maintenance is assigned to a project

Operation expenditure (Opex) can be managed via PPM but does not have to be - eg finance only projects

Master data:

When attributes need to be planned, we will try and plan as measures rather than attributes

Scoring:

This is a separate model

3VR_Bucket - 2VR_SPOD_I2MPPM_I_ProjectPortfolioBucket

Purpose:

Read data from planning model to be used further in DSP




Project Score is calculated according to 3 methodologies, determined by ; Financial, Risk or Regulatory.

How is this score used (attribute or measure) - prioritisation as a number to be sorted

Calculation - sac planning action vs DSP persisted vs on the fly

Include value in snapshot - read score when snapping (better to persist)

I have split this into 3 views for simplicity, but they could be combined

3VR_Item - 3VR_SPOD_I2MPPM_ProjectPortfolioItem

Purpose:

Planning model made available for reporting

Item master data

Purpose:

Read data from planning model to be used further in DSP

Scoring:

Source Views:

I_ProjectPortfolioItem

4MP_item

2VR_TScore

2VR_TStra

2VR_TRiskL

2VR_TRiskM

2VR_TRegL

2VR_TRegM

2VR_TRegT

Join:

Using the attributes, read the master data scoring tables and update the value (attribute vs measure v matrix view)

Formula:

Risk Matrix score (Starting Risk = RISK_MAGNITUDE_VALUE * RISK_LKLHD_VALUE) -  (Residual Risk = RISK_MAGNITUDE_VALUE  * RISK_LKLHD_VALUE)

Reg Matrix Score = REG_MAGNITUDE_VALUE  REG_LKLHD_VALUE  * REG_TIME_HORIZON_FACTOR

3VR_ItemSS -> 3TL_SPOD_I2MPPM_ItemSS

Purpose:

Perform a snapshot of the data at a particular point in time.
To cater for attributes that can change over time, these attributes will be moved to dimensions.

Source:

3VR_Item

Transformation flow:

Graphical flow

Formula:

Add a timestamp and derive the SnapShot ID

Process:

Adhoc - scheduled via a task chain triggered by an API step in a SAC multi-action

Quarterly - scheduled via a task chain based on time

Stage gate - scheduled via a task chain based on a table extracted from S/4 - TBC

3VR_Score -> 3TL_SPOD_I2MPPM_Score

Purpose:

Project Score is calculated according to 3 methodologies, determined by ; Financial, Risk or Regulatory.

How is this score used (attribute or measure) - prioritisation as a number to be sorted

Include value in snapshot - read score when snapping (better to persist)

Source:

Item

Business case

Scoring

Strategy

Parameter:

Restricted to Items in the Item Planning form

Transformation flow:

Suggest using SQL as easy to perform complex calculations

Read above views and 

Case itemtype when

Read 2VR_Score where Project Score calculating Unweighted Project Score = 1 / (1 + EXP (-((Intensity Index + Index Offset) - Location Parameter) / Scale Parameter)) * PES Range + PES Minimum

Read 2VR_Strat where GBU = Item GBU, Buss Driver = Item Buss Driver calculating Score = Unweighted Project Score * WGHTG_FACTOR

Target:

3TL_Score with only ItemUUID and score

Trigger:

Can be initiated from the planning form for Item

However, if an ad-hoc request is required for all projects, then a identical Transformation Flow will be required without parameters




Reporting Layer

If you want to show the external format and text in SAP Analytic Cloud instead of the internal format the modelling in the Analytic Model needs to be done like this example:


4MP_Item 

Purpose:

This is for planning

Functional Spec:

ERP-2665 Data Model - PPM Items

Comments:

Include live data 3VF_PrjAct (must be exposed for consumption)






Outbound Layer

When retracting, you cannot filter on the company code, even if in the model, as restricted to the fields in the API.

Options: play with having 2 fin view types which can be corrected on the API mapping. 

Maybe the guids have a number range sequence