Attendees:
Fernando Girante
Todd Tupper
Matteo Menghetti
Mukti Patel
Open Actions from 9th May
- Mukti to start an issues log on Confluence
- Mukti to put a rough plan together to show team
- Todd and Matteo to start liaising with Pricing Data Lake data engineers to add data on environments - Was there a issue here?
- Joao Madrinha also mentioned adding Access to projects being owned by the data lake squad team which is to be discussed with the GCP team.
Minutes for 16th May
- Once we are happy with the domains and environments (staging, ODS, prod) we should look at security and tables for logging which Matteo is working on
- Todd asked if we are responsible for reporting who is accessing the data. Fernando mentioned there is no plan in our scope to provide any kind of reporting and can be discussed in the future ( outside of scope)
- We will have some type of monitoring tool not just limited to the Data Ocean in the future (outside of scope)
This week
- Todd and Matteo on leave on Thursday and Friday
- All team to start adding documents which are in a draft form in multiple areas, on the wiki
- Todd and Matteo to continue working on TerraForm to build and test tables and Matteo to add document to support this onto the wiki
- Matteo to fix bug on the logging system
- Matteo to work on the Reference table - Job to use with Talend
- Fernando to discuss new initiative Novicare AI with Naveen
- Fernando to make small changes to the Data Architect documents as discussion and agreements progress with Matteo
- Fernando to continue working on the Data Architect SCD document, which is close for review
- The team to continue working with Pricing Data lake Architect and Data engineers as they are able to start adding data onto the staging environment from ODS and Salesforce
- As time permits - Fernando to start looking at the Data creation engine related to data validation as its high priority for George and document ideas discussed in draft
Next week
- Fernando and Matteo to ensure the Data architect Scheduler document shows what we do now using Talend, what information do we needs from jobs to ensure correct sequence of scheduling, what happens when a job fails,ect