| Status | WIP |
|---|---|
| Stakeholders | |
| Outcome | |
| Due Date | |
| Owner | |
| Solution/Domain/Data Architect |
Most of the lab sites are equipped with File Server atop iOMega NAS, which is in some cases if malfunctioning also are running at the limit for adding more hard disks to accommodate new demands for storage and been compliant with data storage regulations.
[ TL;DR]... tbd
Use cases
- Lyon/RICL
- OpenLab instruments generate files at hundreds GB scale for some techniques which demands local computer power to load in stand alone applications for expert analysis, this magnitude size also makes it impossible to load these files to cloud repository as well as creating difficulties in being compliant with regulatory terms for storage.
- Shanghai
- Waters instruments demands 12TB space to store application data. This storage magnitute reaches the local limitation for LabPC disk and also for local network in the iOMega NAS.
- Bristol
- Waters instruments demands 12TB space to store application data. This storage magnitute reaches the local limitation for LabPC disk and also for local network in the iOMega NAS.
Questions and Concerns
Architectural Significant Requirements
- Data rotation
- Retention
- Export Control/Cyber Sec
- Access Control
- Data transfer latency SLA
- Data Consumers (data structure/data model)
Impediments and Blockers
Tradeoff analysis
- Alternatives
- XYZ
- YZX
- Sensitivity Points
- Risks
- Non Risks
- Architectural Approaches
Quick-Wins
The criticality of the problem demands that there be at least a temporary solution as an alternative to mitigate the immediate impact of losing sensitive data for the business due to the local storage limit at the current date.
Some questions that can facilitate the analysis:
- To this end, it is necessary to identify the business requirement for using locally stored historical data
- What type of search criteria is used on historical data?
- What retrieval is done on this found data?
- What type of processing is done on the found and retrieved data?
- Is it possible to parse this data?
Possible alternatives to consider in advance:
- Promote historical data close to the (AWS Landing Zone) ACD Labs domain so that it can be ingested for later analysis in reports
- Promote historical data close to the (GCP) Lab-Booster domain so that it can be ingested for later analysis in reports
- Promote historical data to the Azure Fabric Lakehouse so that it can be ingested for later analysis in reports
