Suggestions on :
- System set up**: Multi Provider Site Version: Central instance for a data user to select data sets and algorithms -> submission communicated to all relevant data providers
- Data management: means (simple protocol) to communicate the data available from sites to central instance; Central instance merges all information and displays is.
- Execution/Output: support of federated workflows (I guess most important federated learning e.g. run a pysyft workflow like Klaus/Max want conduct).
- Executor: Separation - User workflows are running separately from system flow (-> e.g. running a user workload does not alter system flow…)
- Authentication: DKFZ AD -> Helmholtz AAI
Topic for discussion in Kaapana Retreat