Speaker
Description
The ESRF will soon restart after its Extremely Brilliant Source upgrade, which will provide two orders of magnitude improved photon flux for many experiments, and will also come with several new beamlines producing high-throughput data, from macromolecular crystallography to large volume tomography. This creates many challenges in terms of data handling, both from the point of view of the facility (the 'data deluge') and for users who are more focused on practical results on materials rather than methodology details.
We will discuss a number of domains where artificial intelligence could be used to improve the workflow from the experiment to quantitative analysis, including:
- reduction of data by detecting relevant datasets (e.g. serial experiments)
- feature recognition in various techniques (imaging, spectroscopy, diffraction)
- improved (faster) algorithms for data inversion (e.g. coherent scattering experiments)
- more unsupervised/automated workflows for standard experiments, to broaden the user community, including industrial applications