Interview with PaNOSC computational physicist Aljosa Hafner, on the use and benefits of the EOSC

We interviewed one of PaNOSC contributors, dr. Aljoša Hafner, to ask him how the EOSC could benefit the photon and neutron user community, as well as the workflow of researchers working in his scientific domain, i.e. soft condensed matter.

Aljoša is currently employed as Computational Physicist at CERIC-ERIC, and is actively involved in the development of a software for X-ray optics simulations, called OASYS, used to simulate different beamline properties, such as the divergence or spot size of the beam at different positions downstream. The work is carried out in the frame of PaNOSC WP5 – Virtual Neutron and X-Ray Laboratory (VINYL), where one of the goals is to integrate OASYS into the rest of the PaNOSC workflow, firstly by providing user easy access to such X-ray optics simulations, and then eventually to use the results collected in the simulations, in other software for data analysis.

During his PhD at the Université Libre de Bruxelles, Aljoša focused on the surface and interfacial behaviours in soft condensed matter, such as self assembly, which can be often encountered in this field. In particular, Aljoša investigated the mechanism of thin polymer dewetting, a process in which thin polymer layers break up into individual separated droplets. Such systems are very complex and require very sofisticated analytical tools, both in the lab and in large research infrastructures, such as X-ray and neutron sources.

“Data collection and analysis is therefore very complicated – states Hafner – and can be performed much better via a common framework and environment, such as the one which will be provided through the European Open Science Cloud“. During the interview, he also points out that the use of the services which will be available through and integrated to the EOSC will allow combining very complicated tools in a single working environment, having data (and metadata) stored in a unified data format, collected in one place and used from anywhere, while it will be possible to make computationally very expensive calculations, run on remote servers, for data analysis and experiments’ simulations.

In Hafner’s view, “one of the more important steps will be the possibility to compare the data gathered in a current experiment, to those collected in previous ones, to acquire complementary information for a proper interpretation of the data. This can only be achieved by proper metadata and properly labelled data. It would then be possible to use the same settings in simulation, and directly compare the data collected, with those gathered before“. In addition, “combining several different methods, such as advanced simulation experiments and other analytical tools will also be much easier, due to a common user interface.

Watch the video below to listen to the full interview:

Content unavailable
To see this content you need to accept cookies in the pop up box at the bottom of your screen
Share this content