iToBoS project at Data4Business Days


The Data4Business Days, hosted by EnBW Köln, are a week-long event for the exchange of Data- and AI professionals.

iToBoS presented at MPNE Conference 2023


On April 28th-30th 2023, 89 Melanoma patients, carers and patient advocates from 24 countries within and outside the European Union met for the first full post-Covid annual MPNE conference by the iToBoS project partner.

Gravity-induced coma aberration in the liquid lenses


The decision to integrate liquid lenses into iToBoS’ full-body scanner was taken due to the need to take thousands of pictures in the shortest amount of time.

Latent Diffusion Models


Diffusion models rival and can even surpass GANs on image synthesis, as they are able to generate more diverse outputs by having a better coverage of the data distribution and do not suffer from mode collapse and training instabilities of GANs.

iToBoS presented in Semmelweis University to engage upcoming medical professionals


Hungary’s oldest still operating medical and health sciences higher education institution, Semmelweis University, invited SZTAKI deputy director and iToBoS colleague Róbert Lovas to give a presentation about European Union-funded projects that foster medical research and innovation.



Transformers are deep learning architectures created to solve sequence-to-sequence tasks, such as language translation.

Quantus: An Explainable AI Toolkit for Responsible Evaluation of Neural Network Explanations and Beyond


Just shy over a year ago, the Quantus toolkit v0.1.1 has been shared with the Machine Learning (ML) community as a pre-print on

iToBoS project presented in the 19th EADO Congress


iToBoS was discussed in a presentation at the 19th EADO (European Association of Dermato Oncology) Congress, held at Rome, Italy, in April 2023.

XAI Beyond Explaining


Explainable Artificial Intelligence (XAI) can not only be employed to get an insight into the reasoning process of an Artificial Intelligence (AI) model.

Attention is all you need


The paper ‘Attention Is All You Need’ introduces transformers and the sequence-to-sequence architecture. This is a neural net that transforms a sequence of elements (for example, the sequence of words of a sentence) into another sequence.