How to detect overfit models?

Detecting overfitting is technically not possible unless we test the data.

What is overfitting in Deep Learning

Out of all the things that can go wrong with your ML model, overfitting is one of the most common and most detrimental errors.

Image Relighting

Image relighting is the task of simulating a light source change in an image. It is an inverse challenging problem that is usually solved by estimating the image geometry, the reflectance and the lighting. Usually, the results contain artifacts that makes the output look unrealistic.

iToBoS project coordinator supported the MPNE 2023 Conference

Prof. Rafael Garcia (University of Girona), coordinator of iToBoS project, attended the annual Conference of the European Association of Melanoma Patients (MPNE 2023).

Interview with Lasse Folkersen, the author of Understand your DNA

As part of iToBoS educational module prs4pa- polygenic risk scores for patient advocates- we will be reading 'Understand your DNA' as an introduction to genetics in the form of a book club.

Clinical trial has commenced in Hospital Clinic Barcelona

The Foundation Clinic for Biomedical Research (FCRB) have started with the iToBoS Data Acquisition Clinical Trial, taking place at the Hospital Clinic of Barcelona.

How to solve the gravity-induced coma aberration of liquid lenses

In a past post, we explained that the biggest limitation of liquid lenses is gravity-induced coma aberrations which cause worse optical performance when the lenses are used with their optical axis oriented differently than vertical.

Gravity-induced coma aberration in the liquid lenses

The decision to integrate liquid lenses into iToBoS’ full-body scanner was taken due to the need to take thousands of pictures in the shortest amount of time.

Latent Diffusion Models

Diffusion models rival and can even surpass GANs on image synthesis, as they are able to generate more diverse outputs by having a better coverage of the data distribution and do not suffer from mode collapse and training instabilities of GANs.

Transformers

Transformers are deep learning architectures created to solve sequence-to-sequence tasks (such as language translation) and proposed in [1].