iToBoS 6th Newsletter launched

7/04/2024.

The sixth issue of the iToBoS newsletter was released!

Have a look at iToBoS scanner prototype

5/04/2024.

Have a quick look at the iToBoS scanner prototype at the Bosch Manufacturing Solutions facilities.

Edge Computing Platforms for Medical Applications

4/04/2024.

The demand for mobile and multitask devices has shifted the focus of research towards embedded systems and microcomputers.

Use of Neural Radiance Fields in the Medical Domain

3/04/2024.

Understanding the geometry of an existing scene and being able to use this knowledge to produce (and refine) data, is an important task in any research field, particularly in the medical domain where the study and understanding of 3D structures of interest play a crucial role in abnormality detection.

Streamlining Operations: isahit's Management in iToBoS Project

2/04/2024.

The iToBoS project is a demonstration of collaborative efforts, where multiple partners combine their specialized knowledge towards a common objective.

How isahit's expertise and diversified workforce is enhancing Melanoma Research?

1/04/2024.

Precision and diversity are crucial in melanoma research, and isahit's approach to data annotation reflects this necessity.

Privacy risk assessment of AI models

31/03/2024.

The need to analyze personal data to drive business, alongside the requirement to preserve the privacy of data subjects, creates a known tension.

Generative AI for Specialized Dataset Enhancement and Expansion

30/03/2024.

An important challenge for applying machine and deep learning methods in applications where data collection is difficult, or costly is the reduced amount of annotated data.

Empowering HITers for Melanoma annotation tasks: Training and Selection Process

28/03/2024.

Precision and expertise are necessary to reach 100% quality in data annotation. The process of selecting image annotators, called HITers at isahit, for projects like iToBoS involves rigorous criteria to ensure the highest quality in final annotations.

Scaled Dot-Product Attention

27/03/2024.

Self-attention is the core mechanism behind Transformer models, which have provided state-of-the-art results in various scientific fields (i.e. Natural Language Processing).