Explainable AI for Melanoma Detection at MPNEconsensus 2024

The AI, Data, and Data-Dependent Business Models workshop took place between 31 January and 2 February 2024 at Fraunhofer Heinrich Hertz Institute in Berlin.

The workshop is proudly organized by Melanoma Patient Network Europe in the framework of the iToBoS project. Co-creator partners of the project presented their works to the cancer patient community and made connections for future collaborations.

A group of distinguished researchers from the iToBoS (Intelligent Total Body Scanner for Early Detection of Melanoma) project provided engaging discussions and presented cutting edge research in their respective fields during the meeting. In particular, Sebastian Lapuschkin, head of the Explainable AI group at Fraunhofer Heinrich Hertz Institute has presented the group’s technological innovations Concept Relevance Propagation (CRP) [1] and Prototypical Concept-based Explanations (PCX) [2], to be employed in the IToBoS project for automatic melanoma detection.

CRP enables practitioners to determine which concepts are detected and used for the decision of a Machine Learning model. Furthermore, CRP produces explanations that show which parts of the input contain these concepts and how relevant each concept is for a model’s prediction.

Using the information about detected concepts via CRP, PCX disambiguates the decision patterns that have been learned by the model, depending on the presence of different concepts. We can explore different strategies that the model is using in general and find out which strategy is being employed in a specific case.

The meetings continued with presentations and workshops organized by other participants. Discussions comparing the perspectives of different parties involved have been conducted. The meeting was finalized after the first consensus statements were formulated by the MPNE patient advocates. The consensus statements and discussions during the meeting are being compiled into a document which will be made public. In order to follow updates, please consult the MPNE partner website.

References

[1] Achtibat, R., Dreyer, M., Eisenbraun, I. et al. From attribution maps to human-understandable explanations through Concept Relevance Propagation. Nat Mach Intell 5, 1006–1019 (2023). https://doi.org/10.1038/s42256-023-00711-8

[2] Dreyer, Maximilian, et al. "Understanding the (Extra-) Ordinary: Validating Deep Model Decisions with Prototypical Concept-based Explanations." arXiv preprint arXiv:2311.16681 (2023).