In the last few years, there have been remarkable developments in computational methods for helping dermatologists to diagnose skin cancer in early stages.
Computerized analysis of pigmented skin lesions (PSLs) is a growing field of research aiming at developing reliable automatic tools to recognize skin cancer from images. The development of machine learning techniques for early detection of skin cancer has been carried out by an increasing number of researchers using machine learning tools.
As dermatologic AI applications expand and are integrated into the clinical care, particularly for diagnostic support, it is critical to evaluate the role of sex and gender equity to ensure that acceptance of these new technologies does not result in new or wider clinical inequities. If specific sexes or genders are excluded from datasets used to train and create machine learning algorithms, or if sex or gender-based disparities in clinical information and diagnosis are not sufficiently accounted for, there is the possibility of bias. In general, AI systems that have been primarily trained on male individuals perform poorly when tested on female participants. For instance, Google’s speech recognition is proven to have 13% more accuracy for men than for women. And Google is regularly the highest performer comparing to Bing, AT&T, WIT, and IBM Watson systems .
When it comes to bias in AI, there are two sorts of biases reported in the literature: desirable and undesirable biases. Desirable biases arise when algorithms built based on a lack of acceptable evidence or skewed evidence, result in discrimination, whereas undesirable biases arise when algorithms developed based on a lack of appropriate evidence or based on skewed evidence result in discrimination. In the case of sex and gender, a desirable bias is the presence of sex in diagnostic criteria for diseases with sex-based differences, whereas undesirable biases are the use of datasets with a lack of representation of particular genders.
We will boost desirable bias and avoid undesirable bias in the development of AI and ML technologies for the iToBoS project to achieve sex and gender equity. Experts have recommended several approaches that can be employed to accomplish this objective. These approaches include increasing desirable bias, advocating for adequate representation of all genders in AI training, taking into account the differences in dermatologic disorders between men and women, as well as transgender, nonbinary, and other gender-diverse patient groups and, finally, considering the intersectionality of patient factors including race, sexuality, and gender .
To obtain the best diagnosis accuracy, the iToBoS team will employ all necessary approaches mentioned above to maintain gender equity and avoid undesired biases during all stages of the project, including data acquisition and AI development.
 R. Tatman, "Gender and Dialect Bias in YouTube’s Automatic Captions," Proceedings of the First ACL Workshop on Ethics in Natural Language Processing, pp. 53-59, 2017.
 L. N. G. a. V. E. N. Michelle S. Lee, "Towards gender equity in artificial intelligence and machine learning applications in dermatology," Journal of the American Medical Informatics Association, vol. 22, no. 2, pp. 400-403, 2022.