ABSTRACT
Background
The development of artificial intelligence (AI) is rapidly expanding, showing promise in the dermatological field. Skin checks are a resource‐heavy challenge that could potentially benefit from AI‐tool assistance, particularly if provided in widely available AI solutions. A novel smartphone application(app)‐based AI system, “SCAI,” was developed and trained to recognize spots in paired images of skin, pursuing identification of new skin lesions. This pilot study aimed to investigate the feasibility of the SCAI‐app to identify simulated skin changes in vivo.
Materials and methods
The study was conducted in a controlled setting with healthy volunteers and standardized, simulated skin changes (test spots), consisting of customized 3‐mm adhesive spots in three colors (black, brown, and red). Each volunteer had a total of eight test spots adhered to four areas on back and legs. The SCAI‐app collected smartphone‐ and template‐guided standardized images before and after test spot application, using its backend AI algorithms to identify changes between the paired images.
Results
Twenty‐four volunteers were included, amounting to a total of 192 test spots. Overall, the detection algorithms identified test spots with a sensitivity of 92.0% (CI: 88.1–95.9) and a specificity of 95.5% (CI: 95.0–96.0). The SCAI‐app's positive predictive value was 38.0% (CI: 31.0–44.9), while the negative predictive value was 99.7% (CI: 99.0–100).
Conclusion
This pilot study showed that SCAI‐app could detect simulated skin changes in a controlled in vivo setting. The app's feasibility in a clinical setting with real‐life skin lesions remains to be investigated, where the challenge with false positives in particular needs to be addressed.
Keywords: AI, artificial intelligence, detection, feasibility study, pilot study, skin change, smartphone
Abbreviations
- 3D
three‐dimensional
- AI
artificial intelligence
- app
application
- CI
confidence intervals
- CNN
convolutional neural network
- iOS
iPhone Operation System
- ISO
International Organization for Standardization
- LED
light emitting diode
- NPV
negative predictive value
- PPV
positive predictive value
- SCAI
SkinChange.AI, a smartphone application‐based AI system
1. Introduction
Amid the rapidly growing field of artificial intelligence (AI) in the medical sector, there is a potential for new applications in dermatology, ranging from skin conditions such as acne and eczema to skin aging and skin cancer [1, 2, 3, 4, 5]. Skin changes and appearance of new skin lesions is a common concern, and while keeping track may be crucial, it is also a time‐dependent and resource‐heavy endeavor [6, 7]. It is asserted that AI tools may contribute to making detection of skin changes increasingly accessible, especially where resources are sparse [8, 9].
The field of smartphone applications for skin checks is exploding and there is a consumer‐driven interest in providing new solutions [10]. However, the requirements for such tools are substantial and include handiness, widespread availability, and low cost, as well as being scientifically well‐founded with studies backing their applicability [11, 12].
Upcoming AI‐system developments mainly circle around lesion‐specific features based on clinical or dermoscopic images [7, 13–17]. In contrast, there are fewer solutions providing AI tools for nondiscriminatory identification of skin changes over time [18, 19]. Such tools could be helpful for general skin checks, particularly of large or unattainable skin areas, and in skin types with multiple skin lesions.
A novel smartphone application (app)‐based visual AI system, SkinChange.AI (SCAI), has been developed in the pursuit of AI‐assisted identification of skin changes over time. The SCAI‐app has been trained with open‐source AI to non‐discriminatorily identify spots on skin. This is in contrast to many available AI tools that rely on a large photo library with validated skin lesions, histology, imaging, or dermoscopy [20, 21]. In addition, a common challenge in application of AI tools is the quality of images, as the collection of standardized images can be very operator‐dependent. In anticipation of this, the SCAI‐app provides both app‐based and AI‐assisted standardization of paired images of skin. However, the SCAI‐app's feasibility in terms of sensitivity and specificity to detect new spots on skin remains to be tested. This pilot study aimed to investigate the feasibility of the SCAI‐app for identification of simulated skin changes in vivo in a controlled setting.
2. Materials and Methods
2.1. Study Design
This pilot study investigated the feasibility of the SCAI‐app for detecting simulated skin changes. The study was carried out in the Danish Research Center for Skin Cancer at Department of Dermatology, Private Hospital Molholm, Vejle, Denmark in the 1st quartile of 2024.
Healthy adults with Fitzpatrick skin type I‐III were included on a volunteer basis. Volunteers with visible skin diseases or tattoos on back or lower legs and/or known intolerance to adhesive tape were excluded. The study was conducted in accordance with the Helsinki Declaration and adhering to ISO 14155:2020 standards for medical devices. No further approvals were required due to the inclusion of healthy volunteers and non‐invasive intervention. Informed consent for participation and image publication was obtained from all volunteers.
2.2. Study Materials
Simulated skin changes (test spots) with adhesive tape were manually applied to the skin on back and lower legs and images were captured with the SCAI‐app (SkinChange.AI, Denmark) both before and after test spot application. The study was conducted in a controlled environmental setting with a customized light box (TOLEDO Flex LED, Global Greentech, Denmark) with a fixed light output.
2.2.1. Test Spots
The simulated skin changes consisted of adhesive test spots customized for this study, made of laminated paper on medical tape (3M Micropore, USA) and cut‐out with a 3mm biopsy punch (PFM, Germany). Three different colored test spots were created; black (RAL: 9005), brown (NCS: S4030‐Y30R), and red (clinical image of basal cell carcinoma). A total of eight spots were placed on each volunteer, comprising two black spots, one brown spot, and one red spot on the back and legs, respectively. See Figure1 for test spot examples and closeup on skin as well as sketch of systematic test spot placement.
FIGURE 1.
Open in a new tab
2.2.2. Artificial Intelligence App
The SCAI‐app system (SkinChange.AI version 2.2.2 for iOS version 13.0 and above) consists of (a) a smartphone app interface for standardized image collection, and a backend algorithm comprising (b) AI‐assisted standardized alignment and (c) AI‐assisted comparison of paired images.
The smartphone app interface provides anatomical templates of individual body parts that guide the operator in capturing standardized images. Based on the first image, individually adjusted templates are created for a second image, allowing direct pairing of images on the screen. The template and the iOS built‐in smartphone gyroscope further secure alignment of angles as well as photographic distance. The images are saved on a host server for the AI analysis.
The first AI‐assisted standardization secures exact alignment between the paired images. First, by identifying and eliminating disturbing background features. The AI used for the background identifier is based on a convolutional neural network (CNN) DeeplabV3Plus [22] architecture with a ResNet18 encoder [23]. It has been pretrained on ImageNet dataset (1M+ images) [24] and additional training was conducted for general skin detection on 6000 images and additionally 50−200 images for each body part. Second, the AI algorithm adjusts for any remaining standardization differences between the images in three steps: (i) Misalignment of the relative position and body posture is compensated through a 3D representation. (ii) Color differences are aligned by eliminating pixel errors and adjusting for regional and local color differences. (iii) Each paired image pixel is aligned, allowing pixel by pixel comparison.
The second AI‐assisted step is spot identification. The AI compares the images before and after test spot application, identifying possible changes, marking the areas of interest with red rings. The spot identification AI is based on the CNN architecture YOLOv5s object detector [25], pretrained on the COCO [26] dataset (300k+ images). The dataset for additional training has been created from 360 images with multiple spots in each image.
See Figure2a,b for the smartphone app interface and backend AI algorithm web‐interface.
FIGURE 2.
Open in a new tab
The utilized smartphone in this study was an iPhone 14 Pro Max and all images were collected by a single operator.
2.3. Outcome Measures
Outcome measures refer to the SCAI‐app feasibility for detecting study‐specific simulated skin changes, which derived from paired images, before and after application of a fixed number of test spots, and the corresponding AI‐based output of the SCAI‐app.
2.4. Statistics
Characteristics were presented with descriptive statistics. Sensitivityand specificity for detection of test spots were displayed as percentages with 95% confidence intervals (CI). The positive predictive value (PPV) and negative predictive value (NPV) were also estimated for the overall data. Additionally, sensitivity and specificity analysis were stratified by anatomical location, back and legs. Stratification by test spot colors black, brown, and red, respectively, was feasible for true positives and false negatives, while the numbers for false positives and true negatives were not color‐specific.
3. Results
3.1. Characteristics
A total of 24 healthy volunteers were consecutively included, while a single volunteer was excluded due to a tattoo in a study area. Study volunteers comprised both women (75%) and men (25%) with a median age of 55 and a range of 19−62 years. More than half of the volunteers presented with Fitzpatrick skin type II (54%), only a single volunteer had skin type I (4%), while the remaining had skin type III (42%).
3.2. SCAI‐App Output
A total of 192 test spots were placed, amounting to 96 on backs and 96 on legs. A total of five test spots were excluded from the analysis; in one patient, the back images (2 black, 1 brown, and 1 red) were not saved correctly to the server, and in another patient, one test spot (1 black) was placed too laterally for the AI detection. See Figure3 for an example of the SCAI‐app detection output.
FIGURE 3.
Open in a new tab
3.2.1. Sensitivity, Specificity and Predictive Values
Overall, the SCAI‐app provided test spot detection with a sensitivity of 92.0% (CI: 88.1–95.9) and a specificity of 95.5% (CI: 95.0–96.0). The overall ability to identify a true positive test spot among false positive test spots (PPV) was 38.0% (CI: 31.0−44.9) while the ability to correctly identify true negatives (NPV) was 99.7% (CI: 99.0−100). The overall number of true negatives identified by SCAI‐app was substantial (n = 5949).
Stratification of the data by location and test spot color revealed pertinent details of the SCAI‐app output. Sensitivity was higher on the back compared to the lower legs (back: 98.9% [CI: 96.8−100], legs: 85.3% [CI: 78.1−92.4]), while it was the opposite for the specificity (back: 93.8% [CI: 93.0−94.5], legs: 98.1% [CI: 97.5−98.6]). There was a tendency towards red and black test spots being more distinguishable than the brown (sensitivity; red: 93.6% [CI: 86.6−100], black: 92.5% [CI: 87.1−97.8], brown: 89.4% [CI:80.5−98.2]). See Table1 for overview of sensitivity and specificity analysis.
TABLE 1.
Sensitivity and specificity analysis of the SCAI‐app output in percentages with 95% confidence intervals (CI) and absolute numbers; overall, by anatomical location and color‐code, respectively.
SCAI output | Spots | Spots by location | Spots by color‐code | |||
---|---|---|---|---|---|---|
Test spots | Overall (n = 187) | Back (n = 92) | Legs (n = 95) | Black (n = 93) | Brown (n = 47) | Red (n = 47) |
Sensitivity % (95% CI) | 92.0% (88.1−95.9) | 98.9% (96.8−100) | 85.3% (78.1−92.4) | 92.5% (87.1−97.8) | 89.4% (80.5−98.2) | 93.6% (86.6−100) |
Specificity % (95% CI) | 95.5% (95.0−96.0) | 93.8% (93.0−94.5) | 98.1% (97.5−98.6) | 95.5% (95.0−96.0) | 94.8% (94.2−95.4) | 95.1% (94.4−95.7) |
Absolute numbers, n | ||||||
True positive | 172 | 91 | 81 | 86 | 42 | 44 |
False negative | 15 | 1 | 14 | 7 | 5 | 3 |
False positive | 281 | 233 | 48 | 281* | 281* | 281* |
True negative | 5949 | 3495 | 2454 | 5949* | 5949* | 5949* |
Open in a new tab
*
Not color‐specific.
4. Discussion
This pilot study sought to investigate the detection skills of a novel smartphone app‐based AI system, SCAI‐app. Overall, the app provided in vivo detection of the study‐specific test spots with a sensitivity of 92.0% and specificity of 95.5%. Meanwhile, a PPV of 38.0% and a NPV of 99.7% indicated that the app rarely overlooked the simulated skin changes, but a substantial part of marked possible areas of interest were not actual changes. Balancing false negatives and false positives is a general tradeoff and presents a substantial challenge in tools such as the SCAI‐app. While, a low rate of false negatives is crucial to relevant detection of skin changes, a high rate of false positives presents other considerable challenges that are important to address. Overestimation of possible skin changes may induce significant unnecessary worry in individuals operating such an AI tool. In extension, the handling and differentiating of false positives may give rise to an additional, increasing overload of the health care systems, which counteracts the application of AI‐assisting tools for skin checks.
This study provided important learning points for optimization of the SCAI‐app. Generally, detection was better on the back compared to the lower legs, which could be explained by several factors. First, images of the lower legs were captured from a longer distance than those of backs, and therefore the spots consisted of fewer pixels, making them harder to track with the fixed study settings and the current AI‐training status. Second, the background identifier AI was more precise in border definition of the back compared to the legs, which caused the system to exclude spots on legs as background features. Post‐analysis investigations showed that the AI‐system performance could be further enhanced with minor setting adjustments and additional AI training.
4.1. Perspectives
Detection of skin changes is challenging as well as time‐consuming and resource‐heavy [6, 7]. Individuals may struggle to identify new or growing skin lesions, especially on large or unattainable areas of skin such as the back. Additionally, skin types with multiple skin lesions or severely pigmented and sun‐damaged skin may be even more demanding in regard to skin checks [27].
Currently, tools for assisting identification of skin changes mainly exist in passive forms, comparing still‐images over time [28]. However, this leaves the identification of new or changing skin lesions entirely up to the eye of the beholder, which can be anything from a non‐trained individual to dermatology novices or dermatologists [29, 30]. Moreover, image quality is crucial for optimized AI application but remains challenging. While newer smartphone applications such as the SCAI‐app have taken steps toward improved image standardization, the image acquisition techniques of available tools have proven to vary substantially [12]. In a wider perspective, AI systems such as the SCAI‐app may potentially prove to be useful in democratizing early detection of skin lesions, especially in regions where clinical assessment is unavailable or sparse. However, AI‐tool limitations are important to consider in future implementation [12].
Compared to other AI systems within skin lesion detection [17, 20, 21, 31], the SCAI‐app was trained based on identification of non‐specific spots. Potentially the SCAI‐app could be applicable to a variety of skin lesions, as it is more broadly trained to non‐discriminatorily detect skin changes instead of specific clinical features. However, the possibility to apply the SCAI‐app in less controlled settings as well for detection of actual clinical lesions is yet to be investigated in a clinical trial.
5. Conclusion
Overall, the SCAI‐app presented sensitivity of 92.0% and specificity of 95.5% for in vivo detection of the simulated skin changes. The SCAI‐app showed a particular identification of true negatives (NPV 99.7%) and less precise distinction between true and false positives (PPV 38.0%), representing the challenging tradeoff between false negatives and false positives.
Based on this pilot study, the SCAI‐app may potentially prove relevant as an AI assistant for skin checks. However, the clinical feasibility remains to be investigated in a trial with clinical skin lesions.
Ethics Statement
The study was conducted according to the guidelines of the Declaration of Helsinki and ISO 14155:2020 standards for medical devices.
Conflicts of Interest
B.V. is a shareholder and employee at SkinChange.AI Aps.
Informed Consent Statement
Informed consent was obtained from all subjects involved in the study.
Acknowledgments
The work was executed as a part of the Danish Research Center for Skin Cancer (www.researchcenterforskincancer.org), a public‐private research partnership between the Private Hospital Molholm, Aalborg University Hospital, and Copenhagen University Hospital, Bispebjerg and Frederiksberg.
Funding: The study equipment and licenses were provided by SkinChange.AI Aps. The light box setup was provided by Global Greentech Aps.
Data Availability Statement
A data compilation presented in this study is available upon request from the authors. Raw data based on images on host server cannot be shared due to GDPR‐agreement (Edora A/S).
References
- 1.Liopyris K., Gregoriou S., Dias J., and Stratigos A. J., “Artificial Intelligence in Dermatology: Challenges and Perspectives,” Dermatology and Therapy12, no. 12 (2022): 2637–2651, 10.1007/s13555-022-00833-8. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 2.Sangers T. E., Kittler H., Blum A., etal., “Position Statement of the EADV Artificial Intelligence (AI) Task Force on AI‐Assisted Smartphone Apps and Web‐Based Services for Skin Disease,” Journal of the European Academy of Dermatology and Venereology38, no. 1 (2024): 22–30, 10.1111/jdv.19521. [DOI] [PubMed] [Google Scholar]
- 3.Li T., Ma X., Li Z., etal., “Facial Adult Female Acne in China: An Analysis Based on Artificial Intelligence Over One Million,” Skin Research and Technology30, no. 4 (2024): e13693, 10.1111/srt.13693. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 4.Li T., Ma X., Li Z., etal., “Artificial Intelligence Analysis of Over a Million Chinese Men and Women Reveals Level of Dark Circle in the Facial Skin Aging Process,” Skin Research and Technology29, no. 11 (2023): e13492, 10.1111/srt.13492. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 5.Thomsen K., Christensen A. L., Iversen L., Lomholt H. B., and Winther O., “Deep Learning for Diagnostic Binary Classification of Multiple‐Lesion Skin Diseases,” Frontiers in Medicine7 (2020): 574329, 10.3389/fmed.2020.574329. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 6.Furriel B., Oliveira B. D., Prôa R., etal., “Artificial Intelligence for Skin Cancer Detection and Classification for Clinical Environment: A Systematic Review,” Frontiers in Medicine10 (2024): 1305954, 10.3389/fmed.2023.1305954. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 7.Fujisawa Y., Inoue S., and Nakamura Y., “The Possibility of Deep Learning‐Based, Computer‐Aided Skin Tumor Classifiers,” Frontiers in Medicine6 (2019): 191, 10.3389/fmed.2019.00191. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 8.de Carvalho T. M., Noels E., Wakkee M., Udrea A., and Nijsten T., “Development of Smartphone Apps for Skin Cancer Risk Assessment: Progress and Promise,” JMIR Dermatology2, no. 1 (2019): e13376, 10.2196/13376. [DOI] [Google Scholar]
- 9.Felmingham C., Pan Y., Kok Y., etal., “Improving Skin Cancer Management With Artificial Intelligence: A Pre‐Post Intervention Trial Of An Artificial Intelligence System Used as a Diagnostic Aid for Skin Cancer Management in a Real‐World Specialist Dermatology Setting,” Journal of the American Academy of Dermatology88, no. 5 (2023): 1138–1142, 10.1016/j.jaad.2022.10.038. [DOI] [PubMed] [Google Scholar]
- 10.Kong F. W., Horsham C., Ngoo A., Soyer H. P., and Janda M., “Review of Smartphone Mobile Applications for Skin Cancer Detection: What are the Changes in Availability, Functionality, and Costs to Users Over Time?” International Journal of Dermatology60, no. 3 (2021): 289–308, 10.1111/ijd.15132. [DOI] [PubMed] [Google Scholar]
- 11.Freeman K., Dinnes J., Chuchu N., etal., “Algorithm Based Smartphone Apps to Assess Risk of Skin Cancer in Adults: Systematic Review of Diagnostic Accuracy Studies,” British Medical Journal368 (2020): m127, 10.1136/bmj.m127. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 12.Sun M. D., Kentley J., Wilson B. W., etal., “Digital Skin Imaging Applications, Part I: Assessment of Image Acquisition Technique Features,” Skin Research and Technology28, no. 4 (2022): 623–632, 10.1111/srt.13163. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 13.Haenssle H. A., Fink C., Schneiderbauer R., etal., “Man Against Machine: Diagnostic Performance of a Deep Learning Convolutional Neural Network for Dermoscopic Melanoma Recognition in Comparison to 58 Dermatologists,” Annals of Oncology29, no. 8 (2018): 1836–1842, 10.1093/annonc/mdy166. [DOI] [PubMed] [Google Scholar]
- 14.Krakowski I., Kim J., Cai Z. R., etal., “Human‐AI Interaction in Skin Cancer Diagnosis: A Systematic Review and Meta‐Analysis,” NPJ Digital Medicine7, no. 1 (2024): 78, 10.1038/s41746-024-01031-w. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 15.Lyakhov P. A., Lyakhova U. A., and Nagornov N. N., “System for the Recognizing of Pigmented Skin Lesions With Fusion and Analysis of Heterogeneous Data Based on a Multimodal Neural Network,” Cancers14, no. 7 (2022): 1819, 10.3390/cancers14071819. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 16.Sangers T., Reeder S., van der Vet S., etal., “Validation of a Market‐Approved Artificial Intelligence Mobile Health App for Skin Cancer Screening: A Prospective Multicenter Diagnostic Accuracy Study,” Dermatology238, no. 4 (2022): 649–656, 10.1159/000520474. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 17.Tschandl P., Rinner C., Apalla Z., etal., “Human–Computer Collaboration for Skin Cancer Recognition,” Nature Medicine26, no. 8 (2020): 1229–1234, 10.1038/s41591-020-0942-0. [DOI] [PubMed] [Google Scholar]
- 18.Stafford H., Buell J., Chiang E., etal., “Non‐Melanoma Skin Cancer Detection in the Age of Advanced Technology: A Review,” Cancers15, no. 12 (2023): 3094, 10.3390/cancers15123094. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 19.Ukharov A. O., Shlivko I. L., Klemenova I. A., etal., “Skin Cancer Risk Self‐Assessment Using AI as a Mass Screening Tool,” Informatics in Medicine Unlocked38 (2023): 101223, 10.1016/j.imu.2023.101223. [DOI] [Google Scholar]
- 20.Foltz E. A., Witkowski A., Becker A. L., etal., “Artificial Intelligence Applied to Non‐Invasive Imaging Modalities in Identification of Nonmelanoma Skin Cancer: A Systematic Review,” Cancers16, no. 3 (2024): 629, 10.3390/cancers16030629. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 21.Patel R. H., Foltz E. A., Witkowski A., and Ludzik J., “Analysis of Artificial Intelligence‐Based Approaches Applied to Non‐Invasive Imaging for Early Detection of Melanoma: A Systematic Review,” Cancers15, no. 19 (2023): 4694, 10.3390/cancers15194694. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 22.Chen L. C., Zhu Y., Papandreou G., Schroff F., and Adam H., “Encoder‐Decoder With Atrous Separable Convolution for Semantic Image Segmentation.” in Computer Vision – ECCV 2018 (Munich: Springer, 2018), 833–851, 10.1007/978-3-030-01234-2_49. [DOI] [Google Scholar]
- 23.He K., Zhang X., Ren S., and Sun J., “Deep Residual Learning for Image Recognition,” in 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR) (Las Vegas, NV: IEEE, 2016), 770–778, 10.1109/CVPR.2016.90. [DOI] [Google Scholar]
- 24.Deng J., Dong W., Socher R., Li L. J., Li K., and Fei‐Fei L.i, “ImageNet: A Large‐Scale Hierarchical Image Database,” in 2009 IEEE Conference on Computer Vision and Pattern Recognition (Miami, FL: IEEE, 2009), 248–255, 10.1109/CVPR.2009.5206848. [DOI] [Google Scholar]
- 25.Redmon J., Divvala S., Girshick R., and Farhadi A., “You Only Look Once: Unified, Real‐Time Object Detection,” in 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR) (Las Vegas, NV: IEEE, 2016), 779–788, 10.1109/CVPR.2016.91. [DOI] [Google Scholar]
- 26.Lin T. Y., Maire M., Belongie S., etal., “Microsoft COCO: Common Objects in Context,” in Computer Vision – ECCV 2018 (Zurich: Springer, 2014), 740–755, 10.1007/978-3-319-10602-1_48. [DOI] [Google Scholar]
- 27.Lim H. W. and Cooper K., “The Health Impact of Solar Radiation and Prevention Strategies,” Journal of the American Academy of Dermatology41, no. 1 (1999): 81–99, 10.1016/S0190-9622(99)70412-3. [DOI] [PubMed] [Google Scholar]
- 28.Kassianos A. P., Emery J. D., Murchie P., and Walter F. M., “Smartphone Applications for Melanoma Detection by Community, Patient and Generalist Clinician Users: A Review,” British Journal of Dermatology172, no. 6 (2015): 1507–1518, 10.1111/bjd.13665. [DOI] [PubMed] [Google Scholar]
- 29.Dinnes J., Deeks J. J., Grainge M. J., etal., “Visual Inspection for Diagnosing Cutaneous Melanoma in Adults,” Cochrane Database of Systematic Reviews2018, no. 12 (2018): CD013194, 10.1002/14651858.CD013194. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 30.Nervil G. G., Ternov N. K., Vestergaard T., etal., “Improving Skin Cancer Diagnostics Through a Mobile App With a Large Interactive Image Repository: Randomized Controlled Trial,” JMIR Dermatology6 (2023): e48357, 10.2196/48357. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 31.Ternov N. K., Christensen A. N., Kampen P. J. T., etal., “Generalizability and Usefulness of Artificial Intelligence for Skin Cancer Diagnostics: An Algorithm Validation Study,” JEADV Clinical Practice1, no. 4 (2022): 344–354, 10.1002/jvc2.59. [DOI] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.
Data Availability Statement
A data compilation presented in this study is available upon request from the authors. Raw data based on images on host server cannot be shared due to GDPR‐agreement (Edora A/S).