AI Summary • Published on Feb 21, 2026
Diagnosing dermatophytosis often relies on potassium hydroxide (KOH) microscopy, which faces significant challenges. The presence of artefacts, inconsistent keratin clearance, and high variability among observers hinder the accurate identification of fungal hyphae. Traditional diagnostic methods, including direct microscopy and fungal cultures, are limited by extended analysis times, lower sensitivity, and subjective interpretation. Existing deep learning approaches in this field have largely focused on image-level classification or pixel-wise segmentation, rather than precise object-level localization that can differentiate true fungal structures from visually similar artefacts, which are a major source of diagnostic ambiguity.
This study developed an AI-based detection system using the RT-DETR (Real-Time Detection Transformer) model architecture for accurate localization of fungal elements in KOH microscopy images. A dataset of 2,540 high-resolution microscopic images was acquired and expertly annotated using a multi-class strategy. This involved explicitly labeling 631 fungal elements (hyphae and spore clusters) as the primary target class and 381 clinically relevant artefacts (such as keratin debris and fibers) as a distinct 'artefact' class to enable the model to actively discriminate between them. The RT-DETR-L model, pre-trained on the COCO dataset, was trained for up to 250 epochs using the AdamW optimizer with morphology-preserving augmentations like horizontal flipping, scaling, translation, and minor rotations, while aggressive augmentations were avoided. Model performance was evaluated at both object-level (using metrics like recall, precision, AP@0.50, and mean IoU) and image-level (sensitivity, specificity, and diagnostic accuracy), with an image classified as positive if at least one fungal element was detected with high confidence (>0.25).
At the object level, the detection framework achieved a recall of 0.9737 and a precision of 0.8043. The model demonstrated strong spatial agreement with expert annotations, indicated by an AP@0.50 of 93.56% and a mean IoU of 0.8560. Critically, when aggregated for image-level diagnosis on an independent test set of 254 images, the model achieved 100% sensitivity, correctly identifying all 89 positive cases without any false negatives. It also showed a specificity of 98.18% and an overall diagnostic accuracy of 98.82%. Qualitative analysis confirmed the model's ability to robustly localize low-contrast fungal hyphae even in artefact-rich environments, effectively distinguishing them from mimics.
This transformer-based AI system demonstrates high reliability in detecting fungal elements in challenging KOH microscopy images, positioning it as a safe and effective automated screening tool in dermatomycology. Its ability to provide object-level localization and explicitly distinguish between fungal structures and artefacts offers interpretable visual cues, enhancing diagnostic transparency and potentially reducing inter-observer variability. The perfect image-level sensitivity ensures that no positive cases are overlooked, which is crucial for clinical safety. This approach bridges the gap between complex AI predictions and practical clinical explainability. Future work includes validating the model through external multi-center studies, refining annotation schemes to differentiate specific fungal morphologies, integrating anomaly detection to flag atypical patterns (e.g., malignancies), and conducting prospective reader studies to assess real-world impact on diagnostic accuracy and efficiency.