Journal: Scientific reports
This study presents BreastXploreAI, a novel multimodal, multitask deep learning framework designed to enhance breast cancer diagnosis by simultaneously predicting cancer subtypes and disease stages from full-field digital mammogram images.
The system is built around TransBreastNet, a hybrid model that combines:
- Convolutional neural networks for spatial lesion encoding
- Transformer modules for temporal lesion progression
- Dense metadata encoders for patient-specific clinical data
This approach addresses the limitations of previous static, single-view models.
By leveraging both genuine longitudinal data and synthetic temporal sequences, the framework effectively models lesion progression patterns.
Evaluated on a public mammogram dataset, BreastXploreAI outperformed existing state-of-the-art methods, achieving:
- Macro accuracy of 95.2% for subtype classification
- Macro accuracy of 93.8% for stage prediction
The inclusion of explainability modules further supports clinical interpretability, positioning this approach as a scalable and clinically relevant tool for improving precision in breast cancer diagnosis and supporting oncology decision-making.