Ayuda
Ir al contenido

Dialnet


Resumen de Deep Learning-based Computer-Aided Diagnosis systems: a contribution to prostate cancer detection in histopathological images

Lourdes Durán López

  • In this work, novel computer-aided diagnosis systems for medical image analysis focusing on prostate cancer are proposed and implemented. First, the histopathology of prostate cancer was studied, along with the Gleason Grading System, which measures the aggressiveness of a tumor through different patterns with the purpose of driving therapies dealing with this disease. Furthermore, a study of Deep Learning techniques, particularly focusing on neural networks applied to medical image analysis, was conducted. Based on these studies, a Deep Learning-based system to detect malignant regions in gigapixel-size whole-slide prostate cancer tissue images was proposed and developed, which is able to report spatial information of the malignant areas. This solution was evaluated in terms of performance and execution time, obtaining promising results when compared to other state-of-the-art methods. Since the implemented system locates malignant regions within the image without providing a global class, a customWide & Deep network was developed to report a slide-level label per image. The proposed system provides a fast screening method for analyzing histopathological images. Next, a neural network was proposed to assign a specific Gleason pattern to the malignant areas of the tissue. Finally, with the purpose of developing a global computeraided diagnosis system for prostate cancer detection and classification, the three aforementioned subsystems were combined, allowing a complete analysis of histopathological images by reporting whether the sample is normal or malignant, and, in the last case, a heatmap of the malignant areas with their corresponding Gleason pattern. The studied algorithms were also used for other medical image analysis tasks. The performance of these systems were evaluated, discussing the obtained results, presenting conclusions and proposing improvements for future works.


Fundación Dialnet

Dialnet Plus

  • Más información sobre Dialnet Plus