Mammography CADx System based on Transfer Learning

  • J.A. Almaraz-Damian 
  • Oscar Garcia-Avila , 
  • Volodymyr Ponomaryov 
  • Roxane Gardner , 
  • Rebecca Minehart , 
  • Barbara Bertagni 
  • a,b,c,d,e Instituto Politecnico Nacional,Santa Ana Ave. # 1000, Mexico City, 04430, Mexico
Cite as
Almaraz-Damian J.A., Garcia-Avila O, Ponomaryov V., Reyes-Reyes R., Cruz-Ramos C.  (2021). Mammography CADx System based on Transfer Learning. Proceedings of the 10th International Workshop on Innovative Simulation for Healthcare (IWISH 2021), pp. 55-63. DOI: https://doi.org/10.46354/i3m.2021.iwish.009

Abstract

Early detection of breast cancer is crucial in the treatment of this desease, as the principal diagnoses tool mammography imaging is employed due to its no-invasive form. In this paper, a Computer-Aided Detection System (CADx) is presented for the analysis of digital mammogram images. The methods used in the proposed CAD system are Transfer Learning, Support Vector Machine as Classier, and Feature Reduction based on Principal Component Analysis. The system has demonstrated improved performance in comparison with state-of-the-art methods in terms of quality metrics such as Accuracy, Specificity, Sensibility, and F1-Score.

References

  1. Abadi, M., Agarwal, A., Barham, P., Brevdo, E., Chen, Z., Citro, C., Corrado, G. S., Davis, A., Dean, J., Devin, M., Ghemawat, S., Goodfellow, I., Harp, A., Irving, G., Isard, M., Jia, Y., Jozefowicz, R., Kaiser, L., Kudlur, M., Levenberg, J., Mané, D., Monga, R., Moore, S., Murray, D., Olah, C., Schuster, M., Shlens, J., Steiner, B., Sutskever, I., Talwar, K., Tucker, P., Vanhoucke, V., Vasudevan, V., Viégas, F., Vinyals, O., Warden, P., Wattenberg, M., Wicke, M., Yu, Y., and Zheng, X. (2015). TensorFlow: Large-scale machine learning on heterogeneous systems. Software available from tensorow.org.
  2. Aibar, L., Santalla, A., Criado, M. L., González–Pérez, I., Calderón, M., and Gallo, J.&; Parra, J. (2011). Clasificación radiológica y manejo de las lesiones mamarias. Clínica E Investigación En Ginecología Y Obstetricia, 38(4):141–149.
  3. Albawi, S. (2017). Tareq abed mohammed; saad al-zawi. In “Understanding of a convolutional neural network”. Conference on Engineering and Technology (ICET).
  4. Arora, R., Rai, P. K., and Raman, B. (2020). “deep feature–based automatic classification of mammograms”. International Federation for Medical and Biological Engineering. at Chicago (UIC), T. U. I. Uic dataset. 2020.
  5. Backfrieder, W. and Zwettler, G. (2015). Rotated principal components for fuzzy segmentation szinitiggraphic time series in individual dose planing. pages 33–37. Conference Code:116063.
  6. Chollet, F. et al. (2015). Keras.
  7. Cortes, C. and Vapnik, V. (1995). Support-vector networks. Machine learning, 20(3):273-297.
  8. de Cancerologia (INCan) Secretaria de Salud Gobierno de México, I. N. ” ¿qué es el cáncer?” actualizado.
  9. de Oliveira et al., J. E. E. (2010). Mammosys: A content-based image retrieval system using breast density patterns,. In Comput. Methods Programs Biomed.
  10. de Salud Pública, I. N. Dale la mano a la prevención del cáncer de mama.
  11. Haindl, M. and Remeš, V. (2019). Pseudocolor enhancement of mammogram texture abnormalities. Mach. Vision Appl., 30:785–794.
  12. Haindl, M.&; Remeš, V. (2019). Pseudocolor enhancement of mammogram texture abnormalities. Machine Vision and Applications, 30(4):785–794.
  13. He, K., Zhang, X., Ren, S., and Sun, J. “deep residual learning for image recognition,”. 2015.
  14. INEGI. “estadísticas a propósito del día mundial de la lucha contra el cáncer de mama (19 de octubre)”, comunicado de prensa núm. 462/20.
  15. Krizhevsky, A., Sutskever, I., and Hinton, G. E. (2012). “imagenet classication with deep convolutional neural networks". Advances in Neural Information Processing Systems.
  16. Lévy, D. and Jain, A. “breast mass classication from mammograms using deep convolutional neural networks”, stanford university, [cs.
  17. Mery, D. and Saavedra, D. &; Prasad, M. . (2020). X-ray baggage inspection with computer vision: A survey. IEEE Access, 8(13):45620–14563.
  18. of Radiology (acr), A. C. “acr bi-rads atlas®5th edition, ii. REPORTING SYSTEM, B. ASSESSMENT CATEGORIES, pagina, 135.
  19. OMS/OPS. Breast cancer: prevention and control.
  20. Pan, S. J. and Yang, Q. (2010). A survey on transfer learning. IEEE Transactions on Knowledge and Data Engineering, 22(10):1345–1359.
  21. Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., and Duchesnay, E. (2011). Scikit-learn: Machine learning in Python. Journal of Machine Learning Research, 12:2825–2830.
  22. Ragab, D. A., Sharkas, M., Marshall, S., and Ren, J. (2019a). Breast cancer detection using deep convolutional neural networks and support vector machines. 7:2167–8359.
  23. Ragab, D. A., Sharkas, M., Marshall, S., and Ren, J. (2019b). “breast cancer detection using deep convolutional neural networks and support vector machines”.
  24. Russakovsky, O., Deng, J., Su, H., Krause, J., Satheesh, S., Ma, S., Huang, Z., Karpathy, A., and Khosla, A. a. (2015). Imagenet large scale visual recognition challenge. International Journal of Computer Vision, 115(3):211–252.
  25. Selvaraju, R. R., Cogswell, M., Das, A., Vedantam, R., Parikh, D., and Batra, D. (2017). Grad-cam: Visual explanations from deep networks via gradient-based localization. In Proceedings of the IEEE international conference on computer vision, pages 618–626.
  26. Simonyan, K. and Zisserman, A. (2014). Very deep convolutional networks for large-scale image recognition.
  27. Society, A. C. “cuando se comunican con usted después del mamograma”.
  28. Szegedy, C., Ioe, S., Vanhoucke, V., and Alemi, A. (2017). Inception-v4 inception-resnet and the impact of residual connections on learning. AAAI Conference on Artificial Intelligence.
  29. Szegedy, C., Liu, W., Jia, Y., Sermanet, P., Reed, S., and Anguelov, D. . R. (2015a). Going deeper with convolutions. pages 1–9. IEEE conference on computer vision and pattern recognition.
  30. Szegedy, C., Vanhoucke, V., Ioe, S., Shlens, J., and Wojna, Z. (2015b). Rethinking the inception architecture for computer vision. In Proceedings of the 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pages 2818–2826, NV, USA, 26 Jun–1
    Jul 2016;. Las Vegas.
  31. Tsochatzidis, L., Costaridou, L., and Ioannis (2019). “deep learning for breast cancer diagnosis from mammograms—a comparative study”,. Journal of Imaging.
  32. USA.gov, N. C. I. What is cancer?
  33. Weiss, K., Khoshgoftaar, T. M., and Wang, D. A survey of transfer learning. J. Big Data, 2016:3.
  34. Yanase, J. and Triantaphyllou, E. (2019). “a systematic survey of computer-aided diagnosis in medicine: Past and present developments”,. Expert Systems with Applications, 138(112821).
  35. Yosinski, J., Clune, J., Bengio, Y., and Lipson, H. (2014). How transferable are features in deep neural networks? arXiv preprint arXiv:1411.1792.
  36. Zoph, B., Vasudevan, V., Shlens, J., and Le, Q. V. (2018). Learning transferable architectures for scalable image recognition. In Proceedings of the IEEE conference on computer vision and pattern recognition, pages 8697–8710.
  37. Zwettler, G. and Backfrieder, W. (2014). Automated domain-specific feature selection for classification-based segmentation of tomographic medical image data. pages 26–35. Conference Code:108894.