The emergence of the global pandemic of COVID-19 has generated a global acceleration in the sphere of digital transformation of medicine filed. Medical consultations have become a routine for everyone, but this can be improved with new technologies. In order to avoid crowded spaces, online applications for medicine are increasingly relevant and sought after. This will make it easier for medical staff to work and reduce the chance of contracting various diseases. SmartRay module was conceived out of the desire to strengthen the part of online applications in the field of medicine. This module is intended to analyze X-Rays to get a cancer result. The Microsoft Azure Machine Learning platform was used for the artificial intelligence part. We collected a set of different data consisting of over 10.000 radiographic images from different human organs (breast, lung, kidney, etc.) with and without cancer from the Cancer Imaging Archive of the Department of Biomedical Informatics at the University of Arkansas for Medical Sciences. Transformations were applied to this data set, using the DenseNet algorithm, a type of convolutional neural network, to drive the PyTorch model. The obtained model was compared with the initial data set to obtain a score, and in order to evaluate the radiographic images in real time, a NVIDIA GPU inference endpoint was created. The module allows the user to create an account, in order to be able to access the analysis part of the radiographs, but also to create a portfolio that will retain both the radiographic images and the result obtained after the analysis. In conclusion, the module developed under this research work can be used as a solution for health monitoring through the analysis of medical images.


Machine Learning, DenseNet, Real Time Processing, NVIDIA GPU