Real-Time Liver Lesion Segmentation in Ultrasound Imaging using Deep Learning
Keywords:
Ultrasound Imaging, Liver Tumor, Segmentation, U-Net, Focal Tversky Loss, Real-time detection of PCR productsAbstract
Ultrasound (US) imaging is a widely used, non-invasive method for detecting liver tumors and assessing parenchymal changes. However, the inherent variability and noise in US images pose challenges for accurate lesion identification. This study aims to develop and evaluate a deep learning (DL) model capable of performing real-time segmentation of liver lesions in US scans. A dataset of 50 video examinations was used, from which frames were extracted and manually annotated by an experienced gastroenterologist. The segmentation process was conducted using a U-Net architecture with focal Tversky loss (FTL) to address class imbalance. Two versions of the model were trained with different FTL parameters: Model 1 (α = β = 0.5, γ = 1) and Model 2 (α = 0.7, β = 0.3, γ = 0.75). The models were assessed based on key performance metrics, including intersection over union (IoU), recall, and precision. Model 1 achieved a higher IoU score (0.84) than Model 2. Both models demonstrated inference times between 30 and 80 milliseconds, confirming their feasibility for real-time US applications. Visual analysis showed that Model 1 produced more precise and contiguous lesion segmentation, whereas Model 2 tended to separate lesions that were close together. These findings suggest that the proposed DL models are effective in real-time liver lesion segmentation in US imaging. Model 1, which utilized balanced FTL parameters, demonstrated superior segmentation accuracy.
Downloads
Published
How to Cite
Issue
Section
License
Copyright (c) 2025 Marinela-Cristiana URHUŢ, Larisa Daniela SĂNDULESCU, Costin Teodor STREBA, Mădălin MĂMULEANU, Adriana CIOCÂLTEU, Sergiu Marian CAZACU, Suzana DĂNOIU

This work is licensed under a Creative Commons Attribution 4.0 International License.
All papers published in Applied Medical Informatics are licensed under a Creative Commons Attribution (CC BY 4.0) International License.