Logo BSU

Please use this identifier to cite or link to this item: https://elib.bsu.by/handle/123456789/288996
Title: Intelligent Labeling of Tumor Lesions Based on Positron Emission Tomography/Computed Tomography
Authors: Ye, Sh.
Shen, Ch.
Bai, Z.
Wang, J.
Yao, X.
Nedzvedz, O.
Keywords: ЭБ БГУ::ЕСТЕСТВЕННЫЕ И ТОЧНЫЕ НАУКИ::Физика
ЭБ БГУ::ЕСТЕСТВЕННЫЕ И ТОЧНЫЕ НАУКИ::Биология
ЭБ БГУ::ТЕХНИЧЕСКИЕ И ПРИКЛАДНЫЕ НАУКИ. ОТРАСЛИ ЭКОНОМИКИ::Медицина и здравоохранение
Issue Date: 2022
Publisher: MDPI
Citation: Sensors 2022;22(14)
Abstract: Positron emission tomography/computed tomography (PET/CT) plays a vital role in diagnosing tumors. However, PET/CT imaging relies primarily on manual interpretation and labeling by medical professionals. An enormous workload will affect the training samples’ construction for deep learning. The labeling of tumor lesions in PET/CT images involves the intersection of computer graphics and medicine, such as registration, a fusion of medical images, and labeling of lesions. This paper extends the linear interpolation, enhances it in a specific area of the PET image, and uses the outer frame scaling of the PET/CT image and the least-squares residual affine method. The PET and CT images are subjected to wavelet transformation and then synthesized in proportion to form a PET/CT fusion image. According to the absorption of 18F-FDG (fluoro deoxy glucose) SUV in the PET image, the professionals randomly select a point in the focus area in the fusion image, and the system will automatically select the seed point of the focus area to delineate the tumor focus with the regional growth method. Finally, the focus delineated on the PET and CT fusion images is automatically mapped to CT images in the form of polygons, and rectangular segmentation and labeling are formed. This study took the actual PET/CT of patients with lymphatic cancer as an example. The semiautomatic labeling of the system and the manual labeling of imaging specialists were compared and verified. The recognition rate was 93.35%, and the misjudgment rate was 6.52%. © 2022 by the authors.
URI: https://elib.bsu.by/handle/123456789/288996
DOI: 10.3390/s22145171
Scopus: 85135130828
Sponsorship: This research was funded by the Ministry of Science and Technology of the People’s Republic of China (grant number G2021016001L, G2021016002L, G2021016028L).
Licence: info:eu-repo/semantics/openAccess
Appears in Collections:Статьи биологического факультета

Files in This Item:
File Description SizeFormat 
sensors-22-05171-v2.pdf3,27 MBAdobe PDFView/Open
Show full item record Google Scholar



Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.