Logo BSU

Пожалуйста, используйте этот идентификатор, чтобы цитировать или ссылаться на этот документ: https://elib.bsu.by/handle/123456789/288996
Заглавие документа: Intelligent Labeling of Tumor Lesions Based on Positron Emission Tomography/Computed Tomography
Авторы: Ye, Sh.
Shen, Ch.
Bai, Z.
Wang, J.
Yao, X.
Nedzvedz, O.
Тема: ЭБ БГУ::ЕСТЕСТВЕННЫЕ И ТОЧНЫЕ НАУКИ::Физика
ЭБ БГУ::ЕСТЕСТВЕННЫЕ И ТОЧНЫЕ НАУКИ::Биология
ЭБ БГУ::ТЕХНИЧЕСКИЕ И ПРИКЛАДНЫЕ НАУКИ. ОТРАСЛИ ЭКОНОМИКИ::Медицина и здравоохранение
Дата публикации: 2022
Издатель: MDPI
Библиографическое описание источника: Sensors 2022;22(14)
Аннотация: Positron emission tomography/computed tomography (PET/CT) plays a vital role in diagnosing tumors. However, PET/CT imaging relies primarily on manual interpretation and labeling by medical professionals. An enormous workload will affect the training samples’ construction for deep learning. The labeling of tumor lesions in PET/CT images involves the intersection of computer graphics and medicine, such as registration, a fusion of medical images, and labeling of lesions. This paper extends the linear interpolation, enhances it in a specific area of the PET image, and uses the outer frame scaling of the PET/CT image and the least-squares residual affine method. The PET and CT images are subjected to wavelet transformation and then synthesized in proportion to form a PET/CT fusion image. According to the absorption of 18F-FDG (fluoro deoxy glucose) SUV in the PET image, the professionals randomly select a point in the focus area in the fusion image, and the system will automatically select the seed point of the focus area to delineate the tumor focus with the regional growth method. Finally, the focus delineated on the PET and CT fusion images is automatically mapped to CT images in the form of polygons, and rectangular segmentation and labeling are formed. This study took the actual PET/CT of patients with lymphatic cancer as an example. The semiautomatic labeling of the system and the manual labeling of imaging specialists were compared and verified. The recognition rate was 93.35%, and the misjudgment rate was 6.52%. © 2022 by the authors.
URI документа: https://elib.bsu.by/handle/123456789/288996
DOI документа: 10.3390/s22145171
Scopus идентификатор документа: 85135130828
Финансовая поддержка: This research was funded by the Ministry of Science and Technology of the People’s Republic of China (grant number G2021016001L, G2021016002L, G2021016028L).
Лицензия: info:eu-repo/semantics/openAccess
Располагается в коллекциях:Статьи биологического факультета

Полный текст документа:
Файл Описание РазмерФормат 
sensors-22-05171-v2.pdf3,27 MBAdobe PDFОткрыть
Показать полное описание документа Статистика Google Scholar



Все документы в Электронной библиотеке защищены авторским правом, все права сохранены.