• Complex
  • Title
  • Keyword
  • Abstract
  • Scholars
  • Journal
  • ISSN
  • Conference
成果搜索

author:

Zhu, Shitao (Zhu, Shitao.) [1] | Lin, Ling (Lin, Ling.) [2] | Liu, Qin (Liu, Qin.) [3] | Liu, Jing (Liu, Jing.) [4] | Song, Yanwen (Song, Yanwen.) [5] | Xu, Qin (Xu, Qin.) [6]

Indexed by:

Scopus SCIE

Abstract:

Background: Automated tumor segmentation and survival prediction are critical to clinical diagnosis and treatment. This study aimed to develop deep-learning models for automatic tumor segmentation and survival prediction in magnetic resonance imaging (MRI) of cervical cancer (CC) by combining deep neural networks and Transformer architecture. Methods: This study included 406 patients with CC, each with comprehensive clinical information and MRI scans. We randomly divided patients into training, validation, and independent test cohorts in a 6:2:2 ratio. During the model training, we employed two architecture types: one being a hybrid model combining convolutional neural network (CNN) and ransformer (CoTr) and one of pure CNNs. For survival prediction, the hybrid model combined tumor image features extracted by segmentation models with clinical information. The performance of the segmentation models was evaluated using the Dice similarity coefficient (DSC) and 95% Hausdorff distance (HD95). The performance of the survival models was assessed using the concordance index . Results: The CoTr model performed well in both contrast-enhanced T1-weighted (ceT1W) and T2-weighted (T2W) imaging segmentation tasks, with average DSCs of 0.827 and 0.820, respectively, which outperformed other the CNN models such as U-Net (DSC: 0.807 and 0.808), attention U-Net (DSC: 0.814 and 0.811), and V-Net (DSC: 0.805 and 0.807). For survival prediction, the proposed deep-learning model significantly outperformed traditional methods, yielding a concordance index of 0.732. Moreover, it effectively divided patients into low-risk and high-risk groups for disease progression (P<0.001). Conclusions: Combining Transformer architecture with a CNN can improve MRI tumor segmentation, and this deep-learning model excelled in the survival prediction of patients with CC as compared to traditional methods.

Keyword:

cervical cancer (CC) deep learning Magnetic resonance imaging (MRI) survival Transformer

Community:

  • [ 1 ] [Zhu, Shitao]Fuzhou Univ, Coll Comp & Data Sci, Fuzhou, Peoples R China
  • [ 2 ] [Lin, Ling]Fujian Med Univ, Fujian Canc Hosp, Dept Gynecol, Clin Oncol Sch, 420 Fuma Rd, Fuzhou 350011, Peoples R China
  • [ 3 ] [Liu, Jing]Fujian Med Univ, Fujian Canc Hosp, Dept Gynecol, Clin Oncol Sch, 420 Fuma Rd, Fuzhou 350011, Peoples R China
  • [ 4 ] [Xu, Qin]Fujian Med Univ, Fujian Canc Hosp, Dept Gynecol, Clin Oncol Sch, 420 Fuma Rd, Fuzhou 350011, Peoples R China
  • [ 5 ] [Liu, Qin]Univ Hong Kong, Li Ka Shing Fac Med, Dept Clin Oncol, Hong Kong, Peoples R China
  • [ 6 ] [Song, Yanwen]Xiamen Humanity Hosp, Dept Radiat Oncol, Xiamen, Peoples R China

Reprint 's Address:

  • [Xu, Qin]Fujian Med Univ, Fujian Canc Hosp, Dept Gynecol, Clin Oncol Sch, 420 Fuma Rd, Fuzhou 350011, Peoples R China;;

Show more details

Related Keywords:

Source :

QUANTITATIVE IMAGING IN MEDICINE AND SURGERY

ISSN: 2223-4292

Year: 2024

Issue: 8

Volume: 14

2 . 9 0 0

JCR@2023

Cited Count:

WoS CC Cited Count:

SCOPUS Cited Count:

ESI Highly Cited Papers on the List: 0 Unfold All

WanFang Cited Count:

Chinese Cited Count:

30 Days PV: 0

Online/Total:88/10057336
Address:FZU Library(No.2 Xuyuan Road, Fuzhou, Fujian, PRC Post Code:350116) Contact Us:0591-22865326
Copyright:FZU Library Technical Support:Beijing Aegean Software Co., Ltd. 闽ICP备05005463号-1