• Complex
  • Title
  • Keyword
  • Abstract
  • Scholars
  • Journal
  • ISSN
  • Conference
成果搜索

author:

Zhou, Xiaogen (Zhou, Xiaogen.) [1] | Nie, Xingqing (Nie, Xingqing.) [2] | Li, Zhiqiang (Li, Zhiqiang.) [3] | Lin, Xingtao (Lin, Xingtao.) [4] | Xue, Ensheng (Xue, Ensheng.) [5] | Wang, Luoyan (Wang, Luoyan.) [6] | Lan, Junlin (Lan, Junlin.) [7] | Chen, Gang (Chen, Gang.) [8] | Du, Min (Du, Min.) [9] | Tong, Tong (Tong, Tong.) [10]

Indexed by:

EI

Abstract:

Skin lesions and thyroid cancer have become diseases with a high incidence. The computer-aided diagnosis (CAD) system for dermatological diseases offers one of the most remarkable performance where deep learning technologies demonstrate their superiority in surpassing human experts. For developing a CAD system, a critical step is skin lesions and thyroid nodules diagnosis from dermoscopic images and ultrasound images, respectively. Although notable successes have been obtained using deep convolutional neural network (DCNN) models, several challenges hamper the practical applications in clinical due to the complexity of clinical data, e.g., skin lesions or thyroid nodules are irregular shapes or low contrast. To alleviate these issues, we propose a novel dual encoder-decoder network, called H-Net, for automated thyroid nodule and skin lesion segmentation. Specifically, a shallow CNN is applied at its left to learn the low-level details information called L-Net and a deep CNN is employed at its right to capture the high-level information called R-Net. Furthermore, to transfer information between the L-Net and the R-Net mutually, we propose a novel crossed skip connection strategy, which is a specific reliability skip connection way. In addition, to enhance the representation learning ability of the proposed pipeline, we propose a novel contextual information encoding module, which replaces conventional convolutional layers in H-Net. Meanwhile, we propose a novel hybrid loss to alleviate the imbalance training problem. To validate the effectiveness of H-Net, 600 pairs of dermoscopic images and 139 pairs of ultrasound images have been used for evaluation in experiments. Seven latest biomedical image segmentation approaches are compared, and ten metrics are utilized to evaluate the segmentation performance. Extensive experimental results demonstrate that our H-Net yields the new record, achieving a mIoU value of 84.8% on the ISIC-2017 dataset and a mIoU value of 87.5% on the TNUI-2021 dataset, which outperforms state-of-the-art approaches in both visual comparisons and quantitative evaluation. The codes are available at https://github.com/zxg3017/H-Net. © 2022 Elsevier Inc.

Keyword:

Computer aided diagnosis Convolution Convolutional neural networks Decoding Deep neural networks Dermatology Image enhancement Image segmentation Signal encoding Ultrasonics

Community:

  • [ 1 ] [Zhou, Xiaogen]College of Physics and Information Engineering, Fuzhou University, Fuzhou, China
  • [ 2 ] [Zhou, Xiaogen]Fujian Key Lab of Medical Instrumentation & Pharmaceutical Technology, Fuzhou University, Fuzhou, China
  • [ 3 ] [Nie, Xingqing]College of Physics and Information Engineering, Fuzhou University, Fuzhou, China
  • [ 4 ] [Nie, Xingqing]Fujian Key Lab of Medical Instrumentation & Pharmaceutical Technology, Fuzhou University, Fuzhou, China
  • [ 5 ] [Li, Zhiqiang]College of Physics and Information Engineering, Fuzhou University, Fuzhou, China
  • [ 6 ] [Li, Zhiqiang]Fujian Key Lab of Medical Instrumentation & Pharmaceutical Technology, Fuzhou University, Fuzhou, China
  • [ 7 ] [Lin, Xingtao]College of Physics and Information Engineering, Fuzhou University, Fuzhou, China
  • [ 8 ] [Lin, Xingtao]Fujian Key Lab of Medical Instrumentation & Pharmaceutical Technology, Fuzhou University, Fuzhou, China
  • [ 9 ] [Xue, Ensheng]Fujian Medical University Union Hospital, Fuzhou, China
  • [ 10 ] [Wang, Luoyan]College of Physics and Information Engineering, Fuzhou University, Fuzhou, China
  • [ 11 ] [Wang, Luoyan]Fujian Key Lab of Medical Instrumentation & Pharmaceutical Technology, Fuzhou University, Fuzhou, China
  • [ 12 ] [Lan, Junlin]College of Physics and Information Engineering, Fuzhou University, Fuzhou, China
  • [ 13 ] [Lan, Junlin]Fujian Key Lab of Medical Instrumentation & Pharmaceutical Technology, Fuzhou University, Fuzhou, China
  • [ 14 ] [Chen, Gang]Department of Pathology, Fujian Cancer Hospital & Fujian Medical University Cancer Hospital, Fuzhou, China
  • [ 15 ] [Du, Min]College of Physics and Information Engineering, Fuzhou University, Fuzhou, China
  • [ 16 ] [Du, Min]Fujian Key Lab of Medical Instrumentation & Pharmaceutical Technology, Fuzhou University, Fuzhou, China
  • [ 17 ] [Tong, Tong]College of Physics and Information Engineering, Fuzhou University, Fuzhou, China
  • [ 18 ] [Tong, Tong]Fujian Key Lab of Medical Instrumentation & Pharmaceutical Technology, Fuzhou University, Fuzhou, China
  • [ 19 ] [Tong, Tong]Imperial Vision Technology, Fuzhou, China

Reprint 's Address:

Email:

Show more details

Related Keywords:

Related Article:

Source :

Information Sciences

ISSN: 0020-0255

Year: 2022

Volume: 613

Page: 575-590

8 . 1

JCR@2022

0 . 0 0 0

JCR@2023

ESI HC Threshold:61

JCR Journal Grade:1

CAS Journal Grade:1

Cited Count:

WoS CC Cited Count:

SCOPUS Cited Count: 9

ESI Highly Cited Papers on the List: 0 Unfold All

WanFang Cited Count:

Chinese Cited Count:

30 Days PV: 0

Affiliated Colleges:

Online/Total:164/10112614
Address:FZU Library(No.2 Xuyuan Road, Fuzhou, Fujian, PRC Post Code:350116) Contact Us:0591-22865326
Copyright:FZU Library Technical Support:Beijing Aegean Software Co., Ltd. 闽ICP备05005463号-1