A T1WI-fMRI image tumor collaborative segmentation method based on 3D-Unet and graph theory segmentation

A collaborative segmentation and tumor technology, applied in the field of image processing, can solve the problems that it is difficult to express all the information of the tumor, and cannot accurately describe the tumor boundary, so as to achieve the effect of accurate description

Active Publication Date: 2019-05-10
ZHEJIANG UNIV OF TECH +1
View PDF2 Cites 19 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] In order to overcome the problem that the existing single-modal image-based tumor segmentation method is difficult to express all the information of the tumor and the existing deep learning network cannot accurately describe the tumor boundary, the present invent

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A T1WI-fMRI image tumor collaborative segmentation method based on 3D-Unet and graph theory segmentation
  • A T1WI-fMRI image tumor collaborative segmentation method based on 3D-Unet and graph theory segmentation
  • A T1WI-fMRI image tumor collaborative segmentation method based on 3D-Unet and graph theory segmentation

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0027] In order to make the object, technical solution and advantages of the present invention clearer, the present invention will be further explained below in conjunction with specific implementation and accompanying drawings.

[0028] refer to figure 1 and figure 2 , a T1WI-fMRI image tumor collaborative segmentation method based on 3D-Unet and graphic segmentation, which can make full use of the information between multi-modal MRIs to realize automatic and accurate description of tumors and tumor subregions, including the following steps:

[0029] Step 1. Image preprocessing: Obtain the MRI training data set, interpolate the MRI images of different modalities to the same voxel size through the bilinear interpolation algorithm, and use the registration algorithm based on B-splines to align the T1WI and the corresponding fMRI space Align the coordinates, and further generate the corresponding brain tissue / non-brain area mask through gray value thresholding;

[0030] Step ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

A T1WI-fMRI image tumor collaborative segmentation method based on 3D-Unet and graph theory segmentation is provided, the method comprises the following steps: step 1, obtaining a T1WI and fMRI tumorsegmentation data set, and carrying out preprocessing; Step 2, generating a training sample; Step 3, training two independent 3D-Unet network; carrying out tumor and tumor sub-region segmentation onthe T1WI image and the fMRI image respectively, and helping high-quality voxel-level tumor and tumor sub-region/non-tumor masks and probability graphs to be generated through the strong description capability of the network; and step 4, carrying out fine segmentation based on a graph theory method, and simultaneously generating final tumor and tumor subregion segmentation results on the T1WI and fMRI images by using the two obtained probability mapping graphs and the coarse segmentation mask in a continuous graph segmentation model with label consistency constraint. According to the invention,automatic and accurate description of tumors and tumor subregions (including tumor necrosis, active tumors and tumor peripheral edema) is realized.

Description

technical field [0001] The present invention relates to the field of image processing, in particular to a tumor image segmentation method based on deep learning. Background technique [0002] Brain tumors are abnormal tissue growths that can lead to increased intracranial pressure and damage to the central nervous system, thereby threatening the patient's life. Reliable detection and segmentation of brain tumors from magnetic resonance images can aid in surgical planning and treatment assessment in medical diagnosis. Currently, most brain tumors are segmented manually by medical experts, which takes a long time and relies too much on the subjective experience of experts. Computer-aided tumor segmentation plays an increasingly important role in modern medical analysis. However, due to the large spatial and structural variability of brain tumors, and the overlap of the gray intensity range of tumors with that of healthy tissues, traditional machine learning methods are still...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06T7/11G06T7/136
Inventor 冯远静谭志豪陈余凯金儿曾庆润李思琦诸葛启钏
Owner ZHEJIANG UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products