Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Multi-modal MR image brain tumor segmentation method based on deep learning and multi-guidance

A deep learning, multi-modal technology, applied in the field of image processing, can solve the problems of low segmentation accuracy, uneven brightness distribution, blurred target boundary and so on

Active Publication Date: 2021-02-12
ZHONGBEI UNIV
View PDF8 Cites 13 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] The present invention aims to overcome the deficiencies of the existing technologies, solve the problem of low segmentation accuracy caused by uneven brightness distribution and blurred target boundaries in MRI, and provide a multi-modal MRI brain tumor segmentation method based on deep learning and multi-guidance. The invention fuses the overall glioma segmentation results and glioma edge prediction results through the proposed fusion mechanism, and realizes the multi-modal MRI glioma segmentation under the guidance and fusion of multi-feature maps

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Multi-modal MR image brain tumor segmentation method based on deep learning and multi-guidance
  • Multi-modal MR image brain tumor segmentation method based on deep learning and multi-guidance
  • Multi-modal MR image brain tumor segmentation method based on deep learning and multi-guidance

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0031] In order to make the purpose, features and advantages of the present invention understandable, the specific implementation manners of the present invention will be described in detail below in conjunction with the accompanying drawings.

[0032] like figure 1 As shown, the multimodal MRI brain tumor segmentation method based on deep learning and multi-guidance is mainly composed of three network modules: the overall brain glioma segmentation network module, the brain glioma edge prediction network module and the brain glioma substructure The segmentation network module, wherein, the overall brain glioma segmentation network module and the brain glioma edge prediction network module are used to generate multiple guide maps, and these guide information are used to guide the brain glioma sub-region segmentation, brain glioma sub-structure segmentation The segmentation result obtained by the network module is the final segmentation result, and the substructures of different...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a multi-modal MR image brain tumor segmentation method based on deep learning and multi-guidance, belongs to the field of image processing, and solves three problems in a multi-modal MRI brain glioma segmentation process: (1) the problem of inaccurate segmentation caused by unclear brain glioma boundary; (2) some discrete wrong segmentation points appear in a segmentation result due to uneven brightness distribution of the multi-mode MRI; and (3) the problem of carrying out feature fusion on various kinds of guide information in a brain glioma MRI segmentation network.Feature fusion is carried out on an overall brain glioma segmentation result and a brain glioma edge prediction result through a proposed fusion mechanism, so that multi-mode MRI brain glioma segmentation under multi-feature map guidance and fusion is realized. According to the deep segmentation network, high-accuracy segmentation is realized with less parameter quantity, so that the method is convenient to embed into edge equipment to assist doctors in diagnosing and analyzing brain glioma.

Description

technical field [0001] The invention belongs to the field of image processing, in particular to a multi-modal MRI brain tumor segmentation method based on deep learning and multi-guidance. Background technique [0002] Glioma is a primary brain tumor, accounting for about 80% of malignant brain tumors. MRI assists doctors in diagnosing gliomas, and doctors can use MRI to locate gliomas, and then conduct quantitative analysis of gliomas to implement treatment plans such as radiotherapy and surgery. Generally, doctors diagnose and label gliomas through multimodal MRI, but it is quite time-consuming to manually label gliomas slice by slice. At the same time, different doctors have different levels of experience, which leads to large differences in the results marked for the same case, which is not conducive to the precise treatment of glioma. Therefore, the MR image labeling of glioma needs computer-assisted doctors to complete. [0003] At present, the image segmentation me...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/00G06T7/11G06N3/08G06N3/04
CPCG06T7/0012G06T7/11G06N3/08G06T2207/10088G06T2207/30096G06N3/045
Inventor 张晋京曾建潮秦品乐赵利军
Owner ZHONGBEI UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products