Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Neural network prediction method for colorectal cancer treatment effect based on MRI and CT images

A colorectal cancer, treatment effect technology, applied in the field of neural network prediction, can solve the problems of misjudgment, interference, feature artifacts, etc.

Pending Publication Date: 2022-08-09
FUZHOU UNIVERSITY
View PDF2 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] Disadvantage 1: Patent Publication No. CN111210909A, this method is easily interfered by data set training when delineating the lesion area, which may easily lead to misjudgment of lesion area discrimination
[0005] Disadvantage 2: The patent with publication number CN113345576A only uses CT multimodal images, and cannot make good use of the rich image information contained in CT and MRI images
However, traditional MRI and CT image fusion techniques have the potential for characteristic artifacts due to the peristalsis of most organs

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Neural network prediction method for colorectal cancer treatment effect based on MRI and CT images
  • Neural network prediction method for colorectal cancer treatment effect based on MRI and CT images
  • Neural network prediction method for colorectal cancer treatment effect based on MRI and CT images

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0030] The technical solutions of the present invention will be described in detail below with reference to the accompanying drawings.

[0031] The invention provides a neural network prediction method for colorectal cancer treatment effect based on MRI and CT images. First, the tumor region of the input image is segmented through a deep learning network; then, the region of interest ROI is automatically extracted according to the segmentation result; , fused MRI and CT features by channel, and used a convolutional neural network for pCR classification.

[0032] The present invention also provides a neural network prediction system for the treatment effect of colorectal cancer based on MRI and CT images, including a segmentation module and a classification module;

[0033] The segmentation module uses the CE-Net network to automatically segment the tumor region of the input image, and uses the binary mask obtained from the segmentation to locate and crop the image tumor region...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a neural network prediction method for colorectal cancer treatment effect based on MRI and CT images. The method comprises the following steps: firstly segmenting a lesion through a deep learning network, and then automatically extracting a region of interest (ROI) according to a segmentation result. And finally, MRI and CT features are fused through a channel, and pCR classification is carried out by using a convolutional neural network. The method provided by the invention can reduce the complexity and time consumption of manual segmentation of the tumor region in predicting the pCR of the patient.

Description

technical field [0001] The invention relates to a neural network prediction method for the treatment effect of colorectal cancer based on MRI and CT images. Background technique [0002] The patent with publication number CN111210909A discloses a deep neural network-based automatic diagnosis system for rectal cancer T staging, including a deep neural network model, which includes: a feature extraction network, a region generation network, a pooling layer, a classification and regression layer; first, use the ResNet-50 model to learn and judge the layer type of the entire input image, learn to judge the orientation of the image, and establish a layer recognition module; then, the ResNet101 model is used as the basic network. A target detection model is trained at each image level to delineate the tumor area and determine the T stage of the tumor. The system has consistent processing results and comparable accuracy for easy integration and large-scale applications [0003] T...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06V10/764G06N3/04G06N3/08G06T7/00G06T7/11G06V10/25G06V10/40G16H50/30
CPCG06V10/764G06T7/0012G06T7/11G06V10/25G16H50/30G06N3/08G06V10/40G06T2207/10081G06T2207/10088G06N3/045
Inventor 李兰兰胡益煌王大彪徐斌
Owner FUZHOU UNIVERSITY
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products