Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Multi-mode cooperation esophageal cancer lesion image segmentation system based on self-sampling similarity

A multi-modal, esophageal cancer technology, applied in the field of medical image intelligent processing, can solve the problems that researchers only consider, and achieve the effect of improving efficiency and precise segmentation

Active Publication Date: 2021-06-01
FUDAN UNIV
View PDF5 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] Although the above methods have achieved certain results, the researchers only considered the use of white light images

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Multi-mode cooperation esophageal cancer lesion image segmentation system based on self-sampling similarity
  • Multi-mode cooperation esophageal cancer lesion image segmentation system based on self-sampling similarity
  • Multi-mode cooperation esophageal cancer lesion image segmentation system based on self-sampling similarity

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0034] The embodiments of the present invention will be described in detail below, but the protection scope of the present invention is not limited to the examples.

[0035] use figure 1 With the network structure in , a multimodal neural network is trained with 268 white-light NBI image pairs to obtain an automatic lesion region segmentation model.

[0036] The specific steps are:

[0037] (1) When training, scale the image to 500×500. Set the initial learning rate to 0.0001, the decay rate to 0.9, and decay once every two cycles. Minimize the loss function using mini-batch stochastic gradient descent. The batch size is set to 4. Update all parameters in the network, minimize the loss function of the network, and train until convergence;

[0038] (2) When testing, the image I Adjust the size to 500×500, input it into the trained model, and the model outputs white light images and NBI lesion area segmentation results;

[0039] Figure 4 Showing the segmentation results...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention belongs to the technical field of medical image intelligent processing, and particularly relates to a multi-mode cooperation esophageal cancer lesion image segmentation system based on self-sampling similarity. For early esophageal cancer diagnosis, firstly, a white light image and an NBI data set which are paired are constructed; the segmentation system comprises a feature extraction encoder, a self-sampling similar feature separation module and a feature fusion decoder. the white light image and the NBI are taken as multi-modal input, and features of two modals are extracted respectively by two encoders; for each mode, the self-sampling similar feature separation module effectively distinguishes lesion region features from normal regions; and the decoder completes multi-modal feature fusion in a feature domain, and finally outputs lesion segmentation results of two modal images. Experimental results show that the features of different modes can be reasonably fused, the lesion area and the normal area are remarkably distinguished, accurate segmentation of the lesion area is achieved, and the efficiency of clinical diagnosis is improved.

Description

technical field [0001] The invention belongs to the technical field of medical image intelligent processing, in particular to an esophageal cancer lesion image segmentation system, more specifically, to a self-sampling similarity-based multi-modal cooperative esophageal cancer lesion image segmentation system. Background technique [0002] Esophageal cancer is a common upper gastrointestinal cancer in China and developing countries, ranking sixth in cancer deaths in the world [1] . Studies have shown that the 5-year survival rate of patients with advanced esophageal cancer is less than 20%, but it can exceed 85% in the early stage. Advanced unresectable tumors are the main cause of poor prognosis in most patients with esophageal cancer. Therefore, accurate diagnosis of early esophageal cancer plays an important role in reducing mortality. [0003] Endoscopic screening significantly reduces mortality from esophageal cancer. White-light endoscopic imaging (WH) is the most ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/11G06T7/00G06N3/04G06N3/08
CPCG06T7/11G06T7/0012G06N3/08G06T2207/10068G06T2207/20081G06T2207/20084G06T2207/30096G06N3/045
Inventor 钟芸诗颜波蔡世伦谭伟敏林青
Owner FUDAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products