Training method and device for neural network model, storage medium and electronic device

A neural network model and training method technology, applied in the neural network model training method and device, storage medium and electronic equipment fields, can solve the problem of increasing difficulty, unable to effectively mine the correlation of multiple modal images, and not considering multiple models. feature differences and commonalities in images

Active Publication Date: 2022-07-22
INFERVISION MEDICAL TECH CO LTD
View PDF3 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, in medical image analysis, it is usually necessary to comprehensively analyze images of multiple modalities of the same body part, thus increasing the difficulty of applying deep learning algorithms to medical image analysis
[0003] In addition, although some researchers have used deep neural networks to train images of multiple modalities to assist doctors in making correct judgments, they usually simply combine the features of images of multiple modalities without considering the The differences and commonalities between features in images of multiple modalities, so the correlation between multiple modal images cannot be effectively mined

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Training method and device for neural network model, storage medium and electronic device
  • Training method and device for neural network model, storage medium and electronic device
  • Training method and device for neural network model, storage medium and electronic device

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0025] The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention. Obviously, the described embodiments are only a part of the embodiments of the present invention, but not all of the embodiments. Based on the embodiments of the present invention, all other embodiments obtained by those of ordinary skill in the art without creative efforts shall fall within the protection scope of the present invention.

[0026] figure 1 Shown is a schematic flowchart of a training method for a neural network model provided by an embodiment of the present invention. The method may be performed by a computer device (eg, a server). like figure 1 As shown, the method includes the following.

[0027] S110: Perform feature extraction on multiple modal images of the same body part respectively to obtain multiple corresponding modal features.

[0028] Spe...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention provides a training method and device for a neural network model, a storage medium and an electronic device. The training method of the neural network model includes: extracting features from multiple modal images of the same body part to obtain corresponding multiple modal features; fusing the multiple modal features to obtain a first fusion vector; using a cyclic neural network method Perform feature aggregation and reconstruction on the first fusion vector to obtain a first feature vector; calculate a first loss function according to the first feature vector; train a neural network model according to the first loss function. The present invention obtains the first fusion vector by fusing multiple modal features, and uses a cyclic neural network to perform feature aggregation and reconstruction on the first fusion vector to obtain the first feature vector, so as to effectively mine multiple modal images. Correlation of modal features.

Description

technical field [0001] The invention relates to the technical field of medical imaging, in particular to a method and device for training a neural network model, a storage medium and an electronic device. Background technique [0002] In recent years, deep learning algorithms have been well applied in the field of image processing, which has inspired many researchers to apply this technology to medical image analysis. However, in medical image analysis, it is usually necessary to comprehensively analyze images of multiple modalities of the same body part, which increases the difficulty of applying deep learning algorithms to medical image analysis. [0003] In addition, although some researchers have used deep neural networks to train images of multiple modalities to assist doctors in making correct judgments, they usually use simple merging of the features of images of multiple modalities without considering Differences and commonalities between features in images of multi...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Patents(China)
IPC IPC(8): G06N3/08G06T7/00
CPCG06N3/08G06T7/0012G06T2207/10088G06T2207/20221
Inventor 陈伟导印宏坤武江芬张荣国李新阳王少康陈宽
Owner INFERVISION MEDICAL TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products