Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Depth feature bag-based classification method

A technology of deep features and classification methods, applied in image analysis, image data processing, instruments, etc., can solve the problems of large amount of calculation and long time consumption of 3D convolutional neural network, avoid over-fitting problems, shorten time, improve The effect of classification performance

Active Publication Date: 2018-10-12
SOUTHERN MEDICAL UNIVERSITY
View PDF5 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] The present invention provides a classification method based on deep feature bags, through which the problems of large amount of calculation and long time consumption of three-dimensional convolutional neural network are improved, and the over-fitting problem caused by less medical image training data is avoided. And improve the classification performance of support vector machine

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Depth feature bag-based classification method
  • Depth feature bag-based classification method
  • Depth feature bag-based classification method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0040] Such as Figure 1-3 As shown, a classification method based on deep feature bags includes the following steps in turn:

[0041] S1. Collect the 3D image of the target object, outline and segment the region of interest;

[0042] S2. Decompose the three-dimensional image of the region of interest obtained in step S1 into two-dimensional images of three two-dimensional orthogonal planes to obtain three sets of two-dimensional image groups, and select a pixel point from each set of two-dimensional image groups The most two-dimensional images are respectively used as input images of three two-dimensional orthogonal planes;

[0043] S3. Use the convolutional neural network to perform feature extraction on the input images of the three two-dimensional orthogonal planes obtained in step S2, and obtain the depth features of the three two-dimensional orthogonal planes. The depth features of the three two-dimensional orthogonal planes include training Sample depth features and t...

Embodiment 2

[0059] A classification method based on deep feature bags, other features are the same as in Embodiment 1, the difference is that in step S5, the kernel function is a Gaussian kernel, and the Gaussian kernel expression is: where D RBF (x i ,x j ) is the Gaussian distance, V=1 or 2 or 3, representing the cross-section, coronal plane and sagittal plane respectively, that is, ω 1 , ω 2 , ω 3 represent the weighting coefficients of the transverse plane, coronal plane and sagittal plane respectively, Denote the Gaussian distance of the transverse plane, coronal plane and sagittal plane respectively; X i ,X j Represents the i-th and j-th samples in the training sample features; σ is a hyperparameter, which is determined during the training process; k represents the k-th feature in the feature vector; N is the total number of training samples.

[0060] In step S6, the classification function expression is:

[0061]

[0062] where α i Indicates the weight coefficient o...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a depth feature bag-based classification method. The method comprises the following steps of: extracting depth features of representative images of two-dimensional orthogonal planes; recoding the extracted features by adoption of a feature bag model; and finally combining the features of three two-dimensional orthogonal planes through a kernel fusion method so as to solve the classification function to carry out classification and labeling on the images. According to the classification method, the calculated amount in the training and test process is greatly decreased,the data space is saved and the time of the whole calculation process is shortened; a codebook is utilized to code the image features, so that high-dimensional sparse representation is realized, the features are more discriminating and more compact, and the overfitting problem caused by data amount insufficiency can be eased. The three two-dimensional orthogonal planes are respectively multipliedby different weight coefficients when the kernel function is calculated, so that different pieces of space information included in three different sections can be better utilized and the constructed classification function is more discriminating.

Description

technical field [0001] The invention relates to the technical field of medical image classification and prediction, in particular to a classification method based on deep feature bags. Background technique [0002] In recent years, with the rapid development of image processing technology and machine learning methods, medical image processing has attracted more and more attention. Many studies have shown that based on patient medical images including MRI (magnetic resonance) images, CT (computed tomography) images and PET (positron emission computed tomography) images, etc., methods such as pattern recognition and machine learning can be used to achieve tumor benign Malignant classification, preoperative prediction, and prognosis analysis provide powerful help for clinical decision-making. [0003] Deep learning methods have powerful learning capabilities and have achieved great success in image processing, object detection, and other fields, while convolutional neural netw...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/00G06K9/62
CPCG06T7/0012G06T2207/20084G06T2207/20081G06T2207/10104G06T2207/10116G06T2207/10081G06T2207/10088G06T2207/30068G06T2207/30064G06T2207/30081G06F18/23213G06F18/2413
Inventor 张煜罗嘉秀宁振源
Owner SOUTHERN MEDICAL UNIVERSITY
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products