A Convolutional Neural Network Object Detection Method Based on Pyramid Input Gain

A convolutional neural network and target detection technology, which is applied in biological neural network models, neural architecture, image enhancement, etc., can solve the problems of high missed detection rate and low reliability, and achieve accuracy assurance, wide application, and solution The effect of precision loss

Active Publication Date: 2021-06-22
BEIJING INSTITUTE OF TECHNOLOGYGY
View PDF5 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0009] The purpose of the present invention is to propose a method for target detection based on a convolutional neural network based on pyramidal input gain, aiming at the technical defects of low reliability and high rate of missed detection in the existing method for target detection based on convolutional neural networks. The above method proposes a convolutional neural network model PiaNet based on pyramid input gain. PiaNet combines multi-scale processing and multi-task learning to effectively improve detection accuracy and detection accuracy.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A Convolutional Neural Network Object Detection Method Based on Pyramid Input Gain
  • A Convolutional Neural Network Object Detection Method Based on Pyramid Input Gain
  • A Convolutional Neural Network Object Detection Method Based on Pyramid Input Gain

Examples

Experimental program
Comparison scheme
Effect test

Embodiment

[0051] According to the method steps described in the summary of the invention, a PiaNet network model structure corresponding to an embodiment of the present invention for detecting pulmonary nodules on CT images is as follows figure 1 shown.

[0052] Step (1) data preprocessing;

[0053] The original image is preprocessed by de-meaning and grayscale normalization to obtain the preprocessed image, where the original CT input image is as figure 2 shown;

[0054] Step (2) inputs the preprocessed image output of step (1) into the PiaNet network;

[0055] Step (2) comprises the following sub-steps again:

[0056] Step (2A) The input image is subjected to an average pooling operation on the source connection path to obtain a compressed source image. Among them, the multi-scale source image generated by multi-level average pooling can form an image pyramid, such as image 3 shown;

[0057] Step (2B) At the same time, the input image undergoes feature extraction through convo...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention relates to a target detection method of a convolutional neural network based on pyramid input gain, and belongs to the technical field of computer vision and target detection. The target detection method is based on the convolutional neural network model PiaNet comprising a feature extraction module and a multi-task prediction module; the target detection method includes a training phase and a testing phase; the training phase adopts a two-stage migration learning strategy, comprising: step (1 ) data enhancement and data preprocessing to produce the training set of the first stage training, the training set and the test set of the second stage training; step (2) carry out the first stage training in the two classification network; step (3) carry out the second stage Stage training to obtain the trained PiaNet network; the test stage is to accurately detect the target, specifically: input the test set into the trained PiaNet network, and output the detection frame position and classification results through the multi-task loss function. It has wide applicability and high robustness.

Description

technical field [0001] The invention relates to a target detection method of a convolutional neural network based on pyramid input gain, and belongs to the technical field of computer vision and target detection. Background technique [0002] Target detection refers to finding the location information such as the specific position and size of all interested targets in the image. This problem is one of the basic problems in the fields of computer vision and pattern recognition. It is widely used in applications such as monitoring and analysis, face recognition, and nodule or tumor detection in medical CT images. [0003] Existing object detection methods are mainly divided into two categories: [0004] 1) Traditional object detection methods. Traditional target detection generally adopts the sliding window framework, which mainly includes steps such as image space segmentation, feature design and extraction, and classification recognition. It needs to search in several dime...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Patents(China)
IPC IPC(8): G06K9/62G06T7/00G06N3/04
CPCG06T7/0012G06T2207/30064G06T2207/10012G06T2207/10081G06N3/045G06F18/24G06F18/214
Inventor 刘峡壁刘伟华李慧玉
Owner BEIJING INSTITUTE OF TECHNOLOGYGY
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products