Improved few-sample target detection method based on Faster RCNN

A target detection and sample technology, applied in the field of deep learning target detection and few-sample learning, can solve the problems of restricting the application and promotion of target detection methods, covering a single task, and a single application scenario, so as to improve detection performance, improve detection accuracy, The effect of reducing within-class variance

Pending Publication Date: 2022-06-24
INST OF OPTICS & ELECTRONICS - CHINESE ACAD OF SCI
View PDF0 Cites 3 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Although many mature target detection algorithms have been applied in practice, their disadvantages have also begun to emerge. One of the bigger problems is that the application of most mature algorithms requires the help of large-scale labeled data, and in the vast majority of In most practical application scenarios, collecting labeled data that meets the requireme

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Improved few-sample target detection method based on Faster RCNN
  • Improved few-sample target detection method based on Faster RCNN
  • Improved few-sample target detection method based on Faster RCNN

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0027] The specific embodiments of the present invention will be described in detail below with reference to the accompanying drawings. But the following examples are only limited to explain the present invention, and the protection scope of the present invention should include the full content of the claims; and through the following examples, those skilled in the art can realize the full content of the claims of the present invention.

[0028] In this specific embodiment, the method for detecting a few-sample target mainly includes the following steps:

[0029] Step 1: Input Image

[0030] The input images are divided into support set images and query set images. The image to be detected is the query set image, which contains unlabeled target samples, and the support set image is a small number of images containing labeled target samples.

[0031] Step 2: Feature Extraction

[0032] The support set image and the query image to be detected are regarded as the support image ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention provides an improved few-sample target detection method based on a Faster RCNN (Regional Convolutional Neural Network). On the basis of a traditional target detection framework Faster RCNN, deep optimization and improvement are carried out on the Faster RCNN in combination with a CBAM attention module, a global-local relation detector and a classifier based on cosine Softmax loss, so that the Faster RCNN is beneficial to few-sample target detection. According to the invention, the CBAM attention module is combined with the RPN network in the Faster RCNN to form the CBAM-Attention-RPN network based on the attention mechanism, thereby facilitating the generation of candidate boxes of a specific category, and improving the precision of a subsequent network. According to the method, a global-local relation detector is provided, and the support image features and the query image features are subjected to feature matching by using the global relation and the local relation, so that the candidate box more related to the target category can be obtained. According to the method, a classifier based on cosine Softmax loss is provided as a classification branch, so that intra-class variance can be reduced, and the detection precision of a new class can be improved.

Description

technical field [0001] The invention relates to the fields of deep learning target detection and few-sample learning, in particular to an improved few-sample target detection method based on FasterRCNN. Background technique [0002] As an important computer vision task, object detection aims to find out the objects of interest in the image, determine their location and category, and is the basis of many other computer vision tasks. In recent years, with the emergence of powerful computing equipment, large-scale data sets, and advanced models and algorithms, object detection based on deep learning technology has developed rapidly and gradually replaced traditional detection methods. Now, object detection has been widely used in many practical applications, such as autonomous driving, robot vision, video surveillance, etc. Although many mature target detection algorithms have been practically applied, their drawbacks have also begun to emerge. One of the major problems is tha...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06V10/764G06V10/42G06V10/44G06V10/74G06V10/82G06K9/62G06N3/04G06N3/08
CPCG06N3/08G06N3/048G06N3/045G06F18/22G06F18/2415
Inventor 江彧杜芸彦毛耀李鸿杨锦辉刘超彭锦锦
Owner INST OF OPTICS & ELECTRONICS - CHINESE ACAD OF SCI
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products