Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A Method of Object Classification and Pose Detection Based on Deep Convolutional Neural Network

A deep convolution and target classification technology, applied in biological neural network model, neural architecture, image analysis and other directions, can solve the problems of limited detection ability, decreased signal-to-noise ratio, inability to characterize posture, etc., to improve the accuracy and complete the characterization. , to avoid the effect of a large number of calculations

Active Publication Date: 2019-09-24
BEIJING INSTITUTE OF TECHNOLOGYGY
View PDF5 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Therefore, if the target framed by the upright rectangular frame will contain a lot of background information, the signal-to-noise ratio will drop significantly, which is not conducive to the subsequent target classification and position correction.
And because the method does not take into account the target's attitude angle changes in the image, it cannot characterize the target's pose
Based on the above points of view, the current target detection methods have limited detection capabilities, and the representation of the target state is not sufficient.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A Method of Object Classification and Pose Detection Based on Deep Convolutional Neural Network
  • A Method of Object Classification and Pose Detection Based on Deep Convolutional Neural Network
  • A Method of Object Classification and Pose Detection Based on Deep Convolutional Neural Network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0032] In order to express the purpose, technical solutions and advantages of the present invention more clearly, the following in conjunction with specific embodiments, and with reference to the accompanying drawings, provides a further detailed description of the present invention, but the scope of protection of the present invention is not limited to the following embodiments.

[0033] The framework for detecting the target in this embodiment is as follows figure 1 As shown in Fig. 1, after the deep convolutional features of the image are obtained through the convolutional neural network, the candidate window with the attitude angle is mapped to the feature layer to obtain the directional regional feature vector, and then the final feature vector is obtained by classifying and predicting the feature vector. Test results.

[0034] The data sets used in this example are partially derived from the public image databases PASCAL VOC 2007 and PASCAL VOC2012. In addition, we have ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention belongs to the field of image processing and target detection, and discloses a target classification and attitude detection method based on a deep convolutional neural network. The core content of the method includes: 1. a method for making a sample set; 2. a method for generating candidate windows; 3. a method for feature extraction; 4. a method for network training; After the image to be detected gets the deep convolution feature through the convolutional neural network, the candidate window with the attitude angle is mapped to the feature layer to obtain the directional regional feature vector, and the final detection result is obtained by classifying and predicting the feature vector . This method can extract more pure target features from samples, improve the accuracy of classification, and realize the function of target attitude angle detection.

Description

technical field [0001] The invention belongs to the field of image processing and target detection, and discloses a target classification and attitude detection method based on a deep convolutional neural network. Background technique [0002] Object detection needs to accurately mark the location of the object on a given image and identify the type of object. The variable size of the target, the variable position of the target in the image, the variable attitude angle of the target, and the change of background illumination will cause difficulties in the detection process, resulting in a decrease in the detection accuracy. [0003] Classical target detection methods usually use template matching plus sliding window to solve the problem of target recognition and positioning in the image, which takes a long time and is less effective when the target changes drastically. The emergence of convolutional neural networks has led to the development of this topic, which has greatly...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06T7/00G06N3/04
CPCG06T7/0002G06T2207/10004G06T2207/20081G06N3/045
Inventor 刘明杜浩源董立泉赵跃进刘小华惠梅孔令琴
Owner BEIJING INSTITUTE OF TECHNOLOGYGY
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products