Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Deep learning image target mapping and positioning method based on weak supervision information

A deep learning and positioning method technology, applied in the field of image processing, can solve the problem that the feature points are not enough to represent the original features.

Active Publication Date: 2018-11-06
PEKING UNIV
View PDF10 Cites 37 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] The disadvantage of the above-mentioned image mapping method in the prior art is: this method of pooling the feature map using the method of calculating the global average value or the global maximum value will cause the feature points after pooling to have insufficient representation ability for the original features.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Deep learning image target mapping and positioning method based on weak supervision information
  • Deep learning image target mapping and positioning method based on weak supervision information
  • Deep learning image target mapping and positioning method based on weak supervision information

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0036] Embodiments of the present invention are described in detail below, examples of which are shown in the drawings, wherein the same or similar reference numerals denote the same or similar elements or elements having the same or similar functions throughout. The embodiments described below by referring to the figures are exemplary only for explaining the present invention and should not be construed as limiting the present invention.

[0037] Those skilled in the art will understand that unless otherwise stated, the singular forms "a", "an", "said" and "the" used herein may also include plural forms. It should be further understood that the word "comprising" used in the description of the present invention refers to the presence of said features, integers, steps, operations, elements and / or components, but does not exclude the presence or addition of one or more other features, Integers, steps, operations, elements, components, and / or groups thereof. It will be understoo...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a deep learning image target mapping and positioning method based on weak supervision information. The method comprises the following steps that: using image data with a category tag to independently train two deep convolutional neural network frames to obtain a classification model M1 and a classification model M2, and obtaining the parameters of a global learnable poolinglayer with parameters; using the new classification model M2 to carry out characteristic extraction on a test image to obtain a characteristic pattern, and according to the characteristic pattern, obtaining a preliminary positioning box through characteristic category mapping and a threshold value method; using a selective searching method to carry out candidate area extraction on the test image,and using the classification model M1 to screen a candidate box set of which the category belongs to a target object; and carrying out non-maximum inhibition processing on the preliminary positioningbox and the candidate box to obtain the final target positioning box of the test image. By use of the method, the global learnable pooling layer with parameters can be imported, learning can be carried out to obtain a better characteristic expression about a target category j, and a selective characteristic category mapping way is used for effectively obtaining the position information of the target object in the image.

Description

technical field [0001] The invention relates to the technical field of image processing, in particular to a deep learning image target mapping and positioning method based on weakly supervised information. Background technique [0002] With the development of deep learning technology represented by deep convolutional neural network, great breakthroughs have been made in the fields of image classification and image object recognition, and many influential academic researches and related industrial applications have been triggered. In the 2015 Global Large-Scale Visual Recognition Competition (ILSVRC), the deep residual model proposed by Microsoft Research Asia won the championship with a recognition error rate of 3.57%, and surpassed human recognition accuracy for the first time. [0003] The Regional Convolutional Neural Network (RCNN) proposed in 2014 was the first to use a deep convolutional network for image target detection tasks, and its performance was significantly im...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/62G06N3/04G06N3/08
CPCG06N3/084G06V2201/07G06N3/048G06N3/045G06F18/214
Inventor 田永鸿李宗贤史业民曾炜王耀威
Owner PEKING UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products