Method, device and system for detecting object in image

A detection method and technology in images, applied in neural learning methods, biological neural network models, instruments, etc., can solve problems such as poor detection results, low robustness of manual features, and incomplete detection targets

Active Publication Date: 2020-02-04
HUAZHONG UNIV OF SCI & TECH
View PDF4 Cites 10 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] At present, salient object detection is mostly based on 2D or 3D data. When the target and background in the image are complex, problems such as incomplete detection of the target occur.
[0004] In addition, the light field saliency detection models all use manual features to extract clues such as focus degree and background probability based on the focus stack. These light field-based saliency clues are used as additional features to fuse with the RGB features of the fully focused image and the depth features of the depth map. Or participate in the fusion process of traditional features as a weighting coefficient, simply fusing light field features with traditional features, resulting in low robustness of manual features, making the detection effect poor

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method, device and system for detecting object in image
  • Method, device and system for detecting object in image
  • Method, device and system for detecting object in image

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0071] In order to make the objectives, technical solutions, and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be described clearly and completely in conjunction with the accompanying drawings in the embodiments of the present invention. Obviously, the described embodiments It is a part of the embodiments of the present invention, not all the embodiments. Based on the embodiments of the present invention, all other embodiments obtained by those of ordinary skill in the art without creative work shall fall within the protection scope of the present invention.

[0072] The terms "first", "second", "third", "fourth", etc. (if any) in the description and claims of the present invention and the above-mentioned drawings are used to distinguish similar objects, and are not necessarily used Describe a specific order or sequence. It should be understood that the data used in this way can be interch...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention provides a method, a device and a system for detecting an object in an image. The method comprises the steps that a focus stack of a scene is acquired, the focus stack comprises focusingslices, focused on different depth planes, of the same scene, multi-level feature extraction is conducted on the focus stack through a deep convolutional neural network, and L-layer features of the focus stack are obtained, wherein L is a natural number greater than 1; fusion processing is performed on each layer of features through a convolution long-short-term machine model to obtain L layers of focusing fusion features of the focus stack; and multi-level feature fusion processing is performed on the L-layer focusing fusion features to obtain target focusing fusion features, convolution processing is performed on the target focusing fusion features, and the features are activated after convolution processing through an activation function to obtain a significant image. Therefore, the accuracy and robustness of object detection in a complex environment scene image are improved.

Description

Technical field [0001] The present invention relates to the technical field of computer vision images, in particular to a method, device and system for detecting objects in images. Background technique [0002] With the development of technology, image processing is gradually infiltrating all areas of daily life. The light field camera is constantly improving, and the light field data can be obtained through the micro lens array in front of the image sensor, which provides a new method for image saliency analysis. [0003] At present, the detection of salient objects is mostly based on 2D or 3D data. When the target and background in the image are complex, problems such as incomplete target detection may occur. [0004] In addition, the light field saliency detection model uses manual features, and extracts cues such as focus degree and background probability based on the focus stack. These saliency cues based on the light field are used as additional features to be fused with the R...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/46G06K9/62G06N3/04G06N3/08
CPCG06N3/08G06V10/40G06N3/045G06F18/253
Inventor 杨铀刘琼李贝
Owner HUAZHONG UNIV OF SCI & TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products