Target detection method based on global and local information fusion

A technology of target detection and local information, applied in computer parts, character and pattern recognition, instruments, etc., can solve the problem of ignoring scene level and so on

Active Publication Date: 2020-04-28
NORTHEAST NORMAL UNIVERSITY
View PDF11 Cites 10 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, scene-level features ar

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Target detection method based on global and local information fusion
  • Target detection method based on global and local information fusion
  • Target detection method based on global and local information fusion

Examples

Experimental program
Comparison scheme
Effect test

experiment example

[0099] In order to effectively and systematically evaluate the proposed method, a large number of target detection experiments were carried out on two standard databases PASCAL VOC and MS COCO2014; among them, PASCAL VOC contains two data sets of VOC 2007 and VOC2012, and the PASCAL VOC2007 data set contains 9963 An annotated picture consists of three parts: train / val / test, with a total of 24,640 objects marked. The train / val / test of the VOC2012 dataset contains all corresponding images from 2008 to 2011, and the train+val has 11,540 images with a total of 27,450 objects. Compared with the PASCAL VOC dataset, the pictures in MSCOCO 2014 contain natural images and common target images in life, and consist of two parts: train / minival. The image background in this database is relatively complex, the number of targets is large and the target size is smaller, so the task on the MS COCO 2014 dataset is more difficult and challenging. figure 1 with figure 2 Partial images in the t...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention relates to a target detection method based on global and local information fusion, and belongs to the field of video image processing. The method comprises the following steps: firstly,sending a scene into a convolutional neural network to increase the memory ability of the network, so that the network better learns scene context information to obtain global scene features; secondly, establishing a relationship between objects in a self-adaptive manner by referring to an attention mechanism to obtain local object characteristics; and finally, fusing scene features and object features through information transmission to enhance feature expression. The method has the advantages that global scene features and local object features are considered at the same time, target features are better represented through information transmission, and a large number of contrast experiments show that the detection performance of the method is obviously superior to that of other target detection methods.

Description

technical field [0001] The invention belongs to the field of video image processing, in particular to a target detection method based on fusion of global and local information. Background technique [0002] Object detection has a wide range of applications in autonomous driving, robotics, video surveillance, pedestrian detection, etc., and is a research hotspot in the fields of computer vision and machine learning. Classic object detection techniques are mainly based on the use of manual features, which can be divided into three steps: (1) selection of object regions; (2) feature extraction; (3) classification. In the first step, the sliding window strategy is widely adopted to perform an exhaustive search for the candidate regions by using sliding windows with different dimensions and aspect ratios. The second step is to analyze the candidate regions, and a variety of techniques can be used for feature extraction, such as traditional methods such as scale invariant feature...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06K9/00G06K9/62
CPCG06V20/41G06V20/46G06V2201/07G06F18/253
Inventor 齐妙王建中张燕妮孔俊吕英华郑彩侠徐慧
Owner NORTHEAST NORMAL UNIVERSITY
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products