A spiking neural network target tracking method and system based on event camera

A spiking neural network and target tracking technology, applied in the field of target tracking, can solve the problems of limited feature extraction ability, limited tracking real-time performance, and large illumination impact, and achieve the effect of reducing the amount of data transmission, reducing the calculation delay, and improving the tracking accuracy.

Active Publication Date: 2022-07-08
ZHEJIANG LAB +1
View PDF9 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] The method based on correlation filtering is fast, but the feature extraction ability is limited, and the effect is poor in the face of scale transformation and target loss.
The method based on deep learning has good feature expression ability and higher tracking accuracy, but it is accompanied by an increase in the amount of calculation, which is limited in real-time tracking and is greatly affected by light, so it is not suitable for high-speed tracking. dynamic scene

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A spiking neural network target tracking method and system based on event camera
  • A spiking neural network target tracking method and system based on event camera
  • A spiking neural network target tracking method and system based on event camera

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0035] In order to make the objectives, technical solutions and advantages of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings. Obviously, the described embodiments are part of the embodiments of the present invention, not All examples. Based on the embodiments of the present invention, all other embodiments obtained by those of ordinary skill in the art without creative efforts shall fall within the protection scope of the present invention.

[0036] The present application will be further described in detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are only used to explain the related invention, but not to limit the invention. In addition, it should be noted that, for the convenience of description, only the parts related to the related inv...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention belongs to the field of target tracking, and in particular relates to a pulse neural network target tracking method and system based on an event camera. The method includes acquiring an asynchronous event data stream in a target high dynamic scene through an event camera; dividing the asynchronous event data stream into milliseconds event frame images with high temporal resolution; take the target image as the template image and the complete image as the search image, train a twin network based on the spiking neural network, which includes a feature extractor and a cross-correlation calculator, and the image passes through the feature extractor After the feature map is extracted, the cross-correlation calculator is used to calculate the result of the feature map; the trained network is used to interpolate and upsample the result of the feature map to obtain the position of the target in the original image to achieve target tracking. The invention reduces the transmission delay of the image data and the calculation delay of the target tracking algorithm, and improves the accuracy of the target tracking in the high dynamic scene.

Description

technical field [0001] The invention relates to the field of target tracking, in particular to a method and system for target tracking of an impulse neural network based on an event camera. Background technique [0002] Recognition and tracking of moving objects is a hot issue in the field of computer vision, and has a wide range of applications in human-computer interaction, video tracking, visual navigation, robotics, and military guidance. At present, there are two mainstream technical routes for target tracking based on correlation filtering and based on deep learning. [0003] The method based on correlation filtering is fast, but the feature extraction ability is limited, and the effect is poor when facing the problem of scale transformation and target loss. The method based on deep learning has good feature expression ability and higher tracking accuracy, but it is accompanied by an increase in the amount of calculation, which is limited in terms of real-time trackin...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Patents(China)
IPC IPC(8): G06T7/246G06T3/40G06N3/04G06N3/08
CPCG06T7/246G06T3/4007G06T3/4046G06N3/049G06N3/08G06T2207/20081G06T2207/20084G06N3/045
Inventor 赵文一唐华锦洪朝飞王笑袁孟雯陆宇婧张梦骁黄恒潘纲
Owner ZHEJIANG LAB
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products