Pulse neural network target tracking method and system based on event camera

A pulse neural network and target tracking technology, which is applied in the field of target tracking, can solve problems such as limited feature extraction capability, real-time tracking limitation, and large impact of light, so as to reduce data transmission volume, reduce calculation delay, and improve tracking accuracy.

Active Publication Date: 2022-05-03
ZHEJIANG LAB +1
View PDF12 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] The method based on correlation filtering is fast, but the feature extraction ability is limited, and the effect is poor in the face of scale transformation and target loss.
The method based on deep learning has good feature expression ability and higher tracking accuracy, but it is accompanied by an increase in the amount of calculation, which is limited in real-time tracking and is greatly affected by light, so it is not suitable for high-speed tracking. dynamic scene

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Pulse neural network target tracking method and system based on event camera
  • Pulse neural network target tracking method and system based on event camera
  • Pulse neural network target tracking method and system based on event camera

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0035] In order to make the purpose, technical solutions and advantages of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below in conjunction with the accompanying drawings. Obviously, the described embodiments are part of the embodiments of the present invention, rather than Full examples. Based on the embodiments of the present invention, all other embodiments obtained by persons of ordinary skill in the art without making creative efforts belong to the protection scope of the present invention.

[0036] The application will be further described in detail below in conjunction with the accompanying drawings and embodiments. It should be understood that the specific embodiments described here are only used to explain related inventions, not to limit the invention. It should also be noted that, for the convenience of description, only the parts related to the related invention are sho...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention belongs to the field of target tracking, and particularly relates to a pulse neural network target tracking method and system based on an event camera, and the method comprises the steps: obtaining an asynchronous event data flow in a target high-dynamic scene through the event camera; dividing the asynchronous event data stream into event frame images with millisecond-level time resolution; a target image is used as a template image, a complete image is used as a search image, a twin network based on a pulse neural network is trained, the network comprises a feature extractor and a cross-correlation calculator, and after feature mapping of the image is extracted through the feature extractor, the cross-correlation calculator is used for calculating a feature mapping result; and performing interpolation up-sampling on a feature mapping result by using the trained network to obtain the position of the target in the original image so as to realize target tracking. According to the invention, the transmission delay of image data and the calculation delay of a target tracking algorithm are reduced, and the precision of target tracking in a high-dynamic scene is improved.

Description

technical field [0001] The invention relates to the field of target tracking, in particular to an event camera-based pulse neural network target tracking method and system. Background technique [0002] The recognition and tracking of moving targets is a hot issue in the field of computer vision, and has a wide range of applications in human-computer interaction, video tracking, visual navigation, robotics, and military guidance. At present, there are two mainstream technical routes for target tracking based on correlation filtering and deep learning. [0003] The method based on correlation filtering is fast, but the feature extraction ability is limited, and the effect is poor when facing the problem of scale transformation and target loss. The method based on deep learning has good feature expression ability and higher tracking accuracy, but it is accompanied by an increase in the amount of calculation, which is limited in real-time tracking and is greatly affected by li...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/246G06T3/40G06N3/04G06N3/08
CPCG06T7/246G06T3/4007G06T3/4046G06N3/049G06N3/08G06T2207/20081G06T2207/20084G06N3/045
Inventor 赵文一唐华锦洪朝飞王笑袁孟雯陆宇婧张梦骁黄恒潘纲
Owner ZHEJIANG LAB
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products