Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Multi-target tracking method based on Leiyu fusion

A multi-target tracking and radar target technology, applied in the field of intelligent traffic detection, to achieve the effect of strong plasticity, enhanced comprehensiveness, and reduced frequent ID cutting

Pending Publication Date: 2022-04-08
连云港杰瑞电子有限公司
View PDF0 Cites 11 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] The purpose of the present invention is to address the shortcomings of the above-mentioned single detection and tracking method, to provide a multi-target tracking method based on Lesion Vision Fusion, using a detection method based on deep learning to detect video targets, using cluster analysis to achieve radar point detection, and radar fusion and video detection information to achieve accurate tracking of multiple targets

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Multi-target tracking method based on Leiyu fusion
  • Multi-target tracking method based on Leiyu fusion
  • Multi-target tracking method based on Leiyu fusion

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0050] In order to make the purpose, technical solution and advantages of the present application clearer, the present application will be further described in detail below in conjunction with the accompanying drawings and embodiments. It should be understood that the specific embodiments described here are only used to explain the present application, and are not intended to limit the present application.

[0051] In one embodiment, combined with figure 1 , provides a multi-target tracking method based on Lesion Fusion, including the following steps:

[0052] Step 1, collect radar signals and analyze them to obtain status information of radar detection points, including position and speed;

[0053] Step 2, collect real-time video streams, load the target detection model, input the acquired video images into the detection model for reasoning, and obtain the status information of the video target, including position, category, and size;

[0054] Step 3, using perspective tran...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a multi-target tracking method based on thunder-vision fusion, and the method comprises the steps: obtaining the state information of a plurality of targets of a radar frame through the analysis of a radar signal; an image acquired by the video monitoring equipment is input to the convolution model for target identification, and video state target state information is obtained; transforming the radar signal and the target signal into the same coordinate system by using perspective transformation, establishing data association between a radar target and a video target, fusing the thunder information and storing the thunder information into a tracker; predicting the state of the target in the tracker at the next moment, associating the data information of the prediction target and the observation target, and carrying out track matching; and filtering and updating the associated signals by using a filtering algorithm and a parameter threshold to complete multi-target tracking. According to the method, radar and video information are fused, the problems of target inaccuracy caused by a single information system, inaccurate detection of a static target by the radar, multi-detection of a large target, easy interference of ambient light on single video detection and the like are solved, and more comprehensive and accurate target detection is realized.

Description

technical field [0001] The invention belongs to the field of intelligent traffic detection, in particular to a multi-target tracking method based on multi-source information, in particular to a multi-target tracking method based on Lesion Vision Fusion. Background technique [0002] In recent years, with the increasing status of "smart city" construction, traffic monitoring systems have become more and more widely used. In order to reduce the incidence of traffic accidents, reduce traffic casualties and traffic losses, it is necessary to accurately identify and track the monitoring area. [0003] Traditional target tracking is based on the analysis of signals collected by a single sensor, which has various drawbacks. Millimeter-wave radar detects and tracks targets by transmitting and receiving modulated continuous waves in the detection area, and can obtain target position and speed information more accurately. When the target is stationary or large in size, the signal dis...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06V20/40G06K9/62G06V10/80G06V10/774
Inventor 颜耀乜灵梅张宇杰孙浩凯纪彬佟世继章涛涛刘昌杰李元青赵忠刚
Owner 连云港杰瑞电子有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products