Eureka AIR delivers breakthrough ideas for toughest innovation challenges, trusted by R&D personnel around the world.

Multi-target tracking method and system suitable for embedded terminal

A multi-target tracking, embedded terminal technology, applied in neural learning methods, character and pattern recognition, image enhancement, etc., can solve problems such as inability to achieve real-time tracking effects, hardware equipment limitations, etc., and achieve high practicability and market promotion value. , the effect of reduced computing scale and low hardware performance

Active Publication Date: 2021-06-25
安徽科大擎天科技有限公司
View PDF19 Cites 10 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] In order to overcome the problems in the prior art, the present invention provides an improved multi-target tracking method, which solves the problem that the existing multi-target tracking method is limited by hardware equipment and cannot achieve real-time tracking effect

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Multi-target tracking method and system suitable for embedded terminal
  • Multi-target tracking method and system suitable for embedded terminal
  • Multi-target tracking method and system suitable for embedded terminal

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0074] Such as figure 1 As shown, this embodiment provides a multi-target tracking method suitable for embedded terminals, and the multi-target tracking method includes the following steps:

[0075] S1: Frame the video of the monitoring and tracking object to obtain several consecutive frame images, and use the obtained continuous frame images as the target image for processing;

[0076] S2: Build an improved multi-target tracking neural network model. The multi-target tracking neural network model includes a detector and a tracker; the detector uses the YOLOv4 network as the basic network, and replaces the CSPDarkNet53 feature extraction network in the YOLOv4 network structure with a lightweight one. The backbone network ShuffleNetV1; the tracker uses the Deep-SORT network as the basic network, and replaces the feature extraction process completed by the convolutional neural network in the tracker with the features extracted by the ShuffleNetV1 network.

[0077] The CSPDarkN...

Embodiment 2

[0126] Such as Figure 5 As shown, this implementation also provides a multi-target tracking system suitable for embedded terminals. The system adopts the aforementioned multi-target tracking method suitable for embedded terminals. Based on the continuous target images obtained after video framing, the The target object is identified, detected and continuously tracked; and the detected and tracked target objects are matched and associated; the multi-target tracking system includes: a video preprocessing module, a multi-target tracking neural network module, an associated cost matrix building module, and a level Link matching module.

[0127] The video preprocessing module is used to divide the video used for monitoring and tracking objects into frames, and use the continuous frame images obtained after the frame division processing as the target images for multi-target tracking to form a sample data set.

[0128] The multi-target tracking neural network module includes a dete...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to the field of computer vision, in particular to a multi-target tracking method and system suitable for an embedded terminal. The method comprises the following steps: S1, framing a video to obtain a target image; S2, constructing a multi-target tracking neural network model comprising a detector and a tracker, wherein the detector replaces a feature extraction network of backbone network with ShuffleNetV1; the tracker adopts a Deep-SORT network as a basic network, and uses features extracted by a ShuffleNetV1 network as partial input; S3, performing target detection on the target image to obtain a set of target detection frames; S4, performing state prediction on a tracking object to obtain a target tracking frame; S5, calculating a motion matching degree and a feature matching degree according to results of the tracker and the detector, and constructing a correlation cost matrix; and S6, performing cascade matching on the association cost, determining a tracking result, and realizing a multi-target tracking process. The method solves the problem that an existing multi-target tracking method is limited by hardware equipment and cannot achieve a real-time tracking effect.

Description

technical field [0001] The invention relates to the field of computer vision, in particular to a multi-target tracking method and system suitable for embedded terminals. Background technique [0002] In computer vision, target detection and target tracking are intersecting fields. Target detection is to scan and search for targets in images and videos, and locate and identify targets in a scene; After the state, accurately predict the position and size of the target, so as to obtain the motion state of the object. In recent years, with the continuous development of deep learning, convolutional neural networks have been widely used in the field of target detection and target tracking, and achieved good results. [0003] Target detection is mainly done using deep learning neural networks, among which YOLO series algorithms and SSD algorithms are the representatives; YOLOv4 is the fourth version of YOLO series algorithms. After the iteration of the first three generations, the...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T7/246G06T7/13G06K9/20G06K9/46G06N3/04G06N3/08
CPCG06T7/246G06T7/13G06N3/08G06T2207/30241G06T2207/10016G06T2207/10004G06T2207/20081G06V10/22G06V10/44G06N3/045
Inventor 刘子龙万森程腾张海涛黄凌
Owner 安徽科大擎天科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Eureka Blog
Learn More
PatSnap group products