Target tracking method and system based on composite convolutional network

A technology of target tracking and convolutional neural network, which is applied in the field of video target tracking to achieve the effects of perfect feature representation, accurate and robust tracking, and increased discrimination

Pending Publication Date: 2022-05-31
XIDIAN UNIV
View PDF0 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Convolutional neural networks can extract multi-scale local spatial information and fuse them t

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Target tracking method and system based on composite convolutional network
  • Target tracking method and system based on composite convolutional network
  • Target tracking method and system based on composite convolutional network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0050] The following will clearly and completely describe the technical solutions in the embodiments of the present invention with reference to the accompanying drawings in the embodiments of the present invention. Obviously, the described embodiments are some of the embodiments of the present invention, but not all of them. Based on the embodiments of the present invention, all other embodiments obtained by persons of ordinary skill in the art without making creative efforts belong to the protection scope of the present invention.

[0051] In the description of the present invention, it should be understood that the terms "comprising" and "comprising" indicate the presence of described features, integers, steps, operations, elements and / or components, but do not exclude one or more other features, Presence or addition of wholes, steps, operations, elements, components and / or collections thereof.

[0052] It should also be understood that the terminology used in the descriptio...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a target tracking method and system based on a composite convolutional network. The method comprises the following steps: constructing a twin composite convolutional feature extraction sub-network Siam-Co-CNNs; inputting feature maps of the template branch and the detection branch extracted by the Siam-Co-CNNs into a region proposal sub-network to form a target tracking network based on a composite convolutional network; performing offline pre-training on a target tracking network based on the composite convolutional network by using a video frame sequence in the training data set; and converting the test video into an image frame input system, marking a to-be-tracked target on a first frame, carrying out online tracking, and obtaining and outputting a target tracking result. Experiments prove that the target tracking method provided by the invention can improve the accuracy and success rate of target tracking.

Description

technical field [0001] The invention belongs to the technical field of video target tracking, and in particular relates to a compound convolution network-based target tracking method and system, which integrates advanced technologies in many fields such as image processing, feature fusion and computers. Background technique [0002] Object tracking is one of the most important and challenging problems in computer vision, with a wide range of applications including video surveillance, self-driving cars, etc. Only if annotations are given in the first frame of the video, the tracking algorithm can run to locate the object in subsequent frames, which may face various changes in appearance and motion caused by lighting, deformation, occlusion and motion, etc. [0003] In recent years, deep learning technology has developed rapidly and plays an important role in the field of digital image processing. Deep learning can be driven by data to automatically learn features that meet t...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06T7/246G06N3/04G06N3/08
CPCG06T7/246G06N3/08G06T2207/10016G06T2207/20081G06T2207/20084G06N3/045
Inventor 陈璞花单鼎丞王璐焦李成刘芳古晶
Owner XIDIAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products