Visual target tracking method based on self-adaptive subject sensitivity

A target tracking and adaptive technology, applied in the field of computer vision, can solve the problems that the model is difficult to adapt to different types of targets, the sensitivity of the target subject is not obtained, and the foreground and background can be effectively distinguished, so as to solve the gradient disappearance and improve the expression ability. Effect

Active Publication Date: 2019-09-06
BEIJING UNIV OF TECH
View PDF8 Cites 46 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

In real and complex natural scenes, it is difficult for the model to adapt to different types of targets, and the feature expression that is sensitive to the target subject and effectively distinguishes the foreground from the background has not been obtained. The performance of the visual target tracking method needs to be further improved.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Visual target tracking method based on self-adaptive subject sensitivity
  • Visual target tracking method based on self-adaptive subject sensitivity
  • Visual target tracking method based on self-adaptive subject sensitivity

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0040] Below in conjunction with accompanying drawing of description, the embodiment of the present invention is described in detail:

[0041] A visual target tracking method based on adaptive subject sensitivity, the overall flow chart is attached figure 1 shown; the algorithm is divided into an offline part and an online part; the flow charts are as attached figure 2 And attached image 3 As shown; in the offline part, the corresponding image pair is first generated according to the training sample set as the input template image and the current image respectively; then, the two are input into the built adaptive subject-sensitive Siamese network for feature extraction, and the tracking regression response is generated and calculated Track the regression loss function; then, calculate the backpropagation gradient through the chain derivation rule, and generate the target template image mask superimposed on the output of the network model; finally, calculate the tracking reg...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a visual target tracking method based on self-adaptive subject sensitivity, and belongs to the technical field of computer vision. The visual target tracking method comprises an overall process, an offline part and an online part. The whole process includes: designing a target tracking process, and designing a network structure; adjusting the feature map of each stage of the network into an adaptive size to complete the end-to-end tracking process of the twin network; the offline part comprises six steps: generating a training sample library; carrying out forward tracking training; calculating a back propagation gradient; calculating a gradient loss item; generating a target template image mask; and training a network model and obtaining the model. The online part comprises three steps: carrying out model updating; carrying out online tracking; and positioning a target area. The model updating comprises forward tracking, back propagation gradient calculation, gradient loss item calculation and target template image mask generation; the online tracking comprises the steps of performing forward tracking to obtain a similarity matrix, calculating the confidencecoefficient of a current tracking result and returning to a target area. The method can better adapt to target robust tracking of appearance changes.

Description

technical field [0001] The invention belongs to the technical field of computer vision and relates to a target tracking method, in particular to a visual target tracking method based on adaptive subject sensitivity. Background technique [0002] Visual object tracking is one of the most basic tasks in computer vision and video processing. It has important applications in video content analysis, intelligent transportation systems, human-computer interaction, unmanned driving, and visual navigation. Typical online tracking methods, given the bounding box of the object in the first frame of the video, complete the object localization in all subsequent frames in an automatic manner. In real application scenarios, the appearance changes of the target caused by factors such as imaging conditions and attitude deformation are intricate. It is a very challenging problem to distinguish the target from the messy background and achieve accurate tracking of the target. [0003] At prese...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/62G06N3/04G06K9/48
CPCG06V10/473G06V10/46G06V2201/07G06N3/045G06F18/22G06F18/214
Inventor 张辉齐天卉卓力李嘉锋
Owner BEIJING UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products