Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Vision tracking method based on consistency predictor model

A visual tracking and consistency technology, applied in the field of data processing, can solve problems such as inability to adapt to target occlusion, background interference from appearance changes, poor accuracy, and poor robust target tracking effect.

Inactive Publication Date: 2018-08-28
SOUTHWEAT UNIV OF SCI & TECH
View PDF5 Cites 11 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, when using deep features for tracking, the common problem is that a large number of samples are required to train and update CNN parameters. For visual tracking tasks, it is usually difficult to obtain a large number of training samples about the tracked target in advance.
[0006] Existing visual tracking methods have poor robustness in target tracking for video sequences; they cannot adapt to complex situations such as target occlusion, appearance changes, and background interference, and the accuracy of various existing tracking algorithms is poor.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Vision tracking method based on consistency predictor model
  • Vision tracking method based on consistency predictor model
  • Vision tracking method based on consistency predictor model

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0085] In order to make the object, technical solution and advantages of the present invention more clear, the present invention will be further described in detail below in conjunction with the examples. It should be understood that the specific embodiments described here are only used to explain the present invention, not to limit the present invention.

[0086]At present, the robustness of target tracking for video sequences is poor; it cannot adapt to complex situations such as target occlusion, appearance changes, and background interference, and the accuracy of various existing tracking algorithms is poor.

[0087] The application principle of the present invention will be described in detail below in conjunction with the accompanying drawings.

[0088] Such as figure 1 As shown, the visual tracking method based on the consistency predictor model provided by the embodiment of the present invention includes:

[0089] S101: First construct a dual-input convolutional neur...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention belongs to the data processing technology field and discloses a vision tracking method based on a consistency predictor model. The method comprises steps that firstly, a two-input convolutional neural network model is constructed, a video frame sampling region and high-level characteristics of a target template are extracted synchronously, and the logistic regression method is utilized to distinguish the target from the background region; secondly, the convolutional neural network is embedded in the consensus predictor framework, the algorithm randomness test is utilized to evaluate reliability of the classification result, under the specified risk level, the classification result with the credibility index is outputted in the domain form; and lastly, the high-confidence region is selected as a candidate target region, through optimizing the space-time domain global energy function, the target trajectory is acquired. The method is advantaged in that the method can be adapted to complex situations such as target blocking, appearance change and background interference, and the method has strong robustness and accuracy compared with presently popular tracking algorithms.

Description

technical field [0001] The invention belongs to the technical field of data processing, in particular to a visual tracking method based on a consistency predictor model. Background technique [0002] Visual object tracking is a basic problem in the field of computer vision. Its task is to determine the motion state of the object in the video, including position, speed and trajectory. Although great progress has been made in visual tracking technology in recent years, it still faces great challenges to achieve robust tracking in complex situations such as object occlusion, pose changes, and mixed backgrounds. [0003] In the visual tracking problem, the feature representation of the target is one of the important factors affecting the tracking performance. The features used to represent objects should be able to adapt to object appearance changes while being well discriminative to background. A large number of feature extraction methods have been applied to visual tracking,...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T7/246
CPCG06T2207/10016G06T2207/20081G06T2207/20084G06T2207/30241G06T7/246
Inventor 高琳
Owner SOUTHWEAT UNIV OF SCI & TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products