Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Visual object tracking method based on multiple model integration and structured depth characteristics

A deep feature, target tracking technology, applied in image data processing, instrumentation, computing and other directions, can solve the problems of missing targets, small number of online samples, easy overfitting, etc., to improve accuracy, enhance robustness, The effect of improving accuracy

Active Publication Date: 2018-02-13
XIDIAN UNIV
View PDF4 Cites 16 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Among the above two methods, the method based on the generative model does not use the information of the background, so it is difficult to separate the target in the presence of a complex background, and there is a high probability that it will track an unrelated object or background; while the method based on discrimination Although the linear model method has a strong ability to separate the target and the background, it is limited by the small number of online samples and is prone to overfitting.
[0003] To sum up, the problems existing in the existing technology are: the appearance model based on a single feature is difficult to fully describe the tracking target, cannot accurately distinguish the target from the background, and is likely to lose the target after a period of tracking
In addition, due to the small number of samples available for online update models, the current mainstream tracking methods based on discriminative models are prone to overfitting, resulting in a decrease in the estimation accuracy of the target position.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Visual object tracking method based on multiple model integration and structured depth characteristics
  • Visual object tracking method based on multiple model integration and structured depth characteristics

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0033] In order to make the object, technical solution and advantages of the present invention more clear, the present invention will be further described in detail below in conjunction with the examples. It should be understood that the specific embodiments described here are only used to explain the present invention, not to limit the present invention.

[0034] The application principle of the present invention will be described in detail below in conjunction with the accompanying drawings.

[0035] Such as figure 1 As shown, the visual target tracking method based on multi-model fusion and structured depth features provided by the embodiment of the present invention includes the following steps:

[0036] S101: Construct the result of ideal correlation filtering as a reference output for subsequent training;

[0037] S102: According to the target position calibrated in the first frame, cut out the image block containing the target on the first frame image, and extract the d...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention belongs to the technical field of pattern recognition and computer vision, and discloses a visual object tracking method based on multiple model integration and structured depth characteristics. The method comprises the steps that modeling is conducted on the appearance of a tracking object by adopting multilayer structured depth network characteristics, network extraction characteristics have higher robustness on interference factors such as motion blur; low-level characteristics in structure characteristics can not only distinguish strong interfering targets easily but also make position estimation more accurate; high-level characteristics in the structured characteristics can separate the tracking object from a background easily. Accordingly, visual object tracking is conducted by means of long time and short time model merging, and the precision of position estimation is improved easily through a short-time model; the strong interfering targets which are similar are inhibited and tracked easily through the short-time model. The method has the advantages of being high in precision and robustness; the method can be used for application such as video monitoring, roadtraffic condition analysis and human-computer interaction.

Description

technical field [0001] The invention belongs to the technical field of pattern recognition and computer vision, and in particular relates to a visual target tracking method based on multi-model fusion and structured depth features. Background technique [0002] Visual object tracking is one of the most fundamental problems in the field of computer vision. With the development of information technology, video has become an important information carrier. A large amount of video data leads to the demand for automatic video analysis and processing, and object tracking is one of the key technologies that need to be solved. Since the input of visual object tracking is only images in video, it is easily affected by the quality of video capture. According to the video classification of several publicly available test databases, the main factors affecting video quality and tracking difficulty are: illumination changes, scale changes, occlusion, motion blur, complex backgrounds, rot...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T7/246
CPCG06T2207/10016G06T2207/20056G06T2207/20081G06T2207/20084G06T7/251
Inventor 田春娜李明郎君高新波王秀美刘丽莎刘恒姜萌萌
Owner XIDIAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products