Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A Robust Object Tracking Device and Tracking Method Based on Compact Expression

A tracking device and robust technology, applied in image analysis, image enhancement, instrumentation, etc., can solve unsolved robust tracking problems and other problems, and achieve the effect of reducing time and stabilizing tracking effect

Active Publication Date: 2021-08-17
SHANDONG UNIV
View PDF4 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, deep learning is generally aimed at general-purpose objects, and the problem of robust tracking of specific objects has not yet been solved.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A Robust Object Tracking Device and Tracking Method Based on Compact Expression
  • A Robust Object Tracking Device and Tracking Method Based on Compact Expression
  • A Robust Object Tracking Device and Tracking Method Based on Compact Expression

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0048] Such as figure 1 As shown in , a physically robust tracking device based on compact expressions, including a computer and a camera connected to it.

Embodiment 2

[0050] A physically robust tracking method based on compact representations such as figure 2 shown, including the following steps:

[0051] 1) Construct a complete and compact target dictionary. To achieve robust tracking in marker-free AR environments, we first need to create representations of objects, and we frame this problem as one that constructs a high-dimensional manifold of normalized subimages of all possible appearances of a target object. Specifically divided into the following five steps:

[0052] 1.1) Capture sub-images about the target from multiple perspectives, multiple backgrounds, and under changing lighting conditions to form an instance pool. Note that these sub-images should not have any occlusion. In order to facilitate the acquisition of such images in large quantities and quickly, video sequences taken and computer-synthesized images can be utilized. These sub-images should constitute a complete description of the target, i.e. they should contain i...

Embodiment 3

[0073] A robust object tracking method based on compact representations such as image 3 As shown, in the training phase, such as image 3 From right to left, the upper line of , collects various video images of a person, performs general target tracking on the person, and removes or corrects the error through manual intervention when the tracking fails; the sub-image corresponding to the accurately tracked target taken out to obtain thousands to hundreds of thousands of instances{x j ,j=1,2,...,m} to form an instance pool; further, through an iterative sparse learning method, a compact template set T is selected * ; in the online phase (such as image 3 the next row from left to right), this template set T * It is used to track the target, so as to obtain a stable two-dimensional area, and through the constraints, that is, the character walks on the ground, it can be seen that the bottom of the two-dimensional area is in contact with the plane, so as to obtain the three-di...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to an object robust tracking device and a tracking method based on a compact expression, and belongs to the technical field of object tracking. The method of the present invention is used to realize the object compact expression (dictionary) of robust tracking in the virtual reality environment. The method mainly includes capturing the sub-image of the target, constructing the instance pool, learning the compact template, expanding the instance pool to obtain the optimal template, and calculating The two-dimensional position of the object, the trajectory of the three-dimensional object, and the simulated virtual object, etc., the basic idea is to use the sparse representation to learn the compact representation of the object based on the image data of the target; the realization method includes three steps, first collect the image containing the target, and then Initialize a dictionary with the residual of the sparse expression, and finally use the support vector machine to find the sparse area on the target manifold, and construct a complete dictionary by resampling and the method in the second step. The method can learn a compact and complete representation of an object, which can be used to achieve robust tracking in virtual reality applications.

Description

technical field [0001] The invention relates to an object robust tracking device and tracking method based on a compact expression, and belongs to the technical field of object tracking. Background technique [0002] Augmented Reality (AR for short) applications need to realize the cognition of dynamic objects in the real world, so as to obtain appropriate responses to moving objects. Object tracking provides spatial information of objects of interest in a frame, but robust and accurate tracking is often a difficult task. Once the tracking fails, the augmented reality application loses its perception of the object, which prevents proper feedback and causes the system to fail. Among them, due to the change of observation angle, the dynamics of the object itself, the change of the appearance of the target caused by occlusion, and the change of the lighting environment, it is a main reason for the tracking failure. For this reason, to achieve accurate and robust tracking, we ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06T7/246G06K9/62
CPCG06T7/251G06T2207/10016G06T2207/20081G06F18/28G06F18/2411
Inventor 秦学英王同翰
Owner SHANDONG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products