Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Video tracking method based on spread fusion

A technology of communication fusion and video tracking, which is applied in the field of video tracking based on communication fusion, can solve problems such as not considering information, and achieve the effect of improving accuracy and high accuracy

Inactive Publication Date: 2014-04-09
HUAZHONG UNIV OF SCI & TECH
View PDF3 Cites 10 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, this method only considers the appropriate metric selection problem under one feature representation, and does not consider the information provided by unlabeled samples in incoming video frames.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Video tracking method based on spread fusion
  • Video tracking method based on spread fusion
  • Video tracking method based on spread fusion

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0017] The invention relates to a video target tracking method based on matching. Consider a simple tracking mode, that is, a tracking target area with a certain position and scale in the t-th video sequence image is known, and some candidate target areas of the same size as the known target are selected on the t+1 frame video sequence image, and the Each candidate target area is matched with the known tracking target area of ​​the current frame one by one, and the candidate target area with the highest matching degree is found as the position of the tracking target of the t+1 frame video sequence image; the tracking target position and The t+1 frame video sequence image of the scale is used as the t frame video sequence image. Repeat the previous step, and the target of the t+2 frame video sequence image can also be tracked; and so on, the t+2 frame video sequence image The target of all subsequent video frames can be tracked.

[0018] In the traditional matching-based video tr...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a video tracking method based on spread fusion. The method comprises the following steps: (1) a frame of video image t of a known tracking object region is received; (2) the (t+1)th frame of video image is received, and then a candidate object region having a same size as the tth video image tracking object is sampled to form a candidate object set; (3) similarity between objects under character representations of HOG, LBP and Haar-likeare calculated separately to obtain the similarity; (4) spread fusion is performed on the calculated three similarities based on different character representations to fuse the three similarities into a similarity; (5) the candidate object region in the (t+1)th frame of video image having the highest with the known the tth video image tracking object as the position of the tracking object in the frame, and the(t+1)th frame of video image of the tracking object marked with a rectangular frame is outputted; (6) and letst=t+1, and the step (2) to the step (5) are repeated until the video is finished. According to the invention, the similarity between an similarity object and a candidate object is described through multiple similarities based on multiple character representations, so that the object tracking accuracy can be improved.

Description

Technical field [0001] The invention relates to tracking a target object in a video, in particular to a video tracking method based on propagation fusion. Background technique [0002] Video tracking is a very important problem in computer vision and has a wide range of practical applications. Designing a robust video tracking method faces two challenges: on the one hand, the challenge is brought about by changes in the appearance of the target object, such as viewing angle changes, posture changes, and scale changes; on the other hand, the challenge is caused by the target object’s location. The environment is caused by external noise, such as occlusion, background noise, etc. [0003] In recent years, many robust video tracking methods have been proposed, the most representative of which is a matching-based tracking method, and the method proposed by the present invention also belongs to this type. This method mainly includes two key steps. One is to express the appearance of t...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/20G06K9/00
Inventor 白翔周瑜鲁勤刘文予
Owner HUAZHONG UNIV OF SCI & TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products