Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Visual tracking method based on multi-cue fusion

A visual tracking and clue technology, applied in the field of visual tracking, can solve the lack of generality and other problems, and achieve the effect of tracking

Inactive Publication Date: 2009-12-23
PEKING UNIV
View PDF0 Cites 9 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, it assumes that the background model obeys a single Gaussian model. It needs to train the video sequence without moving objects in advance to obtain the background initial model, which limits its application. In the cue evaluation function, a rectangle slightly larger than the target is used to represent the sense The area of ​​interest, the area between the rectangle and the tracking window is defined as the background area. For the reliability evaluation function of a certain clue, the size of the background area directly affects its value, that is, the larger the tracking window, the greater its reliability evaluation function The smaller the value, the lack of generality

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Visual tracking method based on multi-cue fusion
  • Visual tracking method based on multi-cue fusion
  • Visual tracking method based on multi-cue fusion

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0055] Embodiments of the present invention will be described in detail below in conjunction with the accompanying drawings. The protection scope of the present invention is not limited to the following examples.

[0056] The visual tracking of the present embodiment is carried out according to the following steps:

[0057] First, the tracking window is set in the first frame of the video sequence. The length and width of the tracking window are determined by the operator according to the size of the tracked target and remain unchanged during the tracking process. Divide the tracking window into three parts, the middle part (A) is the target area, and the left and right (B and C) are the background areas, such as figure 2 shown.

[0058] Second, from the second frame, select the most reliable 2 (W=2) color features (for example, R channel and B channel) according to the previous frame, and calculate the color feature probability distribution map M 1 .

[0059] Third, calc...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a visual tracking method based on multi-cue fusion, which belongs to the technical field of information. The method comprises the following steps: a) determining a tracking window comprising a target region and a background region in a first frame of a video sequence; b) obtaining a color feature probability distribution graph, a position feature probability distribution graph and a motion continuity feature probability distribution graph of the previous frame from the second frame; c) adding the three probability distribution graphs in a weighed manner to obtain a total probability distribution graph; and d) using a CAMSHIFT algorithm to obtain the coordinates of a central point of the tracking window of the current frame in the total probability distribution graph. The method can be used in human-computer interaction, visual intelligent surveillance, intelligent robot, virtual reality technology, model-based image encoding, content retrieval of streaming media and other fields.

Description

technical field [0001] The invention relates to visual tracking, in particular to a visual tracking method for fusing multiple clues, and belongs to the field of information technology. Background technique [0002] With the rapid development of information technology and intelligence science, computer vision, which uses computers to realize human visual functions, has become one of the most popular research directions in the computer field. Among them, visual tracking is one of the core problems of computer vision, which is to find the position of the moving target of interest in each frame of the image sequence. It is very necessary and urgent to study it. [0003] Hong Liu et al published the paper "Collaborative mean shift tracking based on multi-cue integration and auxiliary objects" in "Proceedings of the 14th IEEE International Conference on Image Processing (ICIP 2007)" (IEEE 14th International Conference on Image Processing) in 2007 (Collaborative mean shift track...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): H04N7/26G06T7/20G06T7/246
Inventor 杨戈刘宏
Owner PEKING UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products