Eureka AIR delivers breakthrough ideas for toughest innovation challenges, trusted by R&D personnel around the world.

Object tracking method based on state fusion of multiple cell blocks

A technology of object tracking and block state, which is applied in image data processing, instrumentation, calculation, etc., can solve the problems of limited application and low calculation efficiency, and achieve the effect of simple confidence, simple calculation, and real-time stable object tracking

Inactive Publication Date: 2015-03-04
SOUTHWEST JIAOTONG UNIV
View PDF1 Cites 3 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Fan et al. proposed to learn regions of interest with strong discrimination to assist in tracking. However, when there is violent movement in the scene, the computational efficiency is still not high due to the limitations of these local regions.
Godec et al. achieved a satisfactory tracking effect by classifying the background into multiple virtual types by clustering the scene, but this method assumes that the background is only gradually and slightly changed, which is not true in many tracking occasions, so its limited application

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Object tracking method based on state fusion of multiple cell blocks
  • Object tracking method based on state fusion of multiple cell blocks
  • Object tracking method based on state fusion of multiple cell blocks

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0023] The present invention will be further described below according to the accompanying drawings: the method of the present invention can be used in various occasions of object tracking, such as intelligent video analysis, automatic human-computer interaction, traffic video monitoring, unmanned vehicle driving, biological group analysis, and fluid surface velocity measurement Wait.

[0024] Technical scheme of the present invention comprises the steps:

[0025] (1) Target selection

[0026] Select and determine the target object to track from the initial image. The target selection process can be automatically extracted by the moving target detection method, or manually specified by the human-computer interaction method.

[0027] (2) Set the target cell block

[0028] Extract the image block as the target cell block according to the randomly generated center point position, width and height in the target object area, figure 1 In , use I to represent the image, T to repres...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides an object tracking method based on state fusion of multiple cell blocks, belonging to the technical field of visual object tracking. The object tracking method based on the state fusion of the multiple cell blocks can be used for effectively solving the non-rigid movement changes such as object rotation, object distortion, object scaling as well as a tracking problem under sheltering. The object tracking method based on the state fusion of the multiple cell blocks comprises the following steps of: selecting and determining a to-be-tracked target object from an initial image; automatically extracting by virtue of a moving target detecting method or manually appointing by virtue of a man-machine interaction method; setting a target cell block on a central point position which is randomly generated in a target object area; extracting a video image which is acquired by a camera and stored in a storage area under a real-time treatment condition, decomposing the video image which is used as a to-be-tracked video file into an image sequence consisting of a plurality of frames, extracting the frame image one by one as an input image; if the input image is null, ending the whole process; configuring the state of each cell block and determining the best configuration according to the corresponding target cell block. Target location is used for estimating the state of the existing target.

Description

technical field [0001] The invention belongs to the technical field of computer vision object tracking, in particular to the technical field of computer graphic image processing. Background technique [0002] Visual object tracking is a basic and critical problem for many computer vision applications, such as video analysis, intelligent surveillance, human-computer interaction, behavior recognition, etc. Although researchers have made a lot of work on this, it is necessary to achieve real-time Stable object tracking remains an extremely challenging task. [0003] At present, object tracking methods relying on detection or learning (such as TLD, Tracking-Learning-Detection) are receiving more and more attention. These methods explore unknown data and information by learning some kind of classifier, such as support vector machine, bootstrap, random forest, or random fern, so as to enhance its adaptability to the target and its scene changes. When basic (short-term) tracking ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/00G06T7/20
CPCG06T7/0012G06T7/248G06T2207/10016
Inventor 权伟陈锦雄张卫华江永全何武
Owner SOUTHWEST JIAOTONG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Eureka Blog
Learn More
PatSnap group products