Video object tracking method based on feature optical flow and online ensemble learning

An integrated learning and target tracking technology, applied in image data processing, instrumentation, computing, etc., can solve problems such as poor tracking results

Active Publication Date: 2013-01-30
NORTHWESTERN POLYTECHNICAL UNIV
View PDF2 Cites 30 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] In order to overcome the shortcomings of poor tracking results of specific target tracking methods in existing digital videos, the present invention provides a video target tracking method based on feature optical flow and online integrated learning

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Video object tracking method based on feature optical flow and online ensemble learning
  • Video object tracking method based on feature optical flow and online ensemble learning
  • Video object tracking method based on feature optical flow and online ensemble learning

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0041] The concrete steps of the inventive method are as follows:

[0042] 1. Tracking part.

[0043] The preprocessing part inputs the video sequence, uses the function that comes with OpenCV to track the feature points of each frame with the iterative pyramid optical flow method, and obtains the position of these feature points in the next frame.

[0044] For target tracking without scaling, proceed as follows:

[0045] (i) When the number of feature points is less than 4, use the median value of the displacement of these feature points in the x direction and y direction as the displacement of the entire target in the x direction and y direction.

[0046] (ii) When the number of feature points is ≥ 4, use the RANSAC algorithm to calculate the transformation matrix from the bounding box of the previous frame to the bounding box of the next frame.

[0047] Since the moving object may have slight scaling in two consecutive frames, it is incomplete to only consider the unscale...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a video object tracking method based on a feature optical flow and online ensemble learning. The technical problem that tracking results of a tracking method of a specified object in an existing digital video are poor is solved. According to the technical scheme, the method comprises steps of inputting a tracking portion into a video sequence, tracking characteristic points of every frame through an iterative pyramid optical flow method by using functions of OpenCV, and obtaining positions of characteristic points of the next frame; selecting a positive sample or a negative sample to be subjected to adaboost algorithm processing for a detection portion; and conducting machine learning for possible object positions obtained by the tracking portion and the detection portion. A tracking feature extraction mode and a detection feature extraction mode are separated, the filtering for possible object position limitation is conducted during detection, possible objects which are far away from objects are removed, after tracking results and detection results are obtained again, the fisher discrimination ratio of objects and the online model is calculated in a self-adapting mode, the corresponding weight is determined, a fixed value is not used for fusing two results, and then the tracking effect is good.

Description

technical field [0001] The invention relates to a video target tracking method, in particular to a video target tracking method based on feature optical flow and online integrated learning. Background technique [0002] As an important branch in the field of computer vision, the research and application of video target tracking methods are increasingly widely used in various fields of science and technology, national defense construction, aerospace, medicine and health, and the national economy. Therefore, it is of great practical importance to study target tracking technology. value. The existing specific target tracking methods mainly include: detection-based methods, such as inter-frame difference method, background difference method, and motion field estimation method. Recognition-based methods such as region matching methods, model matching methods, frequency domain matching methods, and feature matching methods. [0003] The document "Online learning of robust object...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/20G06T3/40
Inventor 张艳宁杨涛屈冰欣陈挺
Owner NORTHWESTERN POLYTECHNICAL UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products