Weighted extreme learning machine video target tracking method based on weighted multi-example learning

An extreme learning machine and multi-instance learning technology, applied in the field of target tracking, can solve the problems of poor tracking accuracy and achieve the effect of improving stability, accuracy and robustness

Inactive Publication Date: 2017-02-22
XIDIAN UNIV
View PDF4 Cites 6 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] The purpose of the present invention is to address the problem of poor tracking accuracy in the above-mentioned prior art, and propose a weighted extreme learning machine video target tracking method based on weighted multi-instance learning, so as to improve the accuracy of tracking in complex environments, such as target posture changes, fast moving targets, etc. , blurred video image, complex background and partial occlusion target tracking accuracy, meeting the requirements of video target tracking

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Weighted extreme learning machine video target tracking method based on weighted multi-example learning
  • Weighted extreme learning machine video target tracking method based on weighted multi-example learning
  • Weighted extreme learning machine video target tracking method based on weighted multi-example learning

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0038] Below with reference to accompanying drawing, technical scheme and effect of the present invention are further described:

[0039] refer to figure 1 , the specific implementation steps of the present invention are as follows:

[0040] Step 1. Initialize.

[0041] 1.1) Initialize target features:

[0042] Commonly used features in video tracking include: grayscale features, red, green, blue RGB color features, chroma, saturation, brightness HSV color features, gradient features, scale invariant feature transform SIFT features, local binary pattern LBP features, Haar-like features; this example uses, but is not limited to, Haar-like features in existing features as the target feature, and constructs a feature model pool Φ containing M-type Haar feature models;

[0043] 1.2) Randomly assign the feature models in the feature model pool Φ to get the total E group of feature model blocks Where e is the serial number of the feature model block, the value is 1,...,E, E is ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a weighted extreme learning machine video target tracking method based on weighted multi-example learning, solving the problem of bad tracking accuracy in the prior art. The method includes 1. initializing a Haar-like feature similar model pool and constructing a variety of feature model blocks, setting the weighted extreme learning machine network parameters; 2. extracting the training samples in the current frame and their feature blocks corresponding to the feature blocks of the different feature model blocks; 3. calculating the weighted multi-instance learning weight values; 4. constructing a plurality of networks corresponding to the different feature blocks and selecting the network with the largest similarity function value of the packet and the corresponding feature model block; 5. calculating the network global output weight values; 6. extracting the detection samples in the next frame and their corresponding feature blocks corresponding to the selected feature model blocks; 7. classifying the detection samples by means of the selected network and obtaining the target position of the next frame; and 8. repeating the above steps until the video is ended. According to the invention, the tracking accuracy is improved, and the target robustness tracking is realized.

Description

technical field [0001] The invention belongs to the technical field of target tracking, in particular to a weighted extreme learning machine video target tracking method, which can be used for intelligent video retrieval, medical image processing and photoelectric weapon terminal guidance. Background technique [0002] Video target tracking is an important research direction in the field of computer vision and artificial intelligence. Its main task is to track single or multiple targets of interest in video sequences. [0003] The tracking models of video object tracking can be mainly divided into two categories: generative models and discriminative models. The generated model is the appearance model of the target to obtain the appearance model, and then search for the appearance model with the highest matching degree with the target appearance model established by the current frame image in the next frame image, and take the corresponding position as the tracking result. T...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/20
Inventor 姬红兵曹奕张文博刘龙殷鹏飞
Owner XIDIAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products