Target detection tracking method based on feature fusion and environment self-adaption

A feature fusion and target detection technology, applied in character and pattern recognition, image data processing, instruments, etc., can solve problems such as failure and interference tracking, achieve good robustness, improve tracking speed, and reduce complexity

Pending Publication Date: 2022-01-04
DATANG DONGBEI ELECTRIC POWER TESTING & RES INST
View PDF0 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] The purpose of the present invention is to provide a target detection and tracking method based on feature fusion and environment adaptation. On the basis of the KCF algorithm, an adaptive classifier and a color moment feature descriptor are introduced to solve the problem of target factors in the tracking process of the KCF tracker. Tracking failure problems caused by factors such as severe occlusion, similar target interference and out of view, to achieve fast and accurate tracking of the target, with good robustness

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Target detection tracking method based on feature fusion and environment self-adaption
  • Target detection tracking method based on feature fusion and environment self-adaption
  • Target detection tracking method based on feature fusion and environment self-adaption

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0044] The specific implementation manners of the present invention will be further described in detail below in conjunction with the accompanying drawings and embodiments. The following examples are used to illustrate the present invention, but are not intended to limit the scope of the present invention.

[0045] ginseng figure 1 As shown, this embodiment provides a method for target detection and tracking based on feature fusion and environment adaptation, including the following steps:

[0046] Step 1, use a pre-classifier to preprocess the video data, and divide the video data into simple frames and complex frames;

[0047] Step 2, for simple frames, extract a single feature; for complex frames, extract HoG, Haar, CN features for multi-directional feature fusion;

[0048] Step 3, based on the sample screening mechanism of motion prediction, use the least squares method to fit the target's motion trajectory, and use this as a parameter for weight calculation to re-evalua...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention relates to a target detection tracking method based on feature fusion and environment self-adaption, and the method comprises the following steps of: 1, preprocessing video data by using a pre-classifier, and dividing the video data into a simple frame and a complex frame; 2, extracting a single feature for the simple frame; for the complex frame, extracting HoG, Haar and CN features to carry out multi-direction feature fusion; and 3, based on a sample screening mechanism of motion prediction, fitting a motion trail of the target by using a least square method, taking the motion trail as a parameter of weight calculation, and re-evaluating a sample generated by cyclic sampling to obtain a more accurate weight. Under the condition that the tracking speed and precision are guaranteed, the method has better robustness when resisting target deformation, fast movement and scale change, and especially has more advantages in complex backgrounds and long-term sequence processing. The complexity of system calculation can be reduced as much as possible under the condition that the tracking accuracy is not influenced, and the tracking speed of the algorithm is improved.

Description

technical field [0001] The invention relates to the technical field of computer vision target tracking, in particular to a target detection and tracking method based on feature fusion and environment self-adaptation. Background technique [0002] Video object tracking has always been a key research issue in the field of computer vision, and it has applications in many fields such as public transportation, intelligent surveillance, intelligent security, and human-computer interaction. With the continuous expansion of the application range and the continuous improvement of the level of science and technology, people's research on the field of target tracking is also getting more and more in-depth. [0003] Target tracking refers to accurately tracking the target area in continuous video frames. However, many current target tracking algorithms are not well suited to target tracking tasks in multiple scenarios. Currently, relatively mature tracking algorithms are generally It c...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/246G06T7/90G06K9/00G06K9/62
CPCG06T7/246G06T7/90G06F18/22G06F18/24G06F18/253
Inventor 马迪袁智马博洋李金拓胡嘉铭董蔚李强
Owner DATANG DONGBEI ELECTRIC POWER TESTING & RES INST
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products