Unlock instant, AI-driven research and patent intelligence for your innovation.

Target tracking and positioning method based on feature fusion

A target tracking and feature fusion technology, applied in the field of visual target tracking, can solve the problems of discriminative tracking accuracy limitation, background confusion, illumination changes, etc., and achieve the effect of preventing model drift, improving accuracy, and improving robustness

Active Publication Date: 2020-09-01
HUAQIAO UNIVERSITY +1
View PDF3 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] Object tracking technology is faced with the problems of fast moving objects, illumination changes, deformations, occlusions, scale changes, and background confusion. Traditional object tracking methods usually use manual features to deal with the impact of various appearance changes. However, shallow The discriminative nature of visual features limits its tracking accuracy

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Target tracking and positioning method based on feature fusion
  • Target tracking and positioning method based on feature fusion
  • Target tracking and positioning method based on feature fusion

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0060] The general idea of ​​the technical solution in the embodiment of the present application is as follows: by fusing the HOG feature, CN feature and depth feature of the image to improve the expressive ability of the image feature; updating the tracking model of the previous frame by the change factor to prevent model drift and tracking failure; By calculating the average peak correlation energy of the fused response value, the change factor is updated to improve the robustness of tracking, thereby improving the accuracy of target tracking.

[0061] Please refer to Figure 1 to Figure 12 As shown, a preferred embodiment of a feature fusion-based target tracking and positioning method of the present invention includes the following steps:

[0062] Step S10, given the target position of the first frame of image, cropping the first frame of image with the target position as the center to generate a target template O, and creating a search area Z;

[0063] Step S20, respecti...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a target tracking and positioning method based on feature fusion in the field of visual target tracking, and the method comprises the steps: S10, giving the target position of afirst frame of image, cutting the first frame of image with the target position as the center to generate a target template, and creating a search region; S20, extracting HOG features, CN features and depth features of the target template and the search region, and performing fusion to generate a fusion response value; s30, calculating change factors P<t-1> and Q<t-1> of the video by using regularized linear regression; s40, based on the fusion response value, P<t-1> and Q<t-1>, performing calculation to obtain a latest target position; and S50, calculating average peak related energy of thefusion response value, updating P<t-1> and Q<t-1> according to the average peak related energy, and tracking the next frame of image. The method has the advantage that the accuracy of target trackingis greatly improved.

Description

technical field [0001] The invention relates to the field of visual target tracking, in particular to a feature fusion-based target tracking and positioning method. Background technique [0002] Target tracking belongs to the content of video analysis, and video analysis combines the middle and high-level processing stages in the field of visual target tracking, that is, to process video image sequences to study the laws of moving targets, including motion detection, target classification, target tracking and behavior understand etc. The task of target tracking is to determine the position of the target in each subsequent frame by analyzing this group of video image sequences after the information such as the position and size of the target in the first frame is given, and the target is accurately framed. As an important branch of the field of visual target tracking, the research and application of target tracking methods are increasingly widely used in various fields such ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/32G06K9/62G06T7/246
CPCG06T7/248G06T2207/10016G06V10/25G06V2201/07G06F18/22G06F18/253
Inventor 柳培忠柳垚庄加福陈智杜永兆邓建华
Owner HUAQIAO UNIVERSITY
Features
  • R&D
  • Intellectual Property
  • Life Sciences
  • Materials
  • Tech Scout
Why Patsnap Eureka
  • Unparalleled Data Quality
  • Higher Quality Content
  • 60% Fewer Hallucinations
Social media
Patsnap Eureka Blog
Learn More