Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Target tracking based on adaptive feature fusion.

A feature fusion and target tracking technology, applied in the field of image processing and computer vision, can solve problems such as scale transformation, occlusion, and difficulty in achieving real-time tracking, and achieve accurate tracking and improve tracking accuracy.

Inactive Publication Date: 2018-01-30
ANHUI UNIVERSITY
View PDF1 Cites 60 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Although researchers have proposed many tracking algorithms in recent years, it is difficult to achieve real-time efficient and stable tracking in the face of problems such as target appearance changes, fast motion, scale transformation, and occlusion.
[0003] The traditional tracking method builds a complex appearance model and extracts a large number of learning samples, which generates a huge amount of calculation, and it is difficult to achieve real-time tracking.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Target tracking based on adaptive feature fusion.
  • Target tracking based on adaptive feature fusion.
  • Target tracking based on adaptive feature fusion.

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0102] In order to make the purpose, technical route and beneficial effects of the present invention clearer, the present invention will be further described below in conjunction with the accompanying drawings and specific embodiments.

[0103] The implementation process of the target tracking method based on adaptive feature fusion is as follows: figure 1 shown, including the following steps:

[0104] Step 1: Initialize the target and select the target area;

[0105] Specific steps:

[0106] The initial position of the tracked target according to the first frame is p=[x, y, w, h].

[0107] Where x, y represent the abscissa and ordinate of the center point of the target, and w, h represent the width and height of the target frame, respectively.

[0108] The target area selects the target center point as the center and a rectangular area twice the size of the target as the target area P t .

[0109] Step 2: Select samples in the target area to calculate HOG features and CN...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides an adaptive feature fusion target tracking method. The adaptive feature fusion target tracking method comprises in a first frame image, initializing a target area, and constructing a location filter and a scaling filter; extracting detection samples around a target, respectively calculating an HOG (Histogram of Oriented Gradient) feature and a CN (Color Name) feature, and obtaining a response value through the location filter; calculating the feature weight according to the response value, normalizing the weight coefficient, fusing the feature response value, and selecting the point having the largest response value as the central location of the target; judging whether a shielding phenomenon occurs according to the target response, under the shielding condition, only updating the scaling filter without updating the target location filter, carrying out a circular process, and obtaining the target location of each frame. The adaptive feature fusion target trackingmethod is advantaged in that an adaptive feature fusion method is provided, a model updating strategy based on APCE (Average Peak-to-Correlation Energy) is designed, and the target tracking precisionand the robustness under the shielding condition can be greatly improved.

Description

technical field [0001] The invention belongs to the fields of image processing and computer vision, and in particular relates to a target tracking method for adaptive feature fusion. technical background [0002] In computer vision, object tracking is a hot research field, which can be applied to many human-computer interaction fields such as video surveillance and automatic monitoring. Although researchers have proposed many tracking algorithms in recent years, it is difficult to achieve real-time efficient and stable tracking in the face of problems such as target appearance changes, fast motion, scale transformation, and occlusion. [0003] The traditional tracking method builds a complex appearance model and extracts a large number of learning samples, which generates a huge amount of calculation, and it is difficult to achieve real-time tracking. Correlation filtering methods bypass the construction of complex appearance models and a large number of learning samples, a...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T7/246G06K9/62
Inventor 孙战里谷成刚
Owner ANHUI UNIVERSITY
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products