Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Convolutional neural network regression model-based visual tracking method and device

A convolutional neural network and visual tracking technology, applied in the field of computer vision, can solve problems such as inferring the position of the target, so as to improve the tracking effect and achieve the effect of target tracking

Active Publication Date: 2017-12-29
INST OF AUTOMATION CHINESE ACAD OF SCI
View PDF4 Cites 16 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] In order to solve the above-mentioned problems in the prior art, that is, in order to solve the problem that the target tracking process is divided into two independent steps of component matching and target positioning, and the position of the target cannot be directly inferred from the components, one aspect of the present invention proposes A visual tracking method based on a convolutional neural network regression model, comprising the following steps:

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Convolutional neural network regression model-based visual tracking method and device
  • Convolutional neural network regression model-based visual tracking method and device
  • Convolutional neural network regression model-based visual tracking method and device

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0053] Preferred embodiments of the present invention are described below with reference to the accompanying drawings. Those skilled in the art should understand that these embodiments are only used to explain the technical principles of the present invention, and are not intended to limit the protection scope of the present invention.

[0054] The purpose of the present invention is to use convolutional neural networks to regress object positions for robust visual tracking. The present invention comprehensively considers component context information and component reliability, and realizes regression from component to target in an end-to-end framework.

[0055] The method of the present invention implements robust regression models for component-to-target regression in an end-to-end framework through convolutional neural networks. The proposed model is not only able to maintain the overall corresponding spatial layout structure with part contextual information, but also lear...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The present invention belongs to the computer vision field and proposes a convolutional neural network regression model-based visual tracking method and device. The invention aims to solve the problem that a target tracking process is divided into two independent steps of component matching and target positioning and cannot infer the location of a target directly through components. The method comprises the following steps that: S1, an image block is sampled according to a given target to be tracked at the initial frame of visual tracking, the image block is divided into a plurality of components; S2, a pre-constructed convolutional neural network regression model is trained through using a random gradient descent method; and S3, at the follow-up frames of visual tracking, a search area is constructed based on the position of the target to be tracked in the last frame, and the position of the target to be tracked in a current frame is obtained through the trained convolutional neural network regression model. According to the method and device of the invention, the components are fully combined with the positioning of the target, and therefore, better robustness can be realized.

Description

technical field [0001] The invention belongs to the field of computer vision, and in particular relates to a visual tracking method and device based on a convolutional neural network regression model. Background technique [0002] Visual tracking is the most fundamental component in computer vision applications, and fields such as intelligent video surveillance, augmented reality, robotics, and human-computer interaction all require robust tracking of objects of interest. Although great progress has been made in this field in recent years, visual tracking is still a difficult task because it faces challenges such as partial occlusion, deformation, lighting changes, motion blur, fast motion, background clutter, and scale changes. Wait. [0003] In order to solve these problems, many component-based methods have been extensively studied in recent years, which decompose the target object into a set of components for study. In fact, some parts of the target remain visible when...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/223G06N3/04
CPCG06N3/04G06T7/223G06T2207/10016G06T2207/20021G06T2207/20081G06T2207/20084
Inventor 徐常胜张天柱高君宇
Owner INST OF AUTOMATION CHINESE ACAD OF SCI
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products