Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Training and tracking method based on multi-challenge perception learning model

A technology of perceptual learning and training methods, applied in neural learning methods, biological neural network models, neural architectures, etc., can solve problems such as poor extraction of challenge information, achieve the effects of increasing richness, real-time tracking performance, and ensuring accuracy

Active Publication Date: 2020-03-27
ANHUI UNIVERSITY
View PDF5 Cites 9 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] The technical problem to be solved by the present invention is to provide a training and tracking method based on a multi-challenge perception learning model to solve the problem of poor extraction of multi-level challenge information

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Training and tracking method based on multi-challenge perception learning model
  • Training and tracking method based on multi-challenge perception learning model
  • Training and tracking method based on multi-challenge perception learning model

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0053] like figure 1 , figure 1 It is a schematic diagram of the structure of the network model; the training method based on the multi-challenge perception learning model includes the following steps;

[0054] S11. Constructing a network model;

[0055] Obtain the first frame of the current tracking video sequence, and use the center point of the truth frame as the mean value to carry out Gaussian distribution sampling to obtain candidate samples by the truth frame of the target in the given first frame. In this embodiment, (0.09r 2 ,0.09r 2 ,0.25) is the covariance, resulting in 256 candidate samples;

[0056] Among them: r is the average value of the width and height of the target in the previous frame,

[0057] Obtaining the current tracking video sequence is an existing technology, such as obtaining through a camera, etc., and will not be described in detail here, and Gaussian distribution sampling is also an existing technology.

[0058] like figure 2 , the networ...

Embodiment 2

[0081] like figure 1 , image 3 , Figure 4 , figure 1 It is a schematic diagram of the network model structure; image 3 It is a flow block diagram of embodiment 2 in the present invention; Figure 4 is a flowchart of the real-time visual tracking method based on the multi-challenge perceptual learning model;

[0082] A real-time visual tracking method based on a multi-challenge perceptual learning model, comprising the following steps;

[0083] S21. Input the currently tracked video frame, and use Gaussian sampling to obtain candidate samples of the current frame around the target position predicted in the previous frame;

[0084] The first frame image provided by the video sequence to be tracked is used as the previous frame; 5500 samples are randomly generated according to the Gaussian distribution from the previous frame and the ground-truth box that frames the target location area, S + =500(IOU≥0.7) and S - =5000(IOU≤0.3);

[0085] Use 5500 samples to initialize ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a training and real-time tracking method based on a multi-challenge perception learning model, and the method sequentially comprises two parts: a model training process and atracking process carried out through a pre-trained model. The method comprises the following steps: S11, constructing a network model, S12, training the whole network model through employing a VOT data set with a calibrated target; S21, inputting a currently tracked video frame, and obtaining candidate samples of the current frame around the predicted target position of the previous frame by Gaussian sampling; S22, obtaining a feature map of the candidate sample; S23, inputting the feature map into a classifier module, and predicting a target position; S24, judging whether the current frame istracked successfully or not; according to the method, the richness of feature expression can be effectively increased, the tracking robustness is improved, and the real-time tracking performance is achieved.

Description

technical field [0001] The invention relates to the field of computer vision, in particular to a training and tracking method based on a multi-challenge perception learning model. Background technique [0002] Visual tracking is a fundamental research problem in the field of computer vision, which aims to estimate the state of the object in subsequent video frames given the initial state (such as size and position) of the tracked object in the first frame of a video sequence. At present, visual tracking technology has been widely used in intelligent video surveillance, unmanned driving, augmented reality and other fields, and has important research significance for the development of social security, cultural entertainment and other fields. [0003] With the continuous improvement of computer hardware performance and the introduction of large-scale visual data sets (such as ImageNet, large-scale image classification data sets), methods based on deep learning, especially deep...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06N3/04G06N3/08
CPCG06N3/08G06N3/045
Inventor 李成龙刘磊鹿安东
Owner ANHUI UNIVERSITY
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products