Supercharge Your Innovation With Domain-Expert AI Agents!

A target tracking method based on a depth forest

A target tracking and forest technology, which is applied in the field of target tracking based on deep forests, can solve the problems of large calculation requirements and many hyperparameters, and achieve good real-time performance, good adaptability, and improved accuracy and stability.

Active Publication Date: 2019-04-26
NANJING UNIV OF SCI & TECH
View PDF2 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Deep neural network can automatically learn deep features from a large amount of data. Compared with artificially designed image features, it is more stable. It has been widely used in target tracking in recent years. At the same time, it also has some limitations, such as the number of samples and calculation Large volume requirements, many hyperparameters, and strong experience in the training process

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A target tracking method based on a depth forest
  • A target tracking method based on a depth forest
  • A target tracking method based on a depth forest

Examples

Experimental program
Comparison scheme
Effect test

Embodiment

[0056] combine figure 1 , the present invention is based on deep forest target tracking method, comprises the following steps:

[0057] Step 1: If figure 2As shown, the target tracking video set is established, the target image in the video set is normalized with the image to be matched, and merged in the horizontal direction to form a sample image, and a sample set containing 50,000 positive samples and 50,000 negative samples is established. And the set is divided into two parts, the training set and the test set, at a ratio of 5:5, as follows:

[0058] Step 1.1: Set the target image and the image to be matched as I 1 and I 2 , by size [32,32] for I 1 and I 2 Carry out normalization, obtain the normalized target image and the image to be matched as I 1 ' and I 2 ', calculate the overlap between the two

[0059] Step 1.2: If the overlap rate IOU is greater than the threshold 0.8, the normalized target image I 1 ' and the image to be matched I 2 'Merge into sample...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a target tracking method based on a depth forest. The method comprises the steps of connecting a target image and a to-be-matched image in series in the horizontal direction toform a sample image, and establishing positive and negative sample sets; Defining a multi-granularity scanning layer and a cascading forest layer, and connecting the multi-granularity scanning layerand the cascading forest layer in series to construct a depth forest model; Training the multi-granularity scanning layer and the cascade forest layer respectively, and determining the layer number ofthe cascade forest layer by taking the condition that the added value of the test accuracy is smaller than a threshold value as a training stopping condition in the training process; Defining a localsearch range on the basis of the target position in the previous frame of image; extracting candidate images according to the discrete scale intervals, performing normalization, connecting the firstframe of target image with the candidate images in series, inputting the first frame of target image into a depth forest model, comparing the probability output by the depth forest model, and taking the position corresponding to the highest probability as the current position of the target, thereby realizing target tracking. According to the invention, the stability of target tracking in a complexenvironment is improved, and the real-time performance is good.

Description

technical field [0001] The invention belongs to the technical field of computer vision, in particular to a deep forest-based object tracking method. Background technique [0002] Object tracking is an important research direction in the field of computer vision, which has received extensive attention from academia and industry. Its purpose is to locate the moving target object in the sequence image, so as to obtain its related parameters, such as position, speed, scale, trajectory, etc. These parameters can be further used to understand the behavior of the target object, or to complete a higher level level tasks. With the improvement of computer performance and the increase of camera terminals, the demand for target tracking is increasing, and it has a good development prospect. Its typical applications include: security monitoring, traffic monitoring, human-computer interaction, robotics, military fields, and medical applications. [0003] There are many influencing facto...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T7/20
CPCG06T7/20G06T2207/20081G06T2207/10016
Inventor 朱周刘英舜胡启洲郭唐仪周竹萍
Owner NANJING UNIV OF SCI & TECH
Features
  • R&D
  • Intellectual Property
  • Life Sciences
  • Materials
  • Tech Scout
Why Patsnap Eureka
  • Unparalleled Data Quality
  • Higher Quality Content
  • 60% Fewer Hallucinations
Social media
Patsnap Eureka Blog
Learn More