Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Target tracking method and device based on targeted candidate areas

A target candidate and target tracking technology, which is applied in the field of image processing and can solve problems such as determining target candidate regions and tracking failures.

Inactive Publication Date: 2017-06-30
BOCOM SMART INFORMATION TECH CO LTD
View PDF8 Cites 14 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] Therefore, the technical problem to be solved by the embodiments of the present invention is that the target tracking method in the prior art mainly extracts the feature information of the target object through detection and then tracks the target object. Since the target candidate area cannot be determined in the video image, the tracking process Due to external conditions, it is more likely to cause tracking failure

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Target tracking method and device based on targeted candidate areas
  • Target tracking method and device based on targeted candidate areas
  • Target tracking method and device based on targeted candidate areas

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0062] An embodiment of the present invention provides a target tracking method based on target candidates, such as figure 1shown, including the following steps:

[0063] S1. Determine the current frame image containing the tracked target; input multiple consecutive images in the video stream to form a complete video, and the images are continuous time-series associated data, so only by obtaining the tracked target image with the current frame In order to complete the specific tracking, the tracked target can only be found in the video stream if the current frame image position containing the tracked target is determined.

[0064] Specifically, object tracking usually refers to automatically estimating the state of the target object in subsequent frames given the initial state of the object in the first frame of the tracking video. The human eye can easily follow a specific target for a period of time, but for the machine, this task is not easy. During the tracking process, t...

Embodiment 2

[0083] This embodiment provides a target tracking device based on target candidates, which corresponds to the target tracking method based on target candidates in Embodiment 1, such as Figure 5 shown, including:

[0084] The first determining unit 41 is configured to determine the current frame image containing the tracked target;

[0085] The first acquiring unit 42 is configured to acquire the tracked target area in the current frame image;

[0086] The second acquiring unit 43 is configured to acquire the next frame image of the current frame;

[0087] A third acquisition unit 44, configured to acquire a plurality of target candidate regions in the next frame of image;

[0088] Calculation unit 45, for calculating the similarity between the tracked target area and each target candidate area;

[0089] The second determining unit 46 is configured to determine the target tracking area in the plurality of target candidate areas according to the similarity.

[0090] As an i...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a target tracking method and a device based on targeted candidate areas, wherein the target tracking method comprises: determining the current frame image containing a tracked target; obtaining the tracked target area from the current frame image; obtaining the frame image following the current frame; obtaining a plurality of target candidate areas from the following frame image; calculating the similarities between the tracked target area and each target candidate area; based on the similarities, determining the target tracking area from the multiple target candidate areas. According to the invention, through the target candidate manner, the specific position of a tracked target can be determined; and in the tracking process, the tracked target object can be detected accurately, therefore, effectively increasing the stability of tracking a target object and avoiding the tracking failure.

Description

technical field [0001] The present invention relates to the field of image processing, in particular to a target candidate-based target tracking method and device. Background technique [0002] The purpose of target tracking is to obtain the trajectory of a specific target in a video sequence. In recent years, with the rapid spread of computer network video, the research on target tracking has always been a hot topic in the field of computer vision, and it also plays an important role in many practical vision systems. , and in the process of tracking the target object, it is often necessary to select a candidate area of ​​the tracked target in the video image to complete the specific tracking. [0003] At present, the target tracking method in the existing technology mainly extracts the feature information of the target object through the detection method of the learning classification task and then tracks the target object, but the image information in the video stream is v...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00
CPCG06V20/42
Inventor 谯帅蒲津何建伟张如高
Owner BOCOM SMART INFORMATION TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products