Target tracking method and device

A technology for target tracking and target tracking, which is applied in the field of target tracking methods and devices, and can solve problems such as inability to identify task differences

Inactive Publication Date: 2017-07-04
BOCOM SMART INFORMATION TECH CO LTD
View PDF8 Cites 33 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] Therefore, the technical problem to be solved by the embodiments of the present invention lies in the target tracking method in the pri

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Target tracking method and device
  • Target tracking method and device
  • Target tracking method and device

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0070] This embodiment provides a target tracking method, such as figure 1 shown, including the following steps:

[0071] S1. Establish a network tracking model based on the current frame image; this network tracking model is combined with video data associated with time series, and trained to obtain an individual-level network. Because object tracking usually refers to the initial state of the object in the first frame of the tracking video, the state of the object in subsequent frames is automatically estimated. The human eye can easily follow a specific target for a period of time, but for the machine, this task is not easy. During the tracking process, the target may change drastically, be blocked by other targets or appear similar objects Interference and other complex situations. The above-mentioned current frame is the image information including the initial frame input from the video stream or the previous frame or the next frame at the current moment, and the inform...

Embodiment 2

[0096] This embodiment provides a target tracking device, which corresponds to the target tracking method in Embodiment 1, such as Figure 5 As shown, it includes the following units:

[0097] An establishment unit 511, configured to establish a network tracking model according to the current frame image;

[0098] The first determining unit 512 is configured to determine the current frame image containing the tracked target;

[0099] The first acquiring unit 513 is configured to acquire the tracked target area in the current frame image;

[0100] The second acquiring unit 514 is configured to acquire the next frame image of the current frame;

[0101] A third acquisition unit 515, configured to acquire a plurality of target candidate regions in the next frame of image;

[0102] A calculation unit 516, configured to calculate the similarity between the tracked target area and each target candidate area;

[0103] The second determining unit 517 is configured to determine a t...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a target tracking method and device. The target tracking method comprises the following steps of: establishing a network tracking model according to the current frame of image; determining the current frame of image comprising a tracked target; obtaining a tracked target area in the current frame of image; obtaining the next frame of image of the current frame of image; obtaining a plurality of target candidate areas in the next frame of image; calculating the similarity between the tracked target area and each target candidate area; determining a target tracking area in the plurality of target candidate areas according to the similarities; determining the tracked target in the target tracking area; and obtaining the current state of the tracked target, and updating the network tracking model according to the current state. According to the method and device disclosed by the invention, the network tracking model is established, so that the individual differences between same objects of the tracked targets can be recognized, and the appearance changes of the target objects under various conditions can be effectively expressed when sample objects are updated.

Description

technical field [0001] The invention relates to the field of image processing, in particular to a target tracking method and device. Background technique [0002] The purpose of target tracking is to obtain the trajectory of a specific target in a video sequence. In recent years, with the rapid spread of computer network video, the research on target tracking has always been a hot topic in the field of computer vision, and it also plays an important role in many practical vision systems. , and target tracking refers to giving the initial state of the target in the first frame of the tracking video, and predicting the precise position of the target in the subsequent frames. At the same time, visual target tracking is also the basis of artificial intelligence, which can simulate human vision. Behavior. [0003] At present, the target tracking method in the prior art mainly uses images to detect different classification tasks to track target objects, but cannot identify the di...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06T7/207G06K9/00
CPCG06T2207/20081G06V20/41
Inventor 谯帅蒲津何建伟张如高
Owner BOCOM SMART INFORMATION TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products