Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Object scale self-adaption tracking method based on spatial-temporal model

An adaptive tracking, spatiotemporal model technology, applied in character and pattern recognition, instruments, computer parts, etc., can solve problems such as the decline of tracking accuracy, and achieve the effect of robust tracking, improved accuracy, and wide application prospects.

Active Publication Date: 2015-12-02
JIANGNAN UNIV
View PDF4 Cites 23 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] Aiming at the problem of the decrease of tracking accuracy caused by the change of target scale, the present invention introduces a multi-scale historical target template library constructed by borrowing the idea of ​​clustering, proposes a target scale adaptive tracking algorithm based on a spatio-temporal model, and realizes the real-time target robustness of scale change. stick track

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Object scale self-adaption tracking method based on spatial-temporal model
  • Object scale self-adaption tracking method based on spatial-temporal model
  • Object scale self-adaption tracking method based on spatial-temporal model

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0033] In order to better illustrate the purpose, concrete steps and characteristics of the present invention, the present invention will be described in further detail below in conjunction with the accompanying drawings:

[0034] refer to figure 1 , a kind of target scale self-adaptive tracking method based on the space-time model proposed by the present invention mainly includes the following steps:

[0035] Step 1. Read in the first frame image Image 1 , manually specify the tracking target rectangle position Ζ;

[0036] Step 2. Based on the context space Ω c , initialize the space-time model make

[0037] H 1 s t c ( X ) = h 1 S C ( X ) = F - ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses an object scale self-adaption tracking method based on a spatial-temporal model, and the method comprises the following steps: beginning a video, reading in a first frame of image, manually assigning the rectangle position of a tracked object; then based on context airspace, initializing the spatial-temporal model and a multiscale history object template library; next reading in a next frame of image, building the spatial-temporal model by iteration, calculating a confidence map, and estimating an object center position; then according to the history object template library, judging a template optimal scale, determining the rectangle position of the object, completing tracking of the current frame of object, and updating scale parameters of the spatial-temporal model and the multiscale history object template library; finally detecting whether the video is finished or not, continuously reading in a next frame if the video is not finished, or completing tracking. By adopting the object scale self-adaption tracking method, the object appearance scale change is effectively coped with under the conditions of illumination change, partial blocking and swift moving, and the robust tracking is realized.

Description

Technical field: [0001] The invention belongs to the field of machine vision, and in particular relates to an adaptive tracking method of target scale based on a spatio-temporal model. Background technique: [0002] Target tracking is a high-level visual processing in the video surveillance system, that is, using computer vision, image and video processing and other related technologies to describe, process and analyze the targets in the image sequence captured by the camera without human intervention, to achieve Detect, track and identify moving objects in dynamic scenes, and then obtain the trajectory of interest based on the processed and analyzed object characteristics. Moving target tracking and detection is an important research content in the field of video surveillance. The effect of target tracking and detection in video images directly affects the accuracy of advanced processing such as target behavior detection, event understanding and analysis in the subsequent v...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/62
CPCG06V20/42G06V20/46G06V20/52G06V2201/07G06F18/23
Inventor 蒋敏吴佼孔军柳晨华皮昕鑫
Owner JIANGNAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products