Fast positioning method for video events from rough state to fine state

A technology of video event and positioning method, which is applied in the research field of video event positioning method, and can solve the problems of statistics, incapable of quantitative and accurate comparison, and calculation cost

Inactive Publication Date: 2011-08-17
南方报业传媒集团
View PDF4 Cites 27 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] The above two methods only provide visual experimental results, and do not count the detection rate, so they cannot be compared quantitatively and accurately
In addition, using the above two methods to realize the positioning of the query event in the video must be searched in the complete space of X-Y-T. In order to reduce the huge search space, they both down-sample the original video, which is more likely to cause missed detection, and Significant computational cost is still spent on space-time locations that are unlikely to contain query events
The above technical defects make the existing video event positioning method not meet the practical requirements in terms of performance and time efficiency, which limits the scope of application of this method

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Fast positioning method for video events from rough state to fine state
  • Fast positioning method for video events from rough state to fine state
  • Fast positioning method for video events from rough state to fine state

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0085] Such as figure 1 As shown, a rapid localization method of video events from coarse to fine, specifically includes the following steps:

[0086] (1) Coarse search for the space-time body of interest: A group of video clips most likely to contain the query event are obtained by time-segmenting the real video, and the region of interest of each frame is obtained by spatially segmenting the real video. The region of interest of each frame of image in the clip is normalized and stacked in time sequence to form a set of space-time volume of interest. The time segmentation of real video includes space-time interest point detection, HOG feature and HOF feature extraction for space-time split volume , Use the chi-square distance method to perform feature matching on the space-time split, and use the classification algorithm to determine the start and end points of the video clip; the spatial segmentation of the real video includes using the historical frame and current frame inf...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a fast positioning method for video events from rough state to fine state, comprising the following steps of: roughly matching an inquired event with an actual video to obtain the start and ending points for dividing the actual video time, extracting an interesting region of each frame of image in the actual video to finish space division of the actual video, performing space division and time division to the actual video to obtain a series of interesting time bodies, finely matching the interesting time bodies with the inquired event to construct a relative body, using a global significance test to identify whether each interesting time body has the event related to the inquired sample on the relative body, and finally, using a post-process method to control the time region positioning and displaying around the best matched obvious points. The method adopts the roughly matching method to remove large amount of irrelevant time regions to effectively reduce the matching and searching space so that the finely matching is only performed between the interesting time body and the inquired event, so the method improves the searching speed.

Description

technical field [0001] The invention belongs to the research field of video event positioning methods, in particular to a rapid positioning method of video events from coarse to fine. Background technique [0002] The localization of video events has a wide range of applications in video retrieval, video browsing, intelligent monitoring, and human motion analysis. Current video event localization methods are mainly divided into two categories: learning-based methods and learning-free methods. The learning-based method needs to establish a training model for each query event, but because the training of the model needs to adjust multiple parameters, over-fitting may occur. The learning-free method does not require training and only needs the user to provide the corresponding query event to locate the video event, and the space-time location of the query event in the real video can be obtained by searching. This type of method uses query events as templates to perform matchi...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06F17/30G06T7/00
Inventor 吴娴杨兴锋王春芙张东明何崑
Owner 南方报业传媒集团
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products