Eureka AIR delivers breakthrough ideas for toughest innovation challenges, trusted by R&D personnel around the world.

Video space-time search method

A video, spatiotemporal technology, applied in video data retrieval, video data query, special data processing applications, etc., can solve the problems of dense spatial distribution, large number of retrieval result data sets, and difficulty in visual analysis of result video sets, and achieve clear imaging. , the effect of improving usability

Active Publication Date: 2017-05-31
NANJING NORMAL UNIVERSITY
View PDF3 Cites 4 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] However, video data has spatial aggregation and redundancy of information expression in space, because multiple videos are usually taken for a certain object of interest, and these characteristics of spontaneous video data collection and uploading are more significant, which This leads to a large number of retrieval result data sets and dense spatial distribution, which brings difficulties to the visualization and further analysis of the result video collection.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Video space-time search method
  • Video space-time search method
  • Video space-time search method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0039] Below in conjunction with accompanying drawing, technical scheme of the present invention is described in further detail:

[0040] The basic steps of a kind of video spatio-temporal retrieval method of the present invention, as figure 1 As shown, specifically:

[0041] Step 1: Get the spatio-temporal information of the video:

[0042] (1) Obtain various video information, including imaging chip size, image resolution, aperture diameter, focus distance, relative aperture, diameter of diffusion circle, video shooting start time, and video shooting end time;

[0043] (2) Obtain the information of each video frame in each video, including video frame shooting position (latitude and longitude coordinates, shooting height), shooting attitude (pitch angle, rotation angle), focal length, shooting time;

[0044] (3) All video information and the corresponding information of each video frame constitute the complete spatiotemporal description information of the video.

[0045]S...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a video space-time search method. The video space-time search method comprises the steps of searching a video set satisfying a time constraint according to a time search condition; sampling a searching object into a plurality of sampling spatial points, and sampling the shooting direction into a plurality of direction units; determining whether the sampling spatial points are in visual threshold of video frames, obstructed by barriers and clearly imaged or not, and calculating perception intensity of the current spatial point by the current video frame if conditions are satisfied; calculating the direction of the video frame shooting the sampling spatial point, and merging each video frame to a corresponding angle unit; and calculating the sum of the perception intensity of the video frames belonging to the same video in each sampling spatial point and each angle unit, and selecting the video having the maximum perception intensity as the object video, a set formed by object videos involved in all sampling spatial points and direction units being a space-time search result. An ordered list of comprehensively describing spatial object information is obtained on this basis.

Description

technical field [0001] The invention relates to a video spatiotemporal retrieval method, in particular to a video retrieval method considering temporal and spatial information such as video shooting time, spatial object position information, and spatial object shooting direction. Background technique [0002] Video data is a streaming media that contains comprehensive information such as vision, hearing, time, and space. With the increase in the deployment of surveillance cameras and the popularity of video acquisition devices such as smartphones, video data has exploded. The explosive growth of acquisition, sharing and usage of video data has brought great challenges to video retrieval. [0003] Current video retrieval can be divided into two categories: content-based video retrieval and metadata-based video retrieval. With the integrated application of space-related sensors (GPS, electronic compass, gravity sensor, gyroscope, etc.) The method is automatically calculated...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F17/30
CPCG06F16/73G06F16/738
Inventor 王美珍刘学军孙开新王自然
Owner NANJING NORMAL UNIVERSITY
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Eureka Blog
Learn More
PatSnap group products