Video interaction inquiry method and system based on mass data

A query method and technology of massive data, applied in the field of video data query, can solve the problems of video data unable to provide location information, lack of semantic retrieval mechanism, unable to meet the needs of smart city video query, etc., to increase relevance and improve effectiveness Effect

Active Publication Date: 2014-08-13
SHENZHEN INST OF ADVANCED TECH
View PDF7 Cites 17 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0002] At present, the video data in the city has been aggregated into massive (Big Data) data. The traditional way of storing surveillance cameras according to the area where they are located and storing files in a directory can no longer meet the video query needs of smart cities.
[0003] On the one hand, because video data cannot provide accurate location informati

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Video interaction inquiry method and system based on mass data
  • Video interaction inquiry method and system based on mass data

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0028] A video interactive query method based on massive data mainly includes the following steps:

[0029] In step S101, based on the spatial positioning method of video analysis, a video object spatial positioning coordinate system centered on a single monitoring camera is established.

[0030] In step S102, based on the spatial positioning coordinate system, an association of video data between a plurality of surveillance cameras is established.

[0031] Said associations include:

[0032] (1) Correlate according to time, for example, to show the traffic operation situation of a road section during the rush hour at 6:00 p.m.;

[0033] (2) Association based on geographic information. For example, five monitoring cameras are installed on a certain street, which can be associated according to geographic information such as their monitoring range and relative distance.

[0034] (3) Correlate according to the video object, such as monitoring the video data of the suspected veh...

Embodiment 2

[0053] see figure 2 , shows the block diagram of video interactive query system based on massive data. The query system includes: a video data collection terminal 10 , a server 20 , and a query terminal 30 .

[0054] The video data collection terminal 10 is used to provide the content of the video data, the shooting time, and the location information of the shooting location. It mainly includes a plurality of surveillance cameras 11 all over the city corners.

[0055] The server 20 is used for unified management of the video data, including establishing a unified video format, associating video data from multiple sources, and setting indexing rules. Specifically, servers include:

[0056] The video database 21 is used to establish a unified coordinate system and a unified video format for the collected video data, and perform association according to part of the content in the video format.

[0057] The index generation module 22 is used to set index rules and extract vid...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention provides a video interaction inquiry method based on mass data, and is applied to the field of urban safety protection. The method comprises the steps of constructing a video object space positioning coordinate system taking a single monitoring camera as a center; constructing a video data relation among multiple monitoring cameras; adding a space coordinate in the video data; performing real-time compression on the video data; extracting video semantic features in a compression process, and generating a semantic index; performing storage according to a uniform coordinate system and a uniform video format, wherein the video format at least comprises the photographing time, the space coordinate and the semantic index; performing inquiry by taking an input semance, the photographing time or/and the space coordinate as keywords; searching the video data related to the keywords, and outputting an inquiry result. The invention also provides a corresponding inquiry system. Due to precise space positioning, property-space based bidirectional interaction inquiry for the mass video data is realized.

Description

technical field [0001] The invention relates to video data query, in particular to a video interactive query method and system based on massive data. Background technique [0002] At present, the video data in the city has been aggregated into massive (Big Data) data, and the traditional method of storing files and catalogs by labeling surveillance cameras according to the area where they are located has been unable to meet the video query needs of smart cities. [0003] On the one hand, because video data cannot provide accurate location information, it is difficult to combine it with electronic maps, GSP, etc. for high-end applications, such as conditional selection of video resources and correlation search based on a certain spatial activity range. On the other hand, due to the lack of semantic retrieval mechanism for video data, it is impossible to quickly search and accurately locate specific video content. For example, according to the characteristics of vehicle color...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06F17/30H04N7/18
CPCG06F16/7867
Inventor 修文群
Owner SHENZHEN INST OF ADVANCED TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products