Method for realizing extrasensory experience by retrieving video content

A technology of video content and video, applied in the fields of super-sensing device mapping and video event retrieval, to achieve the effect of automation, improving retrieval efficiency and improving efficiency

Inactive Publication Date: 2016-04-27
EAST CHINA NORMAL UNIV
View PDF2 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] The purpose of the present invention is to overcome the shortcomings of artificially introducing special effects in 4D theaters, and provide a method for automatically and efficiently introducing special effects in 4D theaters, realizing real-time synchronization between super-sensing equipment and video scenes, and a video event retrieval system

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method for realizing extrasensory experience by retrieving video content
  • Method for realizing extrasensory experience by retrieving video content
  • Method for realizing extrasensory experience by retrieving video content

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0030] The present invention will be described in further detail below in conjunction with the accompanying drawings.

[0031] refer to figure 1 , the present invention includes three steps of video preprocessing, video event retrieval, and video event retrieval result processing; wherein in the video preprocessing, the video frame is taken as the minimum unit to divide the video into multiple analysis units and obtain its representative semantics feature, and then obtain its semantic information, and establish a semantic model through the obtained semantic information. The description of the video event mainly includes two aspects: the event occurrence object (Who) and the event development process (How). The method described in this application studies three aspects: the description and feature extraction of event dynamic information, the establishment of a motion dictionary and the resolution of the impact of the interaction between events on the event. For the description...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a method for realizing extrasensory experience by retrieving video content. The method comprises three steps of video preprocessing, video event retrieval and video event retrieval result processing, and relates to the technical field of video event retrieval and extrasensory device mapping. The method disclosed by the invention aims to realize automation of introduction of four dimensional stunt in a 4D (four dimensional) cinema, compared with the method of manual introduction of the 4D stunt in the 4D cinema in the prior art, the method disclosed by the invention realizes the automation of the introduction of the 4D stunt at the greatest extent, the efficiency of the introduction of the 4D stunt in the 4D cinema is improved at the great extent, and the more real 4D experience can be provided for the audience.

Description

technical field [0001] The present invention relates to the technical field of video event retrieval and super-sensing equipment mapping. Specifically, the video event retrieval technology is used to realize the automation of the motion path of a third-party super-sensing device according to video event control, so that the super-sensing device and the video event can be synchronized at the same time. oriented video viewing experience. Background technique [0002] With the continuous development of social material culture, people have higher and higher requirements for the experience of watching movies in theaters, so 4D theaters appear. The scenes of the film are designed with wind, rain, thunder, and electricity to form a unique experience. At present, most of this experience is semi-automatic, that is, manually confirming the time and type of special effects introduced, but this kind of manual labor is time-consuming and labor-intensive, and the accuracy is not high, an...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06F17/30
CPCG06F16/786G06F16/783
Inventor 吕钊刘欢陈梦伟
Owner EAST CHINA NORMAL UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products