Generating method and system of video scene database and method and system for searching video scenes

A technology of video scenes and video files, which is applied in the field of video search, can solve the problems of not having and spending a lot of time on target video scene fragments, and achieve the effect of saving time

Inactive Publication Date: 2011-04-20
李平辉
View PDF3 Cites 24 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] From the above, it can be seen that under the existing network technology, the user must spend a lot of time to obtain a small number of target video s

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Generating method and system of video scene database and method and system for searching video scenes
  • Generating method and system of video scene database and method and system for searching video scenes
  • Generating method and system of video scene database and method and system for searching video scenes

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0073] The content, format, type and other attributes of the video files involved in the solution of the present invention do not affect the implementation of the solution of the present invention. In the following example, a general English movie video file is taken as an example, but the implementation of the solution of the present invention is not limited to the video file of an English movie. For example, the present invention is also applicable to Chinese-language movies, other foreign-language movies, and non-movie videos.

Embodiment

[0074] refer to figure 1 , figure 1 A preferred implementation example of the generation method of the video scene library of the present invention is disclosed, and the method includes the following steps:

[0075] 【Step 101】

[0076] According to the preset framing rules, time anchor mark and subtitle annotation are carried out for each video scene of the video file. Typical framing rules include taking each complete dialogue or narration in the video as a scene unit, and taking a specific scene as a scene unit. The subtitle content can be the original text of the dialogue / narration, or a synonymous explanation or summary of the dialogue / narration, which corresponds to a dialogue or narration-type video scene, or a scene description tag, which corresponds to a descriptive video scene.

[0077] Take the video file of the film "Forrest Gump" as an example, assuming that the preset framing rule is to frame each complete dialogue or narration in the video, and also to frame s...

Embodiment 2

[0121] The difference between this embodiment and Embodiment 1 is that in this embodiment, no video scene library is constructed, and when the user makes a request to search for video scene segments, the time anchor points corresponding to the subtitle segments obtained from the search are compared to the time anchor points in the data source. The video file is cut in real time, and the target video scene segment is returned to the user. For the technical details of each step and the working principles of each unit, refer to Embodiment 1, and details are not repeated here.

[0122] A method for directly searching video scene segments, said method comprising the steps of:

[0123] A', the video scene in the video file in the data source is marked with time anchor point and subtitle annotation;

[0124] B', extracting marked time anchors and subtitle segments and related video file information are stored in the subtitle library;

[0125] C', the user inputs a keyword to reques...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses generating method and system of a video scene database and a method and a system for searching video scene segments based on the video scene database generated by the former method. The generating method of the video scene database comprises the following steps of: (A) marking time anchor points and annotating subtitles in a video scene in a video file in a data source; (B) extracting the annotated subtitles into a subtitle database; (C) carrying out redundance cutting on the corresponding video file according to the marked time anchor points, intercepting a video scene segment corresponding to the subtitles and storing in a video scene segment database; and (D) establishing the corresponding relation between the subtitle segments in the subtitle database and the video scene segment in the video scene database. The invention provides data support for a user to conveniently and rapidly find a target video scene segment.

Description

technical field [0001] The invention belongs to the technical field of video search, and in particular relates to a method for generating a video scene library and a method and system for searching video scenes based on the library; in addition, the invention also relates to a method and system for directly searching video scenes. Background technique [0002] With the popularization of the Internet and the development of network technology, video search technology has been widely used on the Internet today. Users can easily obtain the video information they want by using the video search engine. Today's video search technology is generally based on keyword search, and the video files that meet the search conditions are returned to the user by matching and searching the video file name or related tags in the video database. For example, if the user enters the keyword "crazy" to search for videos, then "Crazy Stone", "Crazy Racing" and other video files whose file names cont...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06F17/30H04N5/262
CPCG06F17/30781G06F16/70
Inventor 李平辉
Owner 李平辉
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products