Event triggering method based on audio and video characteristic fragment index

A feature clip, event-triggered technology, applied in the field of audio and video applications, can solve problems such as inability to associate

Inactive Publication Date: 2008-06-11
北京天天宽广网络科技有限公司
View PDF0 Cites 17 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, these scheduled investment systems target the delivery entirely based on the calculated user preference data, and the delivery time is also pre-set, which cannot be related to the current situation of the audio and video content, and there are also problems in the effect.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0009] The method that the present invention proposes is described in detail as follows in conjunction with accompanying drawing and embodiment;

[0010] The event triggering method based on audio and video feature segment index of the present invention comprises the following steps:

[0011] 1) Scanning the content of the audio and video files (for example, scanning manually, i.e. checking manually), determining and recording the quantity and time position of its characteristic segments;

[0012] 2) Scan each feature segment in chronological order, and select multiple description words for the feature segment according to the content of the feature segment (for example, this feature segment contains the famous actor Schwarzenegger, the scene is a car chase, an explosion , and conduct a gun battle, the description words can include Schwarzenegger, car chase, explosion, gun battle, etc.), record the description words, and the start and end time of the feature segment to form a ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention relates to an event-trigger method based on an audio-video featured clip index, which belongs to the audio-video application field. The method comprises the following steps of: scanning the contents of audio-video files and recording the quantity and the time position of featured clips, selecting the descriptions and the beginning time and ending time for a plurality of featured clips and storing the index information of each featured clip into the index files, adding an event-trigger module to a client-side player, the content server of the audio-video frequency sending audio-video frequency content and index files according to a request of the client-side, the client-side sending the audio-video frequency contents to the player and sending the featured clip index information in the index files to the event-trigger module, the event-trigger module forming a trigger event connected with the featured clip according to the information of the featured clip index and the user and starting the event-trigger when the sound-video contents reach the time point of the featured clip. The invention realizes good interacting and advertising effect.

Description

technical field [0001] The invention belongs to the field of audio and video applications, and is characterized in that it relates to an event triggering method for audio and video characteristic segments. Background technique [0002] Event triggering based on the relevance of audio and video content is mainly used in audio and video interactive services and audio and video advertising services, with the main purpose of increasing interactivity and content relevance of advertisements. [0003] Currently, audio and video interactive programs or advertisement content are pre-set, and the play time of the interaction or advertisement is determined according to the playing schedule of the audio and video content, but it is impossible to automatically select which one to play according to the characteristic segment information of the audio and video content being played. Interactions or advertisements, so the content of these interactions or advertisements is not very relevant, ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G11B27/10G11B19/02H04L29/06
Inventor 赵树乔
Owner 北京天天宽广网络科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products