Check patentability & draft patents in minutes with Patsnap Eureka AI!

Data processing apparatus

A data processing device and data structure technology, applied in the direction of electrical digital data processing, special data processing applications, data recording, etc., can solve problems such as difficulty in summarizing user's moving images

Inactive Publication Date: 2006-11-29
PANASONIC INTELLECTUAL PROPERTY CORP OF AMERICA
View PDF0 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

In addition, it is also difficult to generalize what the user desires, such as selecting a moving image of a scene including a specific character

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Data processing apparatus
  • Data processing apparatus
  • Data processing apparatus

Examples

Experimental program
Comparison scheme
Effect test

no. 1 example

[0171] A first embodiment of the present invention will be described below. In this embodiment, moving pictures of MPEG-1 system streams are used as the media content. In this case, a media segment corresponds to a single scene segmentation, and a score represents the objective degree of contextual importance of the scene of interest.

[0172] figure 1 The block diagram of is showing the data processing method according to the first embodiment of the present invention. exist figure 1 Among them, reference numeral 101 indicates the selection step; reference numeral 102 indicates the extraction step. In the selection step 101, a scene of media content is selected from the context description data, and the start time and end time of the scene are output. In an extraction step 102, data related to a piece of media content specified by the start time and end time output in the selection step 101 is extracted.

[0173] figure 2 The structure of the context description data ac...

no. 2 example

[0193] A second embodiment of the present invention will be described below. This second embodiment differs from the first embodiment only in the processing related to the selection step.

[0194] Processing related to the selection step 101 according to the second embodiment will be described below with reference to the drawings. In the selection step 101 according to the second embodiment, the priority values ​​assigned to all elements ranging from the highest ranked to the lowest are utilized. The priority assigned to each element and indicates the objective degree of contextual importance. Refer below Figure 31 The processing related to the selection step 101 is described. exist Figure 31Among them, reference numeral 1301 represents one of a plurality of elements included in the highest level in the context description data; 1302 represents a child element of element 1301; 1303 represents element 1302 A child element ; 1304 represents a child element of the...

no. 3 example

[0199] A third embodiment according to the present invention will be described below. The third embodiment differs from the first embodiment only in the processing related to the selection step.

[0200] Processing related to the selection step 101 according to the third embodiment will be described below with reference to the drawings. As in the case of the processing described in conjunction with the first embodiment, in the selection step 101 according to the third embodiment, the selection is performed only for elements each of which has a child . In a third embodiment, a threshold is set which takes into account the sum of the duration intervals of all scenes to be selected. Specifically, the element is selected in order of decreasing priority until the sum of the duration intervals of the elements that have been selected so far is the largest but remains smaller than the threshold. Figure 33 The flowchart of shows the processing related to the selection step 101 ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

A context of media content is represented by context description data having a hierarchical stratum. The context description data has the highest hierarchical layer, the lowest hierarchical layer, and other hierarchical layers. The highest hierarchical layer is formed from a single element representing content. The lowest hierarchical layer is formed from an element representing a segment of media content which corresponds to a change between scenes of video data or a change in audible tones. The remaining hierarchical layers are formed from an element representing a scene or a collection of scenes. A score corresponding to the context of a scene of interest is appended, as an attribute, to the element in each of the remaining hierarchical layers. A score relating to the time information about a corresponding media segment and a context is appended, as an attribute, to individual elements in the lowest hierarchical layer. In a selection step of a data processing method, the context of the media content is expressed, and one or a plurality of scenes of the media content is or are selected on the basis of the score of the context description data. Further, in the extraction step of the data processing method, only data pertaining to the scenes selected in the selection step are extracted.

Description

technical field [0001] The present invention relates to a media content data processing method, a storage medium and a program, all of which relate to the viewing, playback and transmission of continuous audio-visual data (media content) such as moving images, video programs or audio programs , wherein the summary of the highlighted scene of the media content or only the scene of the media content desired by the audience is played and transmitted. Background technique [0002] Traditional media content is traditionally played, transmitted or stored on the basis of separate files storing the media content. [0003] As described in Japanese Unexamined Patent Application No. Hei-10-111872, according to the method of extracting a specific scene of a moving image, changes between scenes of two moving images (hereinafter referred to as "scene segmentation") are detected . Additional data such as the time code of the start frame, the time code of the end fr...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06F17/40G06F13/00G06F9/06G06F17/30G11B27/00G11B27/10G11B27/28H04N5/76H04N5/91H04N5/93
CPCG11B2220/2575G11B2220/2562G11B27/10G06F17/30858G11B27/32G11B2220/216G11B27/28G11B27/105G06F16/71G11B27/00
Inventor 宗续敏彦荣藤稔荒木昭一江村恒一
Owner PANASONIC INTELLECTUAL PROPERTY CORP OF AMERICA
Features
  • R&D
  • Intellectual Property
  • Life Sciences
  • Materials
  • Tech Scout
Why Patsnap Eureka
  • Unparalleled Data Quality
  • Higher Quality Content
  • 60% Fewer Hallucinations
Social media
Patsnap Eureka Blog
Learn More