Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

User-directed navigation of multimedia search results

a multimedia search and user-directed technology, applied in multimedia data indexing, metadata video data retrieval, instruments, etc., can solve the problems of search engine inability to provide the corresponding audio/video podcast, and the limited metadata information that describes the audio content or video content is typically limited

Inactive Publication Date: 2009-09-03
CXENSE
View PDF80 Cites 64 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Benefits of technology

Enables users to efficiently locate and play back specific segments of audio / video content by generating detailed metadata, improving search accuracy and user-directed navigation within media files.

Problems solved by technology

With respect to media files or streams, the metadata information that describes the audio content or the video content is typically limited to information provided by the content publisher.
If this limited information fails to satisfy a search query, the search engine is not likely to provide the corresponding audio / video podcast as a search result even if the actual content of the audio / video podcast satisfies the query.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • User-directed navigation of multimedia search results
  • User-directed navigation of multimedia search results
  • User-directed navigation of multimedia search results

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

Generation of Enhanced Metadata for Audio / Video

[0026]The invention features an automated method and apparatus for generating metadata enhanced for audio / video search-driven applications. The apparatus includes a media indexer that obtains an media file / stream (e.g., audio / video podcasts), applies one or more automated media processing techniques to the media file / stream, combines the results of the media processing into metadata enhanced for audio / video search, and stores the enhanced metadata in a searchable index or other data repository.

[0027]FIG. 1A is a diagram illustrating an apparatus and method for generating metadata enhanced for audio / video search-driven applications. As shown, the media indexer 10 cooperates with a descriptor indexer 50 to generate the enhanced metadata 30. A content descriptor 25 is received and processed by both the media indexer 10 and the descriptor indexer 50. For example, if the content descriptor 25 is a Really Simple Syndication (RSS) document, th...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

A method and apparatus for timed tagging of content is featured. The method and apparatus can include the steps of, or structure for, obtaining at least one keyword tag associated with discrete media content; generating a timed segment index of discrete media content, the timed segment index identifying content segments of the discrete media content and corresponding timing boundaries of the content segments; searching the timed segment index for a match to the at least one keyword tag, the match corresponding to at least one of the content segments identified in the segment index; and generating a timed tag index that includes the at least one keyword tag and the timing boundaries corresponding to the least one content segment of the discrete media content containing the match.

Description

RELATED APPLICATIONS[0001]This application is a continuation-in-part of U.S. patent application Ser. No. 11 / 395,732, filed on Mar. 31, 2006, which claims the benefit of U.S. Provisional Application No. 60 / 736,124, filed on Nov. 9, 2005. The entire teachings of the above applications are incorporated herein by reference.FIELD OF THE INVENTION[0002]Aspects of the invention relate to methods and apparatus for generating and using enhanced metadata in search-driven applications.BACKGROUND OF THE INVENTION[0003]As the World Wide Web has emerged as a major research tool across all fields of study, the concept of metadata has become a crucial topic. Metadata, which can be broadly defined as “data about data,” refers to the searchable definitions used to locate information. This issue is particularly relevant to searches on the Web, where metatags may determine the ease with which a particular Web site is located by searchers. Metadata that are embedded with content is called embedded metad...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(United States)
IPC IPC(8): G06F17/30G06F7/00G10L11/00
CPCG06F17/30247G06F17/30787G06F17/30852G06F17/30817G06F17/30796G06F16/7844G06F16/78G06F16/685G06F16/7834G06F16/745G06F16/583G06F16/41G06F16/489
Inventor HOUH, HENRYSTERN, JEFFREY NATHAN
Owner CXENSE
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products