Video highlight detection with pairwise deep ranking

A focus and video technology, applied in the field of video focus detection with paired depth sorting, can solve redundant, unstructured, lengthy and other problems

Inactive Publication Date: 2018-06-08
MICROSOFT TECH LICENSING LLC
View PDF4 Cites 4 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

This limitation is especially severe when directly applying these methods to first-person videos, which are recorded in unconstrained settings, making them lengthy, redundant, and unstructured.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Video highlight detection with pairwise deep ranking
  • Video highlight detection with pairwise deep ranking
  • Video highlight detection with pairwise deep ranking

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0017] Concepts and techniques are described herein for providing a video highlight detection system for generating highlight content output to a user for accessing large video streams.

[0018] overview

[0019] Current systems that provide focus on video content do not have the ability to effectively identify spatial moments in a video stream. The advent of wearable devices such as camcorders and smart glasses has made it possible to document life, recorded in first-person video. Browsing through such long, unstructured videos is time-consuming and tedious.

[0020] In some examples, the techniques described herein describe main or special moments of interest (eg, highlights) in a video (eg, a first-person video) for use in generating a summary of the video.

[0021] In one example, the system uses a pairwise deep ranking model that employs deep learning techniques to learn relationships between highlighted and non-focused video segments. The result of deep learning can b...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

Video highlight detection using pairwise deep ranking neural network training is described. In some examples, highlights in a video are discovered, then used for generating summarization of videos, such as first-person videos. A pairwise deep ranking model is employed to learn the relationship between previously identified highlight and non-highlight video segments. This relationship is encapsulated in a neural network. An example two stream process generates highlight scores for each segment of a user's video. The obtained highlight scores are used to summarize highlights of the user's video.

Description

Background technique [0001] The advent of wearable devices such as camcorders and smart glasses has made it possible to document life, recorded in first-person video. For example, wearable camcorders such as Go-Pro cameras and Google Glass are now capable of capturing high-quality first-person video to document our everyday experiences. These first-person videos are often very unstructured and long-running. Browsing and editing such videos is a very tedious job. Video summarization applications can generate short summaries of full-length videos that encapsulate the most informative parts, alleviating many of the problems associated with first-person video browsing, editing, and indexing. [0002] Research on video summarization is mainly carried out along two dimensions, namely, keyframe or shot-based methods and structure-driven methods. Keyframe- or shot-based methods select a collection of keyframes or shots by optimizing the diversity or representativeness of the summar...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): H04N21/8549H04N21/4545H04N21/466
CPCH04N21/45457H04N21/4666H04N21/8549G11B27/031G11B27/3081G06V20/41G06V20/47
Inventor 姚霆梅涛芮勇
Owner MICROSOFT TECH LICENSING LLC
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products