Unlock instant, AI-driven research and patent intelligence for your innovation.

Saliency prediction using part affinity fields in videos

a technology of affinity field and video, applied in the field of conference systems, can solve problems such as discomfor

Active Publication Date: 2021-08-19
FUJIFILM BUSINESS INNOVATION CORP
View PDF0 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Benefits of technology

Effectively focuses on salient interactions in less dynamic scenes like conference rooms, improving user experience by eliminating reliance on audio and addressing privacy concerns, while maintaining performance even in distorted 360° video projections.

Problems solved by technology

Users are free to select the viewpoints in these 360° videos, however, viewers selecting the viewpoints by themselves may frequently cause them to feel discomfort.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Saliency prediction using part affinity fields in videos
  • Saliency prediction using part affinity fields in videos
  • Saliency prediction using part affinity fields in videos

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0018]The following detailed description provides details of the figures and example implementations of the present application. Reference numerals and descriptions of redundant elements between figures are omitted for clarity. Terms used throughout the description are provided as examples and are not intended to be limiting. For example, the use of the term “automatic” may involve fully automatic or semi-automatic implementations involving user or administrator control over certain aspects of the implementation, depending on the desired implementation of one of ordinary skill in the art practicing implementations of the present application. Selection can be conducted by a user through a user interface or other input means, or can be implemented through a desired algorithm. Example implementations as described herein can be utilized either singularly or in combination and the functionality of the example implementations can be implemented through any means according to the desired i...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

Systems and methods are provided that involve processing video to identify a plurality of people in the video; obtaining a plurality of gaze part affinity fields (PAFs) and torso PAFs from the identified plurality of people; determining orthogonal vectors from first vectors derived from the torso PAFs; determining an intersection between second vectors derived from the gaze PAFs and the orthogonal vectors; and changing a viewpoint of the video based on the intersection.

Description

BACKGROUNDField[0001]The present disclosure relates generally to conference systems, and more specifically, to utilization of video to determine viewpoints.Related Art[0002]The number of cameras and streams that support 360° video are growing in popularity on the Internet. Users are free to select the viewpoints in these 360° videos, however, viewers selecting the viewpoints by themselves may frequently cause them to feel discomfort. While there are related art methods to automatically predict 360° viewpoints, such methods often focus on dynamic scenes and egocentric video streams.SUMMARY[0003]In the related art, there are no implementations for less dynamic streams like conference rooms as often found in enterprise environments. Example implementations described herein involve a geometry-based method and a learning-based method to assist in navigating in 360° videos of people interacting in conference rooms and lecture hall environments.[0004]Aspects of the present disclosure invol...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(United States)
IPC IPC(8): H04N13/167G06K9/00G06F3/01H04N13/282
CPCH04N13/167G06K9/00362G06N3/08H04N13/282G06F3/013H04N7/15G06V20/41G06V10/56G06V10/462G06F3/04815G06V40/10G06N20/00
Inventor LEE, HU-CHENGKENNEDY, LYNDONSHAMMA, DAVID AYMAN
Owner FUJIFILM BUSINESS INNOVATION CORP