VR video live broadcasting interaction method and device based on eye tracking technology

A technology of eye-tracking and live video broadcasting, which is applied to color TV parts, TV system parts, TVs, etc. It can solve problems such as high delay, large amount of VR video data, and large bandwidth demand, so as to ensure quality, Reduces bandwidth requirements, reduces distortion and chromatic aberration effects

Inactive Publication Date: 2017-08-29
UNIV OF ELECTRONICS SCI & TECH OF CHINA
View PDF5 Cites 20 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] The purpose of the present invention is to provide a VR video live interactive method and device based on eye-tracking

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • VR video live broadcasting interaction method and device based on eye tracking technology
  • VR video live broadcasting interaction method and device based on eye tracking technology
  • VR video live broadcasting interaction method and device based on eye tracking technology

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0047] All the features disclosed in this specification, except mutually exclusive features and / or steps, can be combined in any way.

[0048] The present invention will be described in detail below in conjunction with the accompanying drawings.

[0049] A VR video live broadcast interaction method based on eye-tracking technology, such as figure 1 shown, including the following steps:

[0050] S1: VR video preprocessing

[0051] S11: Use multiple cameras to obtain multiple video sources, and splicing the multiple video sources to obtain a fused spherical VR video; the method of video splicing adopts the splicing method of invariant feature matching, such as figure 2The following steps are shown:

[0052] S11: Feature extraction, including establishment of scale space, detection of extreme points, precise positioning of extreme points, and generation of feature vectors;

[0053] S12: Feature matching, through a certain search strategy, based on the BBF algorithm and the R...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention provides a VR video live broadcasting interaction method and device based on an eye tracking technology, and relates to the field of sight interaction technologies, voice interaction technologies and VR video compression coding technologies. An eye watching point is judged through an eyeball tracking technology; a watching point region, namely a region of interest (ROI), is transmitted in a picture compression coding process through high resolution; therefore, the fact that a seen picture is clear enough can be ensured; furthermore, the transmission data size can be greatly reduced; therefore, requirements of VR equipment on hardware can be reduced; simultaneously, the control mode can be switched at any time by utilization of a voice recognition technology; and the interaction mode diversity is increased.

Description

technical field [0001] The present invention relates to the field of eye-sight interaction technology, voice interaction technology and VR video compression coding technology, and in particular to a VR video live broadcast interaction method and device based on eye-sight tracking technology. The technology can be used in virtual reality live video and gaming applications, military aiming assistance, human-computer interaction, smart home, medical research, and psychological analysis. Background technique: [0002] Human-Computer Interaction (HCI for short), as the name suggests, is a study of the interactive relationship between the system and the user. From the initial paper tape punching, to the development of keyboard input and mouse input, to the current touch operation, voice recognition, and 3D gesture and eye movement recognition to be further developed. Every technological innovation and product upgrade will bring about major changes in the way of human-computer int...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): H04N21/426H04N21/44H04N21/472H04N21/485H04N21/81H04N5/265
CPCH04N21/42653H04N5/265H04N21/44008H04N21/47205H04N21/4854H04N21/816
Inventor 张汝民赵丽丽张梦王文一陈建文曾辽原
Owner UNIV OF ELECTRONICS SCI & TECH OF CHINA
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products