Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Visual angle synchronization method and device in virtual reality VR live broadcast

A technology of virtual reality and synchronization device, which is applied in the generation of 2D images, the input/output process of data processing, instruments, etc. It can solve problems such as synchronization of viewing angles, and achieve the effect of showing smooth changes in viewing angles.

Active Publication Date: 2019-08-23
ALIBABA GRP HLDG LTD
View PDF11 Cites 7 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] This application provides a method and device for viewing angle synchronization in virtual reality VR live broadcasting, which can solve the problem of viewing angle synchronization in the process of VR live broadcasting

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Visual angle synchronization method and device in virtual reality VR live broadcast
  • Visual angle synchronization method and device in virtual reality VR live broadcast
  • Visual angle synchronization method and device in virtual reality VR live broadcast

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0075] This embodiment 1 mainly introduces the viewing angle synchronization solution corresponding to the live broadcast mode of pure display VR content that is biased towards movies, that is, in this solution, the displaying angle of view when the VR sending device displays the VR content, It is related to the motion of the VR sending device, but the display angle of view of the current to-be-displayed image frame by the VR receiving device has nothing to do with the motion of the VR receiving device. refer to figure 1 As shown in , it is a schematic diagram of a scene of an exemplary embodiment of the present application in practical application. exist figure 1 Among them, it is an application scenario of viewing angle synchronization in a VR video live broadcast application scenario. The user of the VR sending device 101 is a live broadcast user, also known as the sender user. The sender user uses the VR sending device 101 such as a VR head-mounted display, or a mobile te...

Embodiment 2

[0093] The second embodiment is corresponding to the first embodiment. From the perspective of the VR receiving device, a method for synchronizing the viewing angle in the virtual reality VR live broadcast is provided. Refer to image 3 , the method may specifically include:

[0094] S301: Obtain VR content information provided by the VR sending device, where the VR content information includes image frames and corresponding viewing angle information of the sending user;

[0095] S302: According to the viewing angle information of the sender user corresponding to the current image frame to be displayed and the preset number of image frames before it, determine the display angle of view of the current image frame to be displayed by the VR receiving device;

[0096] Specifically, when determining the display angle of view of the current image frame to be displayed by the VR receiving device, the average value of the viewing angle information of the sender user corresponding to t...

Embodiment 3

[0102] The third embodiment mainly introduces the problem of viewing angle synchronization during the live broadcast of the game-oriented exploratory VR content. That is, in this case, the display angle of view of the VR content by the VR sending device is related to the motion of the VR sending device, and the display angle of view of the VR content by the VR sending device is related to the motion of the VR sending device. Specifically, the third embodiment firstly provides a method for synchronizing view angles in a virtual reality VR live broadcast from the perspective of a VR sending device. Specifically, see Figure 4 , the method can include:

[0103] S401: Determine viewing angle information of the sender user corresponding to the image frame during the playback of the VR content on the sending device side;

[0104]This step may be the same as step S201 in Embodiment 1, that is, even if the VR receiving device displays VR content, the specific viewing angle of view ch...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a visual angle synchronization method and device in virtual reality (VR) live broadcast, and the method comprises the steps: when determining that VR content is displayed on a sending device side, a sender corresponding to an image frame watches viewing angle information; providing image frames of the VR content and the viewing angle information watched by the sender such that when a VR receiving device displays the VR content, the VR receiving device determines a display angle of the current image frame to be displayed according to the viewing angle information of the sender corresponding to the image frame to be displayed and the pre-preset number of image frames. determining the viewing visual angle information of a sender user corresponding to an image frame in the playing process of VR content at a sending equipment side; and providing the image frame in the VR content and the corresponding sender user viewing angle information to VR receiving equipment, themethod is used for determining the display view angle of the VR receiving device to the current to-be-displayed image frame according to the current to-be-displayed image frame and the sender user viewing view angle information corresponding to the previous preset number of image frames when the VR receiving device displays the VR content. With the adoption of the embodiment of the invention, theVR live broadcast content can be smoothly played for the a receiver userreceiving side at the VR receiving equipment side, so that the dizzy feeling of the receiver user caused by sudden change of the visual angle is avoided.

Description

technical field [0001] The present application relates to the technical field of virtual reality VR live broadcasting, in particular to a method and device for viewing angle synchronization in virtual reality VR live broadcasting. Background technique [0002] Virtual reality (Virtual Reality, VR) can use a computer to generate a simulated environment. The simulated environment is a real-time dynamic realistic image generated by a computer. Head-mounted display devices and the like are immersed in the simulated environment for viewing, and so on. Among them, an obvious difference between VR content and ordinary video content is that each frame of VR video can usually be shot in a 360-degree panorama, which can restore the shooting scene more clearly and accurately. During video playback, since the screen of the playback device is usually a flat structure, it is impossible to display a 360-degree panorama at the same time. Therefore, the playback device first needs to determ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): H04N13/366H04N13/398H04N21/2187H04N21/43H04N21/4402H04N21/442H04N21/45
CPCH04N21/2187H04N21/4302H04N21/4402H04N21/44213H04N21/4508H04N13/00H04N21/816H04N21/43076H04N21/41407G06F3/011G06T11/00H04L65/80H04L67/131H04L65/613
Inventor 张哲
Owner ALIBABA GRP HLDG LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products