Virtual reality image processing method and device

An image processing device and virtual reality technology, which is applied in the field of virtual reality image processing, can solve problems such as occupation and large bandwidth, and achieve the effects of reducing compression loss, increasing bandwidth compression rate, and saving transmission bits

Inactive Publication Date: 2019-04-05
XIAN CREATION KEJI CO LTD
View PDF0 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Such a large image data transmission needs to occupy a large bandwidth. The existing solution is to compress and decompress the transmitted data. Therefore, how to provide a data compression method with high compression efficiency has become a hot research issue.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Virtual reality image processing method and device
  • Virtual reality image processing method and device
  • Virtual reality image processing method and device

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0044] See figure 1 , figure 1 This is a schematic flowchart of a virtual reality image processing method provided by an embodiment of the present invention; the image processing method includes the following steps:

[0045] (a) Obtain left-eye image data and corresponding right-eye image data;

[0046] (b) Obtain the difference pixel matrix of the left-eye image and the right-eye image;

[0047] (c) Obtain the macroblock to be processed of the difference pixel matrix;

[0048] (d) Obtain sampling points and non-sampling points of the macroblock to be processed;

[0049] (e) Obtain the prediction residuals of the sampling point and the non-sampling point;

[0050] (f) Obtain the distribution type of the prediction residual of the macroblock to be processed;

[0051] (g) Obtain the quantized residual of the macroblock to be processed according to the distribution type.

[0052] Among them, step (b) includes:

[0053] The difference pixel matrix is ​​obtained by making a difference between ea...

Embodiment 2

[0079] See again figure 1 On the basis of the above-mentioned embodiments, this embodiment focuses on a detailed description of a virtual reality image processing method. Specifically, the method includes the following steps:

[0080] (S01) Acquire left-eye image data and right-eye image data obtained at any time on the host of the virtual reality device;

[0081] Among them, the left-eye image data and the right-eye image data can be two image pixel matrices of the same size. Let the two pixel matrices be the left-eye pixel matrix A and the right-eye pixel matrix B respectively. The sizes of A and B are both m× n, where both m and n are integers greater than 0, m represents the number of rows of the pixel matrix, and n represents the number of columns of the pixel matrix.

[0082] (S02) Make a difference between the pixel value in the left-eye image data and the corresponding pixel value in the right-eye image data to obtain a difference pixel matrix;

[0083] Among them, the differ...

Embodiment 3

[0164] Based on the above embodiments, this embodiment focuses on a detailed description of a virtual reality image processing device. The virtual reality image processing device includes a processor and a storage medium, and the processor and the storage medium are used to execute the above embodiment. The virtual reality image processing method of the first and second embodiments, wherein the storage medium is used to store related variables of the virtual reality image processing method, and the processor is used to execute the virtual reality image processing method.

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention relates to a method for processing a virtual reality image. The method comprises the following steps: (a) acquiring left eye image data and corresponding right eye image data; (b) obtaining a difference pixel matrix of the left eye image and the right eye image; (c) obtaining a to-be-processed macro block of the difference pixel matrix; (d) acquiring a sampling point and a non-sampling point of the to-be-processed macro block; (e) obtaining a predicted residual error of the sampling point and the non-sampling point; (f) obtaining the distribution type of the prediction residual error of the to-be-processed macro block; and (g) obtaining the quantized residual error of the macro block to be processed according to the distribution type. Through the difference information acquisition of the left eye video data and the right eye video data and the compression encoding method with the high compression ratio, the compression loss of complex texture images is reduced, the numberof transmission bits is further reduced, and the bandwidth compression rate is increased.

Description

Technical field [0001] The invention belongs to the technical field of virtual reality data processing, and specifically relates to a method and equipment for processing virtual reality images. Background technique [0002] With the advancement of science, virtual reality equipment (Virtual Reality, referred to as VR) technology has become a hot research technology today. The display principle is that the left and right eye screens display the images of the left and right eyes respectively. After the human eyes obtain this information with differences, There is a three-dimensional feeling in the mind. [0003] In order to achieve a high-quality VR application experience and meet the three performance requirements of VR applications: 1) low response delay, 2) high frame refresh rate and 3) high-quality images, VR systems usually hand over the heavy rendering work to a powerful The host is complete. The host transmits the rendered frame to the head-mounted display worn by the user ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06F3/01H04N19/124H04N19/176
CPCG06F3/01H04N19/124H04N19/176
Inventor 李雯冉文方
Owner XIAN CREATION KEJI CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products