Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Method, device and electronic device for determining rendering objects in a virtual scene

A technology for virtual scene and determination method, which is applied in 3D image processing, instruments, calculations, etc., can solve problems such as reducing model accuracy, reducing image rendering quality, and degrading 3D scene visual effects, saving resource overhead and reducing the amount of calculation. Effect

Active Publication Date: 2022-04-22
BEIJING QIYI CENTURY SCI & TECH CO LTD
View PDF0 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] Due to the reduction of image rendering quality or the reduction of model accuracy, the visual effect of the rendered 3D scene is reduced, which affects the user's visual experience

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method, device and electronic device for determining rendering objects in a virtual scene
  • Method, device and electronic device for determining rendering objects in a virtual scene
  • Method, device and electronic device for determining rendering objects in a virtual scene

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0074] The technical solutions in the embodiments of the present invention will be described below with reference to the drawings in the embodiments of the present invention.

[0075] see figure 1 , figure 1 Shown is a schematic flowchart of a method for determining a rendered object in a virtual scene provided by an embodiment of the present invention, which may include the following steps:

[0076] S110. Determine a visible space in the virtual scene, and use the visible space as an effective rendering space.

[0077] The visible space refers to the spatial area in the virtual scene that the user can see at the current moment. In this embodiment, the virtual reality program may determine a position in the virtual scene as a viewpoint to simulate the position of the user's eyes in the virtual scene. The observation space that the viewpoint can observe in the virtual scene is determined according to the position of the viewpoint and the field of view (Field Of View, FOV for...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

Embodiments of the present invention provide a method, device and electronic equipment for determining rendering objects in a virtual scene. The method includes: determining the visible space in the virtual scene, and using the visible space as an effective rendering space; judging whether the space occupied by the object in the virtual scene intersects with the effective rendering space; if the object occupies There is no intersection between the space of the object and the effective rendering space, and the space occupied by the object is determined as a space where complete rendering is not performed. In the embodiment of the present invention, when there is no intersection between the space occupied by the object in the virtual scene and the current visible space, the space occupied by the object can be determined as a space that does not perform complete rendering, because the space occupied by the object and the current visible space There is no intersection in the visual space, so the calculation amount of the rendering module when rendering the virtual scene can be reduced without affecting the user's current visual experience, saving resource overhead.

Description

technical field [0001] The present invention relates to the technical field of intelligent hardware, in particular to a method, device and electronic equipment for determining rendering objects in a virtual scene. Background technique [0002] Virtual reality (Virtual Reality, referred to as VR) software can render a three-dimensional (3Dimensions, referred to as 3D) virtual scene for the user, and after the user wears a corresponding VR device, it can provide the user with an immersive experience in it. However, rendering the entire virtual scene will bring a large resource overhead to the device. For devices with poor performance, it will take a lot of time to complete, resulting in dropped frames, freezes, and other phenomena that damage the user experience. In the prior art, by reducing the image rendering quality or reducing the accuracy of the model, the resource overhead brought by rendering the virtual scene to the device is reduced, so as to avoid the occurrence of ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06T15/00
CPCG06T15/005
Inventor 赵献静黄安成秦涛
Owner BEIJING QIYI CENTURY SCI & TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products