Real-time rendering method of large-scale scenes based on user behavior analysis

A behavior analysis and real-time rendering technology, applied in the field of machine learning, can solve problems such as inability to realize and consume resources, and achieve the effect of avoiding rendering delay and saving rendering resources.

Inactive Publication Date: 2021-04-06
UNIV OF SCI & TECH BEIJING +1
View PDF12 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

When the number of models in the scene is large and the model quality is high, real-time high-quality rendering of all models in the scene is very resource-intensive, and sometimes it is even impossible to achieve

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Real-time rendering method of large-scale scenes based on user behavior analysis
  • Real-time rendering method of large-scale scenes based on user behavior analysis

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0025] In order to make the technical problems, technical solutions and advantages to be solved by the present invention clearer, the following will describe in detail with reference to the drawings and specific embodiments.

[0026] The invention provides a large-scale scene real-time rendering method based on user behavior analysis.

[0027] Such as figure 1 As shown, the method first collects user movement behavior data in the virtual reality scene, and builds a user behavior database in the virtual scene; then, builds a basic model for user behavior analysis based on a neural network, uses the user behavior database obtained above to train the model, and obtains a personalized Model; finally, calculate the user's position and vision range at the next moment according to the trained personalized model, and output the scene to be rendered according to the current virtual reality scene data.

[0028] The following will be described in conjunction with specific embodiments. ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention provides a large-scale scene real-time rendering method based on user behavior analysis, which belongs to the technical field of machine learning. The method first collects user movement behavior data in a virtual reality scene, and constructs a user behavior database in a virtual scene. Then build the basic model of user behavior analysis based on the neural network, and use the user behavior database to train the model. Finally, calculate the user's position and vision range at the next moment according to the personalized model, and output the scene to be rendered according to the current virtual reality scene data. When the amount of models in the scene is large and the model quality is high, this method can pre-load the models within the user's possible vision range at the next moment, realize real-time rendering of large-scale scenes, save rendering resources, avoid rendering delays, and create A truly immersive experience.

Description

technical field [0001] The invention relates to the technical field of machine learning, in particular to a large-scale scene real-time rendering method based on user behavior analysis. Background technique [0002] Virtual reality technology (Virtual Reality, referred to as VR technology) has the characteristics of immersion, interaction and creativity [1], which integrates the latest development achievements of computer graphics, digital image processing, multimedia, sensors, networks and other disciplines. It is widely used in education, medical care, design, art, planning and many other fields [2-3]. In order to create a truly immersive experience, virtual reality technology has high requirements for real-time rendering of scenes. When the number of models in the scene is large and the model quality is high, real-time high-quality rendering of all the models in the scene is very resource-intensive, and sometimes it is even impossible to achieve. Therefore, it is necess...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Patents(China)
IPC IPC(8): G06F3/01G06F16/20G06T15/00
CPCG06F3/011G06T15/005
Inventor 王晓慧张彦春杨晓红
Owner UNIV OF SCI & TECH BEIJING
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products