Large-scale scene real-time rendering method based on user behavior analysis

A behavior analysis and real-time rendering technology, applied in the field of machine learning, can solve the problems of impossibility and resource consumption, and achieve the effect of avoiding rendering delay and saving rendering resources

Active Publication Date: 2019-03-08
UNIV OF SCI & TECH BEIJING +1
View PDF12 Cites 1 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

When the number of models in the scene is large and the model quality is high, real-time high-quality rendering of all models in the scene is very resource-intensive, and sometimes it is even impossible to achieve

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Large-scale scene real-time rendering method based on user behavior analysis
  • Large-scale scene real-time rendering method based on user behavior analysis

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0025] In order to make the technical problems, technical solutions and advantages to be solved by the present invention clearer, the following will describe in detail with reference to the drawings and specific embodiments.

[0026] The invention provides a large-scale scene real-time rendering method based on user behavior analysis.

[0027] Such as figure 1 As shown, the method first collects user movement behavior data in the virtual reality scene, and constructs a user behavior database in the virtual scene; then, builds a user behavior analysis basic model based on a neural network, uses the user behavior database training model obtained above, and obtains a personalized Model; finally, calculate the user's position and vision range at the next moment according to the trained personalized model, and output the scene to be rendered according to the current virtual reality scene data.

[0028] The following will be described in conjunction with specific embodiments.

[0...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention provides a large-scale scene real-time rendering method based on user behavior analysis, belonging to the technical field of machine learning. The method firstly collects the user behavior data in virtual reality scene and constructs the database of user behavior in virtual scene. Then the basic model of user behavior analysis is constructed based on neural network, and the user behavior database is used to train the model. At last, according to that personalized model, the user position and vision range at the next time are calculated, and according to the current virtual reality scene data, the scene to be rendered is output. The method can load the model in advance when the quantity of the scene model is large and the quality of the model is high, so as to realize the real-time rendering of the large-scale scene, save the rendering resources, avoid the rendering delay, and then create the real immersion experience.

Description

technical field [0001] The invention relates to the technical field of machine learning, in particular to a large-scale scene real-time rendering method based on user behavior analysis. Background technique [0002] Virtual reality technology (Virtual Reality, referred to as VR technology) has the characteristics of immersion, interaction and creativity [1], which integrates the latest development achievements of computer graphics, digital image processing, multimedia, sensors, networks and other disciplines. It is widely used in education, medical care, design, art, planning and many other fields [2-3]. In order to create a truly immersive experience, virtual reality technology has high requirements for real-time rendering of scenes. When the number of models in the scene is large and the model quality is high, real-time high-quality rendering of all the models in the scene is very resource-intensive, and sometimes it is even impossible to achieve. Therefore, it is necess...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06F3/01G06F16/20G06T15/00
CPCG06F3/011G06T15/005
Inventor 王晓慧张彦春杨晓红
Owner UNIV OF SCI & TECH BEIJING
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products