Patents
Literature
Hiro is an intelligent assistant for R&D personnel, combined with Patent DNA, to facilitate innovative research.
Hiro

43 results about "Photon mapping" patented technology

In computer graphics, photon mapping is a two-pass global illumination algorithm developed by Henrik Wann Jensen that approximately solves the rendering equation. Rays from the light source and rays from the camera are traced independently until some termination criterion is met, then they are connected in a second step to produce a radiance value. It is used to realistically simulate the interaction of light with different objects. Specifically, it is capable of simulating the refraction of light through a transparent substance such as glass or water, diffuse interreflection between illuminated objects, the subsurface scattering of light in translucent materials, and some of the effects caused by particulate matter such as smoke or water vapor. It can also be extended to more accurate simulations of light such as spectral rendering.

Photon mapping accelerating method based on point cache

The invention discloses a photon mapping accelerating method based on point cache. The method includes the following steps: 1 photon tracking which includes emitting a certain number of photons to a scene from a light source, then tracking the moving track of the photons, recording the information of the photons in collision with an object and storing the information into a photon picture; 2 preprocessing which includes pre-computing the irradiance, storing the irradiance into the point cache, continuing computing a tinter after the irradiance is computed and storing obtained color values in the point cache mode; 3 rendering which includes conducting rendering according to a light tracking algorithm and emitting tracking reflection, refraction and diffuse reflection light at the intersection point position of light and the surface of the object according to the object surface attribute. A final gathering algorithm emits N diffuse reflection light, the diffuse reflection light intersects with the object on the scene, the color values of N sampling points in the point cache closest to the intersection point are returned, the color values are subjected to mean value computing to serve as the indirect illumination color value, and finally the indirect illumination color value plus the direct illumination color value is the final image.
Owner:SHANDONG UNIV

Parallelization type progressive photon mapping method and device based on OpenCL

The invention discloses a parallelization type progressive photon mapping method and device based on OpenCL. The parallelization type progressive photon mapping method and device are applied to the overall illumination field in the virtual reality technology. Parallelization type progressive photon mapping is achieved through the OpenCL. The parallelization type progressive photon mapping method comprises the steps that firstly, initialization is conducted, a scene model is loaded, and the OpenCL calculating parameters are initialized; secondly, parallelization is conducted on viewpoint ray tracing, photo tracing and scene rendering based on the OpenCL, working loads are designed on corresponding processors reasonably, and after a command queue is executed, a computation result is read and transmitted to a CPU; finally, data resources stored in the CPU are released through the OpenCL standard library functions. By the adoption of the parallelization type progressive photon mapping method and device based on the OpenCL, the efficiency of the progressive photon mapping algorithm can be improved remarkably, compared with a computation method that design is conducted on the CPU, the efficiency is improved by four to nine times, the transportability is high, and the rendering effect is improved to a certain extent.
Owner:BEIJING UNIV OF POSTS & TELECOMM

System and method for generating and using systems of cooperating and encapsulated shaders and shader dags for use in a computer graphics system

A computer graphics system is described in which a new type of entity, referred to as a “phenomenon,” can be created, instantiated and used in rendering an image of a scene. A phenomenon is an encapsulated shader DAG comprising one or more nodes each comprising a shader, or an encapsulated set of such DAGs which are interconnected so as to cooperate, which are instantiated and attached to entities in the scene which are created during the scene definition process to define diverse types of features of a scene, including color and textural features of surfaces of objects in the scene, characteristics of volumes and geometries in the scene, features of light sources illuminating the scene, features of simulated cameras will be simulated during rendering, and numerous other features which are useful in rendering. Phenomena selected for use by an operator in connection with a scene may be predefined, or they may be constructed from base shader nodes by an operator using a phenomenon creator. The phenomenon creator ensures that phenomena are constructed so that the shaders in the DAG or cooperating DAGs can correctly cooperate during rendering of an image of the scene. Prior to being attached to a scene, a phenomenon is instantiated by providing values, or functions which are used to define the values, for each of the phenomenon's parameters, using a phenomenon editor. The phenomenon editor allows the operator to view the effects produced by various settings for the parameter values which are selected. During scene image generation, a scene image generator operates in a series of phases, including a including a preprocessing phase, a rendering phase and a post-processing phase. During a pre-processing phase, the scene image generator can perform pre-processing operations, such as shadow and photon mapping, multiple inheritance resolution, and the like. The scene image generator may perform pre-processing operations if, for example, a phenomenon attached to the scene includes a geometry shader to generate geometry defined thereby for the scene. During the rendering phase, the scene image generator renders the image. During the post-processing phase, the scene image generator may perform post-processing operations if, for example, a phenomenon attached to the scene includes a shader that defines post-processing operations.
Owner:MENTAL IMAGES

Photon mapping parallel method for MIC (intel many integrated core) framework coprocessor

The invention discloses a photon mapping parallel method for an MIC (intel many integrated core) framework coprocessor. The photon mapping parallel method comprises the following steps: a photon tracking beginning stage, namely uploading scene data to the MIC framework coprocessor end; a photon light ray intersection phase, namely carrying out intersection computation on one ling ray in each thread each time, and feeding back light ray intersection point information; a photon map generating stage, namely executing different photon impact behaviors according to the light ray intersection point information, storing impacted photons into a photon map until all photon ray beams are tracked and generating an intact photon map; a rendering beginning stage, namely uploading the photon map, organizing a plurality of groups of rendering points, and carrying out hierarchical clustering on each group of rendering points; a nearest photon researching stage, namely uploading clustered rendering points, calculating the nearest photon of each rendering point in a rendering point cluster, and determining emergent radiation intensity of each rendering point; and an image generating stage, clustered calculating the colors of the rendering points, and returning the colors of each group of rendering points to a screen space to form a final image.
Owner:SHANDONG UNIV

Method for drawing surface caustic effect of 3D virtual scene generated by smooth surface refraction

The invention discloses a method for drawing surface caustic effect of a 3-dimesnioanl (3D) virtual scene generated by smooth surface refraction, and belongs to the technical field of real 3D virtual scene drawing. At present, the surface caustic effect of the 3D virtual scene is drawn generally by using a photon mapping algorithm. The photon mapping algorithm requires for tracking and computing a large amount of photons emitted by a light source so as to seriously reduce the drawing efficiency of the surface caustic effect of the 3D virtual scene. The method comprises the following steps of:creating all caustic illuminants in the 3D virtual scene generated by smooth surface refraction, and storing the caustic illuminants into corresponding data structures; and when the 3D virtual scene is drawn, judging whether the contribution of the caustic illuminants needs to be added into the illumination value of a scenic spot to be drawn by calculating the position relationship between the scenic spot to be drawn and the caustic illuminants, and finally implementing drawing of the surface caustic effect. The method can be easily integrated into a ray tracing algorithm framework, and can remarkably improve the third dimension of 3D virtual scene drawing.
Owner:CHANGCHUN UNIV OF SCI & TECH

Parallel approach to photon mapping for mic-architecture coprocessors

The invention discloses a photon mapping parallel method for an MIC (intel many integrated core) framework coprocessor. The photon mapping parallel method comprises the following steps: a photon tracking beginning stage, namely uploading scene data to the MIC framework coprocessor end; a photon light ray intersection phase, namely carrying out intersection computation on one ling ray in each thread each time, and feeding back light ray intersection point information; a photon map generating stage, namely executing different photon impact behaviors according to the light ray intersection point information, storing impacted photons into a photon map until all photon ray beams are tracked and generating an intact photon map; a rendering beginning stage, namely uploading the photon map, organizing a plurality of groups of rendering points, and carrying out hierarchical clustering on each group of rendering points; a nearest photon researching stage, namely uploading clustered rendering points, calculating the nearest photon of each rendering point in a rendering point cluster, and determining emergent radiation intensity of each rendering point; and an image generating stage, clustered calculating the colors of the rendering points, and returning the colors of each group of rendering points to a screen space to form a final image.
Owner:SHANDONG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products