Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

3D scene generation method based on depth video

A depth video and 3D technology, applied in the field of 3D scene generation based on depth video, can solve the problems of large-scale and complex scene modeling that cannot cope with the workload, consume large computing storage, computer hardware dependence, etc., to avoid geometric surface modeling , fast drawing speed, convenient and efficient modeling

Inactive Publication Date: 2014-08-27
广东小草科技有限公司
View PDF2 Cites 35 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

On the one hand, traditional scene modeling methods cannot cope with large-scale and complex scene modeling work with a huge workload; The performance of computer hardware is highly dependent, especially in the fields of virtual reality and the Internet, which require real-time rendering in many cases, traditional geometry-based scene modeling is facing huge challenges

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • 3D scene generation method based on depth video
  • 3D scene generation method based on depth video
  • 3D scene generation method based on depth video

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0036] The following will clearly and completely describe the technical solutions in the embodiments of the present invention with reference to the accompanying drawings in the embodiments of the present invention. Obviously, the described embodiments are only some, not all, embodiments of the present invention. Based on the embodiments of the present invention, all other embodiments obtained by persons of ordinary skill in the art without creative efforts fall within the protection scope of the present invention.

[0037] Embodiments of the present invention provide a method for generating a 3D scene based on depth video, such as figure 1 shown, including the following steps:

[0038] S1. Use the depth camera to collect depth video and color video data, and perform filtering processing;

[0039] S2. Convert the filtered depth video into three-dimensional point cloud data in combination with plane coordinates and depth values, and then establish a scene model based on the thr...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a 3D scene generation method based on a depth video. The method comprises the following steps that S1, data of the depth video and a color video are collected by using a depth camera, and filtering out is performed; S2, the depth video on which filtering out is performed is converted into three-dimensional point cloud data by combining a plane coordinate and a depth value, and then a scene model is built according to the three-dimensional point cloud data; S3, the color corresponding to each point in the three-dimensional point cloud data is obtained from color video data on which filtering out is performed and the color is applied to the scene model so as to obtain a color scene model; S4, the format of data of the color scene model are converted into the 3D file format, and converted files are 3D scene files. The scene generated through the method is verisimilar and natural, modeling is convenient and efficient, calculation amount and memory space are small during drawing, and the method is more suitable for real-time modeling and drawing.

Description

technical field [0001] The invention relates to the field of computer graphics and image processing, in particular to a 3D scene generation method based on depth video. Background technique [0002] For a long time, computer workers have been pursuing the use of computers to construct realistic virtual 3D scenes, and 3D scene modeling has always been one of the most active research fields in computer graphics and image processing. [0003] [0004] In traditional scene modeling, geometry-based modeling methods are usually adopted. This method usually uses the existing 3D modeling software to manually model the scene, mainly through the superposition of 3D models to construct the scene. The constructed scene has high precision, complete model description and good interaction. But the shortcomings are also very prominent, that is, the workload of human-computer interaction is huge, which leads to low efficiency of modeling; it is also easy to greatly reduce the authenticity ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T17/00
Inventor 蔡昭权冯嘉良黄翰
Owner 广东小草科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products