Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Real-time three-dimensional model reconstruction method and system based on deep video stream

A deep video stream, real-time 3D technology, applied in the field of 3D modeling, can solve problems such as failure to achieve virtual-real fusion, complicated processing, expensive, etc., and achieve the effect of reducing time consumption, clear modeling effect, and high real-time performance

Inactive Publication Date: 2018-07-24
ZHONGKE HENGYUN CO LTD
View PDF7 Cites 5 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

At present, fine 3D modeling requires expensive equipment and complex processing, and can only generate static models; while cheap depth cameras can only generate blurred scenes, which are far from meeting the needs of virtual-real fusion

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Real-time three-dimensional model reconstruction method and system based on deep video stream
  • Real-time three-dimensional model reconstruction method and system based on deep video stream
  • Real-time three-dimensional model reconstruction method and system based on deep video stream

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0033] Embodiments of the present invention are described in detail below, examples of which are shown in the drawings, wherein the same or similar reference numerals designate the same or similar elements or elements having the same or similar functions throughout. The embodiments described below by referring to the figures are exemplary and are intended to explain the present invention and should not be construed as limiting the present invention.

[0034] Such as figure 1 As shown, the real-time three-dimensional model reconstruction method based on the depth video stream in the embodiment of the present invention includes the following steps:

[0035] Step S1, build a shooting device, and calibrate the reference point in the virtual scene according to the real relative position of the shooting device.

[0036] Specifically, the shooting device includes three cameras. In one embodiment of the present invention, the shooting device may use a Kinect 2.0 camera.

[0037] Su...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a real-time three-dimensional model reconstruction method and system based on a deep video stream. The method comprises the following steps: constructing shooting devices; collecting depth information and color information through the shooting devices, and carrying out distortion correction processing on the depth information and the color information; carrying out feature matching on the depth information and the distortion information obtained after distortion correction processing, and carrying out spatial matching processing, fusing the depth information shot by eachshooting device into one point cloud model, and carrying out feature matching and cropping on images in timestamp synchronization of each shooting device; carrying out triangular facet calculation processing and optimization processing on the point cloud information to generate an information group with a time label; and through a three-dimensional reconstruction computer, obtaining the information group, carrying out time matching and fusion on model information sent by the plurality of shooting devices to realize three-dimensional reconstruction and carrying out adjustment and display according to calibration parameters. The method and system reduce three-dimensional reconstruction consumed time by adopting the mode of unification of the depth and color and resampling frame reduction and triangular facet optimization.

Description

technical field [0001] The invention relates to the technical field of three-dimensional modeling, in particular to a real-time three-dimensional model reconstruction method and system based on depth video streams. Background technique [0002] The 3D model reconstruction based on the depth camera is an important part of the automatic modeling of real objects and the application of virtual-real fusion. At present, fine 3D modeling requires expensive equipment and complex processing, and can only generate static models; while cheap depth cameras can only generate blurred scenes, which are far from meeting the needs of virtual-real fusion. Contents of the invention [0003] The aim of the present invention is to solve at least one of said technical drawbacks. [0004] Therefore, the purpose of the present invention is to propose a real-time 3D model reconstruction method and system based on depth video stream. [0005] In order to achieve the above object, an embodiment of...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T17/20G06T7/80
CPCG06T17/20G06T7/80
Inventor 钟秋发黄煦高晓光李晓阳楚圣辉
Owner ZHONGKE HENGYUN CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products