Unlock instant, AI-driven research and patent intelligence for your innovation.

Three-dimensional-mapping-table-based three-dimensional point cloud rapid reconstruction method

A technology of 3D point cloud and mapping table, which is applied in the field of camera 3D scene reconstruction, can solve the problem of large amount of calculation, and achieve the effect of low calculation amount, high real-time performance and reduced calculation amount

Inactive Publication Date: 2016-08-31
CHANGAN UNIV
View PDF4 Cites 9 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The traditional method of restoring 3D information directly uses the calibration parameters of the camera to perform 3D restoration of the depth image. The limitation of this method is that it requires a large amount of calculation and cannot reconstruct the 3D scene in real time.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Three-dimensional-mapping-table-based three-dimensional point cloud rapid reconstruction method
  • Three-dimensional-mapping-table-based three-dimensional point cloud rapid reconstruction method
  • Three-dimensional-mapping-table-based three-dimensional point cloud rapid reconstruction method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment

[0061] In the process of the embodiment, the sampling frequency is 25 frames per second, and the size of the frame image is 320×240, Figure 4 is a section of the depth video stream collected from the RGB-D camera, and the method of the present invention is used to restore the point cloud of the depth map.

[0062] Step 1, establish a world coordinate system, such as figure 2 , select the ground directly below the camera as the origin of the world coordinate system, and the X-Y plane is parallel to the ground; use the calibration bracket to obtain 6 sets of points on the image coordinate system and their corresponding points on the world coordinate system, such as image 3 ; Calculate the parameter matrix P of the camera.

[0063] Step 2: Generate a three-dimensional mapping table according to the parameter matrix P. Generate a three-dimensional mapping table with the structure of Sheet[240][320][200][3]. The pseudo code of the process is as follows:

[0064]

[0065] S...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a three-dimensional-mapping-table-based three-dimensional point cloud rapid reconstruction method. A parameter matrix of a camera is obtained; according to the parameter matrix, a three-dimensional mapping table is generated; the three-dimensional mapping table is inquired to obtain world coordinates of an image coordinate at different pixel values, thereby forming a three-dimensional point cloud of the image. With the method, the calculation load of three-dimensional point cloud recovery can be reduced effectively; and three-dimensional point cloud recovery can be realized only by inquiring the three-dimensional mapping table. The method has advantages of high real-time performance and low calculation load.

Description

technical field [0001] The invention belongs to the field of camera three-dimensional scene reconstruction, and in particular relates to a fast three-dimensional point cloud reconstruction method based on a three-dimensional mapping table. Background technique [0002] At present, RGB-D cameras are gradually widely used in various fields, such as 3D reconstruction, image understanding and video surveillance. RGB-D cameras can obtain the distance from objects, and the distance can be obtained by users in the form of images (this image is obtained by called a depth image). Because of the characteristics of the RGB-D camera, after the camera is calibrated, the 3D information of the depth image can be directly recovered by using the calibration parameters. The traditional method of restoring 3D information directly uses the calibration parameters of the camera to perform 3D restoration of the depth image. The limitation of this method is that it requires a large amount of calcu...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T17/00
CPCG06T17/00
Inventor 宋焕生孙士杰张朝阳刘瑞芝王璇陈艳李怀宇崔华张文涛张向清李莹严腾郑宝峰张斌
Owner CHANGAN UNIV