Unlock instant, AI-driven research and patent intelligence for your innovation.

A Method of Using Invisible Light Casting Features for 3D Space Modeling

A technology of three-dimensional space and visible light, applied in 3D modeling, image analysis, image enhancement, etc., can solve the problems of discounted effect of 3D model reconstruction, difficulty of 3D model, poor effect, etc., and achieve the effect of avoiding calibration errors

Active Publication Date: 2021-12-31
CHINA GERMANYZHUHAIARTIFICIAL INTELLIGENCE INST CO LTD +1
View PDF9 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, in a space scene with a relatively single color (such as a construction site, white walls, glass, etc.), it is difficult to obtain sufficient feature information because the color transition is not obvious or even there is no color transition, and the effect of 3D model reconstruction will be greatly reduced.
The reason for the above problems is that the traditional 3D model reconstruction uses the SIFT (Scale-invIRiant feature transform) method to extract feature points in a single-color space. The 3D model reconstruction is more difficult and the effect is poor.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A Method of Using Invisible Light Casting Features for 3D Space Modeling
  • A Method of Using Invisible Light Casting Features for 3D Space Modeling

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0031] In order to make the present invention be further understood, the present invention will be further described below with reference to specific embodiment:

[0032] A method for three-dimensional space modeling using invisible light projection features provided by the present invention includes steps:

[0033] S1. Take a group of position pictures at two different positions in the same space, and the position pictures include an RGB picture and an invisible light projection picture;

[0034] S2. Extract feature points from the captured RGB pictures and invisible light projection pictures respectively, and perform feature point fusion on the RGB pictures and invisible light projection pictures of the same group of position pictures to obtain feature points in the scene taken at this position;

[0035] S3. Perform matching calculations for the feature points in the scenes captured at different positions, and the initial camera positions when the two sets of position pictur...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a method for three-dimensional space modeling by using invisible light projection features, and relates to the technical field of three-dimensional imaging digital modeling. The steps include: taking a set of position pictures at two different positions in space, extracting feature points from the position pictures, and then merging the feature points of the same set of position pictures to obtain the position feature points of the position, and performing the process for different position feature points The SIFT algorithm is used for matching calculation, the SLAM algorithm is used to calculate the initial camera position when different groups of position pictures are taken, the SFM algorithm is used to calculate the precise camera position and sparse point cloud, and the 3D structural modeling is carried out based on the camera position and sparse point cloud, and finally 3D scene mapping. The present invention fuses the feature points of the invisible light projection picture and the RGB photo to form the mutual complementation of the feature points, and solves the problem of relatively few feature points extracted when modeling by SIFT in a single color space, and the difficulty of three-dimensional model reconstruction is relatively high. Larger, less effective problems.

Description

technical field [0001] The invention relates to the technical field of three-dimensional imaging digital modeling, in particular to a method for three-dimensional space modeling using invisible light projection features. Background technique [0002] Traditional 3D space modeling methods are mainly aimed at scenes with rich and diverse color information. Due to the large number of feature points in the space scene with obvious color changes, a large amount of feature information can be extracted, so that the reconstructed 3D model is more accurate, that is, it is more consistent with the real scene. However, in a space scene with a relatively single color (such as a construction site, white walls, glass, etc.), it is difficult to obtain sufficient feature information because the color transition is not obvious or even there is no color transition, and the effect of 3D model reconstruction will be greatly reduced. The reason for the above problems is that the traditional 3D ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06T7/55G06T7/73G06T17/00
CPCG06T7/55G06T7/73G06T17/00G06T2207/10004G06T2207/10024G06T2207/10028
Inventor 崔岩刘强
Owner CHINA GERMANYZHUHAIARTIFICIAL INTELLIGENCE INST CO LTD