Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Indoor three-dimensional scene rebuilding method based on double-layer rectification method

A 3D scene and registration technology, which is used in the reconstruction of large-scale indoor scenes and the field of 3D reconstruction of indoor environment, can solve problems such as inability to meet the requirements of GPU hardware configuration, and achieve the goal of improving reconstruction accuracy and solving cost and real-time problems. Effect

Inactive Publication Date: 2013-05-15
BEIJING UNIV OF TECH
View PDF2 Cites 84 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Since this method is implemented on GPU hardware, it has high requirements for GPU hardware configuration, and is limited by GPU memory, it can only reconstruct a range of 3m×3m×3m, which cannot meet the needs of large-scale indoor 3D scene creation

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Indoor three-dimensional scene rebuilding method based on double-layer rectification method
  • Indoor three-dimensional scene rebuilding method based on double-layer rectification method
  • Indoor three-dimensional scene rebuilding method based on double-layer rectification method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0036] The present invention will be described in further detail in conjunction with the accompanying drawings. as attached figure 1 Shown, the present invention comprises the following steps:

[0037] Step 1, carry out Kinect calibration, the specific method is as follows:

[0038] (1) Print a chessboard template. The present invention adopts a piece of A4 paper, and the interval of chessboard is 0.25cm.

[0039] (2) Photograph the chessboard from multiple angles. When shooting, try to make the chessboard fill the screen as much as possible, and ensure that every corner of the chessboard is on the screen, and shoot a total of 8 template pictures.

[0040] (3) Detect the feature points in the image, that is, every black intersection of the chessboard.

[0041] (4) Calculate the Kinect calibration parameters.

[0042] The internal reference matrix K of the infrared camera ir :

[0043] K ir = ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention belongs to the crossing field of computer vision and intelligent robots, relates to an indoor three-dimensional scene rebuilding method based on a double-layer rectification method and solves the problems that an existing indoor scene rebuilding method is expensive in required equipment, high in computation complexity and poor in real-time performance. The indoor three-dimensional scene rebuilding method based on the double-layer rectification method comprises Kinect calibration, SURF feature point extraction and matching, mapping from a feature point pair to a three-dimensional space point pair, three dimensional space point double-layer rectification based on random sample consensus (RANSAC) and inductively coupled plasma (ICP) methods and scene updating. According to the indoor three-dimensional scene rebuilding method based on the double-layer rectification method, the Kinect is adopted to obtain environmental data, and the double-layer rectification method is provided based on the RANSAC and the ICP. Indoor three-dimensional scene rebuilding which is economical and rapid is achieved, and the real-time performance of rebuilding algorithm and the rebuilding precision are effectively improved. The indoor three-dimensional scene rebuilding method based on the double-layer rectification method is applicable to the service robot field and other computer vision fields which are relative to the three-dimensional scene rebuilding.

Description

technical field [0001] The invention belongs to the intersecting field of computer vision and intelligent robots, relates to a three-dimensional reconstruction technology of an indoor environment, and in particular relates to a large-scale indoor scene reconstruction method based on a double-layer registration method. Background technique [0002] In recent years, with the continuous development of information technology, the demand for 3D scene reconstruction technology has been increasing, and the economical and fast indoor 3D scene reconstruction method has become a key technical problem to be solved in many fields. In the field of home service robots, the market demand for smart home service robots triggered by the aging population is becoming increasingly strong. At present, most service robots on the market can only provide a single and simple service in a specific scene because they cannot perceive the three-dimensional environment. This problem seriously restricts th...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T17/00G06T7/00
Inventor 贾松敏郭兵王可李秀智
Owner BEIJING UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products