Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Joint modeling method of indoor and outdoor scenes based on line features

A modeling method, indoor and outdoor technology, applied in the field of 3D reconstruction, can solve the problems of different data quality, low overlap rate, difficult to handle, etc., to achieve the effect of improving the coincidence rate, improving the success rate, and simple expression

Active Publication Date: 2020-06-30
厦门思总建设有限公司
View PDF6 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] 1. The sources of indoor data and outdoor data are different, resulting in different data quality, which is difficult to deal with by traditional manual feature algorithms (Spin-Images, FPFH, SHOT, etc.)
[0005] 2. The indoor scene and the outdoor scene are separated by a wall, the Overlap (overlap rate) is very low, and it is difficult to handle it with 4PCS
This method uses window detection to generate a correspondence between the two models, and then uses this correspondence for registration, but the method is image-based

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Joint modeling method of indoor and outdoor scenes based on line features
  • Joint modeling method of indoor and outdoor scenes based on line features
  • Joint modeling method of indoor and outdoor scenes based on line features

Examples

Experimental program
Comparison scheme
Effect test

Embodiment

[0047] see figure 1 , the invention discloses a joint indoor and outdoor scene modeling method based on line features, comprising the following steps:

[0048] S1. Obtaining original point cloud data, the original point cloud data includes indoor point cloud and outdoor point cloud.

[0049] S2. Wall surface extraction is performed on the indoor point cloud and the outdoor point cloud respectively to obtain wall surface point clouds. This step is achieved through the following sub-steps:

[0050] S21. Divide the indoor point cloud and the outdoor point cloud into small blocks based on the octree, obtain point cloud blocks, and classify the point cloud blocks. The specific categories to be marked include walls, floors, ceilings and others. .

[0051] S22, using FPFH features and height features to describe the point cloud segmentation, denoted as x i Represents the feature vector of the point cloud patch (patch) i, x ij Represents the feature vector of point cloud block ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses an indoor and outdoor scene joint modeling method based on line characteristics, comprising the following steps: obtaining original point cloud data, wherein the original pointcloud data comprises an indoor point cloud and an outdoor point cloud; extracting wall surfaces of the indoor point cloud and the outdoor point cloud respectively to obtain a wall surface point cloud; extracting a line structure for the wall surface point cloud; based on the line structure, the indoor point cloud and the outdoor point cloud being registered. The invention can process different quality point clouds, and the expression is simple. The invention improves the coincidence rate of indoor and outdoor scenes by extracting the wall surface, and further improves the success rate of registration.

Description

technical field [0001] The invention relates to the technical field of three-dimensional reconstruction, in particular to a joint indoor and outdoor scene modeling method based on line features. Background technique [0002] In recent years, 3D reconstruction has received more and more attention. Due to the limitations of instruments and scenes, outdoor scenes often use vehicle-mounted data or static scanners. The indoor scene is relatively small, and it is more suitable to use a portable device such as a backpack. Therefore, most of the obtained data are separated from indoor and outdoor scenes. On the other hand, GPS signals are better outdoors, but poorer indoors. Through the integration of indoor and outdoor point cloud data, more detailed information can be provided for outdoor scenes and more complete information for indoor scenes (indoor and outdoor data can be complementary). In addition, the GPS coordinates of the outdoor point cloud can be introduced into the i...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06T17/00G06T7/33
CPCG06T7/33G06T17/00G06T2207/10028
Inventor 温程璐张正王程侯士伟李军
Owner 厦门思总建设有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products