Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A three-dimensional scene reconstruction method, device and electronic device based on lidar

A 3D scene and laser radar technology, applied in the field of computer vision, can solve the problems of long processing time, less cooperative detection, and high hardware requirements

Active Publication Date: 2022-07-15
TSINGHUA UNIV
View PDF2 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] In the existing technology, most rely on 2D images and multi-frame data for reconstruction, and there are problems such as high hardware requirements, complex algorithms, and long processing time
More importantly, the reconstruction of the existing technology is mostly limited to the detection of objects alone or the prediction of the 3D scene layout alone, and there are few collaborative detections of the two, which have certain limitations in time efficiency and network models.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A three-dimensional scene reconstruction method, device and electronic device based on lidar
  • A three-dimensional scene reconstruction method, device and electronic device based on lidar
  • A three-dimensional scene reconstruction method, device and electronic device based on lidar

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0061] In the description of the present invention, it should be understood that the terms "first" and "second" are only used for description purposes, and cannot be interpreted as indicating or implying relative importance or the number of indicated technical features. Thus, a feature defined as "first" or "second" may expressly or implicitly include one or more of that feature. In the description of the present invention, "plurality" means two or more, unless otherwise expressly and specifically defined.

[0062] The "plurality" mentioned in this embodiment refers to two or more. "And / or", which describes the association relationship of the associated objects, indicates that there can be three kinds of relationships, for example, A and / or B, which can indicate that A exists alone, A and B exist at the same time, and B exists alone. Words such as "exemplary" or "such as" are used to denote an example, illustration, or illustration, are intended to present the relevant concep...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a three-dimensional scene reconstruction method, device and electronic equipment based on laser radar, and relates to the technical field of computer vision. Based on joint detection results, an implicit expression function of an object is learned, and a compact object surface is reconstructed, so as to realize accurate and accurate reconstruction of the three-dimensional scene. complete understanding. The method includes: acquiring point cloud data of a target area; jointly detecting the point cloud data by a neural network based on an attention mechanism to obtain a joint detection result; the joint detection result includes objects in the target area and the layout of the target area; learning based on the joint detection result Obtain the implicit expression function of the object in the target area; reconstruct the 3D scene of the object in the target area based on the implicit expression function. The three-dimensional scene reconstruction device based on lidar is applied to a three-dimensional scene reconstruction method based on lidar. The three-dimensional scene reconstruction method based on lidar is applied in electronic equipment.

Description

technical field [0001] The present invention relates to the technical field of computer vision, and more particularly, to a method, device and electronic device for three-dimensional scene reconstruction based on laser radar. Background technique [0002] Three-dimensional scenes include indoor scenes and outdoor scenes. In the prior art, the application number is CN201910925338.0, an octree-based robot vision-guided three-dimensional object reconstruction method, using an octree structure as a storage structure for point cloud data, and selecting As a reconstruction algorithm, the fixed region growing method discloses the detection and reconstruction of objects alone. The application number is CN201811466786.0, a three-dimensional reconstruction method of indoor scenes based on RGB-D images, which uses semantic segmentation results to repair depth image holes, provides object outline and category information for three-dimensional reconstruction, and obtains the shape and sh...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06T17/20G06V10/80G06V10/82G06K9/62G06N3/04
CPCG06T17/20G06N3/045G06F18/253
Inventor 陈小雪周谷越
Owner TSINGHUA UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products