Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Three-dimensional laser mapping method and system

A 3D laser and map technology, applied in 3D modeling, image analysis, image data processing, etc., can solve the problems of low positioning accuracy and complex methods, achieve high positioning accuracy, simple algorithm, improve robustness and accuracy Effect

Pending Publication Date: 2020-05-15
SUZHOU AGV ROBOT CO LTD
View PDF3 Cites 8 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] For this reason, the technical problem to be solved by the present invention is to overcome the problems of low positioning accuracy and complex methods in the prior art, thereby providing a three-dimensional laser mapping method with high positioning accuracy and simple method

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Three-dimensional laser mapping method and system
  • Three-dimensional laser mapping method and system
  • Three-dimensional laser mapping method and system

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0032] Such as figure 1 and figure 2 As shown, the present embodiment provides a three-dimensional laser mapping method, including the following steps: Step S1: use the sensor placed on the robot to acquire data, and filter out unqualified data; Step S2: use the coarse-to-fine The matching algorithm calculates the pose of the robot, and the coarse-to-fine matching algorithm includes: Q k Using K-D tree description, for Each laser point in the search at Q k The closest point in the pairing is established; the nonlinear equation is established, and the goal is to minimize the distance of all pairs; the L-M method is used to solve the nonlinear optimization problem, and the T opt = (R, t); then the pose of the robot at time k+1 is where Q k Indicates the point cloud map established at time k, Indicates the point cloud map converted from the current frame to the world coordinate system at time k+1, Indicates the pose of the robot in the world coordinate system at time ...

Embodiment 2

[0069] Based on the same inventive concept, this embodiment provides a 3D laser mapping system, and its problem-solving principle is similar to the above-mentioned 3D laser mapping method, and repeated descriptions will not be repeated here.

[0070] The three-dimensional laser mapping system described in this embodiment includes:

[0071] The acquisition and filtering module is used to acquire data using sensors placed on the robot and filter out unqualified data;

[0072] The calculation module is used to calculate the pose of the robot using a coarse-to-fine matching algorithm, and the coarse-to-fine matching algorithm includes: pairing Q k Using K-D tree description, for Each laser point in the search at Q k The closest point in the pairing is established; the nonlinear equation is established, and the goal is to minimize the distance of all pairs; the L-M method is used to solve the nonlinear optimization problem, and the T opt = (R, t); then the pose of the robot at ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a three-dimensional laser mapping method and a system, and the method comprises the steps: obtaining data through a sensor placed on a robot, and filtering out unqualified data; calculating the pose of the robot by using a coarse-to-fine matching algorithm; and constructing a new image according to the pose of the robot, and then modifying a map according to a loop closingdetection result. According to the method, the scene map with high consistency can be generated, the algorithm is simple, and the accuracy is high.

Description

technical field [0001] The invention relates to the technical field of computer image processing, in particular to a three-dimensional laser mapping method. Background technique [0002] In a large-scale outdoor environment, it is more complicated to create an environmental map using a laser ranging sensor than indoors. Generally speaking, due to the large range and few features of the outdoor environment, the information that the sensor can perceive is limited, which poses a new challenge to the robot's autonomous mapping. At this stage, 3D laser SLAM technology is generally used. This technology usually uses point cloud maps as the storage format. However, as the scope of the scene expands, point cloud maps will inevitably lead to a surge in memory, so it is necessary to find a more suitable data structure; In addition, when the robot returns to the previously passed location after a long path, due to the inevitable cumulative error of data association, an inconsistent ma...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T17/05G06T7/70G06F17/11
CPCG06T17/05G06T7/70G06F17/11
Inventor 刘胜明姜志英芮青青司秀芬
Owner SUZHOU AGV ROBOT CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products