Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Slam method and system based on laser radar point cloud and camera image data fusion

A technology of lidar and image data, which is applied in the directions of image data processing, image enhancement, image analysis, etc., can solve the problem of low robustness, and achieve the goal of improving robustness, optimizing pose sequence, and good pose estimation effect Effect

Active Publication Date: 2020-08-21
SHANGHAI JIAO TONG UNIV
View PDF9 Cites 34 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The traditional slam method relies on low-level feature information such as points and lines to estimate the pose, and its robustness is low in complex outdoor environments, especially in scenes with sparse features

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Slam method and system based on laser radar point cloud and camera image data fusion
  • Slam method and system based on laser radar point cloud and camera image data fusion
  • Slam method and system based on laser radar point cloud and camera image data fusion

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0049] The present invention will be described in detail below in conjunction with specific embodiments. The following examples will help those skilled in the art to further understand the present invention, but do not limit the present invention in any form. It should be noted that those skilled in the art can make several changes and improvements without departing from the concept of the present invention. These all belong to the protection scope of the present invention.

[0050] like figure 1 As shown, the present invention provides a slam method for point cloud and image fusion, which involves image segmentation module, point cloud segmentation module, object segmentation and fusion module, front and rear frame object association module, multi-constraint pose estimation module, closed-loop detection and optimization module. like figure 2 Shown is a schematic flow chart of an embodiment of the present invention. Among them, step 1 involves two modules, the image seg...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a laser radar-based point cloud and camera image data fusion slam method and system, and the method comprises the steps: extracting a key frame, carrying out the object instancesegmentation of a key frame image, and obtaining an object instance in the image; performing object segmentation on the point cloud of the key frame to obtain an object in a point cloud space; fusingand unifying the object instance in the image and the object object in the point cloud space to obtain an object set; matching the objects of the front frame and the rear frame according to the object set; performing solving to obtain the pose of the camera according to the point cloud matching error of the front and rear frames, the re-projection error of the image and the object category errorof the feature points in the front and rear frames; and registering the image carrying the object instance information into a point cloud map according to the pose of the camera to obtain the point cloud map with image semantic information. According to the method, the robustness of object instance segmentation is improved, and semantic constraints are added to an optimization equation, so that the precision of the solved pose is higher.

Description

technical field [0001] The present invention relates to the technical field of mobile robot positioning and navigation, in particular to a slam method based on laser radar point cloud and camera image data fusion. In particular, it relates to a slam method for mobile robots based on multi-sensor fusion. Background technique [0002] In the field of positioning and navigation of mobile robots, Simultaneous Localization and Mapping (SLAM) refers to the robot determining its own pose in the working environment and building a map of the environment at the same time. The essence of the SLAM problem is that the robot models the environment and estimates its own pose when the environment is unknown. In order to establish a model of the surrounding environment, it is necessary to determine its own pose. The environment is modeled. [0003] SLAM problems can often be divided into front-end data association and back-end loopback optimization. The existing SLAM methods based on lidar...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/34G06K9/62G06T7/55G06T7/10G01S17/86G06N3/04G06N3/08
CPCG06T7/55G06T7/10G01S17/86G06T2207/10028G06T2207/20081G06T2207/20084G06N3/08G06V20/10G06V10/267G06V10/751G06N3/045G06F18/23213G06F18/241Y02T10/40
Inventor 王贺升赵小文
Owner SHANGHAI JIAO TONG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products