Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Implementation method from 3D sparse point cloud to 2D grid map based on VSLAM

An implementation method and sparse point technology, applied in image enhancement, image analysis, photo interpretation, etc., can solve the problems of positioning and navigation needs, high cost of laser radar, etc., and achieve the effect of improving operation efficiency

Active Publication Date: 2020-01-10
HANGZHOU DIANZI UNIV
View PDF9 Cites 19 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] Aiming at the problems that indoor mobile robot positioning and navigation require two-dimensional maps and the cost of using laser radar is high, the present invention proposes a method for converting 3D sparse point clouds generated based on visual SLAM into 2D grid navigation maps

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Implementation method from 3D sparse point cloud to 2D grid map based on VSLAM
  • Implementation method from 3D sparse point cloud to 2D grid map based on VSLAM
  • Implementation method from 3D sparse point cloud to 2D grid map based on VSLAM

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0031] The present invention will be further described below in conjunction with the accompanying drawings and specific embodiments.

[0032] Table 1 is the system platform parameters. The hardware environment used for implementation is Jetson TX2. The present invention is carried out under the Linux system. The experimental means include data set testing and field testing. The field testing uses MYNTEYE for data collection, and completes different lighting conditions in different scenarios in the experimental corridors and rooftop balconies. performance evaluation.

[0033] parameter Implementation conditions System hardware platform Jetson TX2 vision sensor MYNTEYE S1030 operating environment Ubuntu 16.04 Programming language C++, Python test environment Laboratory Building (30*20m2)

[0034] Table 1

[0035] Such as figure 1 As shown, the implementation method of the present invention based on 3D sparse point cloud gener...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses an implementation method from 3D sparse point cloud to a 2D raster image based on VSLAM, and the method comprises the steps: visual SLAM processes image information obtained bya camera, and 3D sparse point cloud is generated; secondly, the image is divided into a plane area and a height area through visual segmentation based on colors, the plane area is used for map pointmapping, a Bresenham straight line drawing algorithm is adopted for all received key frames and map points, and a counter is added for optimizing a mapping unit; the height area is used for obstacle detection, and a gradient threshold value and a Canny edge detection algorithm are adopted; and finally, autonomous navigation is performed on the 2D grid map which is obtained after conversion and canbe used for navigation and obstacle avoidance, a Dijkstra global planning algorithm and a DWA local path planning algorithm are adopted, and a low-cost visual sensor is adopted to acquire environmental information, so that indoor operation of the robot is realized.

Description

technical field [0001] The invention belongs to the field of computer vision and mobile robot navigation, and relates to a realization method of converting 3D sparse point clouds generated based on visual SLAM to 2D grid images. Background technique [0002] Simultaneous localization and mapping (SLAM) is an important research field for autonomous robot navigation. It is a process in which a robot is equipped with specific sensors, moves without prior information about the environment, builds a map of the environment, and estimates its own movement. SLAM technology has obvious applications in indoor mobile robots, autonomous driving, augmented reality (AR) and other fields. The emergence of indoor mobile robots has injected new impetus into industries such as logistics transportation, warehousing inventory, port transportation, indoor rescue, and supermarket service robots. [0003] At present, the more mature simultaneous positioning and map construction solutions on the ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T3/00G06T7/70G06T7/11G06T7/136G01C21/00G01C21/32G01C11/04
CPCG06T7/70G06T7/11G06T7/136G01C21/005G01C21/32G01C11/04G06T2207/10028G06T3/067Y02T10/40
Inventor 胡莉英闫泽昊孙玲玲李郑慧刘新宇
Owner HANGZHOU DIANZI UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products