Three dimensional SLAM method based on events with depth enhanced vision sensor

A vision sensor and vision-enhancing technology, applied in directions such as road network navigators, can solve the problems of expensive power, resource consumption, consumption, etc.

Active Publication Date: 2016-08-17
上海趣立信息科技有限公司
View PDF6 Cites 16 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, these existing dense 3D SLAM methods have a serious disadvantage, that is, they are all ve...

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Three dimensional SLAM method based on events with depth enhanced vision sensor
  • Three dimensional SLAM method based on events with depth enhanced vision sensor
  • Three dimensional SLAM method based on events with depth enhanced vision sensor

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0050] The present invention will be further described below in conjunction with the accompanying drawings.

[0051] An event-based three-dimensional SLAM method with a depth enhanced vision sensor, characterized in that: the method comprises the steps of:

[0052] Step 1: Generation of input data stream:

[0053] Step 1.1: eDVS generates sparse event streams: The embedded dynamic vision sensor (eDVS) is used to directly generate dynamically changing sparse event streams, which only requires hardware support without software preprocessing.

[0054] Step 1.2: D-eDVS Acquires Depth Information Enhanced Pixel Events: Combining an Embedded Dynamic Vision Sensor (eDVS) and a Separate Active Depth Sensing Sensor (PrimeSense RGB-D) Asus Xtion Pro Live with a resolution of 320*240, frequency 69Hz, and calibrate the corresponding pixels on the two sensors; the depth sensing sensor (PrimeSense RGB-D) can obtain the depth information of the corresponding pixel position of each generated...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention provides a three dimensional SLAM (simultaneous localization and mapping) method based on events with a depth enhanced vision sensor. The method comprises the following components: the embedded dynamic vision sensor and an individually movable depth induction sensor are combined in order to obtain an enhanced pixel event with depth information; the pixel event is used as an only input of the three dimensional SLAM method based on events, and a panoramic map is generated by selecting a particle increment model, generating a local map from discrete probability sparse voxel grid modeling, and updating the local map by iteration. Specific hardware is not needed, the processing speed is 20 times faster than the real time speed, and position update is carried out at hundred Hertz frequency; the method has the advantages of good effects, low memory demand, low power consumption, high efficiency calculating rate, etc.

Description

technical field [0001] The invention relates to an event-based three-dimensional SLAM (simultaneous localization and mapping) method with a depth-enhanced vision sensor, belonging to the field of simultaneous localization and map creation (SLAM) of mobile robots. Background technique [0002] The SLAM (Simultaneous Localization and Mapping) algorithm is one of the core tasks in the field of robotics and computer vision, which enables robots to explore in unknown and unconstrained environments. Traditional 2D and 2.5D SLAM algorithms can construct a bird's-eye view. In addition, some 3D SLAM algorithms have been proposed in recent years. Most of these algorithms are implemented by combining color and depth sensors (RGB-D) such as PrimeSense devices. The existing typical 3D SLAM method is KinectFusion, which is a dense 3D SLAM method that uses iterative closest points to match depth images and a signed distance to obtain a 3D map. The other is the method proposed by Bylow et ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G01C21/32
Inventor 廖鸿宇孙放
Owner 上海趣立信息科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products