Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Robust slam method for indoor robots combined with environmental semantics

An indoor robot and semantic technology, applied in the field of computer vision, can solve problems such as cumulative errors, lack of texture features, damage to the positioning accuracy of visual odometers, etc., and achieve the effect of improving robustness

Pending Publication Date: 2019-09-10
紫光云技术有限公司
View PDF8 Cites 7 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Based on this, our design and use of robots are gradually shifting from functionality to robustness, but existing indoor robots can meet the needs of indoor high-precision positioning and navigation in a simple environment, but once the environment changes or there is high noise, Robot vision will produce cumulative errors at different scales, which is very unfavorable for robot navigation; in addition, indoor robots that have entered human society cannot understand environmental information, so completing specified tasks requires the support of follow-up path planning. It cannot be called truly "strong" intelligence.
[0003] Logistics robots can complete intelligent warehousing and automated logistics indoors. However, due to the fact that the sealed warehousing environment is often dominated by white backgrounds and lacks texture features, this will cause scale information loss during robot navigation and damage the positioning accuracy of the visual odometry. ; and the present invention integrates the environmental semantics into the map and uses optimization means such as potential field guidance to obtain a robust slam method that can read the environmental map

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Robust slam method for indoor robots combined with environmental semantics

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0035] It should be noted that, in the case of no conflict, the embodiments of the present invention and the features in the embodiments can be combined with each other.

[0036] The present invention will be described in detail below with reference to the accompanying drawings and examples.

[0037] Such as figure 1 As shown, the present invention provides a robust slam method for indoor robots combined with environmental semantics, comprising the following steps:

[0038] 1. Positioning of indoor robots: Use GPS global signals, IMU inertial measurement unit signals and lidar signals as original input signals, and use prior input and observation correction methods to make indoor robot positioning more accurate and robust.

[0039] In order to eliminate the positioning error caused when the robot turns at a non-right angle at the indoor corner, and this positioning error is accumulated, the error will be amplified more and more in the later road section, the indoor robot robu...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The present invention provides a robust slam method for indoor robots combined with environmental semantics. The method comprises: indoor robot positioning of the fusion of GPS global signals, IMU inertial measurement unit signals and lidar signals; an indoor robot perception of the fusion of radar data image data, radar sensor calibrated external parameters, pre-camera calibrated external and internal parameters, and a host speed and angular velocity; an indoor robot semantic map of the fusion of the NavigationInfo of a dreamview module, a LaneMarker from a sensing module, and positioning information of a positioning module; decision making of the indoor robot of the fusion of obstacle information, vehicle status, traffic lights, and map information; and indoor robot path planning for thefusion of positioning, a perception, prediction, routing, and a map. According to the method provided by the present invention, the robot can be more robust to the environment in motion and decisionmaking in the environment.

Description

technical field [0001] The invention belongs to the technical field of computer vision, and in particular relates to a robust slam method for indoor robots combined with environmental semantics. Background technique [0002] With the rapid development of artificial intelligence technology, especially strong intelligence, robot application scenarios have gradually shifted from extreme environments and industrial application scenarios to indoor work scenarios that closely interact with humans. Based on this, our design and use of robots are gradually shifting from functionality to robustness, but existing indoor robots can meet the needs of indoor high-precision positioning and navigation in a simple environment, but once the environment changes or there is high noise, Robot vision will produce cumulative errors at different scales, which is very unfavorable for robot navigation; in addition, indoor robots that have entered human society cannot understand environmental informa...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G01C21/20G05D1/02
CPCG01C21/206G05D1/0278G05D1/027G05D1/0257G05D1/0231G05D1/0272
Inventor 常俊龙
Owner 紫光云技术有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products