Eureka AIR delivers breakthrough ideas for toughest innovation challenges, trusted by R&D personnel around the world.

Vision sensor based robot location and obstacle avoidance method

A robot positioning and visual sensor technology, applied in the directions of instrumentation, mapping and navigation, navigation, etc., can solve the problems of limited sensor working status, suboptimal solution results, and unfavorable robot miniaturization.

Pending Publication Date: 2019-09-20
李守斌
View PDF5 Cites 7 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

This method is costly to implement, is not conducive to the miniaturization of the robot, and is limited by the working state of the sensor
The data optimization method is the extended Kalman filter (EKF), and the solution result is not optimal

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Vision sensor based robot location and obstacle avoidance method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0029] Below in conjunction with specific embodiment, further illustrate the present invention, should be understood that these embodiments are only used to illustrate the present invention and are not intended to limit the scope of the present invention, after having read the present invention, those skilled in the art will understand various equivalent forms of the present invention All modifications fall within the scope defined by the appended claims of the present application.

[0030] A robot positioning and obstacle avoidance method based on a visual sensor comprises the following steps:

[0031] Step 1, acquire images through the depth perception module, such as figure 1 As shown, the depth perception module is composed of an infrared sensor, an infrared sensor 3, an infrared laser emitter 2, a color camera 4, a camera and a real-time processing chip. The camera and chip of the depth perception module can provide high-resolution and high-frequency depth images , and r...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a vision sensor based robot location and obstacle avoidance method. The method includes obtaining depth map data through a depth sensing module, and converting the depth map data into a matrix formed by pixels; performing map constructing and location operation during the movement of a robot; and optimizing depth sensing module data, performing image characteristic extraction and matching, reserving key characteristics and key frames, performing re-location, completing a local space global map, and using a Floyd algorithm to recommend the local optimal path. The method can provide effective guarantee for the indoor activity location and obstacle avoidance of the robot, and specific environments do not need to depend on, so that the method can be applied to various vision sensor based robots.

Description

technical field [0001] The invention relates to a visual sensor-based robot positioning and obstacle avoidance method, which is suitable for indoor intelligent robots. Background technique [0002] The first truly intelligent robot was developed by the Stanford Research Institute in the United States in 1968 and equipped with visual sensors. At present, the research on positioning and obstacle avoidance of indoor intelligent robots mainly includes the following methods. [0003] Based on the robot navigation method of the specified path, the obstacle avoidance is realized through the scheduling system (manual or preset). This method relies on complex scheduling system design, and the versatility of the scheduling system is not good. Moreover, the path is predetermined, and other auxiliary equipment and sensors are needed to realize the identification of the path, and the dependent hardware facilities are relatively complicated. In this way, it is determined that it is mor...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G01C21/20
CPCG01C21/206
Inventor 蒋涛周志坚李守斌裴鹏飞
Owner 李守斌
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Eureka Blog
Learn More
PatSnap group products