Robot navigation method based on machine vision and machine learning

A navigation method and robot technology, applied in the field of robot navigation based on machine vision and machine learning, can solve problems such as no applicable navigation method, complex and changeable environment, and inability to apply mobile robots.

Inactive Publication Date: 2016-07-13
HARBIN WEIFANG INTELLIGENT SCI & TECH DEV
View PDF1 Cites 18 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, because these technologies are highly dependent on the configuration, environment, and parameter settings of the robot, they cannot be applied to most mobile robots.
[0004] The robot navigation environment is relatively complex. The existing navigation methods are mainly aimed at specific environments, certain robot configurations (robots are equipped with binocular vision, trinocular vision, panoramic cameras, laser radar and other sensors), simple dynamic environments, etc., and most The experiments are mainly verified by simulation
Due to the different mechanical structures and hardware configurations of robots, and the complex and changeable environment, various navigation methods are generally only effective for specific environments. So far, there is no general navigation method applicable to different environments.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Robot navigation method based on machine vision and machine learning
  • Robot navigation method based on machine vision and machine learning
  • Robot navigation method based on machine vision and machine learning

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0022] In order to make the objectives, technical solutions and advantages of the present application clearer, the present invention will be further described in detail below with reference to the accompanying drawings and specific embodiments. Obviously, the described embodiments are only a part of the embodiments of the present application, but not all of the embodiments. Based on the embodiments in this application, all other embodiments obtained by those skilled in the art without creative efforts shall fall within the protection scope of this application.

[0023] The robot hardware platform involved in the present invention is such as figure 1 As shown, it mainly includes the following components:

[0024] Host computer: For example, using a laptop, mainly running machine learning algorithms and visual image processing algorithms.

[0025] Robot operating handle: used for expert remote control robot operation.

[0026] Camera module: In the form of each frame image, t...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a robot navigation method, comprising robot imitation learning and autonomous navigation steps. The robot imitation learning step comprises: recording each frame video image obtained by a camera and corresponding control commands; extracting a visual feature vector corresponding to each frame video image; employing the visual feature vectors and control commands as a training data set of a classifier; and calculating an optimal vector of the classifier. The robot autonomous navigation step comprises: extracting the visual feature vector of each frame video image; and the classifier calculating and outputting the control commands according to the visual feature vectors of the video images and the optimal vectors to realize autonomous navigation. The method does not need to establish complex environment models or robot motion control models; robot detection to environments mainly depends on visual feature information, and is suitable for indoor navigation and outdoor navigation; the method can be applied to mobile robots, humanoid robots, mechanical arms, unmanned aerial vehicles, etc.

Description

technical field [0001] The invention belongs to the field of robots, and relates to robot motion control, robot indoor and outdoor navigation and computer technology, in particular to a robot navigation method based on machine vision and machine learning. Background technique [0002] With the continuous development of robotics technology, intelligent mobile robots have entered various fields of social life, playing an increasingly important role in household services, medical services, shopping malls, industry, agriculture and military fields. Demand is also increasing day by day. As a basic problem in the research of intelligent mobile robots, autonomous navigation has become a key technology for mobile robots to achieve autonomy and intelligence. [0003] In recent years, many mobile robot platforms have emerged one after another, and their performance is also very superior. A large number of theoretical and applied researches have been carried out in combination with s...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/62G06K9/66G05D1/02G01C21/20
CPCG05D1/0246G01C21/206G06V30/194G06F18/2415G06F18/24
Inventor 莫宏伟徐立芳
Owner HARBIN WEIFANG INTELLIGENT SCI & TECH DEV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products