Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A Vision-Based Terrain Classification Method for Mobile Robots

A mobile robot and terrain classification technology, applied in the field of robotics, can solve the problems of weak computing power of robots, lower classification accuracy, and occupation of computing resources

Active Publication Date: 2019-07-23
UNIV OF SCI & TECH OF CHINA
View PDF1 Cites 1 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] There are few research results on terrain classification based on vision, and there are the following problems: 1) The computing power of robots is often weak, and they need relatively long-lasting battery life
However, many algorithms use complex image descriptors to represent terrain samples, which can improve the accuracy of classification to some extent, but it takes up a lot of computing resources, reduces real-time performance, and increases energy consumption; 2 ) The classifier obtained based on the training data set is effective in a short time, but over time, even samples collected on the same terrain will change greatly
If the parameters of the classifier are not automatically adjusted, the accuracy of the classification will inevitably be reduced.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A Vision-Based Terrain Classification Method for Mobile Robots
  • A Vision-Based Terrain Classification Method for Mobile Robots
  • A Vision-Based Terrain Classification Method for Mobile Robots

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0059] In order to make the object, technical solution and advantages of the present invention more clear, the present invention will be described in detail below in conjunction with the accompanying drawings and specific embodiments.

[0060] Such as figure 1 As shown, the present invention includes two parts of offline training and online classification, and the specific implementation steps are as follows:

[0061] The offline training part includes the following steps:

[0062] The first step is to control the robot to move at a constant speed on different terrains, during which the camera installed on the robot is used to capture ground images to obtain a sequence of ground images;

[0063] In the second step, after feature extraction and normalization of each image in the ground image sequence obtained in the first step, the sample set Σ is obtained, and each sample S in the sample set t Described by 12 features, each sample is a vector in the i-dimensional sample sp...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a terrain classification method for a mobile robot based on vision, which includes two parts: offline training and online classification. The advantages are: 1) a relatively simple and compact image descriptor and classifier are used, which can reduce computing resources and power consumption The consumption of the system ensures the real-time performance of the system, and the correction algorithm is added to the output backend of the classifier; the problem of accuracy rate reduction is solved; 2) The wrongly classified samples are extracted in the classification correction algorithm, and the classifier uses the samples Incremental learning is performed to improve generalization performance and ensure long-run accuracy.

Description

technical field [0001] The invention relates to the technical field of robots, in particular to a vision-based terrain classification method for mobile robots. Background technique [0002] Wheeled robots will encounter various terrains when moving on the ground, and it is very important to ensure that the robot can safely traverse the terrain and avoid being in a dangerous environment. Existing research mainly focuses on obstacle recognition based on lidar or vision sensors. For impassable obstacles such as walls and stones, we call them "geometric threats". However, the ground itself may also pose a threat to robots. Example: A robot may get stuck in loose sand while traversing it. This threat is called "non-geometric threat". Through real-time terrain perception, the robot can adopt different control strategies according to different terrains, so that it can safely and effectively traverse different terrains. Therefore, real-time terrain perception is very important ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06K9/62G06K9/46G05D1/02
CPCG05D1/0246G06V10/462G06F18/24G06F18/214
Inventor 康宇吕文君昌吉李泽瑞
Owner UNIV OF SCI & TECH OF CHINA
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products