Supercharge Your Innovation With Domain-Expert AI Agents!

Indoor mobile robot visual positioning method based on convolutional neural network

A convolutional neural network and mobile robot technology, applied in neural learning methods, biological neural network models, neural architectures, etc., can solve problems such as heavy computing load and large environmental impact, achieve small computing load, enhance application range, and improve Effects on Reliability and Robustness

Pending Publication Date: 2021-07-16
NANJING UNIV OF AERONAUTICS & ASTRONAUTICS
View PDF0 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The visual SLAM sensor uses a camera as a sensor, which is low in cost and simple in structure, and is suitable for large-scale promotion, but the biggest disadvantage is that it is greatly affected by the environment. With the construction of the map, there will be some cumulative errors and a large calculation load.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Indoor mobile robot visual positioning method based on convolutional neural network
  • Indoor mobile robot visual positioning method based on convolutional neural network
  • Indoor mobile robot visual positioning method based on convolutional neural network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0041] A method for visual positioning of an indoor mobile robot based on a convolutional neural network of the present invention collects binocular images through a binocular camera to realize positioning and control of the robot; a method based on a convolutional neural network is used to extract feature points of the binocular image, and The BA method performs image tracking, and uses the target detection algorithm to determine the image pose when the tracking fails. The image pose is used as the control signal of the robot to control the position of the robot. The invention gets rid of the defect that the image is sensitive to the environment change (such as the change of the illumination condition), and realizes the robust positioning and control of the indoor mobile robot in the lack of GPS environment.

[0042] The technical solutions of the present invention will be described in detail below in conjunction with the accompanying drawings and specific embodiments.

[00...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses an indoor mobile robot visual positioning method based on a convolutional neural network, and belongs to the field of robot autonomous navigation. A foresight binocular camera and an airborne computer are carried on an indoor robot platform. The binocular camera collects images, feature points are extracted by using a convolutional neural network, and the feature points are used for a visual odometer based on a feature point method. And in the relocation, a target detection method is adopted to extract feature vectors of the pictures, and motion estimation is carried out. And the airborne computer constructs a local map for positioning by using the result of the visual odometer (or repositioning), and obtains the real-time pose of the robot. And the pose is fed back to a robot control system to control the position of the robot. According to the invention, the real-time pose estimation of the robot in the absence of GPS is realized, and the autonomous navigation level of the indoor robot is greatly improved.

Description

technical field [0001] The invention relates to automatic driving and positioning and navigation technology, in particular to a convolutional neural network-based visual positioning method for an indoor mobile robot. Background technique [0002] A mobile robot is a comprehensive system integrating environmental perception, dynamic decision-making and planning, behavior control and execution. It integrates multi-disciplinary research results in sensor technology, information processing, electronic engineering, computer engineering, automation control engineering, and artificial intelligence. It represents the highest achievement of mechatronics and is one of the most active fields of science and technology development. With the continuous improvement of robot performance, the application range of mobile robots has been greatly expanded, not only in industry, agriculture, medical care, service and other industries, but also in harmful and dangerous occasions such as urban sec...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/73G01C21/00G01C21/20G06K9/62G06N3/04G06N3/08G06T5/00G06T7/55
CPCG06T7/73G06T7/55G06N3/08G01C21/206G01C21/005G06T2207/20228G06T2207/20081G06T2207/20084G06V2201/07G06N3/045G06F18/22G06T5/80Y02T10/40
Inventor 吴乐天王从庆
Owner NANJING UNIV OF AERONAUTICS & ASTRONAUTICS
Features
  • R&D
  • Intellectual Property
  • Life Sciences
  • Materials
  • Tech Scout
Why Patsnap Eureka
  • Unparalleled Data Quality
  • Higher Quality Content
  • 60% Fewer Hallucinations
Social media
Patsnap Eureka Blog
Learn More