Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Visual navigation method of mobile robot based on indoor lighting

A mobile robot and visual navigation technology, applied in the field of intelligent visual navigation, can solve the problems of system lag, markers are easily blocked, and image processing efficiency is low, and achieve the effect of improving real-time performance and simple image processing algorithm

Inactive Publication Date: 2019-06-18
XIAN UNIV OF TECH
View PDF8 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] The purpose of the present invention is to provide a mobile robot visual navigation method based on indoor lighting, which solves the problems in the prior art that markers are easily blocked, image processing efficiency is low, and the system lags behind

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Visual navigation method of mobile robot based on indoor lighting
  • Visual navigation method of mobile robot based on indoor lighting
  • Visual navigation method of mobile robot based on indoor lighting

Examples

Experimental program
Comparison scheme
Effect test

Embodiment

[0058] The binocular camera adopts a binocular black-and-white camera. The main parameters are: the lens distance B is 5cm, 25 frames per second; the lens parameters include: the focal length f is 4mm, the specification is 1 / 3inch, the aperture is F1.6, the viewing angle is 70.6 degrees, and the CMOS.

[0059] The left and right cameras are calibrated through the calibration board, and the internal parameters of the binocular camera are obtained as shown in Table 1.

[0060] Table 1. Internal parameters of the binocular camera

[0061]

[0062] Utilize above-mentioned method of the present invention, control the moving position of mobile robot, obtain mobile robot position error less than 3%;

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses an indoor lamp based mobile robot visual navigation method. The method includes the steps: firstly, modeling a plant environment; secondly, calibrating a binocular camera; thirdly, detecting binocular visual attitudes of the mobile robot; fourthly, navigating with a PID control algorithm, utilizing the PID control rule to control movement speed and movement direction of the mobile robot, continuously changing the tracking target point of the mobile robot in the whole navigation process, detecting the current attitude in a cyclic manner, controlling the mobile robot to run along the planned route according to the PID control rule based on the target position. The method solves the problem that a marker is covered easily, the image processing algorithm is simple, and real-time performance in navigation is improved.

Description

technical field [0001] The invention belongs to the technical field of intelligent visual navigation, and relates to a mobile robot visual navigation method based on an indoor lighting lamp. Background technique [0002] Pose estimation is a core issue in the research of mobile robots (or mobile cars, which are marked as cars in the following drawings). Accurate pose estimation has great significance for the positioning of mobile robots, map generation, path planning, target detection and tracking, etc. important meaning. At present, pose estimation methods are mainly divided into two categories: relative pose estimation and absolute pose estimation. The premise of absolute pose estimation is to have preset environmental information, and the accuracy is relatively high. Landmark positioning is an absolute pose estimation method, which is mostly used in structured environments. This method mainly relies on computer vision image processing to extract and process the features...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G05D1/02G05B11/42
CPCG05B11/42G05D1/0231
Inventor 杨静史恩秀王宇佳
Owner XIAN UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products