Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Relocating method and apparatus of indoor robot

An indoor robot and relocation technology, applied in the directions of measuring devices, instruments, surveying and navigation, etc., can solve the problem that the positioning accuracy is greatly affected by the image quality, the current position information and current attitude information of the robot cannot be accurately determined, and the application scenarios are affected. limit and other issues

Active Publication Date: 2016-11-09
SHENZHEN WEIFU ROBOT TECH CO LTD
View PDF6 Cites 110 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, this vision-based positioning method has the problem that the positioning accuracy is greatly affected by the image quality.
[0007] In the process of realizing the present invention, the inventors found that there are at least the following problems in the related art: due to the technical problems of low positioning accuracy and limited application scenarios in the indoor robot relocation method given in the related art, it is impossible to accurately determine The current position information and current attitude information of the robot can not be accurately navigated autonomously by the robot.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Relocating method and apparatus of indoor robot
  • Relocating method and apparatus of indoor robot

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0086] In order to make the purpose, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below in conjunction with the drawings in the embodiments of the present invention. Obviously, the described embodiments are only It is a part of embodiments of the present invention, but not all embodiments. The components of the embodiments of the invention generally described and illustrated in the figures herein may be arranged and designed in a variety of different configurations. Accordingly, the following detailed description of the embodiments of the invention provided in the accompanying drawings is not intended to limit the scope of the claimed invention, but merely represents selected embodiments of the invention. Based on the embodiments of the present invention, all other embodiments obtained by those skilled in the art without making...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a relocating method and apparatus of an indoor robot. The method comprises the following steps: controlling a visual sensor to acquire visual image data, and controlling a laser sensor to acquire laser dot cloud data; judging whether the robot is lost in a current environmental map according to the laser dot cloud data; if yes, laser relocating the robot according to the laser dot cloud data, visually relocating the robot according to the visual image data, and determining whether a candidate area exists or not according to a laser relocating result and a visual relocating result; when the candidate area exists, carrying out the posture optimization for the robot, and determining current position information and current posture information of the robot; and when the candidate area does not exist, controlling the robot to make obstacle avoidance motion according to the laser dot cloud data until the candidate area is determined. The robot is relocated by adopting a way of combining the laser sensor and the visual sensor, so that the relocating accuracy of the robot is improved, and the robot can be accurately and autonomously navigated.

Description

technical field [0001] The present invention relates to the technical field of control of indoor robots, in particular to a repositioning method and device for indoor robots. Background technique [0002] At present, mobile robots perceive the environment and their own state through sensors, and then realize target-oriented autonomous movement in an environment with obstacles. This is commonly referred to as the navigation technology of intelligent autonomous mobile robots. The positioning is to determine the position of the mobile robot relative to the global coordinates in the working environment and its own posture, which is the basic link of the mobile robot navigation. However, when the robot's system is turned off or powered off, when the robot's position and attitude change, the robot cannot locate its map position and its own attitude after starting. At this time, the robot needs to be manually moved to the initial position to restart the system Autonomous navigatio...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G01C21/20
CPCG01C21/206
Inventor 魏磊磊
Owner SHENZHEN WEIFU ROBOT TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products