Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A robot autonomous navigation system and method based on depth image data

A technology of depth image data and autonomous navigation system, which is applied in the field of visual control and can solve the problems of large amount of calculation and high cost of processing modules

Active Publication Date: 2021-07-06
CHANGAN UNIV
View PDF15 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] In order to overcome the problems and defects of high processing module cost and huge amount of calculation in the depth data navigation method existing in the prior art, the purpose of the present invention is to provide a robot autonomous navigation system and method based on depth data images

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A robot autonomous navigation system and method based on depth image data
  • A robot autonomous navigation system and method based on depth image data
  • A robot autonomous navigation system and method based on depth image data

Examples

Experimental program
Comparison scheme
Effect test

Embodiment

[0068] Such as figure 2 As shown, in this embodiment, the monocular multi-focus robot is autonomously navigated, using a wheeled motion platform and adopting a skid steering method, and setting the image processing area according to the platform size parameters and the camera installation position, taking the horizontal plane of the camera lens axis as The basic interface, and then set the upward movement size of the basic interface according to the proportional relationship between the size of the upper part of the camera and the total size of the device. Below the basic interface is the depth data image processing area. In the horizontal direction, the horizontal area of ​​the image is divided into three main areas on the left, right and middle according to the width of the walking wheels and the width of the platform. The principle of area division is divided according to the size ratio of each part to the total width; the vertical direction is based on the minimum ground ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention belongs to the field of visual control technology, and relates to a robot autonomous navigation system and method based on depth image data. The depth image data acquisition system collects the depth image of the current environment, processes it with a depth data processing method, and obtains the depth image data of the current environment; The processing and transmission system processes the depth image data, obtains the real data of each area of ​​the depth image, subdivides each area and sets the early warning value of each subdivided area, compares the real data and early warning value of each subdivided area, and generates command information; The control system generates control instructions from the command information, and at the same time combines the state of the robot to complete the movement of the robot, and feeds back the robot data to the console of the robot; the auxiliary system records the motion state data of the motion control system, and uses the motion state data as the corrected motion path and Source for uploading data to the console. The invention overcomes the problems and defects of high usage cost and huge calculation amount of processing modules.

Description

technical field [0001] The invention belongs to the technical field of vision control, and relates to a robot autonomous navigation system and method based on depth image data. Background technique [0002] Robot navigation technology obtains information about the environment and its own state through various sensors (such as distance sensors, positioning sensors, mileage counters, etc.) One or some optimization criteria (such as the minimum work cost, the shortest walking route, the shortest walking time, etc.) It has a complete and autonomous system composition from perception to execution steps, and can independently complete work in the environment just like humans. [0003] Existing robot navigation technology mostly uses ranging sensors, which actively transmit detection signals and receive feedback signals from various objects in the environment, and then obtain scene information by measuring the comparison between the transmitted and returned signals. The detection...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G05D1/02
CPCG05D1/0214G05D1/0223G05D1/0242G05D1/0253G05D1/0255G05D1/0276
Inventor 夏晓华武士达李贺华徐光辉杨晶晶王朋
Owner CHANGAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products