Brain-like robot navigation method based on Bayesian multimodal perception fusion

A navigation method and robot technology, applied in navigation, surveying and navigation, navigation computing tools, etc., to achieve the effect of improving practicability and high biological fidelity

Active Publication Date: 2022-03-22
SHENYANG INST OF AUTOMATION - CHINESE ACAD OF SCI
View PDF3 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] Although some biologically similar attractor network models have been proposed to demonstrate the mechanism of probabilistic computation in the brain, no neurobiologically inspired Bayesian model of multimodal perception fusion has been used in robot navigation systems.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Brain-like robot navigation method based on Bayesian multimodal perception fusion
  • Brain-like robot navigation method based on Bayesian multimodal perception fusion
  • Brain-like robot navigation method based on Bayesian multimodal perception fusion

Examples

Experimental program
Comparison scheme
Effect test

Embodiment approach

[0110] Step 3, calibrating cells for visual information fusion. After receiving the visual information, if the current information is different from the previously observed information, then a new topological network node is established, and capabilities are not injected into the head-oriented cell network and the grid cell network. If the current visual information is similar to the visual features seen at a certain time before and meets a certain threshold, then activate the relevant visual cells, inject energy into the grid cell network and head-to-head cell network, and the grid cells and The head-to-cell network is injected in a similar manner, thereby changing the magnitude of the reliability of the calibration cell probability distribution and the position of the mean. Calibration is implemented as follows:

[0111]

[0112]

[0113] in, Indicates the strength of the injected energy, Indicates the location of injected energy on the one-dimensional head-orient...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention relates to a brain-inspired robot navigation method based on Bayesian multi-modal perception fusion, comprising the following steps: integrating cells for vestibular information fusion: changing the firing rate of integrating cells according to the acquired vestibular information; calibrating cells for visual information fusion; Inject energy into the grid cell network and the head-to-head cell network, change the firing rate of the integral cell and the calibration cell; perform global suppression; obtain the actual position of the current robot by estimating the phase of the firing rate of the grid cell network and the head-to-head cell network and head orientation; building a topological map. The model designed in the present invention can carry out the fusion of multi-modal perception, and realize the stable encoding of the space environment where the robot is located and the state of the robot itself. The model is consistent with the neural mechanism of head orientation in mammals, similar to the results of single neuron recording experiments collected in neurobiology, and has high biological fidelity.

Description

technical field [0001] The invention belongs to the field of robot navigation, and in particular relates to a robot navigation method inspired by mammalian neurobiology and utilizing Bayesian for multi-mode perception fusion. Background technique [0002] Multimodal information fusion in robotic navigation systems has been extremely challenging. Since the available perception information is usually unreliable and affected by noise, multimodal perception fusion can more accurately encode the pose and environment of the robot. The two main mechanisms by which animals explore their environments are path integration and landmark calibration. To improve the robot's navigation performance, it is necessary to address the uncertainty caused by the accumulation of path integration errors and blurred landmark perception. [0003] The fusion of multimodal perception is the key to the precise perception and behavior of animals. Animals are able to explore long distances, navigate com...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Patents(China)
IPC IPC(8): G01C21/16G01C21/20
CPCG01C21/165G01C21/20
Inventor 斯白露曾太平
Owner SHENYANG INST OF AUTOMATION - CHINESE ACAD OF SCI
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products