Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Brainlike robot navigation method based on Bayes multi-mode perception fusion

A navigation method and robot technology, applied in navigation, mapping and navigation, navigation calculation tools, etc., to achieve high biological fidelity and improve practicability.

Active Publication Date: 2019-03-05
SHENYANG INST OF AUTOMATION - CHINESE ACAD OF SCI
View PDF3 Cites 1 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] Although some biologically similar attractor network models have been proposed to demonstrate the mechanism of probabilistic computation in the brain, no neurobiologically inspired Bayesian model of multimodal perception fusion has been used in robot navigation systems.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Brainlike robot navigation method based on Bayes multi-mode perception fusion
  • Brainlike robot navigation method based on Bayes multi-mode perception fusion
  • Brainlike robot navigation method based on Bayes multi-mode perception fusion

Examples

Experimental program
Comparison scheme
Effect test

Embodiment approach

[0110] Step 3, calibrating cells for visual information fusion. After receiving the visual information, if the current information is different from the previously observed information, then a new topological network node is established, and capabilities are not injected into the head-oriented cell network and the grid cell network. If the current visual information is similar to the visual features seen at a certain time before and meets a certain threshold, then activate the relevant visual cells, inject energy into the grid cell network and head-to-head cell network, and the grid cells and The head-to-cell network is injected in a similar manner, thereby changing the magnitude of the reliability of the calibration cell probability distribution and the position of the mean. Calibration is implemented as follows:

[0111]

[0112]

[0113] in, Indicates the strength of the injected energy, Indicates the location of injected energy on the one-dimensional head-orient...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a brainlike robot navigation method based on Bayes multi-mode perception fusion. The brainlike robot navigation method based on Bayes multi-mode perception fusion comprises following steps: integration cells are subjected to vestibular information fusion; integration cell spike rate is changed based on obtained vestibular information; calibration cells are subjected to visual sense information fusion; grid cell network and head direction cell network are subjected to energy injection, and integration cell and calibration cell spike rates are changed; comprehensive inhibition is carried out; and the phases of the spike rates of the grid cell network and the head direction cell network are estimated so as to obtain the current robot practical position and head direction; and a topological map is constructed. A model disclosed in the invention can be used for multi-mode perception fusion, realizing stable coding on robot space environment and robot self states; the head direction neuromechanisms of the model and mammals are the same, the results are similar to mononeuron recorded experiment results acquired in neurobiology, and extremely high biological fidelity is achieved.

Description

technical field [0001] The invention belongs to the field of robot navigation, and in particular relates to a robot navigation method inspired by mammalian neurobiology and utilizing Bayesian for multi-mode perception fusion. Background technique [0002] Multimodal information fusion in robotic navigation systems has been extremely challenging. Since the available perception information is usually unreliable and affected by noise, multimodal perception fusion can more accurately encode the pose and environment of the robot. The two main mechanisms by which animals explore their environments are path integration and landmark calibration. To improve the robot's navigation performance, it is necessary to address the uncertainty caused by the accumulation of path integration errors and blurred landmark perception. [0003] The fusion of multimodal perception is the key to the precise perception and behavior of animals. Animals are able to explore long distances, navigate com...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G01C21/16G01C21/20
CPCG01C21/165G01C21/20
Inventor 斯白露曾太平
Owner SHENYANG INST OF AUTOMATION - CHINESE ACAD OF SCI
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products