Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A Robot Navigation Method Based on the Neural Mechanism of Visual Perception and Spatial Cognition

A technology of spatial cognition and visual perception, applied in the field of brain-like navigation, which can solve the problems of high difficulty in brain science exploration and few breakthroughs.

Active Publication Date: 2021-05-11
SHENYANG INST OF AUTOMATION - CHINESE ACAD OF SCI
View PDF4 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Although this research field has great impetus for the further development of robot navigation technology, due to a series of factors such as the difficulty of brain science exploration, there are still few breakthroughs in this research.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A Robot Navigation Method Based on the Neural Mechanism of Visual Perception and Spatial Cognition
  • A Robot Navigation Method Based on the Neural Mechanism of Visual Perception and Spatial Cognition
  • A Robot Navigation Method Based on the Neural Mechanism of Visual Perception and Spatial Cognition

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0081] Example 1 Robot Navigation under Illumination

[0082] Under lighting conditions, the system is stimulated by multiple information sources of visual images and robot motion. At time t, many position nodes form such figure 2 The discharge peak area (bump) shown. Such as Figure 3a-3d As shown, x and y represent the positions in the x and y directions, respectively. The movement of the bump on the spatial cortex is proportional to the movement of the robot in the simulated environment, and the normalized difference is close to 0. Therefore, the system designed by the present invention is dynamic, realizes the conversion of the robot's motion plane and the spatial cortex, and helps the robot to learn a two-dimensional spatial cognitive map of its environment. refer to Figure 4 , when the robot is in different head orientations, the spatial responses of the position nodes are consistent, forming a firing pattern similar to the “place field” of place cells. According...

Embodiment 2

[0083] Example 2 Robot navigation in the dark

[0084] In the dark condition, the stimulation of the visual image to the system disappears, but the bump formed by the position node in the spatial cortex at time t still exists, and if Figure 6a~6b The shown has similar dynamics as in Example 1, that is, the motion of the bump in the spatial cortex is proportional to the robot speed.

Embodiment 3

[0085] Embodiment 3 Robot navigation under the condition of motion noise

[0086] When there is random noise in the speed information of the robot, the system still has anti-interference ability, such as Figure 7a As shown in ~ 7d, the normalized error between the bump's motion and the actual robot motion fluctuates in a small range around 0.

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a robot navigation method based on the neural mechanism of visual perception and spatial cognition, which converts the collected visual images into visual nodes representing robot position and orientation angle information through a neural network to form visual cells; The encoding is transformed into a spatial description of the environment, and a cognitive map similar to that formed in the brain when mammals move freely is constructed; the robot's positioning and navigation are realized according to the cognitive map. Based on the neural computing mechanism of environmental perception and spatial memory, the robot has completed a series of tasks such as visual processing, spatial representation, self-positioning, map update, etc., and realized robot navigation with high bionicity and strong autonomy in unknown environments; compared with traditional SLAM technology, the invention avoids a series of complex calculations such as manual design of visual features and feature point matching, and greatly improves the robustness of the system to factors such as illumination changes, viewing angle changes, and object movement in natural environments.

Description

technical field [0001] The invention relates to a brain-like navigation method. Specifically, it is a system for robots to navigate in unknown environments using the neural computing mechanism of visual perception and spatial cognition. Background technique [0002] Robot autonomous navigation research mainly focuses on: "where" (positioning); "where should go" (path planning). Although the existing navigation technology has solved these two problems to a certain extent, there are still great defects, such as the low positioning accuracy of GPS technology, and it cannot normally provide navigation information in special or concealed environments such as indoors, underwater, and after disasters. ; The traditional robot localization and mapping technology (SLAM) relies on expensive sensors such as odometers and lasers, and the spatial perception information is single. Visual navigation has become a research hotspot in recent years due to its rich source of perceptual informa...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G05D1/02
CPCG05D1/0253G05D1/0276
Inventor 斯白露赵冬晔
Owner SHENYANG INST OF AUTOMATION - CHINESE ACAD OF SCI
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products