Bionic vision self-motion perception map drawing method, storage medium and equipment

A map drawing and visual perception technology, applied in the direction of non-electric variable control, surveying and navigation, control/regulation system, etc., can solve the distortion of two-dimensional position estimation map, the position coordinates cannot be updated in real time, image matching accuracy and positioning accuracy Bad question

Active Publication Date: 2020-10-23
ANHUI UNIVERSITY OF TECHNOLOGY AND SCIENCE
View PDF6 Cites 11 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] The purpose of the present invention is to provide a trajectory control method for automatic driving to solve the problems in the prior art that the position coordinates cannot be updated in real time, the image matching accuracy and positioning accuracy are poor, and the two-dimensional position estimation map is seriously distorted.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Bionic vision self-motion perception map drawing method, storage medium and equipment
  • Bionic vision self-motion perception map drawing method, storage medium and equipment
  • Bionic vision self-motion perception map drawing method, storage medium and equipment

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 2

[0132] Corresponding to Embodiment 1 of the present invention, Embodiment 2 of the present invention provides a computer-readable storage medium on which a computer program is stored, and when the program is executed by a processor, the following steps are implemented:

[0133] Step S1, according to the environmental information collected by the bionic visual perception system, construct an episodic memory unit representing a spatial position and an environmental landmark template, and generate a corresponding episodic memory library;

[0134] Step S2, according to the self-motion information collected by the gyroscope and the accelerometer, perform direction and displacement encoding, and update the pose perception information;

[0135] Step S3, using the episodic memory and the environmental landmark template to correct the ego-motion information;

[0136] Step S4, drawing and correcting the bionic self-motion perception two-dimensional environment position estimation map ac...

Embodiment 3

[0140] Corresponding to Embodiment 1 of the present invention, Embodiment 3 of the present invention provides a computer device, including a memory, a processor, and a computer program stored in the memory and operable on the processor. When the processor executes the program, the The following steps:

[0141] Step S1, according to the environmental information collected by the bionic visual perception system, construct an episodic memory unit representing a spatial position and an environmental landmark template, and generate a corresponding episodic memory library;

[0142] Step S2, according to the self-motion information collected by the gyroscope and the accelerometer, perform direction and displacement encoding, and update the pose perception information;

[0143] Step S3, using the episodic memory and the environmental landmark template to correct the ego-motion information;

[0144] Step S4, drawing and correcting the bionic self-motion perception two-dimensional envi...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a bionic vision self-motion perception map drawing method, a storage medium and equipment. The method comprises steps of building a scene memory unit representing a spatial position and an environment landmark template according to the environment information collected by a bionic vision perception system, and generating a corresponding scene memory database; acquiring self-motion information according to a gyroscope and an accelerometer, performing direction and displacement encoding, and updating pose sensing information; correcting the self-motion information by adopting a scene memory bank and an environment landmark template; and drawing and correcting the bionic self-motion perception map according to the mutual relation among the environmental landmark template, the scene memory unit and the self-motion perception. The method is advantaged in that self-motion sensing errors can be reduced by using the visual sensing template, and precision of self-positioning and accuracy of closed-loop detection of the scene memory unit are improved.

Description

technical field [0001] The invention belongs to the technical field of SALM (instant positioning and map construction technology), and relates to a bionic visual self-motion perception map drawing method, a storage medium and a device. Background technique [0002] The core problem of SALM technology (instant positioning and map construction technology) is that the robot is required to first explore the environment in an unfamiliar environment to understand the environment (build a map), and simultaneously use the map to track the position of the robot in the environment (positioning) . The traditional solution to the SALM problem is mainly based on mathematical probability methods, among which Kalman filter, particle filter and maximum expectation algorithm are the basic solutions to the robot SALM problem. Although these traditional SALM algorithms still use laser ranging and sonar ranging to collect information, the information collected by these sensors is often inaccur...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G05D1/02G01C21/00G01C21/20
CPCG01C21/005G01C21/20G01C21/206G05D1/0221G05D1/0223G05D1/0246G05D1/0276
Inventor 陈孟元田德红
Owner ANHUI UNIVERSITY OF TECHNOLOGY AND SCIENCE
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products