Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Autonomous unmanned system position identification and positioning method based on sequence image features

A sequence of images, recognition and positioning technology, applied in the field of mobile robots, can solve problems such as large amount of calculation, huge map, and large time consumption, and achieve the effect of enhancing robustness

Active Publication Date: 2020-04-07
HUNAN UNIV +1
View PDF5 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, CNN-based image feature descriptors also have a high dimensionality, which tends to cause a large amount of calculation when performing similarity measurement. Usually, certain dimensionality reduction and optimization are performed on it before subsequent operations.
In addition, the map obtained by moving in a large-scale scene will be relatively large, and it will consume a lot of time when performing retrieval tasks

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Autonomous unmanned system position identification and positioning method based on sequence image features
  • Autonomous unmanned system position identification and positioning method based on sequence image features
  • Autonomous unmanned system position identification and positioning method based on sequence image features

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0054] Embodiment 1: The present invention will be further described below in conjunction with drawings and embodiments.

[0055] The visual position recognition is a method based on two-dimensional images. The images used in the present invention are all RGB images acquired by a common monocular camera, and each data set includes at least two groups of images, collected from the same route, different time and perspective. The mechanism of the position recognition task based on the image sequence is that the motion of the robot is continuous in time and space, and it can be considered that the images collected in a similar time have a high similarity, that is, the adjacent images of the current frame can be Find matching images within the contiguous range of the best matching image for the current frame.

[0056] Such as figure 1 Shown is a flow chart of the present invention, a method for identifying and locating the position of an autonomous unmanned system based on sequen...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses an autonomous unmanned system position identification and positioning method based on sequence image features. Firstly, to-be-image features are extracted through an improved convolutional neural network model, the obtained depth features have high illumination invariance and viewing angle invariance, and the robustness of the algorithm for scene condition changes and robotviewing angle changes is enhanced; then, a difference measurement method based on an image sequence is adopted, constraints are effectively provided for position recognition of adjacent frames, and the recognition accuracy is improved. Secondly, an approximate nearest neighbor search method is used, so that the calculated amount of sequence search is greatly reduced, and the use efficiency in a large-scale environment is improved. Finally, through a method of dynamically updating candidate matching, omission caused in sequence search is effectively reduced, and the error-tolerant rate of thealgorithm is improved. A visual position recognition algorithm has the outstanding advantages of being high in robustness, high in efficiency, capable of adapting to various scenes and the like.

Description

technical field [0001] The invention belongs to the field of mobile robots, and relates to a position recognition and positioning method of an autonomous unmanned system based on sequence image features. Background technique [0002] Realizing long-term autonomous navigation and positioning of robots in a dynamically changing environment is one of the main research difficulties and hotspots in mobile robot technology. How to perform efficient position recognition in long-term and large-scale motion environments has become an urgent need to solve The problem. Vision-based position recognition technology retrieves and matches the current image acquired by the robot with the reference image in the map to determine the current position of the robot in the map. When a robot moves for a long time in a large-scale scene, it is in a dynamically changing environment. Affected by factors such as illumination, seasons, weather, occluders, moving objects, and shooting angles, the appea...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F16/51G06F16/583G06K9/46G06N3/04
CPCG06F16/51G06F16/583G06V10/40G06N3/045
Inventor 余洪山王静文蔺薛菲付强王佳龙郭林峰喻逊孙炜刘小燕
Owner HUNAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products