Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Feature extraction and dimension-reduced neural network-based visual SLAM (simultaneous localization and mapping) closed-loop detection method

A feature extraction and neural network technology, applied in the fields of robot vision and mobile robots, can solve the problems of dynamic changes in the environment and the influence of light changes, and achieve the effect of improving the accuracy and recall rate, overcoming the sensitivity of environmental changes, and fast feature extraction speed.

Active Publication Date: 2019-03-08
BEIJING UNIV OF TECH
View PDF7 Cites 18 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Aiming at the problem that the traditional closed-loop detection method is easily affected by environmental dynamic changes and illumination changes, the present invention uses a convolutional neural network model and trains on a large number of data sets, so that the network has the ability to learn features

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Feature extraction and dimension-reduced neural network-based visual SLAM (simultaneous localization and mapping) closed-loop detection method
  • Feature extraction and dimension-reduced neural network-based visual SLAM (simultaneous localization and mapping) closed-loop detection method
  • Feature extraction and dimension-reduced neural network-based visual SLAM (simultaneous localization and mapping) closed-loop detection method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment

[0060] The first step is to build a network model. use figure 1 The shown Base-Block unit, pooling layer and softmax classification layer construct a convolutional neural network for classification, and the classification network is obtained as Figure 4 shown. The specific implementation is written using the open source deep learning framework TensorFlow.

[0061] In the second step, train the convolutional neural network for classification built in the first step. The network is trained using the Places205 scene classification dataset, which contains 205 categories of scenes. The loss function of the network looks like this:

[0062]

[0063] The update strategy of network weights adopts Adam algorithm:

[0064] g t =▽ θ loss t (θ t-1 )

[0065] m t = β 1 m t-1 +(1-β 1 )g t

[0066]

[0067]

[0068]

[0069]

[0070] The parameters are set to: β 1 =0.9,β 2 = 0.999, ε = 10 -8 . Set t=0 in the initial iteration, m 0 = 0, v 0 = 0, the initi...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a feature extraction and dimension-reduced neural network-based visual SLAM closed-loop detection method. According to the feature extraction and dimension-reduced neural network-based visual SLAM closed-loop detection method, a convolutional neural network model is trained through a large number of data set to endow a network with feature learning capacity, so that similarity comparison between images can be converted into similarity comparison between feature vectors; for further improving the detecting speed, the last layer of the convolutional neural network is provided with an auto-encoder network to dimension-reducing extracted image features; the convolutional neural network has the advantages of translation invariance, scale invariance and the like and accordingly can effectively overcome the shortcoming that traditional artificial feature extraction is sensitive to environmental change and achieve a higher feature extraction speed. The feature extraction and dimension-reduced neural network-based visual SLAM closed-loop detection method can overcome the shortcomings of short feature extraction time and large influence of environmental change and light change in traditional visual SLAM closed-loop detection methods, effectively improve the accuracy and recall rate of closed-loop detection and achieve high significance to structuring globally uniform environmental maps.

Description

technical field [0001] The invention belongs to the loop closure detection (Loop Closure Detection) method in the visual simultaneous localization and map construction algorithm (Visual Simultaneous Localization and Mapping, VSLAM) in the field of mobile robots, and belongs to the technical field of robot vision. Background technique [0002] With the rapid development of artificial intelligence technology in recent years, the closely related robot technology has also made great progress. Among them, mobile robot is the key research direction in the field of robotics. Realizing the navigation of the robot in the unknown environment is the key basis for realizing the autonomous movement of the robot. After long-term research, researchers have explored a general algorithm framework for solving this problem, that is, simultaneous positioning and map construction. According to the different sensors used, it can be divided into simultaneous positioning and map construction using...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G01C25/00G06N3/04G06N3/08
CPCG06N3/08G01C25/00G06N3/045
Inventor 阮晓钢王飞黄静朱晓庆周静张晶晶董鹏飞
Owner BEIJING UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products