Synchronous positioning and mapping method using residual attention mechanism network
A technology of synchronous positioning and attention, applied in neural learning methods, biological neural network models, image analysis, etc., can solve the problem of high redundancy of geometric feature information, and achieve the effect of easy understanding and solving six-degree-of-freedom pose problems
- Summary
- Abstract
- Description
- Claims
- Application Information
AI Technical Summary
Problems solved by technology
Method used
Image
Examples
Embodiment
[0095] In this embodiment, the image is first input into the attention mechanism network. In the case of cluttered background, different types of attention are needed to simulate images with complex scenes and large appearance changes. In this case, features from different layers need to be modeled by different attention masks. The incremental nature of the stacked network structure can gradually increase attention to complex images. The trunk branch performs feature processing. The subsequent LSTM module ensures that the attention distribution in the image is relevant to the location prediction. To be able to find and exploit correlations between images taken in long trajectories, long-short-term memory gates capable of learning long-term dependencies by introducing memory gates and cells are used as subsequent network structures. Correspondingly, although LSTM gates can handle long-term dependencies and have a deep temporal structure, it still requires depth on network lay...
PUM
Login to View More Abstract
Description
Claims
Application Information
Login to View More 


