Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Topological map generation method based on visual fusion landmarks

A topological map and landmark technology, applied in the field of robotics, can solve problems such as language-based human-computer interaction, 3D maps do not contain semantic information, and semantic information is easily affected by lighting and other conditions, achieving fast query, low cost, and improved The effect of robustness

Active Publication Date: 2020-05-29
XI AN JIAOTONG UNIV
View PDF6 Cites 15 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] Aiming at the problems existing in the prior art, the present invention proposes a topological map generation method based on visually fused landmarks, which can run on embedded airborne or vehicle-mounted platforms. The 3D map constructed in the field does not contain semantic information, and at the same time solves the problem that the pixel-level semantic information fused in the visual SLAM system is easily affected by lighting and other conditions, and cannot realize language-based human-computer interaction

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Topological map generation method based on visual fusion landmarks
  • Topological map generation method based on visual fusion landmarks
  • Topological map generation method based on visual fusion landmarks

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0029] The following will refer to the attached Figure 1 to Figure 2 Specific examples of the present invention are described in more detail. Although specific embodiments of the invention are shown in the drawings, it should be understood that the invention may be embodied in various forms and is not limited to the embodiments set forth herein. Rather, these embodiments are provided for more thorough understanding of the present invention and to fully convey the scope of the present invention to those skilled in the art.

[0030] It should be noted that certain terms are used in the specification and claims to refer to specific components. Those skilled in the art should understand that they may use different terms to refer to the same component. The specification and claims do not use differences in nouns as a way of distinguishing components, but use differences in functions of components as a criterion for distinguishing. "Includes" or "comprises" mentioned throughout ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a topological map generation method based on a visual fusion landmarks, and the method comprises the steps: inputting an RGB image, carrying out the semantic segmentation of the RGB image based on a convolutional neural network, and extracting a point with an output value greater than a preset value as a feature point according to a feature map of a semantic segmentation intermediate layer; obtaining a depth map of a scene, based on the depth information of the depth map and the two-dimensional coordinate information of the feature points on the image, acquiring the three-dimensional coordinates of the feature points and the pose information of the robot according to a camera model to establish a three-dimensional map of the scene, performing texture segmentation based on deep learning on the input image, and acquiring codes of each pixel point in a texture feature space; obtaining a membership degree distribution function of the point cloud relative to a predetermined landmark by adopting a landmark level data fusion method based on fuzzy mathematics, and obtaining a three-dimensional semantic map fused with the landmark according to the semantic membershipdegree distribution function and the three-dimensional map; and constructing a topological map based on the three-dimensional semantic map of the fused landmark to generate a semantic topological mapof the fused landmark.

Description

technical field [0001] The invention belongs to the technical field of robots, in particular to a method for generating topological maps based on visually fused landmarks. Background technique [0002] Mapping technology is crucial for intelligent robots to explore the task environment and complete various tasks autonomously. In the process of movement, intelligent robots collect environmental information through sensors, analyze data, and perceive the environment, which helps to make corresponding decisions autonomously according to the needs of the task and the current state of the environment, and then complete various tasks to achieve a true sense of reality. intelligent. In the field of robotics, SLAM (simultaneous positioning and composition) technology is a key technology for robots to build maps and perceive the environment. During the process of traveling, the robot uses sensor data such as lidar and camera to estimate the position and attitude at this time, and b...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T17/05G06T19/20G06T7/10G06T7/41G06K9/62
CPCG06T17/05G06T19/20G06T7/10G06T7/41G06T2207/10028G06F18/24155G06F18/25
Inventor 任鹏举李创丁焱赵子瑞毛艺钧郑南宁
Owner XI AN JIAOTONG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products