Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Terrain semantic perception method based on vision and vibration tactile fusion

A semantic and visual technology, applied in the field of terrain semantic perception based on vision and vibrotactile fusion, can solve the problem of insufficient terrain semantic perception ability, and achieve the effect of reliable perception ability

Active Publication Date: 2020-04-03
HARBIN INST OF TECH
View PDF18 Cites 29 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] The purpose of the present invention is to solve the problem of insufficient awareness of terrain semantics in the perception mode in the prior art, and propose a terrain semantics perception method based on the fusion of vision and vibration and touch

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Terrain semantic perception method based on vision and vibration tactile fusion
  • Terrain semantic perception method based on vision and vibration tactile fusion
  • Terrain semantic perception method based on vision and vibration tactile fusion

Examples

Experimental program
Comparison scheme
Effect test

Embodiment

[0119] 1. Experimental settings

[0120] The Blue Whale XQ unmanned vehicle platform is selected as the experimental test platform, and it is equipped with the Kinect V1.0 depth vision camera, and its internal reference is f x =517.306408,f y =516.469215,c x = 318.643040,c y =255.313989, the tangential distortion coefficient is k 1 =0.262383,k 2 =-0.953104, the radial distortion coefficient is p 1 =-0.005358,p 2 = 0.002628, p 3 =1.163314, then the effective depth range can be calculated by the following formula:

[0121]

[0122] During the physical test, the acquisition frequency of color image and depth image of Kinect V1.0 camera is 30Hz, the acquisition frequency of vibration sensor is 100Hz, the frequency of feature vector is 1.6Hz, and the operating frequency of ORB_SLAM2 is 15Hz.

[0123] In addition, this experiment sets the depth scale of the point cloud as DepthMapFactor=1000; the number of ORB feature points extracted from a single frame image is nFeature...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a terrain semantic perception method based on vision and vibration touch fusion. The terrain semantic perception method comprises the steps: firstly, giving an implementation method of vision three-dimensional semantic mapping based on ORB_SLAM2 and semantic segmentation; secondly, in combination with a terrain semantic classification and recognition method based on CNN-LSTM, giving a realization thought and a fusion strategy of vision / touch fusion; and finally, based on the blue whale XQ unmanned vehicle platform, the Kinect V1.0 visual sensing unit and the vibration sensing unit, carrying out algorithm testing in a real object environment. Therefore, the semantic marking precision of the method obtained by comparing a test result with a real environment can meet the application requirements; and meanwhile, whether the terrain semantic cognition is good or not can be obviously compared according to the fusion result of whether the vibration touch exists or not,so that more reliable sensing capacity can be provided for the patroller through fusion of the vibration touch and the terrain semantic cognition, and the vibration touch can still provide terrain cognition precision within a limited range even under the condition of visual failure.

Description

technical field [0001] The invention belongs to the technical field of terrain semantic perception, and in particular relates to a terrain semantic perception method based on fusion of vision, vibration and touch. Background technique [0002] Research on terrain semantic perception technology has not yet been reported in depth, but there have been some studies in the fields of environmental semantic mapping and semantic segmentation. The following is an analysis of the current technology development status and trends from these aspects. [0003] Kostavelis gave an overview of the research on semantic mapping of mobile robots, and analyzed it from multiple aspects such as category composition, development trend, and practical application. In 2016, the Davison team proposed a dense three-dimensional semantic mapping method based on convolutional neural network, that is, SemanticFusion. Through the combination of CNN and dense SLAM system, the improvement from traditional geom...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T7/246G06T7/73G06K9/00G06K9/34G06T17/00G06T17/05G06T15/06G06K9/62
CPCG06T7/246G06T7/73G06T17/005G06T17/05G06T15/06G06T2207/10024G06T2207/10028G06T2207/30244G06T2207/30252G06V20/56G06V10/267G06F18/25G06F18/214
Inventor 白成超郭继峰郑红星刘天航
Owner HARBIN INST OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products