Binocular stereo vision matching method based on neural network and operation framework thereof

A technology of binocular stereo vision and neural network, applied in the field of binocular stereo vision matching method and its computing framework based on neural network

Inactive Publication Date: 2020-01-31
SUN YAT SEN UNIV
View PDF3 Cites 7 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] In order to overcome the problem that the binocular stereo vision matching algorithm in the prior art cannot achieve fast and high-precision matching, the present invention provides a neural network-based binocular stereo vision matching met

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Binocular stereo vision matching method based on neural network and operation framework thereof
  • Binocular stereo vision matching method based on neural network and operation framework thereof
  • Binocular stereo vision matching method based on neural network and operation framework thereof

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0060] Such as figure 1 Shown is an embodiment of a neural network-based binocular stereo vision matching method, comprising the following steps:

[0061] Step 1: Construct a neural network computing framework, construct a binary neural network and conduct training; process 194 pairs of accurate depth images provided in the KITTI2012 database, and construct a total of 24,472,099 positive and negative sample pairs for training to obtain a binary value neural network model.

[0062] Step 2: Input the left image and the right image into the binary neural network for image feature extraction, and obtain a series of binary sequences as feature descriptions of image pixels;

[0063] Step 3: The binary neural network matches the left image and the right image through a matching algorithm, specifically:

[0064] S1: Parallax cost value calculation, the similarity calculation is performed on the binary sequence of two pixels, and the calculated similarity score represents the similar...

Embodiment 2

[0092] A computing framework used in Embodiment 1. In step 1 of Embodiment 1, the neural network computing framework is constructed as a modular neural network computing framework, and the data is compressed by way of channel packaging, and the computing time is reduced by laminar flow technology .

[0093] Specifically, the modular neural network computing framework is divided into networks, layers, tensors, and data blocks according to the granularity from coarse to fine; in the modular neural network computing framework, the network is divided into several layer structures, each Corresponding parameters are set in a layer, and the data in the neural network computing framework are stored in tensors, and are stored by data blocks. The network framework uses its own GPU memory management and recovery system. During the initialization process of the framework, it applies for a piece of memory resource from the GPU, and manages it using memory pointers in the framework. The dat...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention relates to a binocular stereo vision matching method based on a neural network and an operation framework thereof, and the matching method comprises the following steps: 1. constructingthe operation framework of the neural network, constructing a binary neural network and carrying out the training; 2. initializing a neural network operation framework; 3, inputting the left image andthe right image into a binary neural network for image feature extraction to obtain a string of binary sequences as feature description of image pixel points; 4. Using a binary neural network for replacing a convolutional neural network to be used for feature extraction of an image, and designing a neural network training mode and an operation framework of operation special for the binary neuralnetwork, so that binocular stereoscopic vision matching is higher in precision, and meanwhile, the operation speed is higher.

Description

technical field [0001] The present invention relates to the field of binocular stereo vision matching algorithms, and more specifically, to a neural network-based binocular stereo vision matching method and an operation framework thereof. Background technique [0002] Binocular stereo vision is a passive ranging sensing method designed using the principle of bionics. It can capture two pictures at the same time, and obtain a picture containing depth information at the pixel level through algorithm calculation. The application provides more possibilities. [0003] Binocular stereo vision is used more and more in the fields of scene reconstruction, event detection, video tracking, target recognition, pose estimation, motion estimation, etc. With its advantages of low cost, simple structure and high precision, it is widely used in non-contact ranging in industrial production, intelligent robot navigation, unmanned vehicles, medical diagnosis, security monitoring and drones, et...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06K9/62G06N3/04G06N3/08
CPCG06N3/084G06N3/045G06F18/22
Inventor 陈刚孟海涛黄凯
Owner SUN YAT SEN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products