Visual mileometer method based on convolutional neural network

A convolutional neural network and visual odometry technology, applied in the field of navigation and positioning, can solve problems such as insufficient point pairs, odometer estimation errors, GPS signals cannot provide positioning and navigation, etc., and achieve robust feature points, accurate mileage or pose estimation Effect

Active Publication Date: 2019-05-03
ZHEJIANG UNIV
View PDF8 Cites 6 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The method of visual odometry overcomes the problem that traditional GPS odometers cannot provide stable and accurate positioning and navigation in indoor environments or partially outdoor environments, especially in environments where the GPS signal is interfered or even blocked.
[0003] Most of the traditional visual odometry methods are based on the traditional feature point detection method, and these methods often only focus on the feature points of a local area in a single picture, and do not pay attention to the changes and connections of the environmental background between consecutive frames. Therefore, It is often affected by it, so there will be many mismatched point pairs or insufficient matched point pairs, which will eventually lead to unavoidable odometer estimation errors

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Visual mileometer method based on convolutional neural network
  • Visual mileometer method based on convolutional neural network
  • Visual mileometer method based on convolutional neural network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0037] The present invention will be further described below in conjunction with the accompanying drawings.

[0038] refer to Figure 1 to Figure 5 , a visual odometry method based on a convolutional neural network, including the following steps:

[0039] Step 1, collect the original environmental data through the camera carried by the mobile robot, and train the feature point detector A based on the convolutional neural network;

[0040] Step 2, the mobile robot executes the movement of the mileage to be estimated, and collects the raw data to be estimated through the camera carried;

[0041] Step 3, perform data sampling and clipping preprocessing operations on the data to be estimated collected by the camera to obtain the data to be processed;

[0042] Step 4, using the feature point detector A to filter the data to be detected to obtain feature point information;

[0043] Step 5, use the feature point information combined with the pole constraint method to solve the mot...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention relates a visual mileometer method based on the convolutional neural network. The method comprises steps S1, original environment data is collected by a camera carried by a mobile robot,and a feature point detector A based on the convolutional neural network is trained; S2, the motion of the to-be-estimated mileage is performed by the mobile robot, and to-be-estimated original datais collected by the carried camera; S3, data sampling, trimming and pre-processing operations of the to-be-estimated data collected by the camera are performed to obtain to-be-processed data; S4, thefeature point detector A is utilized to filter to-be-detected data to obtain the feature point information; and S5, the feature point information is utilized in combination with the polar constraint method to solve a motion estimation matrix of a moving subject, and mileage coordinates are further reckoned. The method is advantaged in that changes between frame environments before and after filtering can be associated to obtain the more stable feature points, matching accuracy is enhanced, and thereby an estimation error of the visual mileometer is reduced.

Description

technical field [0001] The invention relates to the technical field of navigation and positioning, in particular to a visual odometry method based on a convolutional neural network. Background technique [0002] Visual odometry technology is mainly used in robot positioning and navigation. In recent years, with the upsurge of research in the field of autonomous driving, visual odometry technology has been widely researched and applied. The method of visual odometry overcomes the problem that traditional GPS odometers cannot provide stable and accurate positioning and navigation in indoor environments or partially outdoor environments, especially in environments where the GPS signal is interfered or even blocked by buildings. [0003] Most of the traditional visual odometry methods are based on the traditional feature point detection method, and these methods often only focus on the feature points of a local area in a single picture, and do not pay attention to the changes a...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G01C22/00G06T7/246
Inventor 潘赟陈佳平包瑶琦杨哲惠思琦吴筱
Owner ZHEJIANG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products