Vision positioning method in dynamic environment

A visual positioning and dynamic scene technology, applied in image data processing, instruments, calculations, etc., can solve the problems of reduced positioning accuracy, positioning errors, and mismatching feature points, so as to reduce errors, increase matching stability, and improve real-time Effects of Sexuality and Effectiveness

Inactive Publication Date: 2013-06-12
BEIJING UNIV OF POSTS & TELECOMM
View PDF0 Cites 24 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] 1: When there are areas with similar textures and a large number of similar repetitive structures in the environment, it is easy to cause mis-matching of feature points, resulting in positioning errors
[0004] 2: When there are a large number of independently moving objects in the environment, the feature points located on the dynamic objects will interfere with the odometer results, resulting in

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Vision positioning method in dynamic environment
  • Vision positioning method in dynamic environment
  • Vision positioning method in dynamic environment

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0030] Below in conjunction with accompanying drawing, the method of the present invention is described in detail:

[0031] Step 1: Calibrate the binocular camera and obtain the internal and external parameters of the camera, including: focal length f, baseline length b, image center pixel position u 0 , v 0 , the correction matrix of the entire image, etc.

[0032] Step 2: Turn on the binocular camera, collect left and right images continuously, and use the camera parameters obtained in step 1 to correct the image.

[0033] Step 3: If the image is the first frame, use the sobel operator to extract features from the collected left and right images, and use the SIFT descriptor to describe, and perform special positive matching between the left and right images to obtain the corresponding relationship of feature points. If the image is not the first frame, a total of four images at two moments before and after are matched for feature matching, and the accuracy of the matching ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a stereoscopic vision positioning method capable of realizing accurate positioning in a dynamic environment. An environment image is acquired in real time by a binocular camera. Characteristic points in the image are extracted by using a characteristic extraction algorithm. Four images acquired by the binocular camera at two adjacent moments are stereoscopically matched. The three-dimensional information of the characteristic points is recovered by using a double-vision geometric principle. Matching accuracy is improved in a loop-back matching mode. The characteristic points on a dynamic object are removed by using a field flow method. A field flow method effect is improved by considering error causing factors and calculating a covariance matrix. Motion parameters of a robot are obtained by using a Gaussian Newton iteration method according to the position information of the characteristic points. Vision positioning accuracy is further improved by using an RANSIC algorithm. The whole process is continuously iterative, and the gesture and position of the robot are calculated in real time.

Description

Technical field: [0001] The invention relates to a mobile robot self-positioning technology related to image processing and machine vision. The technology can use visual information as an input to realize the mobile robot self-positioning in complex dynamic scenes. Background technique: [0002] In the field of intelligent robots, mobile robot positioning and navigation technology has always been a key research issue. The accuracy and speed of positioning directly affect the efficiency and accuracy of the robot to complete the given task. The previous positioning methods, such as encoders or wheel odometers, cannot overcome the positioning errors caused by nonlinear factors such as friction and sliding. The visual positioning methods that have emerged in recent years can solve nonlinear errors well, but the existing There are certain defects in the visual positioning method: [0003] 1: When there are areas with similar textures and a large number of similar repetitive str...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06T7/00
Inventor 贾庆轩叶平王轩孙汉旭张天石窦仁银
Owner BEIJING UNIV OF POSTS & TELECOMM
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products