Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Collaborative visual SLAM method based on multiple cameras

A multi-camera and camera technology, applied in image analysis, image data processing, instruments, etc., can solve problems such as failure

Inactive Publication Date: 2016-08-17
BEIJING ROBOTLEO INTELLIGENT TECH
View PDF3 Cites 39 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, traditional SLAM methods tend to fail when the fraction of moving points is relatively large

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Collaborative visual SLAM method based on multiple cameras
  • Collaborative visual SLAM method based on multiple cameras
  • Collaborative visual SLAM method based on multiple cameras

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0089] In the embodiment, the standard deviation σ of the feature detection uncertainty is set to 3.0 pixels; the threshold θ of the Mahalanobis distance is set to 2.0, so as to determine whether a feature point is an inner peripheral layer or an isolated point (according to Gaussian distribution, there are 95 % confidence); the ZNCC threshold T that will be used to measure the similarity between image patches ncc Set to 0.7; the minimum number of frames N to triangulate a feature trajectory min Set to 60; the frame rate of the effective map point cache is 200; find the radius of the nearby seed point matching the camera extrinsic mapping to be 10% of the larger of image width and image height, and

[0090] The present invention will be further described below in conjunction with the accompanying drawings.

[0091] like figure 1 As shown, a multi-camera-based collaborative visual SLAM method includes the following steps:

[0092] 1) When initializing the system, it is a...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The present invention provides a multi-camera-based collaborative visual SLAM method, in particular to a collaborative visual SLAM method using multiple cameras in a dynamic environment. The method of the invention allows the relative position and direction of the cameras to change with time, so that multiple cameras can move independently and be installed on different platforms, and solve problems related to camera pose estimation, map point classification and camera group management, etc. problem, so that the method works robustly in dynamic scenes and can reconstruct the 3D trajectories of moving objects. Compared with the existing SLAM method based on single camera, the method of the present invention is more accurate and robust, and can be applied to micro-robots and wearable augmented reality.

Description

technical field [0001] The present invention relates to a simultaneous localization and mapping (SLAM, simultaneous localization and mapping) method, in particular to a collaborative visual SLAM method with multiple moving cameras in a dynamic environment. Background technique [0002] There are two main traditional visual SLAM methods, the single-camera-based visual SLAM method and the multi-camera-based visual SLAM method. [0003] Single-camera-based visual SLAM method: mainly based on structure-from-motion (SFM) technology and a Bayesian inference method, these two types of methods can be solved by extended Kalman filtering. SFM-based methods can produce more accurate results per unit computation time, while filtering-based methods can be more efficient when processing resources are limited. However, these methods usually do not consider dynamic scenes. Recently proposed methods apply multi-body SFM to handle dynamic environments. However, this approach is only applic...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/00G06T7/20
Inventor 廖鸿宇孙放
Owner BEIJING ROBOTLEO INTELLIGENT TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products