Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Visual and inertial navigation fusion SLAM-based external parameter and time sequence calibration method on mobile platform

A mobile platform, visual technology, used in navigation, mapping and navigation, navigation calculation tools, etc.

Active Publication Date: 2018-12-18
SOUTHEAST UNIV
View PDF6 Cites 72 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0008] 3. In addition, due to hardware limitations, the camera and IMU have different clocks. How to fuse the camera and IMU under asynchronous conditions is also one of the difficulties in solving this problem.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Visual and inertial navigation fusion SLAM-based external parameter and time sequence calibration method on mobile platform
  • Visual and inertial navigation fusion SLAM-based external parameter and time sequence calibration method on mobile platform
  • Visual and inertial navigation fusion SLAM-based external parameter and time sequence calibration method on mobile platform

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0094] The technical solution of the present invention will be described in detail below in conjunction with the accompanying drawings and specific embodiments.

[0095] Aiming at the problem that the current monocular visual SLAM system cannot estimate the scale, the present invention adopts the visual inertial navigation SLAM system integrated with the IMU, and for the application on the mobile device, proposes a method for automatic calibration of the external parameters of the camera and the IMU, which can Solve the problem of online external parameter calibration. Aiming at the problem of asynchronous sensor clocks on mobile devices, a method for sensor fusion under asynchronous conditions is proposed. Experimental results show that the method proposed by the present invention can effectively solve the above problems.

[0096] A method for calibrating external parameters and timing based on visual and inertial navigation fusion SLAM on a mobile platform of the present in...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a visual and inertial navigation fusion SLAM-based external parameter and time sequence calibration method on a mobile platform. The method comprises an initialization stage, wherein the relative rotation parameters between two frames estimated by a camera and an IMU are aligned through a loosely-coupled method, and the relative rotation parameters of the camera and the IMUare estimated and obtained; a front-end stage, wherein the front end completes the function of a visual odometer, namely, the pose of the current frame of the camera in the world coordinate system isgenerally estimated according to the pose of the camera in the world coordinate system estimated in the former several frames, and the estimated value serves as an initial value of back-end optimization; a back-end stage, wherein some key frames are selected from all the frames, variables to be optimized are set, a uniform objective function is established, and optimization is carried out according to the corresponding constraint conditions, and therefore, accurate external parameters are obtained. By adoption of the method disclosed by the invention, the error of the estimated external parameters is relatively low, and the precision of the time sequence calibration track is high.

Description

technical field [0001] The present invention relates to a method for merging camera and IMU (Inertial Measurement Unit, IMU) information for positioning and map construction, in particular to a method for calibrating external parameters and timing based on visual and inertial navigation fusion SLAM on a mobile platform. Background technique [0002] With the increasing demands of people and the rapid development of robot technology, related fields such as unmanned aerial vehicles, unmanned driving, and augmented reality are also making rapid progress. These fields involve a wide range of technologies, including machine control, parallel computing, sensor technology, signal processing, and more. In terms of perception, highly intelligent devices are inseparable from their own positioning and perception of the environment. SLAM (Simultaneous Location and Mapping, SLAM) technology is the key technology to solve this problem, which can realize positioning and mapping in unknown ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G01C21/16G01C21/20G01C25/00
CPCG01C21/165G01C21/20G01C25/005
Inventor 姚莉王秉凤吴含前
Owner SOUTHEAST UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products