Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Monocular vio (Visual-Inertial Odometry) based mobile terminal AR method

A mobile terminal, single-purpose technology, applied in the field of AR, can solve the problems that the effect depends on the environment, the effect is poor, and it is difficult to accurately estimate the camera pose, so as to achieve the effect of good effect, stable effect, and anti-deviation effect

Inactive Publication Date: 2018-03-06
NANJING WEIJING SHIKONG INFORMATION TECH CO LTD
View PDF5 Cites 21 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Although Sparse SLAM is fast, the effect is very dependent on the environment: in a complex environment with many feature points, sparse SLAM works well, but for environments with fewer feature points such as white walls and smooth planes, the effect is very poor. It is difficult to accurately estimate the pose of the camera

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Monocular vio (Visual-Inertial Odometry) based mobile terminal AR method
  • Monocular vio (Visual-Inertial Odometry) based mobile terminal AR method
  • Monocular vio (Visual-Inertial Odometry) based mobile terminal AR method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0029] Such as figure 1 As shown, the present invention provides a mobile AR method based on monocular vio, including the following steps:

[0030] Step 1, using the camera and IMU to perform image acquisition and inertial data acquisition respectively, and the acquisition frequency of the IMU is greater than that of the camera;

[0031] Step 2, extract the feature points of each frame of image acquired by the camera, and use the optical flow method to track each feature point, and set the successfully tracked frame as a key frame;

[0032] Step 3. Since the frequency of the IMU is much higher than the speed at which the camera acquires images, when two frames of images are obtained, multiple sets of IMU data are often obtained. Therefore, the multiple sets of IMU data obtained by the IMU are pre-integrated to calculate two frames The IMU position and velocity corresponding to the image are:

[0033]

[0034]

[0035] In the formula, P k and P k+1 Indicates the posit...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a monocular vio (Visual-Inertial Odometry) based mobile terminal AR method, which comprises the steps of performing image acquisition and inertial data acquisition respectivelyby using a camera and an IMU; extracting feature points of each image acquired by the camera, and setting the extracted feature points to be key frames; calculating corresponding IMU position and rotation information of two images in a pre-integration manner; acquiring a tracked feature point in the key frames, and calculating the pose of each key frame according to the feature point; calculatingan external parameter between the camera and the IMU according to a condition that displacement and rotation information acquired by the IMU is equal to displacement and rotation information measuredby the camera between the two frame; initializing the speed, the acceleration and a scale factor of the IMU; acquiring the pose of the camera by using tight coupling nonlinear optimization, and then superimposing a virtual object to achieve an AR effect. The mobile terminal AR method adopts the IMU to make up defects of VSLAM (Virtual Simultaneous Localization and Mapping), and is not affected byenvironmental feature points within a short time; and deviation is prevented by using alignment of the VSLAM and the IMU when the time is long.

Description

technical field [0001] The present invention relates to an AR method, in particular to a mobile terminal AR method based on monocular vio. Background technique [0002] SLAM (simultaneous localization and mapping) is divided into two functions: positioning and mapping. Among them, the main role of map building is to understand the surrounding environment and establish the corresponding relationship between the surrounding environment and space; positioning is mainly to judge the pose of the device in the map based on the built map, so as to obtain the information of the camera in the environment. [0003] SLAM is divided into dense and sparse. Based on the consideration of computing power, the mobile terminal generally chooses sparseSLAM. Although Sparse SLAM is fast, the effect is very dependent on the environment: in a complex environment with many feature points, sparse SLAM works well, but for environments with fewer feature points such as white walls and smooth planes,...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T7/80G06T7/246G06T7/269
CPCG06T7/248G06T7/269G06T7/80G06T2207/10016G06T2207/30244
Inventor 潘铭星冯向文孙健杨佩星付俊国雷青
Owner NANJING WEIJING SHIKONG INFORMATION TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products