Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Vision and inertia combined positioning method

A visual positioning and joint positioning technology, applied in the field of positioning, can solve the problems of uneven positioning output and poor robustness, achieve high precision, improve real-time performance and precision, and reduce the effect of cumulative impact on precision

Active Publication Date: 2018-04-20
苏州斯米莱斯智能科技有限公司
View PDF6 Cites 6 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0010] Aiming at the problems of poor robustness and unsmooth positioning output in the visual-inertial joint positioning of mobile devices, the present invention provides a visual-inertial joint positioning method, which can improve the positioning accuracy and real-time performance, and make the positioning result smoother

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Vision and inertia combined positioning method
  • Vision and inertia combined positioning method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment

[0040] The joint positioning method mainly uses inertial components to complete the interpolation processing and mutual correction of the visual positioning of the mobile device, so as to obtain high-precision and stable positioning information.

[0041] Visual positioning: collect visual images through imaging equipment, detect changes in different images, and judge motion changes to obtain visual positioning results at a certain moment (taking time k as an example)

[0042] Inertial element positioning: Measure the linear acceleration and rotational angular rate through the inertial measurement element, and calculate the attitude result of the inertial element at a certain moment (taking time k as an example) according to the change of attitude at continuous moments And deduce the predicted positioning results at time k The predicted positioning results Predict the positioning result from the previous moment, that is, k-1 moment and inertial element attitude results ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to the technical field of positioning, and concretely relates to a vision and inertia combined positioning method. A visual positioning result is combined with an inertia component attitude result to reckon the predicted positioning result, and when the new visual positioning result appears, the consistence of the new visual positioning result with the predicted positioning result is judged; and when the new visual positioning result is consistent with the predicted positioning result, the new visual positioning result, a new predicted positioning result reckoned in a subsequent period of time and the predicted positioning result undergo attitude fusion, and the fused attitude is output as a positioning result. Consistence judgment and attitude fusion make the real-time property and the accuracy of the positioning high and the positioning result gentle.

Description

technical field [0001] The present invention relates to the technical field of positioning, in particular to a combined visual and inertial positioning method. Background technique [0002] The process of using sensor data to obtain the three-dimensional position information of the target in the environment is called positioning, which plays an important role in the fields of transportation and service for mobile devices. In general, inertial and vision sensors are two types of airborne / vehicle sensors commonly used in mobile devices. [0003] The visual positioning method is to use the visual system to collect environmental images through imaging equipment and extract feature points of different images during the movement of mobile devices, and estimate the movement of mobile devices based on the changes of feature points in the inspection images. Inertial positioning is to calculate the position and attitude of the next moment from the known initial position according to ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G01C21/16
CPCG01C21/165
Inventor 黄广宁林欢
Owner 苏州斯米莱斯智能科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products