Sampling inertial guidance-based visual IMU direction estimation method

A technology of direction estimation and inertial guidance, applied in computing, instrumentation, electrical digital data processing, etc., can solve problems such as low precision, long-term error accumulation, and time-consuming matching

Active Publication Date: 2017-05-24
SHENYANG INST OF AUTOMATION - CHINESE ACAD OF SCI
View PDF2 Cites 13 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] In view of the problems of low accuracy and long-term error accumulation in the existing IMU direction estimation method, and the time-consuming and mismatching problems in the vision-based direction estimation method, the present invention proposes a visual IMU direction estimation based on inertial guidance sampling The method makes full use of the direction estimation information of the IMU, guides the sampling process of matching point pairs and the removal process of mismatching points in the visual direction estimation, ensures the accuracy of the visual IMU direction estimation, and improves the operation speed of the visual IMU direction estimation

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Sampling inertial guidance-based visual IMU direction estimation method
  • Sampling inertial guidance-based visual IMU direction estimation method
  • Sampling inertial guidance-based visual IMU direction estimation method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0067] The present invention will be further described in detail below in conjunction with the accompanying drawings and examples.

[0068] The present invention is mainly divided into three parts, figure 1 Shown is the principle diagram of the method of the present invention, and the specific implementation process is as follows.

[0069] Step 1: IMU orientation estimation based on gain-adaptive complementary filter.

[0070] The IMU contains three main sensors: a three-axis gyroscope, a three-axis accelerometer, and a three-axis magnetometer. The attitude estimation of the IMU includes the direction estimation of the three types of sensors, and their estimated values ​​are fused.

[0071] Step 1.1: Compute the orientation estimate for the gyroscope.

[0072] Step 1.1.1: Solve the quaternion describing the direction of the IMU at time t+Δt The rate of change (also known as the derivative) The formula is as follows:

[0073]

[0074] Among them, the q in the quate...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention provides a sampling inertial guidance-based visual IMU direction estimation method, namely, a sampling process of a matching point pair and a removing process of mismatching points in visual direction estimation are guided by utilizing direction estimation information of an IMU (Inertial Measurement Unit). The method comprises three steps of gain adaptive complementary filter-based IMU direction estimation, scale rotational invariance-based feature detection and visual IMU fusion-based direction estimation. According to the method, a gain adaptive complementary filter is adopted and a remarkable mismatching point pair can be removed in an initial iterative process, so that the accuracy of direction estimation is improved; pose estimation information of the IMU serves as an initial value to be introduced in the visual direction estimation, and the mismatching point pair is iteratively removed, so that the direction estimation process is accelerated and the problem of large calculation amount caused by adoption of a random initial value is effectively avoided; and the method is wide in applicability, good in robustness and high in accuracy rate, and can be widely applied to an action capture process of human body rehabilitation training.

Description

technical field [0001] The invention relates to computer vision technology and data processing and information fusion technology, in particular to a visual IMU direction estimation method based on inertial guidance sampling. Background technique [0002] The high-resolution and large-capacity data obtained by the human motion capture system can be used to study the causes of diseases and contribute to the prevention and treatment of diseases. It is widely used in the field of biomedicine, for example, clinical gait analysis, outpatient Rehabilitation of the body, rehabilitation of leg joints, activity monitoring and assessment of the elderly, and rehabilitation of the visually impaired. As the cost of MEMS sensors becomes lower and smaller, an inertial measurement unit (Inertial Measurement Unit, IMU) integrated with accelerometers, gyroscopes and magnetometers has emerged. The existing human body motion capture system is realized by fixing different numbers of IMUs in vari...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06F19/00
Inventor 梁炜张吟龙谈金东张晓玲李杨
Owner SHENYANG INST OF AUTOMATION - CHINESE ACAD OF SCI
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products