Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Indoor positioning method based on OS-ELM fusion vision and inertial information

An indoor positioning and inertial technology, applied in measuring devices, instruments, surveying and mapping and navigation, etc., can solve problems such as poor results, and achieve the effect of improving positioning performance and precise positioning results

Pending Publication Date: 2019-08-30
TIANJIN UNIV
View PDF7 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

C. Piciarelli [4] A visual indoor localization technology (referred to here as the VL algorithm) is proposed to compare the image with the reference model of visual features with position marks to achieve localization. The experimental results show that although the localization results of the VL algorithm can be achieved most of the time Precise positioning, however it does not work well in some situations such as occlusion, light changes and personnel access interference, so VNS can be fused with other navigation devices to provide more accurate positioning accuracy

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Indoor positioning method based on OS-ELM fusion vision and inertial information
  • Indoor positioning method based on OS-ELM fusion vision and inertial information
  • Indoor positioning method based on OS-ELM fusion vision and inertial information

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0053] The technical solution of the present invention is further introduced below in conjunction with specific calculation formulas and accompanying drawings, see the following description for details:

[0054] Preprocess the obtained inertial and visual sensor data to generate feature vectors, and model the training data containing feature vectors and target outputs:

[0055] First, the visual information is preprocessed, and the SURF (fast robust feature) feature is extracted from each frame of training image, and the image I numbered i is i and image I numbered i+m i+m to match. Then use the two-way matching algorithm to remove the mismatching points, and keep the N with high matching degree a For matching points, calculate the affine transformation matrix P through these matching points.

[0056] Each affine transformation matrix P is calculated by the following formula:

[0057]

[0058] In the formula, r represents the rotation angle, A is the scaling vector, T ...

Embodiment 2

[0091] Combine below Figure 2-Figure 4 , Table 1-Table 2, and specific examples verify the feasibility of the scheme in Example 1, see the following description for details:

[0092] For the effect of this method, apply the algorithm steps in the above embodiment 1 to perform positioning analysis on the experiment with a total duration of 56 seconds and a displacement length of 15m. The experiment includes random entry and exit of personnel and scene mutations and other interference scenes. The parameters are set as follows: the number of hidden layer nodes is 150, the number of SURF features is 40, and the threshold of matching pairs is N a is 24, and the initial matching step in the online adaptive positioning stage is 5.

[0093] qualitative angle, figure 2 It shows the comparison of the positioning effect between this method and the positioning algorithm proposed in [4]. image 3 This is a schematic diagram of the comparison of error cumulative distribution graphs bet...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses an indoor positioning method based on OS-ELM fusion vision and inertial information. The indoor positioning method based on OS-ELM fusion vision and inertial information comprises the steps of: pre-processing obtained inertial and vision sensor data, so that a training characteristic vector is generated, and modelling training data including the training characteristic vector and target output; inputting the training data into an OS-ELM model, initializing an output initial weight vector, taking the training characteristic vector as the input vector, and taking the corresponding target displacement as a training output vector; online sequentially leaning new training data, updating the output weight vector of the OS-ELM model, outputting the final weight vector through iteration, and taking the final weight vector as the optimal weight vector; modelling the test data, generating a test characteristic vector, and obtaining a test output vector corresponding to the test data through the optimal weight vector; and, introducing corner judgement, optimizing the test output vector, and calculating the output result after corner judgement, so that the final positioning result is obtained.

Description

technical field [0001] The invention relates to the fields of indoor positioning, information fusion and signal processing, in particular to an indoor positioning method based on OS-ELM (Online Sequential Extreme Learning Machine) fusion of visual and inertial information. Background technique [0002] In recent years, with the rapid growth of demand for indoor positioning services, indoor positioning systems have become more and more important. The Global Positioning System (GPS) is the most popular system for positioning and navigation, it can provide accurate location information anywhere in the world during the day and night, however in indoor environments GPS is difficult to receive due to wall barriers and multipath effects Sufficient satellite signals make the positioning accuracy drop sharply, and it is impossible to achieve the same positioning accuracy as the outdoor environment. Therefore, many alternatives to GPS have been proposed to solve the indoor positionin...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G01C21/20G01C21/16
CPCG01C21/206G01C21/165
Inventor 徐岩李宁宁安卫凤崔媛媛
Owner TIANJIN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products