High-precision indoor positioning method based on joint vision and wireless signal characteristics

A wireless signal, indoor positioning technology, applied in the direction of location information-based services, specific environment-based services, neural learning methods, etc. The effect of external environment influence and improving positioning accuracy

Active Publication Date: 2021-01-01
SHANGHAI UNIV
View PDF9 Cites 10 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, due to the change of the internal parameters of the camera, the motion blur in the camera acquisition process, and the influence of external environmental factors such as changes in the lighting conditions in the indoor scene, the positioning accuracy is greatly limited, and it takes a lot of manpower and material resources to update the image database on time.
Moreover, most of these technologies perform signal-based positioning and vision-based positioning separately, only using the positioning results of the former to select some candidate areas to limit the processing complexity of the latter positioning stage, but ignoring the differences between the two schemes. Interaction

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • High-precision indoor positioning method based on joint vision and wireless signal characteristics
  • High-precision indoor positioning method based on joint vision and wireless signal characteristics
  • High-precision indoor positioning method based on joint vision and wireless signal characteristics

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0017] Such as Figure 4 As shown, this embodiment involves a high-precision positioning method that combines vision and wireless signal features. The test is carried out under the specific environment setting of a 4000-square-meter office building corridor. Because there are floating people in the corridor environment of the office building, the Wi- Such an environment poses a challenge to achieve good Wi-Fi signal-based positioning. Moreover, there are many windows on one side of the corridor, and the light conditions will change greatly during the day. This method can achieve a positioning accuracy of 0.62m at a grid size of 1.5m. The method specifically includes: figure 1 Steps shown:

[0018] Step 1: Collect indoor scene information, obtain RSSI value and image data, specifically: in the offline stage, in the predetermined range of indoor scenes, use such as Figure 5 The mobile robot shown with a Wi-Fi module and an image acquisition module establishes communication wi...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a high-precision indoor positioning method based on joint vision and wireless signal characteristics, and the method comprises the steps: in the offline stage, a to-be-positioned indoor site is collected, a WiFi fingerprint database and an image database are constructed, and the scene information of an environment is obtained; and in the online stage, a mobile terminal collects WiFi fingerprint data and image data in real time, performs coarse positioning on the collected WiFi fingerprint data, determines a potential area of a user, and then adopts a method based on deep neural network regression for the image data of the coarse positioning area to complete prediction of an accurate positioning position. According to the indoor positioning method, the wireless signal characteristics and the visual characteristics are fused so that the positioning error is further reduced while the calculation resources are reduced and the calculation complexity is reduced, and the high-precision indoor positioning is realized.

Description

technical field [0001] The present invention relates to a technology in the field of indoor wireless positioning, specifically a high-precision indoor positioning method based on joint vision and wireless signal characteristics, applicable to various indoor positioning systems equipped with Wi-Fi and cameras, such as the current number Numerous mobile robots and other smart devices. Background technique [0002] The existing positioning methods based on Wi-Fi location fingerprints need to accurately model the complex environment to mitigate the impact of multipath propagation, and the instantaneous signal fluctuations caused by some unpredictable obstacles (such as pedestrian movement) Very sensitive, causing attenuation and distortion of wireless signals. [0003] Some improved technologies combine visual information for real-time positioning in a targeted manner, but this type of technology needs to rely on cameras to collect images of the surrounding environment of the u...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): H04W4/021H04W4/33H04W4/02G06K9/00G06K9/62G06N3/04G06N3/08
CPCH04W4/023H04W4/33H04W4/021G06N3/049G06N3/08G06V20/10G06N3/045G06F18/241
Inventor 王钰周广兵向晨路张舜卿徐树公
Owner SHANGHAI UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products