Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Indoor positioning method based on visual feature matching and shooting angle estimation

A visual feature and indoor positioning technology, applied in computing, surveying and mapping and navigation, navigation computing tools, etc., can solve the problems of a large number of equipment and infrastructure investment, improve visual positioning accuracy, loosen the requirements of positioning scenarios, and improve positioning accuracy Effect

Active Publication Date: 2016-01-06
成都卫瀚科技合伙企业(有限合伙)
View PDF8 Cites 16 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] The purpose of the present invention is to solve the problem that the existing indoor positioning methods require a large amount of investment in equipment and infrastructure in the early stage, and the requirements for the positioning scene are relatively strict, and a method based on visual feature matching and shooting angle estimation is proposed. indoor positioning method

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Indoor positioning method based on visual feature matching and shooting angle estimation
  • Indoor positioning method based on visual feature matching and shooting angle estimation
  • Indoor positioning method based on visual feature matching and shooting angle estimation

Examples

Experimental program
Comparison scheme
Effect test

specific Embodiment approach 1

[0017] Specific implementation mode one: combine figure 1 Description of this embodiment, an indoor positioning method based on visual feature matching and shooting angle estimation, is characterized in that an indoor positioning method based on visual feature matching and shooting angle estimation is specifically carried out according to the following steps:

[0018] Step 1: Build a visual database in the indoor scene, which includes indoor visual features, the position coordinates of the visual features in the indoor coordinate system, and the camera coordinate system position of the visual features during the acquisition process; figure 2 ;

[0019] Step 2: According to the visual database in the indoor scene obtained in step 1, solve and query the image P 1 Q and P 2 Q The database image with the largest matching rate; such as image 3 ;

[0020] Step 3: Calculate the query image P 1 Q The corresponding rotation matrix R 1 and query image P 2 Q The correspondin...

specific Embodiment approach 2

[0022] Specific embodiment 2: The difference between this embodiment and specific embodiment 1 is that the visual database in the indoor scene is constructed in the step 1, and the visual database includes indoor visual features, position coordinates of visual features in the indoor coordinate system, and The position of the camera coordinate system of the visual feature during the acquisition process; the specific process is:

[0023] Step 11: define the indoor coordinate system;

[0024] In the indoor scene, define a three-dimensional Cartesian orthogonal coordinate system O D x D Y D Z D , where Z D The direction of the axis is true north, and X D The direction of the axis is due east, Y D The direction of the axis is perpendicular to the indoor ground plane downward, O D is a three-dimensional Cartesian orthogonal coordinate system O D x D Y D Z D the origin of

[0025] Step 1 and 2: Define the camera coordinate system and the image coordinate system;

[0026]...

specific Embodiment approach 3

[0034] Specific embodiment three: the difference between this embodiment and specific embodiment one or two is that in said step two, according to the visual database in the indoor scene obtained in step one, the solution and query image P 1 Q and P 2 Q The database image with the largest matching rate; the specific process is:

[0035] Step 21: SURF feature extraction of query image and database image;

[0036] The SURF algorithm is an accelerated robust feature algorithm, spelled as SpeededupRobustFeatures; in this step, the input of the SURF algorithm is the database image, and the output of the SURF algorithm is the feature vector of the database image; the user to be positioned uses the camera to perform two image acquisitions at the same position , two query images containing different visual features can be obtained, denoted as P k Q , the superscript Q indicates that the image is a query image, and the subscript k indicates the number of the query image, where k=1, ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides an indoor positioning method based on visual feature matching and shooting angle estimation. The invention relates to the indoor positioning method based on visual feature matching and shooting angle estimation, so as to solve the problems of a lot of early equipment and infrastructure investment and stringent scene positioning requirement of the existing indoor positioning method. According to the technical scheme, a visual database in an indoor scene is built, wherein the visual database comprises the position coordinate of a visual feature in an indoor coordinate system and the camera coordinate system of the visual feature in an indoor acquisition process; according to the visual database acquired in the first step in the indoor scene, a database image which matches query images P1Q and P2Q to the greatest extent is solved; a rotation matrix R1 corresponding to the query image P1Q and a rotation matrix R2 corresponding to the query image P2Q are calculated; and according to the R1 and the R2, the position of a query image camera is calculated. The indoor positioning method provided by the invention is applied to the field of computer vision and image processing.

Description

technical field [0001] The invention relates to an indoor positioning method based on visual feature matching and shooting angle estimation. Background technique [0002] With the continuous development of information technology, location-based services based on information technology have gradually attracted the attention of more and more researchers and scientific research institutions. At present, various applications based on location services have gradually penetrated into all aspects of life. For example, an electronic map that provides convenience for travel, a navigation system that provides assistance for driving, and so on. These applications are mainly based on the global satellite positioning system. GPS system, GLONASS navigation system, Galileo navigation system and Beidou navigation system are currently widely used satellite positioning systems. However, the global satellite positioning system can only provide location information services for users in outd...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T7/00G01C21/20
CPCG01C21/206G06T2207/10016
Inventor 谭学治冯冠元马琳
Owner 成都卫瀚科技合伙企业(有限合伙)
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products