Mobile side vision fusion positioning method and system, electronic equipment

A technology that integrates positioning and mobile terminals. It is applied in the intersection of artificial intelligence and geographic information technology. It can solve the problems of no semantic information utilization, cumbersome preparation work, and only relative positioning.

Active Publication Date: 2019-10-25
SHENZHEN INST OF ADVANCED TECH CHINESE ACAD OF SCI
View PDF3 Cites 13 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0008] However, most of the existing visual positioning systems, such as EasyLiving, Google VPS, etc., are mostly based on the SLAM principle by extracting the feature points captured by the visual sensor, and using the triangular ranging method, combined with acceleration, gyroscope and other sensors to calculate the current position movement offset. just relative ...

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Mobile side vision fusion positioning method and system, electronic equipment
  • Mobile side vision fusion positioning method and system, electronic equipment
  • Mobile side vision fusion positioning method and system, electronic equipment

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0064] In order to make the purpose, technical solution and advantages of the present application clearer, the present application will be further described in detail below in conjunction with the accompanying drawings and embodiments. It should be understood that the specific embodiments described here are only used to explain the present application, not to limit the present application.

[0065] see figure 1 , is a flow chart of the mobile terminal visual fusion positioning method according to the embodiment of the present application. The mobile terminal vision fusion positioning method in the embodiment of the present application includes the following steps:

[0066] Step 100: system initialization;

[0067] In step 100, system initialization includes the following steps:

[0068] Step 110: initialization of visual odometry;

[0069] In step 110, the initialization of the visual odometry includes: the memory allocation of the pose manager, the initial value assignmen...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The application relates to a mobile side vision fusion positioning method and system, and electronic equipment. The method comprises the following steps: step a, acquiring an initial location of a mobile terminal as the current location of a positioning target based on the calibrated starting location and sensor information; step b, acquiring a video frame by using the mobile terminal; and step c,detecting a static object in the video frame, acquiring geographic coordinate information of the static object through a BIM space database, and substituting the coordinate information of the staticobject into a multi-target object positioning model, iteratively solving the positioning model through a Gaussian newton method, acquiring the current location of the mobile terminal, and combining the current location of the mobile terminal and the coordinate information of the static object to obtain a positioning result of the positioning target. Through the positioning method disclosed by theapplication, the more convenient, precise and cheaper positioning method can be realized.

Description

technical field [0001] This application belongs to the cross technical field of artificial intelligence and geographic information technology, and particularly relates to a mobile terminal visual fusion positioning method, system and electronic equipment. Background technique [0002] Global Navigation Satellite System (GNSS) can realize navigation and positioning outdoors. At present, radio positioning technology represented by GNSS, cellular network, WIFI, etc. can achieve sub-meter level accurate positioning in open outdoor positioning. , the principle is to rely on the detection of characteristic parameters of the propagation signal to achieve positioning. Common methods include proximity detection, based on observed time difference of arrival (OTDOA) and so on. [0003] Indoor positioning technology mainly realizes the positioning and tracking of people and objects in various indoor spaces. Based on indoor positioning, the demand for safety and monitoring of people and ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G01C21/20
CPCG01C21/20
Inventor 赵希敏胡金星
Owner SHENZHEN INST OF ADVANCED TECH CHINESE ACAD OF SCI
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products